Want proof that you don't need big, specialized equipment to produce a mind-controlled robot arm? Just look at a recent University of Toronto student project. Ryan Mintz and crew have created an arm that you control using little more than a brainwave-sensing headset (which is no longer that rare) and a laptop. The team's software is smart enough to steer the arm using subtle head movements, such as clenching your jaw or winking your eye; it also knows when you've relaxed.
The hardware is designed primarily with mobility and prosthetic limbs in mind. The current head gesture system could be used to steer a wheelchair, for example. In the long run, the students hope to improve the accuracy to the point where just thinking about an action is enough to get it done; you wouldn't need a physical connection to muscles or the nervous system, like some of its rivals. The University of Toronto effort still faces stiff competition, but it shows that quadriplegics and others with little body control could eventually claim some independence with easily accessible (and hopefully affordable) technology.
Filed under: Robots, Wearables
Source: University of Toronto
0 comments:
Post a Comment