Scientists control robot with mind
18 Jan 2007 by Evoluted New Media
In a development that would not be out of place on the big screen, scientists have created a way to control a robot by using signals from a human brain.
In a development that would not be out of place on the big screen, scientists have created a way to control a robot by using signals from a human brain.
The research team from the US have demonstrated that an individual can “order” a robot to move to specific locations and pick up specific objects merely by generating the proper brain waves that reflect the individual’s instructions. They managed to do this with an accuracy of 94% between the thought command and the robot’s movements.
“This is really a proof-of-concept demonstration,” said Rajesh Rao, a researcher from the University of Washington who leads the project. “It suggests that one day we might be able to use semi-autonomous robots for such jobs as helping disabled people or performing routine tasks in a person's home.”
The controlling individual - in this case a graduate student in Rao’s lab - wears a cap dotted with 32 electrodes. The electrodes pick up brain signals from the scalp based on a technique called electroencephalography. The person watches the robot's movements on a computer screen via two cameras, one mounted on the robot and another above it.
When the robot’s camera sees the objects that are to be picked up it passes on the information to the user's computer screen. Each object lights up randomly on the computer screen. When a person wants something picked up and it happens to light up, the brain registers surprise and sends this brain activity to the computer and then to the robot as the choice object. The robot then proceeds to pick up the object.
“One of the important things about this demonstration is that we're using a 'noisy' brain signal to control the robot,” Rao says. “The technique for picking up brain signals is non-invasive, but that means we can only obtain brain signals indirectly from sensors on the surface of the head, and not where they are generated deep in the brain. As a result, the user can only generate high-level commands such as indicating which object to pick up or which location to go to, and the robot needs to be autonomous enough to be able to execute such commands.”
Rao’s team has plans to extend the research to use more complex objects and equip the robot with skills such as avoiding obstacles in a room. This will require more complicated commands from the “master’s” brain and more autonomy on the part of the robot.