Monkeys move virtual arm with their minds

6 Oct 2011, 10:10
Comment (1)

Remember the hit movie Avatar, where the human brain alone could control a lifelike hybrid body, seeing what it sees and feeling what it feels?


Scientists at Duke University are one step closer to making that concept a reality, with important applications for medicine. They have developed a system through which a monkey can control a virtual arm with its brain and also feel sensations from the appendage.


The ultimate goal is to build a robotic body suit controlled entirely by brain activity, which will provide tactile feedback to the wearer, says Dr. Miguel Nicolelis, study co-author and neuroscientist at Duke University.


This could potentially enable quadriplegic individuals and people with locked-in syndrome to move, walk and feel textures with robotic hands and feet. "In essence, we are going to provide a new body to these patients," Nicolelis said. "It’s almost like a whole-body vest, but the vest is going to carry the patient’s body."


The new study, published in the journal Nature, represents a milestone in brain-machine interfaces in that it demonstrates that it's possible for the brain to decode motor signals and also receive sensory feedback, said Sliman Bensmaia, assistant professor at the University of Chicago, who was not involved in this research.


The experiment involved implanting hairlike filaments that act as sensors into the brain, about 3 to 4 millimeters deep. In monkeys, these implants have been shown to work for seven years without any complications, suggesting they are safe to use in humans, Nicolelis said.


In this brain-machine-brain-interface, signals from the animal's brain were sent to the avatar arm, and then a feedback signal was sent back to the brain of the animal.


The monkeys saw the avatar arm and three targets that appeared visually identical, but had different associated textures that could be felt with the virtual hand. A signal sent to their brains indicated a texture, which the monkeys used to determine the corresponding target.


Using only their minds, they directed the virtual arm to differentiate targets according to their texture. One monkey got the tasks right more than 85% of the time, the other more than 60%.


The monkeys did not touch these objects with their real skin, or move their actual bodies. Sounds like Avatar, although Nicolelis points out his group had with the idea first, publishing on preliminary research in Scientific American in 2002.


Nicolelis's team previously demonstrated this concept in 2008, when his group got a rhesus monkey in North Carolina to mentally control the walking patterns of a robot in Japan.


Researchers implanted electrodes into the monkey's brain and, while the monkey walked on a treadmill, the electrodes recorded responses from the brain's sensory and motor cortex, and sensors on the monkey's leg tracked walking patterns.


All of this data helped predict how fast the monkey's legs moved and the stride length. Meanwhile, in Japan, a robot received that information and began moving in synch with the monkey in real time, even though it was thousands of miles away.


Most astonishingly, the monkey continued transmitting these signals to make the robot walk for a few minutes after researchers turned off the treadmill.


The next challenge in this area of research is to come up with more sophisticated algorithms for sensory feedback, Bensmaia said in an e-mail. The patient must receive all of the information from a robotic arm in an intuitive fashion.


"Ideally, the sensations evoked by the neuroprostheses would resemble those evoked by the native limb. Otherwise, patients will receive a barrage of signals from the arm which may serve to confuse rather than assist them," Bensmaia said.


Nicolelis hopes to demonstrate a full-body robotic suit at the 2014 FIFA Soccer World Cup in his home country of Brazil, by having a quadriplegic child walk onto the field and give the kickoff.


His team's research is part of the Walk Again Project, an international consortium of researchers looking to restore movement in people who can't move their limbs.


"In the next decade, with demonstrations like this and a few others that are coming in the next months, we are going to see the emergence of these new neuroprosthetic devices, controlled by thought alone," he said. "It's going to be a pretty major development in rehabilitation medicine".

Please, sign in to leave your comment
  • Jane Alonso

    Wow, that is truly fantastic. This nanotechnology is a big step in the future of medicine, science and humanity )

    06 Oct 2011 in 10:46
Enter through
Enter through