Monkey Brain Control: The Future of Robotic Prostheses?
File this under "the future is now:" in a series of experiments at Duke
University Medical Center, researchers fitted two monkeys with
electrodes in their brains and trained them to move a virtual arm across
a computer screen to grab virtual objects and "feel" their different
textures -- all using only their brains. It's the first demonstration of
what the researchers call a brain-machine-brain interface (BMBI).
The potential is obviously enormous. The technology could help people
with paralysis control prosthetic limbs just by thinking about it and
even experience intuitive, tactile sensations.
"Someday in the near future, quadriplegic patients will take advantage of this technology not only to move their arms and hands and to walk again, but also to sense the texture of objects placed in their hands, or experience the nuances of the terrain on which they stroll with the help of a wearable robotic exoskeleton," said the study's lead researcher, Dr. Miguel Nicolelis, a professor of neurobiology at Duke University Medical Center and co-director of the Duke Center for Neuroengineering.
Over the past few years, Nicolelis has been working on perfecting brain-machine interfaces, devices that allow individuals to, say, move a robotic arm by controlling it with their brain. The problem with these devices, however, is that the flow of information goes in only one direction, from the brain to the machine. Brain-machine interfaces don't deliver feedback from machine to brain -- which is what's necessary for that all-important sense of touch.
Enter the BMBI. For the new study, reported in Nature, Nicolelis' team implanted two sets of electrodes in monkeys' brains: one in the motor cortex, which controls movement, and another in the somatosensory cortex, which processes the sense of touch.
The electrodes simultaneously stimulated and received neural activity: in the motor cortex, they translated the activity of nearby motor neurons to figure out how the monkey wanted to move the virtual arm; in the somatosensory cortex, the electrodes stimulated neurons to deliver a sensation of texture about the virtual objects the animal was "touching."
This allowed communication from the brain to the computer and back to the brain again, mimicking the natural brain-body feedback loop. The researchers tested the BMBI by presenting the electrode-fitted monkeys with three objects that looked identical on the screen, but "felt" different when they were touched by the virtual limb. The goal was to get the monkeys to touch the object that they sensed was different by moving the virtual arm to it -- again, using only their brains.
When they virtually touched the right object, they were rewarded with juice. After a few exploratory attempts, the animals began consistently selecting the target object. Their ability to distinguish texture was proven not to be random -- the monkeys got it right even when the position of the identical objects was switched.
Reported ScienceNow:
"It's definitely a milestone in brain-computer interfaces," says neuroscientist Sliman Bensmaia of the University of Chicago, who is developing touch-feedback systems for human prosthetics. Too many of the robotic arms now being developed, even very advanced ones, he says, ignore the importance of touch. "Sensory feedback is critical to doing anything," he says. Even mundane tasks like picking up a cup require a great deal of concentration so the wearer does not drop or crush it.
Nicolelis' next goal is to create a prototype for a brain-controlled robotic exoskeleton in time to debut at the 2014 World Cup in the researcher's native Brazil. A huge soccer fan, Nicolelis' fantasy is to allow a Brazilian paralysis patient to walk onto the field to deliver the opening kick. Stay tuned to see if Nicolelis and his international consortium of research centers can turn science fiction into fact.
Katie Zhuang / Duke University / AFP / Getty Images |
"Someday in the near future, quadriplegic patients will take advantage of this technology not only to move their arms and hands and to walk again, but also to sense the texture of objects placed in their hands, or experience the nuances of the terrain on which they stroll with the help of a wearable robotic exoskeleton," said the study's lead researcher, Dr. Miguel Nicolelis, a professor of neurobiology at Duke University Medical Center and co-director of the Duke Center for Neuroengineering.
Over the past few years, Nicolelis has been working on perfecting brain-machine interfaces, devices that allow individuals to, say, move a robotic arm by controlling it with their brain. The problem with these devices, however, is that the flow of information goes in only one direction, from the brain to the machine. Brain-machine interfaces don't deliver feedback from machine to brain -- which is what's necessary for that all-important sense of touch.
Enter the BMBI. For the new study, reported in Nature, Nicolelis' team implanted two sets of electrodes in monkeys' brains: one in the motor cortex, which controls movement, and another in the somatosensory cortex, which processes the sense of touch.
The electrodes simultaneously stimulated and received neural activity: in the motor cortex, they translated the activity of nearby motor neurons to figure out how the monkey wanted to move the virtual arm; in the somatosensory cortex, the electrodes stimulated neurons to deliver a sensation of texture about the virtual objects the animal was "touching."
This allowed communication from the brain to the computer and back to the brain again, mimicking the natural brain-body feedback loop. The researchers tested the BMBI by presenting the electrode-fitted monkeys with three objects that looked identical on the screen, but "felt" different when they were touched by the virtual limb. The goal was to get the monkeys to touch the object that they sensed was different by moving the virtual arm to it -- again, using only their brains.
When they virtually touched the right object, they were rewarded with juice. After a few exploratory attempts, the animals began consistently selecting the target object. Their ability to distinguish texture was proven not to be random -- the monkeys got it right even when the position of the identical objects was switched.
Reported ScienceNow:
"It's definitely a milestone in brain-computer interfaces," says neuroscientist Sliman Bensmaia of the University of Chicago, who is developing touch-feedback systems for human prosthetics. Too many of the robotic arms now being developed, even very advanced ones, he says, ignore the importance of touch. "Sensory feedback is critical to doing anything," he says. Even mundane tasks like picking up a cup require a great deal of concentration so the wearer does not drop or crush it.
Nicolelis' next goal is to create a prototype for a brain-controlled robotic exoskeleton in time to debut at the 2014 World Cup in the researcher's native Brazil. A huge soccer fan, Nicolelis' fantasy is to allow a Brazilian paralysis patient to walk onto the field to deliver the opening kick. Stay tuned to see if Nicolelis and his international consortium of research centers can turn science fiction into fact.
No comments:
Post a Comment