Monday, October 10, 2011

A Virtual Arm That Talks With The Brain

Scientists have created a virtual arm that monkeys can move with their thoughts—and the arm can send information back to the brain about the textures of what it touches. Neuroscientist Miguel Nicolelis talks about how this may lead to a full-body suit that helps paralyzed people move and walk again.

IRA FLATOW, host: You're listening to SCIENCE FRIDAY. I'm Ira Flatow.
Picking up a wineglass may seem like a simple task, but there's a lot of nervous processing going on between your fingers and your brain, you know. How do you know how gently to hold it so you don't crush the glass in your hand, or how tightly so it doesn't slip to the floor? Your fingers send signals of touch back to your brain so you can feel the glass, and the brain responds, and there's a lot of interesting coordination going on there. But what if you had a prosthetic hand that cannot feel? Life gets a little tougher. Scientists are working to solve this problem using mind control, and what they're working on, in this case, is a virtual arm for monkeys.
Miguel Nicolelis is a neuroscientist at Duke University in North Carolina and director of the Center for Neuroengineering there. He's also a co-author on a study published this week in the journal Nature about the new virtual limb he and his colleagues created. Welcome back to SCIENCE FRIDAY, Dr. Nicolelis.
Dr. MIGUEL NICOLELIS: Oh, thank you for having me. It's a great pleasure.
FLATOW: Tell us - you're welcome. Tell us about what you had these monkeys do.
NICOLELIS: Well, basically, we trained these animals to use their own brain activity to control an avatar arm and hand that could explore objects, virtual objects that we created for them. And to select a correct object that, you know, conduced them to receive a juice reward, they have to basically explore the texture of these objects by using this virtual hand, these virtual fingers and - because we were basically sending, every time they touch an object, an electrical message back directly to their brains that describe this texture. So they basically could do this entire tactile discrimination task just by using their brains, no interference of their bodies whatsoever. And they actually learned to do that.
FLATOW: So they used mind control to control something, a pointer on a computer that would touch the objects?
NICOLELIS: Well, it was a real virtual arm, you know...
FLATOW: It was a real arm.
NICOLELIS: ...like a monkey arm.
FLATOW: Right.
NICOLELIS: And, basically, we learned later that the monkeys very likely assumed that that was their own arm, because when we touched this virtual arm with, you know, virtual probes, the tactile part of their brain - brains responded to the touch, as if we had touched their own body. So it seems like they were assuming or assimilating the virtual arm as part of their own body. So they were able to use their brain activity to move it, and the signals that we send back to their brains to interpret what they have encountered in that virtual space.
FLATOW: And how much of a fine motor coordination could you get there, or was this just a proof of concept?
NICOLELIS: Well, it was pretty good. It was, you know, an evolution from what we have done when we started this field about 12 years ago. Now they could do, you know, moving in 2-D, and now we are working on 3-D. And they can actually explore the objects and touch them and press the correct object, assuming that they detected the correct texture. And every time they did that, they got a drop of fruit juice that they pretty much enjoy.
FLATOW: And so they had to be trained how to do this.
NICOLELIS: Yes, they had to be trained. But they - what we - made us really shocked was that in four to nine sessions, four to nine days, they actually learned the whole thing, to interpret these signals and to associate them with particular objects.
FLATOW: Hmm. And the wires that go from the brain to the virtual arm, do you need to hook these up to a specific part of the brain to get the right signals?
NICOLELIS: Well, not exactly as specific. You know, we used to think that each part of the brain does just one function, but we are discovering through, you know, a decade of this type of research, that messages, like motor messages, are distributed across multiple areas of the cortex of these animals. So in this case, we sampled one of these areas, the motor cortex, to basically generate the motor commands that we needed to move the avatar arm, and we delivered the tactile feedback message to the somatosensory cortex, one of the areas that is involved in processing normal touch.
And - but what we are really surprised is that we basically established this brain-machine-brain interface without any interference from the animal's body. And in a few sessions, the animal just basically took that for granted and performed the task just fine.
FLATOW: Wow. So they quickly picked this up.
NICOLELIS: They quickly picked it up, yes.
FLATOW: Can you explain why that might be?
NICOLELIS: Well, there's - one of the theories that we have is that as we use - as primates, as we use tools to basically increase our reach in the world, we propose that the brain actually incorporate these tools as an extension of the body representations that the brain has. So every brain of a primate creates a body image, this sense that we inhabit a particular body.
But, you know, we think that this is a flexible model, a flexible representation that can include or assimilate even the tools that we use in our daily routines. So our cars, our baseball bats, our soccer balls, everything that we use as a tool becomes part of us for the brain. So we think that part of this - the results that we got in this study support that vision, that that avatar arm was assimilated by the brain as if it were an extension of the animal's own body.
FLATOW: Of course, everyone hearing your report and listening to us now is going to say: Do you think people could learn to use a virtual arm like this?
NICOLELIS: Oh, absolutely. In fact, we want to go even further. We are hoping that this research can lead to people that suffer severe lesions of the spinal cord and - that make the patients severely paralyzed from the level of that lesion down, that they - one, they will be able to use this brain machine, brain interface to control a whole-body robotic vest, an exoskeleton that will restore full-body mobility to these patients, and also restore the ability to sense what they encounter as they move around in the world.
FLATOW: And what kind of prosthetics can you make, then, for them? Would they be incorporating the touch and the feel and the movement in - all into one?
NICOLELIS: Yes. We're already, actually, building a prototype of this exoskeleton, this vest. Our colleagues at University - Technical University of Munich, led by my good friend Gordon Cheng, is already building a prototype of this exoskeleton that we hope to test in clinical trials very soon.
FLATOW: Mm-hmm. Is that the walk-around - walking-in project?
NICOLELIS: Yes. That's part of this international consortium between Duke, the Technical University of Munich, the Polytechnic School in Lausanne and the Natal Institute of Neuroscience in Brazil, where we hope to basically put all this technology together to create a whole-body prosthetic device that may allow quadriplegic and paraplegic patients to regain mobility and the sense of touch as they interact with a brain machine, brain interface.
FLATOW: And so what - how many neurons do you need to measure for a full bodysuit?
NICOLELIS: Yeah, that's a very question. We are crossing, at this moment, the barrier of a thousand neurons, recorded simultaneously. So we are going very close to recorded activity, the electrical signals produced by 1,000 neurons. We expect that between 1,000 and 10,000 neurons will allow us to start controlling fundamental movements of this exoskeleton. So that's our goal for the next three years, to reach about 10,000 neurons simultaneously.
FLATOW: And it's possible. I imagine you're confident. Yeah.
NICOLELIS: Yeah. We have a new family of sensors that we are developing, 3-D sensors to record brain electrical activity. We hope to reach this level in the next few years.
FLATOW: If these, you know, I've heard, you know noise of neurons firing, you know, they sound like static electricity.
NICOLELIS: Yes.
FLATOW: How do you make sense of all of that?
NICOLELIS: Well, there are messages embedded in the noise. In fact, my - one of my sons say that when he listened to these brain storms that we recorded from hundreds of neurons, that it sounds like popcorn in a microwave while you listen to an AM station that is not well-tuned.
(SOUNDBITE OF LAUGHTER)
NICOLELIS: But there is a message in there, and we can actually extract these motor commands very quickly using, actually, very simple mathematical models that run at the same time scale that our brains actually produce movements out of these electrical storms.
FLATOW: 1-800-989-8255. Let's get some phone calls in. Rick in Elizabethtown, Kentucky. Hi, Rick.
RICK: Hi. How are you today?
FLATOW: Hi, there.
RICK: Yeah. Listen, in 1973, while I was still in the Army, I was a part of a research program that the Army was doing on single-motor coordination skill. And they would actually inject a barbed wire into certain muscles of the body, and with the use of a computer, we were required to move that single muscle and isolate it with just using our brain. And we went through this for about a six-month period, and everybody was getting pretty well. And my comment is I'm really surprised, with the effort that was put forth back then, that we haven't been able to, if you will, finalize or make this thing just - I'm surprised it hasn't accelerated any greater than it is now.
FLATOW: At 40 years later almost, why haven't we gotten anywhere further? Miguel?
NICOLELIS: Well, that's interesting that you mentioned that, because one of the first things I did in grad school was - my thesis adviser actually made me do the same test, and I was amazed myself that I could control a single muscle and to do this kind of experiments. Well, the technology to record large-scale brain activity the way we do now only started to be developed really in the mid-'80s. And we only reached, you know, 100 neurons recorded simultaneously at the late '90s.
And so from the late '90s to now, we discovered that we could link brains to machines using, you know, computer devices that are, you know, available now cheap, and we could actually perform the kind of experiments that we reported this week. So it took a while to be able to actually get the kind of brain-drive signal that you need to run this type of experiments. It's much more difficult to do that than to record the electrical signals from muscles. So that was one of the major reasons that it took so long for us to get to this point.
FLATOW: It's one thing to probe a monkey's brain with wires. How do you expect to do that with people, where you can't do that?
NICOLELIS: Well, no. You can. In fact, you do already. There are several patients - there are thousands of patients, tens of thousands of patients that carry either a stimulator in the brain or in the periphery, in the inner ear, to restore neurological functions or to control diseases like Parkinson's disease. Basically, these electrodes, these sensors will be tested, you know, in patients. It's - the benefit that you get is pretty high compared to, you know, what the patient is suffering right now in terms of a lesion or a handicap.
So we think that the technology is safe, and at this level, we will be able to provide to patients a relief and a benefit that, you know, justifies a chronic implant on the surface of the brain.
FLATOW: I'm Ira Flatow. This is SCIENCE FRIDAY, from NPR.
Talking with Miguel Nicolelis about his experiments. Now, a lot of people are tweeting and asking whether you cut the limbs of these monkeys off or not. You didn't do that.
NICOLELIS: No. No. No, no. You don't need to do anything to the monkeys. The monkeys are actually pretty happy. And now that we have a wireless interface to broadcast these signals, these brain-drive signals, the brain - the monkeys really can be freely moving and actually enjoy playing with the video games. They actually have a lot of pleasure playing these games.
FLATOW: So even though you don't know what it feels like to them to get this stimulation back, they must like it.
NICOLELIS: Oh, they certainly like it. And besides, every time they got the correct object selected and they reach a very high level proficiency after a few weeks of training, they get a drop of fruit juice. And any monkey will do anything for you for that fruit juice.
(SOUNDBITE OF LAUGHTER)
FLATOW: I think you'd have to up the ante for people.
NICOLELIS: Yes, yes. We have to change the reward a little bit.
(SOUNDBITE OF LAUGHTER)
FLATOW: Let's go to the phones again. Harvey in Oakland. Hi, Harvey. Harvey, are you there? No. He's not. I guess one of the things he was going to ask is about possible military uses of these, and people are wondering all the time if you have - if you can control a robot by thinking about it, could you make, for example, a bomb-sniffing robot so that the humans don't have to go on there, and to control it with your own brainwaves?
NICOLELIS: Yeah. Well, I have no military application in my research. You know, we are all involved into rehabilitation medicine. But several scientists in the U.S. have proposed ideas, for instance, to have robots do exactly what he said, to go into environments that are too dangerous for people to go, like, for instance, as we had recently, the nuclear accidents in Japan. The idea would be that would send robots to do the job of fixing the place, and no need to send humans, because the risk was so severe, so dangerous.
So there are some ideas about that, but they are far from, you know, doing anything concrete because, of course, to get the best level of motor control, the ones that we get in our experiments, you have to implant a microchip in these sensors, a few millimeters inside the brain. And that will never be justified for an application like that in a regular person that doesn't have a severe disability.
FLATOW: Let's go to Maggie in Stanford, Connecticut. Hi, Maggie.
MAGGIE: Hi. I was wondering if or how you plan on incorporating proprioception into this robotic arm, so where it is in space.
FLATOW: Yeah, three-dimensional.
NICOLELIS: Yes. Well, we want to test this, the same technique that we use here, microstimulation. Electrical microstimulation of the cortex can be used to other sensory channels to restore other sensory mobilities like temperature or proprioception. That's part of the studies that we are starting to carry out at Duke right now. So this was the proof of concept that we could close this loop and could get these animals to feel something or to interpret these signals that are related to texture, to fine touch.
But now, you know, the door is open to test other possibilities. And, in fact, we are even testing in rodents, in rats, other modalities that are not even related to the regular modalities that we all share. So we are trying to see if with this, our rats can sense physical parameters like infrared light, for instance, that rats normally don't perceive. So it opens a lot of new venues for this type of research.
FLATOW: Thanks, Maggie.
MAGGIE: Thanks.
FLATOW: That's about all the time we have. I want to thank you very much for taking time, because, as you say, there are a lot of things in the future where you might apply this to that we haven't even thought about it.
NICOLELIS: Well, I appreciate. Thank you very much for the invitation.
FLATOW: You're welcome. Miguel Nicolelis is a neuroscientist at Duke University in North Carolina and director of the Center for Neuroengineering there.

No comments:

Post a Comment