Reporting in Nature, researchers write that two individuals, both paralyzed by stroke, made reach-and-grasp movements using a thought-controlled robotic arm. One participant was even able to a sip a drink by herself. Neuroengineer Dr. Leigh Hochberg discusses the paper and the ongoing trial.
Copyright ? 2012 National Public Radio?. For personal, noncommercial use only. See Terms of Use. For other uses, prior permission required.
IRA FLATOW, HOST:
(Unintelligible) at the beginning of the program about Cathy Hutchinson having not being able to drink anything without the help of caregivers for 15 years. She was paralyzed from the neck down. But she's very famous, very famous this week, because thanks to new technology described in the journal Nature, she took a very famous sip of coffee this week. You probably saw it on television or the Internet.
And she was able to do this just by thinking about it. This is part of an ongoing clinical trial called BrainGate II, and last April, when she imagined picking up a coffee thermos and taking a drink, a robotic arm read her mind, and it did the rest. And Cathy had - Cathy and a second stroke victim also used thought-control robotic arms to reach and grasp for targets.
How long before this can be used outside the lab? We all want to know that. What improvements need to be made before it's used for everyday tasks? Dr. Leigh Hochberg is a critical care neurologist at Massachusetts Research Hospital at Providence VA Medical Center, associate professor of engineering at Brown. Leigh Hochberg is also co-author of a paper in Nature and director of the BrainGate II trial. Welcome to SCIENCE FRIDAY.
LEIGH HOCHBERG: Thanks very much, Ira.
FLATOW: How are you handling all this publicity?
HOCHBERG: Well, I'm very please for what our two participants were able to achieve, that's for sure.
FLATOW: And this, of course, never happens in a vacuum. This has been going on for - taken years, right, to get to where you are today.
HOCHBERG: I agree. The results in this issue of Nature that we describe are really the result of more than 40 years of public investment in fundamental translational clinical science, to get to the point that we understand enough about the brain and enough about one particular part of the brain known as the motor cortex that we can begin to use those signals, we hope, in a useful way for people with paralysis.
FLATOW: And tell us exactly the setup that was involved here.
HOCHBERG: Sure. So both of the participants in our ongoing trial, the two that we report in this particular paper, had brainstem strokes years ago. That left them tetraplegic - that is, unable to move their arms or their legs, and in addition, they were both unable to speak. So in many ways, they were locked in, had the same syndrome as Jean-Dominique Bauby, the gentleman who wrote the book "The Diving Bell and the Butterfly" simply by blinking his eyes.
Both of them enrolled in the trial, and there was a small chip of electrodes in what we refer to as an array, about four-by-four millimeters - that's about the size of a baby aspirin - that was placed right into the top of the motor cortex. Motor cortex sits right on the top of the brain, and for people without physical disabilities, it's an important part of the brain for the control of voluntary movement.
That little array of electrodes, about 96 electrodes on there, each of those electrodes can record from one or more individual neurons or brain cells and essentially listen in to the electrical activity that of course is the language of the nervous system, the language of the brain.
We can then, while recording that electrical neural activity, send it down through some wires to what we call a pedestal - that's really a little plug that protrudes up above the head. And then during the research sessions, we take a cable, plug that connector into some computers, and the job of those computers is to decode, as we call it, that neural activity - that is, to translate that person's intention to move into the control of an external device, such as a robotic arm.
FLATOW: All right, we're going to talk more with Leigh Hochberg about the robotic arm. Our number, 1-800-989-8255, if you'd like some information about it, if you'd like to talk about it. That's our number. You can also tweet us @scifri, @-S-C-I-F-R-I. Don't go away, we'll be right back after this break.
(SOUNDBITE OF MUSIC)
FLATOW: This is SCIENCE FRIDAY from NPR. I'm Ira Flatow, talking with Dr. Leigh Hochberg, who is a critical care neurologist at Mass General and researcher at Providence VA Medical Center, associate professor of engineering at Brown, and also the co-author of the paper in Nature and director of the BrainGate II trial in which his subjects were able to think about moving and picking up objects, and even one of them was able to pick up a mug of coffee and sip out of a straw.
How much training did that take, Dr. Hochberg?
HOCHBERG: It didn't actually require much training, if any training at all, of the participants. What it did require was training or building of what we call the algorithm or the filter. We had to essentially train the computer to recognize each of our participants' brain activity that we were recording.
Essentially, we asked the participant to think about the movement of their own hand or to imagine the movement of their own arm and hand while we we're recording that neural activity. We then create a map of how, for example, a robot arm's movement might be mapped to their own imagined movement of their own hand.
And once that map is built - that is, the filter is calibrated, then they begin to use the system.
FLATOW: You know, it's funny that you say that you recorded how they think about moving their hand. We don't consciously think about it, do we?
HOCHBERG: That's exactly right, we don't. When somebody without a physical disability thinks about reaching up and picking up a coffee cup, there is no conscious thought of doing that. It just works. You just reach out and pick up that cup. And that's one of the nice features, if you will, of the motor cortex. Its job is to control an external device. It just happens that that external device is usually one's own arm and hand.
And one can do that while speaking. I'm waving my hand somewhat wildly, that nobody can see at the moment, and I can do that while I'm speaking, and that's why these powerful signals in the motor cortex might be useful as the base for an assistive technology.
FLATOW: Well, as you say, an assistive technology, other robotics to make people able to be more on their own. But I imagine that somewhere down the line, you'd like to be able to rejuvenate the arm itself, would you not?
HOCHBERG: Yeah, exactly. I think for people with locked-in syndrome, such as the two participants in the study - that is, no functional movement of their limbs and unable to speak - a comparatively simpler goal and one that we've published some work on over the past few years would be the simple control of a computer cursor on a screen.
And if one could provide point-and-click control over a cursor 24 hours a day, seven days a week, that would certainly be a useful assistive technology for somebody with severely limited communication.
But just as you said, for somebody with paralysis, somebody who's unable to move their arms and their legs, the real dream for the research is to one day reconnect brain to limb, to take those signals out of the brain, to root them back down to the peripheral nerves in the arm, to stimulate those nerves and to have somebody use their own arm and hand to pick up that coffee again.
FLATOW: So that's pretty far down the road, I would imagine.
HOCHBERG: We're making some progress. The latter half of that technology, functional electrical stimulation, is already well-established. There have been a few hundred people that have had both arm and/or leg functional electrical stimulation systems placed.
We work closely with our colleagues at the Cleveland Functional Electrical Stimulation Center in this endeavor, and we have some early results published last year about one of our participants, who was thinking about the movement of her own hand along the tabletop plane and moving a simulation of one of these FES devices as though she was moving her own arm. So we're making some progress.
FLATOW: Now, the woman we saw on TV, Cathy Hutchinson, the fact that she was able to do this, 15, nearly 15 years after her stroke, isn't that surprising in itself, that she's still able to be in good enough shape to do that?
HOCHBERG: I think it's a very telling and thankfully a very encouraging story from a neuro-rehabilitation standpoint. This part of motor cortex, it essentially had been disconnected from her arms for the past 15 years. She'd had no functional use of either arm or hand. Yet when we were listening to those brain signals, and she thought about using her arm and hand, those neurons were firing away much as if there was no disconnection, there was no injury, and that's what allowed us to harness those signals towards the control of that robotic limb.
FLATOW: So where do you go from here? This is sort of a proof of concept. Do you move on further here?
HOCHBERG: There's no doubt that this is early on in the research. The - we have a lot more work to do in improving the reliability. As mentioned, we really want these types of devices to be functional 24 hours a day, seven days a week. So there's more basic neuroscience to learn about this part of the brain, there's more computational neuroscience to learn in how to most effectively and reliably decode these signals.
And right now our technology is truly hard-wired. There's a little plug (unintelligible) that protrudes up above the head, and that needs to become a fully implanted technology much like a cardiac pacemaker or a deep brain stimulator, which in the early days had some wires coming through the body but appropriately was fully implanted.
And there are many groups that are now involved in fully implanting these systems, including my colleague, Arto Nurmikko, here at Brown.
FLATOW: And possibly making it wireless, a Wi-Fi or some connection to the robotic arm, something like that?
HOCHBERG: That's right - to bring the neural signals, whether it's the radio frequency or infrared, out of the body without having to have a little plug that's protruding through the skin. And then the decoding would happen to drive whatever that external device may be, whether it be a cursor on a screen, a robotic assistive device, or as we all hope one day, somebody's own limb.
FLATOW: Dr. Hochberg, good luck to you, and thank you for taking time to be with us today.
HOCHBERG: Thanks so much.
FLATOW: Dr. Leigh Hochberg is a critical care neurologist at Mass General and researcher at Providence VA Medical Center, associate professor of engineering at Brown University and director of the BrainGate II trial.
Copyright ? 2012 National Public Radio?. All rights reserved. No quotes from the materials contained herein may be used in any media without attribution to National Public Radio. This transcript is provided for personal, noncommercial use only, pursuant to our Terms of Use. Any other use requires NPR's prior permission. Visit our permissions page for further information.
NPR transcripts are created on a rush deadline by a contractor for NPR, and accuracy and availability may vary. This text may not be in its final form and may be updated or revised in the future. Please be aware that the authoritative record of NPR's programming is the audio.
lawrence of arabia denver nuggets lakers orioles correspondents dinner i am legend san antonio spurs
No comments:
Post a Comment
Note: Only a member of this blog may post a comment.