Progress in brain-computer interfaces
Not long ago, the idea of controlling a device with one’s thoughts alone was purely in the realm of science fiction. Until the 1980s, people with complete paralysis, such as from a neuromuscular disease or spinal cord injury, made do with the help of caregivers, call buttons, pointers and alphabet boards. Since the 1980s, computers have revolutionized the lives of many people with disabilities, who can use them even if they can move only one muscle, such as in a finger or toe.
Thought-controlled devices aren’t yet on store shelves. But companies are developing and testing prototypes, and consumers will likely see them on the market within the next few years.
Last summer, Matthew Nagle became the first person to receive a brain implant that allowed him to move a computer cursor with his thoughts alone.
Nagle, a 26-year-old from Weymouth, Mass., was paralyzed from the shoulders down in 2001, when a knife wound to his neck irreparably damaged his spinal cord.
As scientists and reporters at New England Sinai Hospital and Rehabilitation Center in Stoughton, Mass., watched, Nagle opened e-mail, played a video game, drew a circle on a computer screen, changed the channel and volume on a television set, moved a robotic arm, and opened and closed a prosthetic hand.
Decades of research
Behind Nagle’s achievements are decades of research, says Leigh Hochberg, a neurologist and neuroscientist associated with Harvard Medical School in Boston, Brown University in Providence, R.I., and several other New England institutions.
The system Nagle is using is also being tested in people with amyotrophic lateral sclerosis (ALS), muscular dystrophy and other disabling conditions. Called BrainGate, it’s a patented product of Cyberkinetics Neurotechnology Systems of Foxborough, Mass. (See www.cyberkineticsinc.com for information about the trials and a video of BrainGate in action.)
For some 40 years, Hochberg says, scientists have been trying to learn enough about recording signals from inside the brain to allow people with paralysis to use those recordings to control external devices.
A few years ago, researchers began training monkeys to play video games, while they watched and recorded the signals coming from the motor cortex, an area at the top of the brain that controls movement.
“We wanted to learn what these patterns meant and how they related to the movements of an animal’s limb,” Hochberg says.
Next, they recorded signals from dozens of neurons (nerve cells) at once, while a monkey played video games or grabbed a peanut and the scientists practiced discerning the animal’s hand position from the neuronal signal pattern alone.
The real “aha” moment arrived when they found that a monkey whose joystick had been disconnected could still control the cursor, apparently with his brain.
“The signals are sent directly to the computer from an array of electrodes sitting on the motor cortex,” Hochberg says. “The computer decodes the signals and turns them into movements of the cursor on the screen.”
The question then became: Would the system work in humans, and would it work if the nervous system had sustained damage?
From brain to mouse
During the 1980s, John Donoghue was exploring what happened to the motor cortex of an animal after it sustained an injury to the nervous system outside the brain.
Donoghue, a neuroscientist at Brown and director of Cyberkinetics Neurotechnology Systems, a company he co-founded in 2001, says his main interest has always been in how the brain adapts to changing conditions and injuries.
As he began studying people with degenerative nervous system conditions like ALS, as well as strokes and acute injuries to the spinal cord, he found that activity in the motor cortex was preserved. In ALS, he says, the deeper layers of the motor cortex degenerate, but the upper layers are relatively spared.
“What we’ve learned in the last few decades is that the signals available in the cortex are a good representation of what you want to do with your hand,” Donoghue says. “If you were moving a mouse, and I could sample a couple of dozen cells from your motor cortex, I could tell what you were doing.”
Apparently, that goes for people with injuries and diseases of the nervous system, too, and BrainGate is built on this phenomenon. The signals from a sensor, which measures 4 millimeters (about 0.16 inches) by 4 millimeters and is implanted in the motor cortex, represent the person’s intended action and “go from the brain straight into the mouse port of the computer,” Donoghue explains.
He admits that the system requires brain surgery, but he doesn’t consider that a major drawback. (A hole has to be drilled through the skull, through which the device is passed and implanted in the brain tissue.)
“There are legitimate cautions to our technology,” he says. “It is brain surgery, and it isn’t a Lamborghini yet. But that’s how science progresses.”
While Hochberg and Donoghue were developing the implantable BrainGate, Jonathan Wolpaw and colleagues at the New York State Department of Health’s Wadsworth Center in Albany were busy developing a system that they say can accomplish approximately the same thing without brain surgery.
“The control we get noninvasively is in the same ballpark as what we get invasively,” says Wolpaw, a neurologist and neurophysiologist who, like Donoghue, describes his core interest as the study of adaptability in the nervous system. Unless and until invasive systems are “substantially better,” he says, there’s “no reason to put things into the brain.”
Wolpaw knows the BrainGate team and in fact is involved in a project with Leigh Hochberg. But he says he doesn’t share the “prevailing infatuation with putting things into the brain,” noting that “if it were your brain, that’s probably not the way you’d look at it.”
In 2004, Wolpaw’s group showed that their system, which uses brain waves recorded from 64 electrodes embedded in the scalp, allowed people with spinal cord injuries to move a cursor just by thinking about their movements. Subjects gained expertise with training, he says, and the “use of motor imagery lessens in importance over time, as performance becomes more automatic.”
Keeping it seluperficial
The best compromise between an invasive system that requires brain surgery to install but picks up very strong signals, and a noninvasive system that doesn’t require surgery but can’t pick up such high-quality signals, could be recording from the surface of the brain, Wolpaw says, using a system called electrocorticography — ECoG for short.
“With ECoG, you’re below the skull, on the surface of the brain,” Wolpaw says. “The signals are similar to EEG but much better — bigger, more robust and in a higher frequency range. It looks like they’ll be very useful.
“The ultimate method, if we go to an invasive method, will probably be ECoG. For a variety of reasons, it may be more practical and as good or nearly as good as sticking something into the brain. And you’re not contending with the same reaction of brain tissue.”
Striking a balance
“The fact that you can establish a way of communicating with somebody who’s totally locked in [has no muscle movement or ability to speak, as is true in some neuromuscular conditions] is a good thing,” says neurologist Lawrence Phillips at the University of Virginia in Charlottesville, where he directs the MDA clinic. “But making it possible for someone to become totally locked in is, in my view, not a good thing.” Technology, Phillips says, may make it too easy for doctors to dismiss the gravity of locked-in syndrome by assuming that there are always ways of getting around it.
Phillips believes the emotional and financial burden to potential users shouldn’t be neglected in the rush to embrace new technology. A “gee whiz, look what we can do” presentation that doesn’t address the ethical issues, he says, “isn’t presenting a balanced view.”