I have a short news story about exchanging information between machines and people's brains, now online at the Communications of the Association for Computing Machinery. This is a difficult field to capture in a few hundred words. There's a lot of progress, but people are trying a lot of different approaches, and they're not all addressing the same problem.
For example, some people are hoping to provide much needed help to people with disabilities, while others see the opportunity for new user interfaces for games.
Naturally, people will be willing to spend a lot more for the rehabilitation. In addition, recreational use pretty much rules out (I hope!) any approach that requires surgically implanting something in the skull. Even the researchers who are exploring rehabilitation don't yet feel confident exposing people to the risk, because they can't be sure of any benefits. As a result, these studies mostly involve patients who have received implants for other reasons.
If surgery is ruled out, there are fairly few ways to get at what's going on in the brain. With huge, expensive machines, you can do functional MRI, but that doesn't look particularly practical. Both Honda and Hitachi are using infrared monitoring of blood flow, with impressive results. But the best established measurement is EEG, which measures electrical signals with electrodes pasted to the surface of the head.
One up-and-coming technique that I mention in the story is called ECoG, or electrocorticography. Like EEG, it measures the "field potentials" that result from the combined actions of many neurons. However, the electrodes are in an array that is draped over the surface of the brain (yes, under the skull), so the signal is much cleaner.
Finally there are approaches like Braingate that put an array of a few dozen electrodes right into the cortex, where they can monitor the spikes from individual neurons. 60 Minutes did a story a while ago that showed people using this technology to move a computer mouse.
If the implants are to be practical, they will need to be powered and interrogated remotely, not through a bundle of wires snaking through the skull. Many people are exploring wireless interfaces for this purpose, as described by my NYU classmate Prachi Patel in IEEE Spectrum.
Brain-machine interfaces can also run in either direction. My story dealt mostly with trying to tap the output of the brain, for example letting paralyzed people control a wheelchair or computer mouse. But input devices, such as artificial cochleas or retinas, are also proceeding quite rapidly. To my surprise, Rahul Sarpeshkar, who works on both directions, told me the issues are not that different.
My guess would have been that input to the brain can take advantage of the brain's natural plasticity, which will adapt it to a crude incoming signal. To usefully interpret the output of haphazardly placed electrodes, people need to do an awful lot of sophisticated processing of the signal, which can slow things down.
The toughest thing about this sort of story, though, is time. There's a lot of progress, but there's a long way to go. Once the proof of principle is in hand, there's still a lot of hard work to do, some of which may involve major decisions about basic aspects of the system. It's hard to communicate the progress that's being made without getting into a lot of details that are only interesting to specialists.
Even when it lets the blind see or the lame walk, writing about engineering is a hard sell for the general public.
No comments:
Post a Comment