MIND-READING REALITY SOON
By linking sounds to patterns of brain activity, scientists may be on the way to helping us hear the thoughts of other people.
- By Emily Sohn
Wed Feb 1, 2012 07:09 AM ET
By looking only at maps of electrical activity in the human brain, scientists were able to tell which words a person was listening to. The discovery is a major step toward being able to “hear” the thoughts of people who can’t speak.
After mapping out the brain’s electrical responses to each sound, the research team found that they could predict which of two sounds the brain was responding to. Click to enlarge this image.
“If someone was completely paralyzed, or if a patient had locked-in syndrome with no movement, but the brain was still active and we could understand it well enough, we could develop devices to take advantage of that and restore communication,” said Brian Pasley, a neuroscientist at the University of California, Berkeley.
“It’s still very early,” he added. “And a lot of work still needs to be done.”
For decades, scientists have been trying to understand how our brains manage to process audible sounds and extract abstract meaning from words and sentences. As part of that effort, lots of work on animals has helped narrow in on the brain regions involved in hearing and responding to sounds.
To see how those findings might be applicable in people, Pasley and colleagues enlisted the help of 15 patients with epilepsy or brain tumors who had electrodes attached to the surface of their brains in order to map out the source of their seizures. With electrodes in place, participants listened to about 50 different speech sounds in the form of sentences and words, both real and fake, such as “jazz,” “peace,” “Waldo,” “fook’ and “nim.”
After mapping out the brain’s electrical responses to each sound, the research team found that they could predict which of two sounds from the study set the brain was responding to, and they could do it with about 90 percent accuracy.
Decoding the brain’s perception of sound in this way, Pasley said, is sort of like learning how a piano works.
“If you understand the relationship between the keys and their sounds, you could turn on the TV and watch someone perform with the sound off,” he said. “And just by looking at what keys were being pressed, you could understand what sounds were being played.”
The new work represents a substantial step forward both in what we know about how the brain processes sound and in potential applications for people with disabilities, said Jonathan Wolpaw, chief of the Laboratory of Neural Injury and Repair at the New York State Department of Health’s Wadsworth Center.
Like devices that allow people to use their thoughts to move robotic arms, there might some day be brain-machine interfaces that give speech to people who have lost it.
“This work could be extremely relevant if you had someone who could no longer talk and you wanted to use the brain signals produced while the person was thinking about what he wanted to say to provide artificial speech,” Wolpaw said. “The techniques they have developed are definitely relevant to how you’d go about doing that.”
Still, many hurdles remain before applications become practical. The new study looked at just a limited number of sounds that make up the English language, for example, and many words are likely to produce identical electric signatures in the brain. The study also focused only on how the brain hears sounds. Further testing needs to explore whether electrical patterns are the same when people also try to say or imagine those sounds.
For now, the new work represents an important and incremental advance that will likely lead to many more.
“The results are quite encouraging,” Wolpaw said. “They’ve gone farther than others have gone. There’s obviously a long way to go, but this is a big step.”