X
Innovation

Scientists eavesdrop on the thoughts of humans and ferrets

Scientists have been able to decode the brain patterns for hearing certain words. The hope is to one day restore speech to a paralyzed patient.
Written by Christie Nicholson, Contributor

Ferret Attack

Scientists can decode our thoughts and translate them into computer language so that a person can move inanimate objects, like a robotic arm, by just thinking about moving it. Now it looks like they are moving on to perform similar magic in the speech area of the brain. The hope is that neuroscientists may be able to hear the unspoken thoughts of a paralyzed patient, and then translate them through an audio device. Today a team out of the University of California, Berkeley, have come closer to realizing that hope.

They have successfully decoded the patterns of neural firing in the temporal lobe of the brain, this is the area responsible for hearing, as a subject listens to normal speech. And from this pattern they could discern the words the person had heard. Sort of like eavesdropping. Their work is published today in the journal PLoS Biology.

But decoding the patterns for hearing a word may not be the same as the patterns for imagining saying a word. This relationship is necessary for mental conversations to work. If scientists can crack the code then they could either use a synthesizer to give it vocal life or some kind of interface that types the imagined words.

The scientists believe this experiment will pave the way for a speech prosthetic for patients because previous studies have shown that when people imagine speaking a word, the same brain regions light up as do when the person actually verbalizes the word.

It even works with ferrets. In previous studies scientists read words out loud to ferrets and recorded the resulting neural patterns. Later they were able to guess which words were being read to the ferret based solely on the ferret’s neural firing patterns. Of course these are regular ferrets, not ones that understand the English language.

For the current paper researchers analyzed the brain activity of 15 epilepsy patients. The subjects were undergoing brain surgery and had 256 electrodes monitoring the electrical patterns in the temporal lobe for about a week. Specifically scientists recorded the brain activity while the patients listened to 10 minutes of conversation. They then used the data to reconstruct the sounds the patients had heard.

Researchers liken this to a pianist who can look at the keys of another pianist through sound-proof window and still imagine perfectly the sound of the music.

Of course humans can understand words even when they sound quite different—think of accents—so the real challenge for research will be to discern what are the most meaningful sounds of speech, since it might be a syllable or a phoneme or something else altogether.

[photo via emeryc]

This post was originally published on Smartplanet.com

Editorial standards