ARTICLE AD BOX
By Philippa Roxby
Health reporter
Scientists have developed a device that can translate paralysed people's brain signals into words at faster speeds than before, it has been reported in two papers in the journal Nature.
Pat Bennett, 68, who has motor-neurone disease (MND), tested the technology and said it could help her stay connected to the world.
Implants in her brain decode the words she wants to say.
The US researchers now want to improve their technology further.
Their ultimate aim is for people who can no longer talk, because of strokes, brain diseases or paralysis, to be able to communicate their thoughts in real time.
'Good guess'
Ms Bennett used to ride horses and jog every day before being diagnosed, in 2012, with a disease that attacks areas of the brain that control movement, causing eventual paralysis.
Her speech was the first thing affected.
For the Stanford University research, a surgeon implanted four sensors the size of pills into Ms Bennett's brain, in areas key to producing speech.
When she tells her lips, tongue and jaw to make sounds to form words, an algorithm decodes information coming out of her brain.
"This system is trained to know what words should come before other ones, and which phonemes make what words," study co-author Dr Frank Willett, Stanford said.
"If some were wrongly interpreted, it can still take a good guess."
After four months of training the software to interpret Ms Bennett's speech, her brain activity was being translated into words on a screen at 62 words per minute, about three times the speed of previous technology.
Normal conversations are about 160 words per minute, the researchers say, and they are yet to produce a device people can use in everyday life.
One in 10 words was wrong in a vocabulary of 50 words and there were errors in a quarter of Ms Bennett's 125,000-word vocabulary.
"But it's a big advance toward restoring rapid communication to people with paralysis who can't speak," Dr Willett said.
And Ms Bennett said it meant "they can... perhaps continue to work, maintain friends and family relationships".
'Normal conversations'
In another study, from the University of California San Francisco (UCSF), Ann, who has severe paralysis following a stroke, was able to speak through a digital avatar, complete with her own facial expressions.
Scientists decoded signals from more than 250 paper-thin electrodes implanted on the surface of Ann's brain and used an algorithm to recreate her voice, based on a recording of her speaking at her wedding.
The system reached nearly 80 words per minute and made fewer mistakes than previous methods, with a larger vocabulary.
"It's what gives a user the potential, in time, to communicate almost as fast as we do and to have much more naturalistic and normal conversations," researcher Sean Metzger, who helped develop the technology, said.
Study author Dr Edward Chang was "thrilled" to see the success of the brain interface in real time.
Improvements in artificial intelligence (AI) had been "really key", he said, and there were now plans to look at turning the technology into a medical device.
The study comes hot on the heels of the scientists showing they could reconstruct a Pink Floyd song based only on recordings of brain activity from a patient listening to it.
Related Internet Links
The BBC is not responsible for the content of external sites.