Promising BCI tech can translate thoughts to speech for ALS patients
Technology may provide more comfortable way to communicate

New research is bringing mind-to-speech technology a step closer to reality for people with paralysis, including those with amyotrophic lateral sclerosis (ALS) who lose the ability to speak.
A proof-of-principle study shows that a brain-computer interface (BCI) can translate a person’s silent thoughts into spoken words and could provide a more fluent and natural way to communicate than current systems.
The study, “Inner speech in motor cortex and implications for speech neuroprostheses,” was published in Cell.Â
New technology can read the brain’s ‘inner speech’
ALS is a neurological disease that causes progressive muscle weakness and paralysis, which can affect muscles in the mouth and throat, making it difficult or impossible for patients to speak.
BCIs are an emerging form of technology that uses sensors in the brain to detect electrical activity, which is then decoded by advanced computer algorithms. Researchers have been exploring the technology to allow people who are paralyzed to speak and use a computer.
Previous work has shown that it’s possible to use BCIs in the motor cortex (the part of the brain that controls movement) to decode attempted speech. Essentially, the patient tries to move their mouth and talk, and even though that movement is not happening due to muscle weakness, the BCI detects the brain signals and decodes those signals to figure out what the patient is trying to say.
Although this approach has shown promise, it has some limitations. For someone who’s paralyzed, trying to move the mouth can be tiresome and slow. Plus, some people with ALS can still move their mouths a small amount, so this approach can lead to unintended noises or issues breathing.
In this study, researchers explored an alternate approach: detecting the brain’s inner speech, words that are thought but not spoken.
“Inner speech (also called ‘inner monologue’ or self-talk) is the imagination of speech in your mind — imagining the sounds of speech or the feeling of speaking, or both,” Frank Willett, PhD, co-author of the study at Stanford, said in a university news story. “We wanted to know whether a BCI could work based only on neural activity evoked by imagined speech, as opposed to attempts to physically produce speech.”
The new study included four people who are participating in BrainGate2 (NCT00912041), a clinical trial testing BCI technology in people with ALS and other paralyzing conditions. Three of the patients in this study had ALS, and the fourth had developed paralysis due to a stroke. All of them had already been implanted with BCIs in their motor cortex.
For this study, patients were instructed to try to say certain words, then only think the same words. The BCIs and computer algorithms were then used to decode both the attempted speech and the inner monologue.
The researchers found that, when patients think words but don’t try to say them, there’s still detectable activity in the motor cortex. In fact, the electrical activity of the motor cortex during inner monologue looked much like attempted speech, but the signals were weaker.
“We found that inner speech evoked clear and robust patterns of activity in these brain regions. These patterns appeared to be a similar, but smaller, version of the activity patterns evoked by attempted speech,” Willett said.
The researchers found that these inner monologue signals could be decoded well enough to allow some communication, though this system wasn’t as accurate as using attempted speech. Still, this study demonstrates a proof-of-principle for the concept.
Privacy concerns
While detecting the brain’s inner monologue could make communication easier for some BCI users, there’s also a theoretical concern that the BCI might detect, and speak aloud, thoughts that are meant to be private.
“In other words, a BCI could end up decoding something the user intended only to think, not to say aloud,” Willett said.
Currently, the technology isn’t accurate enough for unintended utterances to be a major concern, according to Willett. Nonetheless, the researchers explored some potential features that could be used to enhance privacy. For example, having patients think of a specific keyword that can be used to turn the BCI on or off, allowing patients to deliberately choose which parts of their inner monologue are broadcast and which are silent.
The researchers also stressed that this type of technology is still in the earliest stages of clinical testing and fine-tuning, so more work will be required to create optimal privacy safeguards.
“It’s worth pointing out that implanted BCIs are not yet a widely available technology and are still in the earliest phases of research and testing,” Willett said. “They’re also regulated by federal and other agencies to help us to uphold the highest standards of medical ethics.”