Brain Implant May Someday Help ALS Patients Communicate
Brain-machine interface could predict words based on neuronal activity
Researchers have developed a brain-machine interface (BMI) that could someday facilitate communication for people who have lost their ability to speak, including those with amyotrophic lateral sclerosis (ALS).
After being trained to recognize certain patterns of nerve cell activity, the BMI could accurately predict the words a person with tetraplegia (inability to voluntarily move the upper and lower parts of the body) was thinking, but not speaking or attempting to speak in any way.
Although preliminary, the researchers believe the device holds significant potential for people with brain injuries or neurological diseases who have trouble communicating.
“Neurological disorders can lead to complete paralysis of voluntary muscles, resulting in patients being unable to speak or move, but they are still able to think and reason. For that population, an internal speech BMI would be incredibly helpful,” Sarah Wandelt, a California Institute of Technology (Caltech) graduate student, said in a university press release.
Researchers set out to use a brain-machine interface to predict internal speech
Wandelt recently presented the findings in a poster, titled “Decoding speech and internal speech from populations of single units from the supramarginal gyrus in a tetraplegic human,” at the Society for Neuroscience conference, in San Diego.
Broadly, BMIs aim to provide a direct link between the brain’s electrical signals and an external device, like a computer, that can decode those signals and translate them into actionable outputs.
“You may already have seen videos of people with tetraplegia using BMIs to control robotic arms and hands, for example to grab a bottle and to drink from it or to eat a piece of chocolate,” Wandelt said.
Now, the researchers set out to use a BMI to predict speech. While previous studies have demonstrated an ability to predict speech based on activity in motor areas of the brain, this was only possible when the participant whispered or acted out the words they were thinking.
Predicting speech based on thoughts alone (internal speech), without any movement, is a more challenging task. Before the study, it wasn’t clear which brain region needed to be specifically targeted to decode this internal speech.
Neurological disorders can lead to complete paralysis of voluntary muscles, resulting in patients being unable to speak or move, but they are still able to think and reason. For that population, an internal speech BMI would be incredibly helpful
“In the past, algorithms that tried to predict internal speech have only been able to predict three or four words and with low accuracy or not in real time,” Wandelt said.
In the new study, the researchers recorded electrical activity from the supramarginal gyrus, a brain region involved in the processing of spoken words, by implanting an electrical recording device in that area of the brain.
First, the BMI algorithm was trained over the course of 15 minutes to recognize the patterns of nerve cell signaling that were produced when the tetraplegic patient internally spoke eight different words.
To test the algorithm, the patient was then given a visual or auditory cue to one of the words, which they were then instructed to say internally. After a delay, they were instructed to say the word out loud.
Results showed the algorithm could predict the word a patient thought about with up to 91% accuracy.
Importantly, while the device can accurately predict words from a person’s thoughts, the researchers noted it can’t be used to read minds. It must be specifically trained to learn from each individual’s brain activity when focusing on specific words.
“These findings suggest robust [supramarginal gyrus] modulation during internal speech, indicating that internal speech BMIs can be built using signals from single brain areas,” the researchers wrote.
The supramarginal gyrus can also be used to build BMIs for functions other than speech, they noted.
“We have previously shown that we can decode imagined hand shapes for grasping from the human supramarginal gyrus,” said Richard Andersen, PhD, Wandelt’s advisor at Caltech.
“Being able to also decode speech from this area suggests that one implant can recover two important human abilities: grasping and speech,” added Andersen, who is also a James G. Boswell Professor of Neuroscience and director of the Tianqiao and Chrissy Chen Brain-Machine Interface Center at Caltech.
The researchers hope to publish their findings in a study, titled “Online internal speech decoding from single neurons in a human participant.” The study is in the process of journal submission but has not yet undergone the peer review process.