Man with ALS controls devices using brain-computer interface

CortiCom can predict an intended action by recognizing brain signaling patterns

Lindsey Shapiro, PhD avatar

by Lindsey Shapiro, PhD |

Share this article:

Share article via email
Potential brain interactions are suggested in this close-up view of the human brain inside a person's head.

A man with amyotrophic lateral sclerosis (ALS) was able to control external devices — lighting and a TV — with only his thoughts using a brain-computer interface (BCI) device, according to a recent study.

Called Cortical Communication (CortiCom), the BCI system learns to predict a person’s intended action by recognizing their unique brain signaling patterns. After an initial learning phase, the device reliably helped the man control devices for three months without needing to be recalibrated.

“What’s amazing about our study is that the accuracy didn’t change over time, it worked just as well on Day 1 as it did on Day 90,” Nathan Crone, MD, the study’s senior author and a professor at Johns Hopkins University, said in a news release. “Our results may be the first steps in realizing the potential for independent home use of speech BCIs by people living with severe paralysis.”

The patient is part of the ongoing CortiCom clinical trial (NCT03567213) at Johns Hopkins involving people with significant movement impairments due to spinal cord injury, stroke, or neuromuscular disease, including ALS. Participants, ages 22-70, may still be being recruited. The study, “Stable Decoding from a Speech BCI Enables Control for an Individual with ALS without Recalibration for 3 Months,” was published in Advanced Science. 

Recommended Reading
A man with a bullhorn making an announcement.

Trial of Brain Computer Interface as ALS Speech Aid Being Planned

What is BCI?

BCI technologies are designed to use a person’s brain signaling patterns to interpret their intentions, then use that information to control a device. The goal is help people with mobility or speech issues, such as ALS, perform daily activities they normally couldn’t do independently.

“It’s a very exciting time in the field of brain-computer interfaces,” said Crone, the trial’s principal investigator. “For those who have lost their ability to communicate due to a variety of neurological conditions, there’s a lot of hope to preserve or regain their ability to communicate with family and friends.”

The clinical trial at Johns Hopkins Medicine is evaluating whether CortiCom could help ALS patients control devices over six months. The report concerned data from Tim Evans, 62, who was diagnosed with ALS in 2014.

Evans, who has severe speech and swallowing impairments, was implanted with the CortiCom device in 2022. Thin sheets of sensors (electrodes) no larger than a postage stamp were placed on the surface of his brain in areas involved in speech and upper limb function.

Over the next several weeks, he practiced repeating six commands aloud — up, down, left, right, enter, and back — as they appeared on a screen, while the electrodes recorded his brain activity and a computer algorithm learned to recognize certain patterns of activity.

Once the training was done, Evans was asked to use those same commands to control a communication board with his thoughts for about five minutes a day over three months and was able to do it with a more than 90% accuracy. This was sustained over the three months without it being recalibrated or trained with new data.

“While Tim’s speech was difficult for most human listeners to understand, the BCI was able to accurately translate his brain activity into computer commands,” said Shiyu Luo, the study’s first author and a graduate student in biomedical engineering at Johns Hopkins.

Using BCI in daily life

The device also enabled Evans to control external devices, like lights or streaming TV applications.

“In addition to expressing how he was feeling or what he wanted, Tim was able to use the BCI to turn a light on and off, and to select videos to watch on YouTube,” Luo said.

“It’s wonderful,” Evans said. “I can turn on the TV and turn off the lights without getting up … I can see the possibilities for other patients.”

Evidence suggested the approach could work when Evans didn’t say the command out loud, but only silently mouthed it, showing an 85.2% accuracy.

“It still remains to be seen whether the same level of performance can be achieved in people living with ALS who are unable to [produce speech] and/or articulate,” the researchers wrote.

The algorithm’s performance was found to be best when using signals from both sensory and motor brain areas, with regions involved in lip, tongue, and jaw movements being particularly important, Luo said.

Its reliability over time may be due in part to the type of electrodes used, according to Crone. While other approaches penetrate the brain to record from individual nerve cells, these record from larger populations on the brain’s surface.

“These population responses appear to be more stable over time,” Crone explained. “They don’t change from day to day as much, so the BCI algorithm we used for controlling the computer interface did not require recalibration or retraining for at least three months.”

Without a need for frequent recalibration, the BCI could be used in daily life.

The results suggest that an implanted BCI system decoding speech commands “can reliably control assistive devices over long time periods with only initial model training and calibration, supporting the feasibility of unassisted home use” for people with speech problems due to neurological conditions such as ALS, the researchers wrote.

The researchers are planning to move the BCI equipment to Evans’ home, pending approval from the U.S. Food and Drug Administration. Work is also underway to train the BCI on a broader vocabulary to make it effective for different types of tasks.

“There’s still a lot more work to be done to bring this to all patients who could use this technology,” Crone said.