A research project underway at the University of Rhode Island (URI) aims to help people with advanced amyotrophic lateral sclerosis (ALS) and other severely limiting disorders communicate better by interacting with a computer that can translate their thoughts.
Among the consequences of motor diseases like ALS is the progressive loss of fine motor skills necessary for communication. Technologies known as brain-computer interface (BCI) exist that are able to analyze and interpret brain signals of people with limited muscle control, helping them to communicate.
Conceptually, BCI works just as its name implies: a computer takes readings from the brain, and turns these readings into some form of communication.
Existing BCI systems, however, have notable limitations. For instance, they often require a person to have fine control over their eyes, and eye-gaze control diminishes in the latter stages of ALS. There also can be considerable day-to-day variation in a person’s brain activity over time, limiting the effectiveness of a BCI system over a longer period.
The project, funded by a three-year, roughly $250,000 grant from the National Science Foundation, aims to overcome some of these limitations by developing personalized algorithms that take into account changes in brain activity over time, in order to make the BCI more robust.
Day-to-day changes in brain activity “are speculated to be associated with several factors, including cognitive fluctuations and environmental factors,” Yalda Shahriari, PhD, a professor of engineering at URI and the project’s lead researcher, said in a university news story. “Developing personalized algorithms will enable us to predict these fluctuations and optimize performance based on each patient’s specifications and needs.”
The project incorporates two types of technologies to measure brain activity. The first, called electroencephalogram (EEG), uses electrodes to measure electrical signals in the brain. The second, functional Near Infrared Spectroscopy (fNIRS), uses near-infrared beams of light to measure the amount of oxygen in different regions of the brain; more oxygen generally indicates better blood flow and, therefore, greater brain activity.
While EEG is typically a part of brain-computer interface technologies, fNIRS is not. Project researchers believe that combining the two will give their brain-computer interface more information to work with, allowing for more fine-tuned communication based on brain states.
fNIRS may also be particularly informative for people with limited ability to move their eyes or control eye movement for extended periods.
“We will use a hybrid of EEG and fNIRS signals to compensate for each neuroimaging modality shortage and use the complementary features obtained from each modality to improve our system,” Shahriari said.
The process of developing the refined brain-computer interface involves the use of an oddball paradigm — a psychological framework where the brain gets a set of repetitive stimuli (e.g., seeing several copies of the same picture) and then a different stimulus (e.g., a different picture). The brain’s response to the different stimulus is recorded. The specific setup involves a grid of letters and numbers and intermittent flashes of a matrix of digits, requiring participants to do some mental math.
“By giving the patient higher demanding tasks to focus on, we can trigger several cognitive functions and extract the associated signatures or neural biomarkers,” Shahriari said. “The computer can then decode the pattern of neural activities that appear after the patient performs the tasks. The patterns can be used for diagnostic and communication purposes.”
Shahriari is currently working with the National Center for Adaptive Neurotechnologies, the Rhode Island Chapter of the ALS Association, and Rhode Island Hospital to add more participants to the study.
“Our analysis of the data becomes much more powerful if we can significantly increase the number of patients in the study,” Shahriari said.
Doug Sawyer, a study participant who was diagnosed with ALS 11 years ago, said: “Taking part in the brain activity study has been very rewarding. I enjoy learning new things and staying abreast of the latest technology. Dr. Shahriari and her team have been willing to share their progress. They make me feel as if I’m part of their team and not just a test number.”
Sawyer, 57, works as a design engineer and communicates with his office using eye movement, but finds that his gaze weakens as he tires.
In addition to the research itself, the URI project also aims to help educate students — from those in elementary school to those seeking advanced degrees — about the field of BCI technology. For example, a curriculum called “Engineering the Brain” is being developed for middle school students interested in BCI technology.
Further aims include providing training to women and other groups under-represented in research today.
We are sorry that this post was not useful for you!
Let us improve this post!
Tell us how we can improve this post?