Stanford Researchers Develop Brain-Controlled Prosthesis That’s Nearly As Good As One-finger Typing

Charles Moore avatar

by Charles Moore |

Share this article:

Share article via email

Years of work by a team of researchers at Stanford University’s Engineering Department have finally succeeded in yielding a technique that continuously corrects brain readings in order to provide people with spinal cord injuries a more precise technique for tapping out computer commands by using a thought-controlled cursor. A pilot clinical trial for human use is underway.

Normally when people type or perform other precise tasks, their brains and muscles usually work together in synergy with no conscious effort. However, when neurological disease or a spinal cord injury severs the connection between the brain and limbs, motions that were once near-effortless become difficult or impossible.

Consequently, researchers have worked to develop aids to providing people suffering from injury or disease a degree of restored motor function by use of thought-controlled prostheses — devices that can tap into relevant regions of the brain, bypassing damaged connections and delivering thought commands to devices such as virtual keypads.

However, making this actually work is exceedingly difficult. Brains are complex, and our actions and thoughts are orchestrated by millions of neurons biological switches that fire faster or slower in dynamic patterns.

In a news release, Associate Director of Communications for Stanford Engineering Tom Abate notes that brain-controlled prostheses currently work with access to a sample of only a few hundred neurons but need to estimate motor commands that involve millions of neurons. Consequently tiny errors in the sample neurons that fire too fast or too slow reduce the precision and speed of thought-controlled keypads.

However, an interdisciplinary team led by Stanford professor of electrical engineering Krishna Shenoy, has developed a technique to make brain-controlled prostheses more precise. In essence, the Shenoy prostheses analyzes the neuron sample and make dozens of corrective adjustments to the estimate of the brain’s electrical pattern all in an eyeblink.

The research team at Prof. Shenoy’s Neural Prosthetic Systems Laboratory (NPSL) at Stanford conducts neuroscience, neuroengineering, and translational research to better understand how the brain controls movement, and to design medical systems to assist people with movement disabilities. Their neuroscience research investigates the neural basis of movement preparation and generation using a combination of electro-/opto-physiological, behavioral, computational and theoretical techniques, and the lab’s neuroengineering research investigates the design of high-performance and robust neural prostheses — also known as brain-computer interfaces (BCIs) and brain-machine interfaces (BMIs). These systems translate neural activity from the brain into control signals for prosthetic devices, which can assist people with paralysis by restoring lost motor functions. The NPSL is involved with translational research, including an FDA pilot clinical trial termed BrainGate2 co-directed by Profs. Shenoy & Henderson.

Dr. Shenoy’s team has developed a brain-controlled cursor meant to operate a virtual keyboard intended for use by persons with paralysis and amyotrophic lateral sclerosis (ALS, AKA Lou Gehrig’s disease). ALS degrades an individual’s ability to move, and a thought-controlled keypad would allow a person with paralysis or ALS to run an electronic wheelchair and use a computer or tablet.

“Brain-controlled prostheses will lead to a substantial improvement in quality of life,” Dr. Shenoy observes. “The speed and accuracy demonstrated in this prosthesis results from years of basic neuroscience research and from combining these scientific discoveries with the principled design of mathematical control algorithms.”

The new corrective technique developed by the NPSL and describes in a paper published in the journal Nature Communications is based on recently-discovered understanding of how monkeys naturally perform arm movements. The researchers studied how normal, healthy monkeys used their arms, hands and fingers to reach for targets presented on a video screen. The researchers’ objective was to learn through hundreds of experiments what electrical patterns from the 100- to 200-neuron sample looked like during a normal reach, ergo: to understand the “brain dynamics” underlying reaching arm movements.

“These brain dynamics are analogous to rules that characterize the interactions of the millions of neurons that control motions,” says Jonathan Kao, a doctoral student in electrical engineering and first author of the Nature Communications paper on the research. “They enable us to use a tiny sample more precisely.”

The paper, entitled “Single-trial dynamics of motor cortex and their applications to brain-machine interfaces” (Nature Communications 6, Article number: 7759 doi:10.1038/ncomms8759) is coauthored by Jonathan C. Kao, Stephen I. Ryu, John P. Cunningham, and Krishna V. Shenoy of the Bioengineering, Electrical Engineering and Neurobiology Departments and Stanford Neurosciences Institute at Stanford University; Paul Nuyujukian of the Palo Alto Medical Foundation in Palo Alto, California; Stephen I. Ryu of the Columbia University Department of Neuroscience in New York; and Mark M. Churchland of the Columbia University Department of Statistics.

Paul Nuyujukian is a postdoctoral researcher with an MD in neurosurgery and a PhD in electrical engineering; Stephen Ryu is a neurosurgeon with the Palo Alto Medical Foundation and consulting professor of electrical engineering; Mark Churchland and John Cunningham are assistant professors in the Columbia University Neuroscience and Statistics Departments respectively.

The coauthors note that increasing evidence suggests neural population responses having their own internal drive, or dynamics that describe how the neural population evolves through time. Therefore, they say an important prediction of neural dynamic models is that previously observed neural activity is informative of noisy yet-to-be-observed activity on single-trials, and may thus have a ‘denoising’ effect.

To investigate this prediction, the researchers built and characterized dynamical models of single-trial motor cortical activity, finding that these models capture salient dynamical features of the neural population and are informative of future neural activity on single trials. To assess how neural dynamics may beneficially denoise single-trial neural activity, the investigators incorporated neural dynamics into a brainmachine interface (BMI), finding in online experiments, they found that a neural dynamical BMI achieves substantially higher performance than its non-dynamical counterpart. They note that these results provide evidence that neural dynamics beneficially inform the temporal evolution of neural activity on single trials and may directly impact the performance of BMIs.

Shenoymonkey

Image Caption: In an experiment, monkeys use their brainwaves to accurately move a cursor over blinking dots on a computer screen. The research may lead to devices such as a wheelchair that paralyzed people can drive with their own brain waves. (Image courtesy of Shenoy Lab)

Tom Abate notes that these findings have helped Dr. Shenoy’s team members distill their understanding of brain dynamics into an algorithm that could analyze measured electrical signals obtained by their prosthetic device from sampled neurons. The algorithm tweaks these measured signals so that the sample’s dynamics are more like baseline brain dynamics, with the goal to make the thought-controlled prosthetic more precise.

To test this algorithm the Stanford researchers trained two monkeys to choose targets on a simplified keypad consisting of several rows and columns of blank circles. When a light flashed on a given circle the monkeys were trained to reach for that circle with their arms.

To set a performance baseline the researchers measured how many targets the monkeys could tap with their fingers in 30 seconds. The monkeys averaged 29 correct finger taps in 30 seconds.

The actual experiment only scored virtual taps coming from the monkeys’ brain-controlled cursor. Although the monkey may still have moved his fingers, the researchers only counted a hit when the brain-controlled cursor, corrected by the algorithm, sent the virtual cursor to the target, Tom Abate explains. The prosthetic scored 26 thought-taps in 30 seconds, about 90 percent as quickly as a monkey’s finger.

Abate notes that thought-controlled keypads are not unique to the Shenoy lab, and other brain-controlled prosthetics use different techniques to solve the problem of sampling errors. However, of several alternative techniques tested by the Stanford team, the closest resulted in 23 targets in 30 seconds.

The goal of all this research is to provide thought-controlled prosthetics to people with ALS, who currently may use an eye-tracking system to direct cursors, or alternatively a “head mouse” that tracks the movement of the head. However, both are fatiguing to use, and neither provides natural and intuitive control of readings taken directly from the brain.

The U.S. Food and Drug Administration recently gave Dr. Shenoy’s team the green light to conduct a pilot clinical trial of its thought-controlled cursor on people with spinal cord injuries.

“This is a fundamentally new approach that can be further refined and optimized to give brain-controlled prostheses greater performance and therefore greater clinical viability,” says Dr. Shenoy

Funding for the experiments came from a Director’s Pioneer Award from the National Institutes of Health, a T-RO1 Award from the National Institutes of Health and two programs from the Defense Advanced Research Projects Agency: REPAIR (Reorganization and Plasticity to Accelerate Injury Recovery) and Neuro-FAST (Neuro Function, Activity, Structure, and Technology).

Sources:
Stanford Engineering
Stanford News Service
Nature Communications

Image Credits:
Stanford University
Shenoy Lab