Wearable Facial Sensor May Help ALS Patients More Easily Communicate

Marisa Wexler, MS avatar

by Marisa Wexler, MS |

Share this article:

Share article via email
communication and ALS

A lightweight, wearable sensor is being developed that can detect minute facial movements and translate them into messages, potentially helping people with amyotrophic lateral sclerosis (ALS) communicate more easily.

Once produced in mass, each device is also expected to cost about $10.

The sensor is described in the study “Decoding of facial strains via conformable piezoelectric interfaces,” published in Nature Biomedical Engineering.

People with ALS gradually lose the ability to control their muscles. One consequence of this loss is that communication becomes difficult — the ability to speak, type, or write becomes more difficult as the disease progresses.

Technology can help ALS patients communicate. For instance, the famed physicist Steven Hawking, who had a slow-progressing form of the disease, communicated using an infrared sensor that could detect twitches of his cheek, moving a cursor across rows and columns of letters. While generally effective, this means of communication has drawbacks: it’s slow, and the required equipment is cumbersome.

“These devices are very hard, planar, and boxy, and reliability is a big issue. You may not get consistent results, even from the same patients within the same day,” Canan Dagdeviren, a professor at Massachusetts Institute of Technology (MIT) and study co-author, said in a press release.

Dagdeviren and colleagues developed a new sensor that utilizes piezoelectric materials to detect small facial movements, like a twitch or smile, which even people with advanced ALS usually retain to some degree.

A piezoelectric material is one that generates an electric voltage when a mechanical force — that is, movement — is applied to it. This voltage can be measured and translated into discrete movements using a machine learning algorithm.

The researchers dubbed their sensor cFaCES, standing for “conformable facial code extrapolation sensor.” The sensor is thin and lightweight, and can be worn on the face, with makeup working to match skin tones so it is unobtrusive.

“Not only are our devices malleable, soft, disposable, and light, they’re also visually invisible,” Dagdeviren said. “You can camouflage it and nobody would think that you have something on your skin.”

The components in cFaCES are also easy to manufacture, the researchers report. They estimate that, when manufactured at scale, each device would cost about $10 to produce.

To determine the most useful place to put the sensors, the researchers used digital image correlation. Essentially, they analyzed pictures of people making a variety of facial movements to identify areas with the most physical strain for the sensor to detect.

“We had subjects doing different motions, and we created strain maps of each part of the face,” said Rachel McIntosh, an undergraduate at MIT and study co-author. “Then we looked at our strain maps and determined where on the face we were seeing a correct strain level for our device, and determined that [the cheek] was an appropriate place to put the device for our trials.”

The researchers tested their sensor on healthy volunteers and on two ALS patients (one male, one female). The sensor was able to distinguish between three expressions — a small smile, an open mouth, and pursed lips — with an accuracy rate of 86.8% with the volunteers and 75% with the patients.

The highest accuracy was achieved when four piezoelectric sensors were used.

“The cFaCES design allows for a maximum of four such sensing elements, but further increasing the number of elements used for [detection] could potentially make decoding accuracy even higher,” the researchers wrote.

Being able to distinguish between these small facial movements means the sensor could be incorporated into communication technologies — essentially, any given facial movement could be programmed to communicate a given word or idea.

“We can create customizable messages based on the movements that you can do,” Dagdeviren said. “You can technically create thousands of messages that right now no other technology is available to do. It all depends on your library configuration, which can be designed for a particular patient or group of patients.”

The researchers have filed for a patent on this technology, and are planning additional testing. Besides helping people with ALS communicate, they suggest their device may also be useful for monitoring disease progression or treatment effectiveness.