Computer scientists at Stony Brook University have been awarded $200,000 by the ALS Association to continue the development of a mobile app that helps people with amyotrophic lateral sclerosis (ALS), and others whose mobility has been impaired, to regain some of their independence.
According to a press release, the EyeCanDo app is designed to greatly reduce the burden of ALS for both patients and caregivers, by enabling patients to control smart home devices, write messages, and listen to music, among other things, using only their eyes.
The two-year grant was awarded to professors Xiaojun Bi, PhD, and Fusheng Wang, PhD, who initially developed the application for the 2018 Mount Sinai Medicine Hackathon, along with students at Stony Brook and a local high school student. The app earned them first prize in the competition.
ALS is a condition in which the loss of motor neurons — those controlling voluntary movement — causes patients to progressively lose their ability to move, communicate, and conduct most of their daily activities.
The disease has a significant impact on quality of life, as patients eventually rely on a caregiver for everything, including mundane tasks like turning off the lights. But researchers are aiming to reduce that burden by tracking eye movement, which usually is spared in patients with the ALS.
EyeCanDo is an app for iOS mobile devices that uses Apple’s TrueDepth camera system, the technology behind facial recognition in iPhones and iPads. By creating a detailed 3D map of a user’s face, the technology has the ability to follow eye movement with precise accuracy to see where people are looking on the screen.
By combining this cutting edge technology of eye tracking with a smart home device, the EyeCanDo app will allow a person with ALS to control multiple features of their homes, such as setting room temperature, turning on a fan, shutting off the television, or turning on the lights.
It also is designed to help patients write and send messages and to listen to music, and includes a text-to-speech feature, overall supporting a broad range of patient and caregiver needs.
Bi and Wang, professors at the Department of Computer Science, believe the app likely will help others with motor impairment and problems in communicating due to underlying diseases such as cerebral palsy, multiple sclerosis, and muscular dystrophy, as well as other problems such as stroke, brain injuries, or spinal cord injury.
Meanwhile, the team will be working to combine human-computer interaction and artificial intelligence technologies to tailor the software to each individual patient depending on their needs, while maintaining high accuracy and stability.
Besides Bi and Wang, others with strong experience in bioinformatics, machine learning, and human-computer interfaces also will participate in the research. Then, clinicians and patients at the ALS Clinic at Stony Brook University Hospital, New York, will be involved in testing and evaluating the upgraded mobile app.
We are sorry that this post was not useful for you!
Let us improve this post!
Tell us how we can improve this post?