Inspiration
Communication is essential to the human experience. But individuals who suffer motor paralysis from ALS (amyotrophic lateral sclerosis), GBS (Guillain-Barre Syndrome), spinal cord injury (SCI) or brain-stem infarction have difficulty conveying their intentions because the motor neurons influencing their voluntary muscles have been compromised. The Christopher and Dana Reeves Foundation reports that some 5.6 million people in the US suffer from some form of paralysis, with about 700,000 of them reporting that they were "completely unable to move". Of the 5.6 million people, about 250,000 are under the age of 20. The eyes are directly connected to the brain; if we should suffer through trauma that causes paralysis, they are the last parts of the body on which we can lose control.
What it does
EyeSense is a messaging app that lets users send messages by blinking. The Muse EEG wearable device can detect blinking patterns. The app analyzes those patterns and triggers an event. In this version of the product, we have set it to send an SMS saying "I need a doctor" to a "doctor" when the user blinks twice; and to send an SMS saying "I need you here" to a "parent" when the user blinks thrice.
How we built it
We have developed a Java application that reads Open Sound Control or OSD data from the Muse device. Our application isolates and analyzes the blinking patterns of the user and once a pattern is established, it triggers the appropriate event - that is, when the user blinks twice or thrice, a specific message is sent to a designated mobile number. The Java application communicates through a PHP web service that leverages the Twilio API deployed in the Linode server.
Challenges we ran into
We ran into several challenges:
- We had initially hoped to take advantage of Muse's EEG capabilities. Unfortunately, no one on our team had experience reading raw EEG data. We could not interpret EEG data and so could not use it to trigger controlled, predictable events. We felt we did not have enough time for a crash course into EEG interpretation and decided to take advantage of Muse's simpler eye blink sensing capabilities.
- Lack of documentation on the Muse developer website; so we did trial by inspection.
- Formulating an algorithm to identify the blink sequence.
- Configuration of the Linode server - there were some missing libraries that we had to install.
Accomplishments that we're proud of
We are proud that we accomplished the following:
- We were able to interface the Muse device with the Java application
- We were able to figure out how to take the data Muse collects when it detects blinking, and make it useful.
- The thing works!
What we learned
No one on the team had experience developing with hardware. This was a great learning opportunity for all of us. We learned the value of teamwork - we each had a different set of skills (one of us doesn't even have coding experience); we come from different cultures; we have different perspectives, but we were able to come up with what we believe is a promising product.
What's next for EyeSense
The Muse wearable device is currently marketed as a meditation aid. It can do so much more. EEG is currently being studied as a noninvasive brain-computer interface. With further development, we can use EEG data to help users perform more sophisticated actions. It also has a lot of potential applications in epilepsy research and stress management research. Our short term goal is to accomplish our initial idea - using Muse to monitor how a hospital patient is feeling (e.g. anxious, stressed, calm), and sending that information at specific events (e.g. very high anxiety levels) to someone outside the hospital (e.g. an emergency contact, a parent).
Log in or sign up for Devpost to join the conversation.