Inspiration

In previous years at RevolutionUC, we have been enthralled by all the different hardware and we were determined to use the Muse brainwave-sensing headband this year. Therefore, we decided to center our project around a problem that we see in everyday life, and incorporating a solution that will be the most beneficial to the most users.

ADHD is the single most common childhood disorder with nearly 11% of school age children having been diagnosed. As the cases in the country have incresed in the last few years, we decided it would be appropriate to look at this issue through a digital lens.

Our Project

Our project leverages the Muse Headband to detect levels of gamma brain-wave activity, and creates a graphical display that allows patients to learn how to control their thought patterns through biofeedback. In biofeedback, an individual monitors a real-time display of a body mesurement and consciously makes an effort to change it. By exercising this way, they learn how to consciously control what would otherwise be an unconsciously controlled bodily function.

While biofeedback is clasically applied to physiological metrics like blood preasure and heart rate, bodily temperature, pain sensations, and so on, we decided to try biofeedback about the user's brainwave patterns themselves. Therefore, the user can learn to consciously control their brainwave activity, automatically leading to great control over counsciousness, focus, and alertness.

How it Works

The Muse headband uses four electrodes on the forehead and ears to detect voltage levels around them, in a technique known as an EEG. This raw data is interpreted by our wrapper for the muse-io application, which uses the OSC protocol to transmit data from the headset to any OSC-capable receiving API.

The program that receives the raw data is written in Python. We spawn a thread that listens for OSC messages and use a dispatcher to handle these events asynchronously. Then, we use a mathematical technique called a fast fourier transform to transform the raw signal from the time domain to the frequency domain-- decomposing the voltage/time graph into a breakdown of amplitude/frequency. From this result, we average those frequencies that are associated with gamma and beta waves, which represent conscious alertness and focus.

To provide user display, the value is displayed in an easy-to-understand graphical interface, which shows a horizontal colored meter that fills up proportional to the user's focus level, with high refresh rate and very low latency.

Challenges

One of the most challenging aspects of this project was the interface from the OSC protocol used by Muse to the Python application that we needed to use to process the data in real-time. Another challenge was creating the graphical output, as we ran into countless problems with mutli-threading and thread locking issues.

What we Learned

We learned how to use the Muse headband to accurately and quickly measure data about the brain as it functions. We learned how to connect with this device's software to extract the raw data in real time, and we learned how to pipe this data into a Python program.

We learned about signal processing with (Fast) Fourier Transforms, and explored the mathematical significance behind this immensely useful (and cool) process. We also learned how to create a live-updating graphical interface in Python, including all of the threading "gotchas" that go along with it.

But not only did we learn about data and programming, but building this project was a learning experience in other aspects, as well. Because our team was formed from two partnerships who hadn't worked together before, we learned a lot about how to balance and utilize each team member's strengths in working for the project. As we began working together and sharing fresh ideas and perspectives, our productivity massively increased and we were able to attack problems by using each person's ideas, knowledge, and past experience.

What's next for Focus Friend

Right now, Focus Friend is in the prototype/proof-of-concept stage. Our future plans include:

- Mobile support for Android and iOS
- A monitoring platform for parents and educators to monitor the focus levels and reports of their students
- Possible expansion of the technology to other applications, not just ADHD

Built With

Share this project:

Updates