Inspiration

We noticed that some people are either unable to navigate or move around their spaces easily. We wanted to empower these people by giving them the tools to embody a robot using brain control.

What it does

It allows users to control a robot like spot using their brain signals.

How we built it

eeg part

  • wrote a visual entrainment script on psychopy (using python)
  • wrote a real-time eeg processing script that takes in continuous voltage data from 8 channels, cleans/filters it with butterworth and independent component analysis, complete fast fourier transform, and wrote an algorithm to classify which frequency had the largest increase in power within the past five seconds.

Challenges we ran into

  • visual entrainment script is supposed to have four images flashing at different frequencies but both psychopy and react cannot achieve this. should've tried matlab.
  • real-time eeg processing is difficult to debug as it requires constant streaming of data and using multiple functions at the same time.

Accomplishments that we're proud of

  • debugging nonstop for several hours + finally understanding most of lsl

What we learned

  • need to conduct a full literature review for eeg projects in a short amount of time + understood more in depth about real-time processing

What's next for Insight OS

Built With

Share this project:

Updates