Inspiration
According to the World Health Organization, it is estimated that 1.3 billion people worldwide, or 16% of the global population, experience some form of disability. Millions of these individuals living with motor impairments - such as ALS, Parkinson's disease, muscular dystrophy, or limb loss - struggle with performing simple daily tasks that most people take for granted. Even basic actions like opening a pill bottle or turning on a light can require full-time assistance from a caretaker, which can create emotional strain and a loss of independence for the patient.
Current assistive solutions for individuals with severe motor impairments rely on physical caregiver support, voice-activated systems, or expensive robotic prosthetics/aids. While advanced BCIs exist, they are typically limited to research or medical facilities rather than actual patient use cases.
With Cognia, we want to bridge this gap, combining the neural technology of an EEG headset with an adaptable, microcontroller-driven mechanism that can be tailored to the user's specific needs.
What it does
Cognia allows users with motor impairments to perform simple physical actions using brain signals. In our prototype, an Insight EEG headset detects a user's intent, which triggers an Arduino-powered mechanism to open a makeshift pill capsule. This system can be adapted for other everyday tasks, giving users more independence and control in their daily lives. Additionally, Cognia functions as an AI-driven therapist in the form of a website. Using performance metrics readings from the EEG headset, Cognia provides an AI assessment of your numbers and questionnaire, giving you feedback on what to do for the future.
How we built it
Software:
Using the Emotiv BCI documentation, we utilized the Cortex API to access the necessary data streams (Mental Commands and Performance Metrics) and retrieve raw data from the headset. Following Emotiv’s API flow, we connected to the headset using Python WebSockets and the JSON-RPC protocol. We then processed the incoming data streams and isolated the information relevant to mental commands—specifically, the intensity of the trained user’s thought. For our prototype, we focused on two states: neutral and open, where the “open” command triggers the mechanism to open the box. Once the data is received from the websocket, it's sent to an Arduino through a serial port. The Arduino functions as a state machine to determine the state of the lid, and once it receives the signal, it powers the motor mechanism to open the capsule.
For the AI analysis of performance metrics, we created a website that collects both qualitative and quantitative data from the user. The site asks the user a series of questions while they wear the headset for approximately 40 seconds to record performance data. We gathered six key metrics measured by the EEG headset, each ranging from 0 to 1, and used AI to analyze the user’s mental state. The user can then interact with the AI to receive personalized feedback on their performance and overall well-being.
Challenges we ran into
A major challenge we had to face was getting the EEG headset signal to communicate with the Arduino in real time. We initially struggled to route data through Python and send consistent serial commands, but with further development, we solved this issue by utilizing Python Websockets and JSON protocol to send consistent data to the Arduino microcontroller.
Another issue was controlling the motor precisely. At times, that motor was too inconsistent, spinning too fast or too slow, too long or not long enough. To solve this, we used the L293D driver with an external battery and implemented a tuned open/close window. With the driver, we were able to fine-tune the time the motors run, as well as their speed experimentally.
Accomplishments that we're proud of
Built a functional brain-controlled prototype in under 24 hours
Developed a stable Arduino control system with timed open/hold/close logic
Designed and 3D-printed custom mounts for the motors and bottle alignment
Integrated EEG-trigger training to get consistent results
Worked effectively as a team, combining neurotech, hardware, and coding under time pressure.
What we learned
How to interface brainwave data with embedded hardware with serial communication
Importance of debouncing, timing, and state-machine logic in Arduino projects
How to design and 3D-print mechanical parts to move components
Strategies for rapid prototyping and debugging under tight deadlines
How to manage separate power sources (Arduino, external battery) for logic and motors with resets
What's next for Cognia
Since Cognia's technology is focused on sending relevant data from the EEG headset to motor/module-powered systems, Cognia can be significantly expanded to meet users' and patients' needs. The goal of Cognia was to be able to adapt to any given task; therefore, we plan to create new prototypes of different add-ons, such as turning on light switches, opening cabinets, etc.

Log in or sign up for Devpost to join the conversation.