Inspiration

"About 100 million people around the world need a prosthesis to replace an amputated arm or leg..." (University of Southampton, 2019). Therefore, the need to have prostheses for the motor area of ​​the body is of greater importance to meet SDG 10.2 of 2023, and thus "enhance and promote the social, economic and political inclusion of all people, regardless of their age, sex, disability…” (United Nations, 2022).

Sustainable Development Goal 10.2 for 2023 link

Although even when there is a market of approximately 5 different types of prostheses (passive, powered, myoelectric, hybrid, etc.), none of these manages to achieve the social equality of the person.

I am interested in solving this problem in order to offer a low-cost, high-tech prosthesis alternative to people with motor disabilities and thus collaborate to meet SDG 10.2. Being able to create this relationship between the user's mind and the prosthesis, as if it were a real body member.

I consider Brain Computer Interfaces to be an underexplored area, and its application in prosthetics is incredibly far-reaching. Using this technology it will be possible to reduce the social inequality that occurs in people with amputations, giving them the opportunity to mentally control their prosthesis, thus raising their quality of life, by feeling completely autonomous and included.

What it does

I have currently developed software that contains a class recognition algorithm for both hands. I have managed to allow the user to imagine a movement of his right hand and that this movement can be reflected in the robotic prosthesis.

How I built it

  • I obtained the electroencephalographic signals of a 16-year-old girl using the Unicorn Hybrid Black headset
  • I used a Lab Streaming Layer in Visual Studio to exchange streaming data between the Unicorn Hybrid Black software and OpenVibe. (Oh btw I selected this headset because it allows the delivery of high quality EEG data from 8 electrodes sampled with 24 Bit and 250 Hz per channel, plus it´s the one my school could borrow me)
  • I selected the LSL Driver to start acquiring signals from the subject
  • I placed the headset and injected to the C3 and C4 electrodes with a saline solution to increase conductivity. (This electrode position measures movements from the right and left hand)
  • I adapted a Graz Visualization BCI scenario

Sample of the Graz scenario link

For the Graz visualization the screen on the left is shown for 30s. Then on the center screen a green coordinate appears before the arrow. Finally on the right screen a red arrow pointing to the left or right indicates the imaginary movement of the hand that the subject imagines.

  • I concatenated al signals into a single file using a 5-30 Hz Bandpass filter
  • Started running the scenario for the classifier trainer, that allowed me to obtain an efficiency percentage of my developed algorithm, achieving a 77.6% accuracy rate

Classifier Trainer link

  • Designed and print a 3D hand prosthesis

Hand prosthesis link

  • Code a Python script to receive an array of markers from the classifier trainer algorithm and separate signals from right and left hand
  • Send the processed information into Arduino in order to control the servo motors from the prosthesis

Challenges we ran into

  • It was hard to obtain the EEG headset reader, I had to spend a lot of time at school with the equipment
  • It involved a lot of hard work 😭
  • I had to pay the filament for the 3D printing

Accomplishments that I´m proud of

Even though the prosthesis still needs some final fixes I´m happy that the algorithm finally works. I have been thinking about this proyect for a long time, and I decided to create it in this hackathon because the theme is perfectly appropiate. Rushing against time was fun!

What's next for Mind-controlled prosthesis

Obtain some funding so I can continue developing this project Visit public hospitals in my local area to test this prototype into people

Built With

Share this project:

Updates