Inspiration

For several weeks following my grandmother's surgery, she remained confined to her bed, unable to talk or even move. In the hospital, we improvised a paper communication board, painstakingly guiding her gaze to each word on the board until she made a grunting noise, signifying that we had stumbled upon the word she needed to convey. This process is tedious and frustrating for both the caregiver and the patient, especially when patients find themselves waiting and reliant on nurses or caregivers to express their needs. Witnessing her struggle, along with countless other hospitalized individuals facing similar challenges, we knew that there had to be a better way to communicate without depending on others.

What it does

Eye Talk is a digital and customizable communication board which allows people to select options and speak them with only their eyes. They just look at the box with the word they want to select then blink twice to speak it. Their selected option will be spoken out loud through speakers and their nurse or caregivers will receive a text message if they request them. We are giving patients the ability to speak through their eyes.

How we built it

We used the AdHawk Mindlink glasses to track the patient’s eye position and where they are looking. We calibrated the values to a computer screen, so it can accurately transfer coordinates of where they are looking onto screen coordinates no matter how far the screen is positioned. When the user looks at the correct area and blinks to select, the button will be clicked and the message will be read out loud. A text message can also be sent to a nurse or family caregivers outside of the hospital. We used Tkinter for the GUI and Twilio to send SMS messages. We also built a homepage using Taipy to link to the communication board.

Challenges we ran into

  • Calibrating the eye tracker to be precise was difficult
  • Determining the math formula to figure out what point on the screen someone is looking at, relative to the user’s proximity to the screen and the angle that they’re looking at the screen from
  • Detecting the location the eye is looking at right after a blink
  • Averaging the many data points to find the most accurate place they are looking

Accomplishments that we're proud of

We’re proud that we were able to determine the precise location on a screen that the user is looking at. Doing this required a lot of careful calibration and a ton of math, but being able to precisely click a button using just your eyes was really cool

What we learned

It was everyone’s first time hacking with hardware, and although we intended to do a software project, we learnt a lot from using the AdHawk glasses and it was really fun. We learnt how to use the AdHawk glasses and analyze the data given. We also learnt how to create GUIs in Python and link it to hardware.

What's next for Eye Talk

  • Add more options to the communication board and make the words customizable
  • Add visual keyboard option for the patient so they can add their own words that aren’t there
  • Family members can send messages that will display on the board to communicate with the patient even when they are not at the hospital

Built With

Share this project:

Updates