Inspiration
We noticed the problems with traditional walking sticks and guide dogs:
- Too expensive over long term periods
- Limits independence
- Limited detection ability (i.e. visibility above waist)
What it does
This uses ultrasonic sensors, a gyroscope and computer vision (the yolo v9 model) to generate haptic feedback cues describing the environment, through a glove worn by the user, and an attachment to their walking sticks.
How we built it
First proceeded by constructing the circuitry, then worked on building distance measurements with the ultrasonic sensors. Next we worked on starting to integrate and test a suitable real-time computer vision model, and then worked out how to encode the camera data and send it back to the circuit over WiFi before our construction of the final product.
Challenges we ran into
- The [[https://github.com/WongKinYiu/yolov9?tab=readme-ov-file|Yolo v9 model]] we used had limited labels, so our practical application only exists indoors with a limited number of obstacles to detect.
- Trying to communicate the camera data back to the ESP32 board over WiFi.
- Getting the motors to rotate as necessary for haptic feedback
Accomplishments that we're proud of
- We decomposed the problem well, making sure that each component was able to work separately of the others, before bringing them together
What we learned
- How to interface between multiple different technologies/languages
- Making the most effective use of limited hardware
What's next for SeeTrue
- Compacting the attachment so that it's less heavy, and doesn't make use of a phone camera, but a smaller web camera instead.
- Emergency features (SOS if dropped for longer than 20 seconds)
- Using a better real-time computer vision model, perhaps similar to YoLo but trained on broader data depending on the day-to-day applications the user may want
- Machine Learning for user-specificity
- Sync with mobile apps for voice assistance
Log in or sign up for Devpost to join the conversation.