Inspiration
We wanted to make the process of navigating everyday life as seamless as possible for those who are visually impaired.
What it does
Using the mapping technology of the Google Project Tango we are able to acquire an accurate point map of the world in front of you. We then create a feedback response with the Myo armband which gives feedback on the direction that should be moved towards. We then use sound to help locate and classify the size and distance of objects in front of you which allows you to navigate areas including up and down elevated ledges.
How I built it
Using the project tango to get a point map of data in front of you, we then can measure relative distances and determine a path which would lead to an obstacle free location. The Myo vibrates more depending on how oriented you are to a non passable path. The frequency of the sound pulses and amount of notes played is correlated with the distance and size of the object respectively.
Challenges I ran into
Linking together proper sound tools to get combinations of sounds, finding the best way to relate spacial data with sound.
Accomplishments that I'm proud of
Linking together 2 senses with 1 that some people don't have the opportunity to experience.
What I learned
About converting 3D point maps and getting valuable data from that information, and being able to transfer that to different senses in the body.
What's next for SignalSight
Using the pcl library to get even more insight out of the spacial data, and more advanced sound processing tools to help the user understand even more about where they are.



Log in or sign up for Devpost to join the conversation.