Inspiration

Around 40 million people in the world are considered vision-impaired and blind, and the current assistive solutions still rely on canes or guide dogs. What would it be like for them to "see" the world the way a machine sees all in a hands-free experience that goes beyond reality?

What it does

SenSee is an embodied AI system that uses Snap AR spectacles and a LiDAR-equipped wearable vest for visually impaired users to explore how perception can be extended beyond sight. Essentially, the vest would respond with vibration to the direction of objects close by, and the Spectacles tell the user what the objects are in front of them with Gemini AI, translating the visual environment into spoken and spatial cues, supporting navigation and awareness.

How we built it

We built our tool physically and via Lens Studio. We soldered a LiDAR sensor and fabricated a paper haptic vest with an array of motors. The sensor takes real-time samples of location and depth data of 64 "zones", which are then downsampled into a 4x4 grid. Each of the 16 motors on the haptic vest are then triggered with varying intensities based on the distance reading for each part of the grid, informing the user that there is an obstacle there. This movement triggers the user to move away from that direction, prompting Spectacles to give descriptive information on the observable scene.

Challenges we ran into

  • Getting the LIDAR sensor to work, particularly issues with an I2C level shifter and signal integrity issues
  • Soldering the numerous motors and IC's required to make the haptic vest
  • Setting up the Spectacles and Lens Studio environment on Windows computers (3 out of 5 people in our team are Windows users)
  • Integrating text to speech in Lens Studio
  • The accessibility of Spectacles in terms of form factor; it was a challenge to use the devices with two hands (as one hand was needed to keep them from falling off)
  • Access to the Spectacles Dev team when we ran into bugs and system problems

Accomplishments that we're proud of

  • Getting the LiDAR sensor to read off values and not output too much noise
  • Getting the haptic vest to work (successfully powering and triggering all sensors)
  • Getting Spectacles to pair with all our developers' computers (took up most of first day)
  • Finding an example with TTS and integrating it into our idea

What we learned

  • Affordances of AR technology in the accessibility sphere
  • Importance of documentation and dev tools
  • General use of Lens Studio

What's next for SenSee

  • Integrating the glasses and vest together using BLE or other protocols
  • Add multiple LiDar sensors along with better mounting locations and algorithm tuning to increase the sensing distance and make the sensing more consistent
  • Prototyping 3D prints for mounting hardware to normal glasses to allow for better fit and affordability

Link to Presentation

Link to 2-min Demo

Built With

Share this project:

Updates