Inspiration

We were inspired by the Deaf community and their daily communication challenges. Over 400 million people worldwide experience hearing loss, yet accessibility tools remain limited. We wanted to create something that not only translates sign language but also respects Deaf culture and builds real inclusion using wearable technology like Meta Glass.

What it does

SynSight translates American Sign Language (ASL) into text and speech in real time through a smart add on for Meta Glass. The app also provides reverse translation turning spoken words into on-screen text allowing smooth two-way communication between Deaf and hearing users.

How we built it

We used Python, TensorFlow/Pytorch, and OpenCV for gesture recognition, integrating MediaPipe to track hand movements. The interface was built with React and data is processed through Firebase for real time updates. The prototype connects to Meta Glass SDK and MetaBand sensors to synchronize gestures and context.

Challenges we ran into

Training the ASL recognition model was time consuming due to limited labeled datasets. Integrating computer vision with wearable devices required handling latency and calibration issues. We also faced difficulties ensuring accuracy for similar hand signs and adapting the UI for real time feedback.

Accomplishments that we're proud of

We built a functional prototype capable of detecting common ASL gestures with strong accuracy and responsive feedback on Meta Glass. Our team successfully combined hardware and AI in a single user friendly system that bridges the communication gap for the Deaf community.

What we learned

We learned the importance of inclusive design and how technology can strengthen accessibility. Working with machine learning for gesture recognition taught us about model optimization, dataset preprocessing, and the value of teamwork in building complex prototypes under time constraints.

What's next for SynSight

Next, we plan to expand our dataset to include full-sentence ASL translation, integrate voice emotion detection, and refine our Meta Glass interface. Our long term goal is to make SynSight publicly available and empower the global Deaf community through technolog y.

Built With

Share this project:

Updates