Inspiration

  • The visually impaired are an underserved community, and we wanted to create something that could genuinely help them connect with the world in a new way. We were inspired by how AI can turn visual data into language by giving people the ability to see through sound.

What it does

  • Sightline is a PWA that empowers visually impaired users by helping them experience their surroundings through AI-powered descriptions.
  • It features two modes: Describe Mode and Live Mode.
  • In Describe Mode, users tap anywhere on the screen to take a photo, and Sightline instantly provides a detailed, human-like description of what’s in the image.
  • In Live Mode, users simply point their camera around, and the app continuously narrates what it sees in real time, no photos required.
  • With Sightline, we’re transforming visual information into accessible, meaningful experiences that help users navigate and connect with the world around them.

How we built it

  • We built Sightline using Next.js for both the frontend and backend.
  • We used Gemini API for advanced image analysis and scene understanding.
  • We used ElevenLabs for generating natural, human-sounding voice narration.
  • Together, these technologies create an experience where AI vision meets AI voice.

Challenges we ran into

  • Integrating multiple APIs efficiently while maintaining fast real-time response in Live Mode.
  • Managing the balance between accuracy and latency when processing continuous camera input.
  • Ensuring accessibility and intuitive design for users with visual impairments.

Accomplishments that we're proud of

  • Successfully built both Describe and Live modes within the hackathon timeframe.
  • Created a working prototype that combines computer vision and voice synthesis seamlessly.
  • Designed an app that could have a meaningful real-world impact.

What we learned

  • How to integrate multimodal AI tools (vision + speech) into a single cohesive app.
  • The importance of designing interfaces that prioritize accessibility and ease of use.
  • Creating a PWA.

What's next for SightLine

  • Adding voice-activated controls for hands-free use.
  • Add reactionary vibrations to user feedback.
  • Migrate the PWA to a mobile iOS and Android app.
  • Conducting user testing with visually impaired individuals to gather feedback and improve usability.
  • Eventually launching Sightline as a free accessibility tool for the community.

Built With

Share this project:

Updates