Video demo: https://drive.google.com/file/d/1MmmLX3Pf4cuSbbscgkJjEWgz4qH2dfGu/view?usp=sharing

Inspiration

Fall is the time to get your shit together. It brings the academic pressures with the onset of the school year, as well as a new recruiting season! But since you were slacking off and relaxing in the summer, you now realize it's been a long long time since you last practiced your speaking skills. Suddenly the fall season is throwing all these school presentations, job interviews, and other super-important speaking situations right at you and you've totally forgotten how to articulate your words! Introducing... RTQlate! (AR-TI-CU-late)

What it does

RTQlate is a 5-feature speaking assistant and feedback provider:

  1. Auto-summarized flashcards in the convenient form of a physical wearable
  2. Real-time eye-tracking to measure eye contact
  3. Playback and audio transcription
  4. Sentiment analysis
  5. Enunciation and clarity level indicator

How we built it

  1. The flash card bullet points are displayed on an LCD display, built using Arduino and C++. A push button is used to flip through the cards. These points are automatically summarized by our OpenAI API.
  2. We track the motion of your eyes using the GazeTracking library to monitor whether your eyes are looking away from the screen or not.
  3. We used AssemblyAI to accomplish audio transcription, clarity detection, and sentiment analysis.
  4. Firebase is used for email authentication for account logins and signups.
  5. The backend server is written in Python Flask
  6. The frontend is written in React and styled with Tailwind CSS

Challenges we ran into

  • Choosing a topic relating to the four seasons (😰)
  • Connecting all the components (i.e. a functional frontend, backend, and Arduino piece) together

Accomplishments that we're proud of

  • Completing a working app with a functional hardware component!
  • Figuring out how to use AssemblyAI for sentiment analysis
  • Learning how to track eye movement with the limited resources that were available online
  • Having a nice frontend

What we learned

  • This was our first time working with speech analysis, we learned how to use AssemblyAI!
  • First time working with account authentication with Firebase
  • Also some of our first times working with a Python server (Flask)
  • Our first hardware hack!

What's next for RTQlate

  • Combining the data from the eye tracking and the speech analysis to give an overall score
  • Implement machine learning to track and learn a user's progress over time and automatically adjust to their weaknesses and strengths
  • Feedback on body language
  • Visualizing eye movement over a chart!
  • Deployment!

Built With

Share this project:

Updates