Video demo: https://drive.google.com/file/d/1MmmLX3Pf4cuSbbscgkJjEWgz4qH2dfGu/view?usp=sharing
Inspiration
Fall is the time to get your shit together. It brings the academic pressures with the onset of the school year, as well as a new recruiting season! But since you were slacking off and relaxing in the summer, you now realize it's been a long long time since you last practiced your speaking skills. Suddenly the fall season is throwing all these school presentations, job interviews, and other super-important speaking situations right at you and you've totally forgotten how to articulate your words! Introducing... RTQlate! (AR-TI-CU-late)
What it does
RTQlate is a 5-feature speaking assistant and feedback provider:
- Auto-summarized flashcards in the convenient form of a physical wearable
- Real-time eye-tracking to measure eye contact
- Playback and audio transcription
- Sentiment analysis
- Enunciation and clarity level indicator
How we built it
- The flash card bullet points are displayed on an LCD display, built using Arduino and C++. A push button is used to flip through the cards. These points are automatically summarized by our OpenAI API.
- We track the motion of your eyes using the GazeTracking library to monitor whether your eyes are looking away from the screen or not.
- We used AssemblyAI to accomplish audio transcription, clarity detection, and sentiment analysis.
- Firebase is used for email authentication for account logins and signups.
- The backend server is written in Python Flask
- The frontend is written in React and styled with Tailwind CSS
Challenges we ran into
- Choosing a topic relating to the four seasons (😰)
- Connecting all the components (i.e. a functional frontend, backend, and Arduino piece) together
Accomplishments that we're proud of
- Completing a working app with a functional hardware component!
- Figuring out how to use AssemblyAI for sentiment analysis
- Learning how to track eye movement with the limited resources that were available online
- Having a nice frontend
What we learned
- This was our first time working with speech analysis, we learned how to use AssemblyAI!
- First time working with account authentication with Firebase
- Also some of our first times working with a Python server (Flask)
- Our first hardware hack!
What's next for RTQlate
- Combining the data from the eye tracking and the speech analysis to give an overall score
- Implement machine learning to track and learn a user's progress over time and automatically adjust to their weaknesses and strengths
- Feedback on body language
- Visualizing eye movement over a chart!
- Deployment!
Log in or sign up for Devpost to join the conversation.