Inspiration
We were traveling to the hackathon and on the way here, we noticed the scenery and in the background a Spotify playlist was playing. Then we collectively came up with the idea to develop an app that can capture a picture and play a certain playlist.
What it does
The user would open up our app and take a picture of themselves. Using Microsoft’s Emotion API, we can determine the mood of the user and make a playlist recommendation based on the result (happy, angry, sad, fear, neutral) using Spotify’s API.
How we built it
We built Trebl using Spotify’s API, Java, Kotlin, and XML in Android Studio.
Challenges we ran into
We looked into multiple ways that we could approach this problem of detecting emotion in faces. Some of those methods included loading a pre-trained model onto a Tensorflow Lite framework, to locally classify faces as being happy, angry, sad, fearful, or neutral. We also looked into using Google Cloud Vision to achieve the same result, but due to a myriad of unforeseen barriers, we settled on using Microsoft’s Emotion API. Switching from machine learning model to machine learning model ate up a lot of our time and required us to do unnecessary refactoring of our code.
Accomplishments that we're proud of
Most of us had very little to no experience developing apps in Android and we came out of these 36 hours with all of us having done some Android dev.
What we learned
We learned Kotlin, fundamental Android development, and how to make good use of publicly available APIs using HTTP requests.
What's next for Trebl
- Displaying a real-time prediction of your mood before snapping a picture
Log in or sign up for Devpost to join the conversation.