Inspiration
Harry came up with this idea, by recognizing the fact that people often have difficulties remembering the small, yet important details in a conversation.
What it does
Listen Up streams a conversation from the device microphone and listens for keywords and determines which items or subjects that the other person might be interested in or have positive feelings towards. For example, if the other person in the conversation mentions how cool they find Germany to be as a country, Listen Up will store "Germany" as a location of interest.
How we built it
Listen Up was built using Javascript/Typescript and the React framework. In order to turn text to speech and to analyze the conversation, a number of Microsoft Azure and Google Cloud natural language processing APIs were used. First, Microsoft Azure's text to speech API was used to stream a conversation. Then, Microsoft Azure and Google Cloud natural langue processing were utilized to identify keywords and how positive the person feels about the topic. Finally, these details were stored in a Firebase Database.
Challenges we ran into
One challenge that we ran into while creating Listen Up was in using Google Cloud in the web-app. It was the first time using Google Cloud for everyone on the team, so the team had difficulty integrating it into the product. We had difficulty setting up the API for use. There were problems such as authentication issues and running it on a webpage that we had to overcome in order for us to create Listen Up.
What we learned
Each member had learned something new. For example, Avery learned a bunch of new things ranging from API calls to using Microsoft Azure and Google Cloud. Harry learned that he loves Azure, but struggles with Google Cloud. Oliver learned nothing and knew everything (he's actually big brain). Jeffrey had always wanted to improve his front-end skills, so he got the opportunity to play around and understand the React framework a lot more.
Log in or sign up for Devpost to join the conversation.