Overview

We created a Smart Glasses hack called SmartEQ! This unique hack leverages the power of machine learning and facial recognition to determine the emotion of the person you are talking to and conducts sentiment analysis on your conversation, all in real time! Since we couldn’t get a pair of Smart Glasses (hint hint MLH Hardware Lab?), we created our MVP using a microphone and webcam, acting just like the camera and mic would be on a set of Smart Glasses.

Inspiration

Millions of children around the world have Autism Spectrum Disorder, and for these kids it can be difficult to understand the emotions people around them. This can make it very hard to make friends and get along with their peers. As a result, this negatively impacts their well-being and leads to issues including depression. We wanted to help these children understand others’ emotions and enable them to create long-lasting friendships. We learned about research studies that wanted to use technology to help kids on the autism spectrum understand the emotions of others and thought, hey, let’s see if we could use our weekend to build something that can help out!

What it does

SmartEQ determines the mood of the person in frame, based off of their facial expressions and sentiment analysis on their speech. SmartEQ then determines the most probable emotion from the image analysis and sentiment analysis of the conversation, and provides a percentage of confidence in its answer. SmartEQ helps a child on the autism spectrum better understand the emotions of the person they are conversing with.

How we built it

The lovely front end you are seeing on screen is build with React.js and the back end is a Python flask server. For the machine learning predictions we used a whole bunch of Microsoft Azure Cognitive Services API’s, including speech to text from the microphone, conducting sentiment analysis based off of this text, and the Face API to predict the emotion of the person in frame.

Challenges we ran into

Newton and Max came to QHacks as a confident duo with the initial challenge of snagging some more teammates to hack with! For Lee and Zarif, this was their first hackathon and they both came solo. Together, we ended up forming a pretty awesome team :D. But that’s not to say everything went as perfectly as our new found friendships did. Newton and Lee built the front end while Max and Zarif built out the back end, and as you may have guessed, when we went to connect our code together, just about everything went wrong. We kept hitting the maximum Azure requests that our free accounts permitted, encountered very weird socket-io bugs that made our entire hack break and making sure Max didn’t drink less than 5 Red Bulls per hour.

Accomplishments that we're proud of

We all worked with technologies that we were not familiar with, and so we were able to learn a lot while developing a functional prototype. We synthesised different forms of machine learning by integrating speech-to-text technology with sentiment analysis, which let us detect a person’s emotions just using their spoken words. We used both facial recognition and the aforementioned speech-to-sentiment-analysis to develop a holistic approach in interpreting a person’s emotions. We used Socket.IO to create real-time input and output data streams to maximize efficiency.

What we learned

We learnt about web sockets, how to develop a web app using web sockets and how to debug web socket errors. We also learnt how to harness the power of Microsoft Azure's Machine Learning and Cognitive Science libraries. We learnt that "cat" has a positive sentiment analysis and "dog" has a neutral score, which makes no sense what so ever because dogs are definitely way cuter than cats. (Zarif strongly disagrees)

What's next for SmartEQ

We would deploy our hack onto real Smart Glasses :D. This would allow us to deploy our tech in real life, first into small sample groups to figure out what works and what doesn't work, and after we smooth out the kinks, we could publish it as an added technology for Smart Glasses. This app would also be useful for people with social-emotional agnosia, a condition that may be caused by brain trauma that can cause them to be unable to process facial expressions. In addition, this technology has many other cool applications! For example, we could make it into a widely used company app that company employees can integrate into their online meeting tools. This is especially valuable for HR managers, who can monitor their employees’ emotional well beings at work and be able to implement initiatives to help if their employees are not happy.

Share this project:

Updates