Inspiration
I initially wanted to build an app that simulates how someone would look without make up. i embarked on it, successfully extracted faces and checked whether the face had any make up and successfully determined that. I then tried to use colour accentuation to dull down the effect of the make up. Running a reverse rastarizing algorithm failed to achieve the required effect. I then pivoted to smile and emotion detection. I felt when perfectly used it would help alert parents to kids who are suffering from diseases.
What it does
You take a photo of your kid at several times of the day and it detects the current emotion and gives your child a depression score.
How I built it
I used microsoft face apis for face recognition and open cv for smile and emotional detection. I used android sdks for taking of the photos
Challenges I ran into
Setting up the open cv in python and numpy packages.
Accomplishments that I'm proud of
I managed to understand open cv and use it to curate smile detection algorithms. I also learned how to use Microsoft face APIs.
What I learned
I managed to understand open cv and use it to curate smile detection algorithms. I also learned how to use Microsoft face APIs.
What's next for Happy Meter
Release and testing
Log in or sign up for Devpost to join the conversation.