Inspiration

We want merchants to be able to analyze customers behavior toward their advertisements and give out better products based on data visual reports.

What it does

BehaviourStack uses OpenCV and Machine Learning to analyzes your facial emotions, age, and gender. BehaviourStack will turn on the video from what you choose in your file browser, and your device's camera will work behind the scene to capture and analyze human's facial expressions. When the video ends, a PDF report file will be generated on your local application folder and also on our web server file system using BlockStack's Gaia storage technology.

How we built it

We created the ML model to detect and analyze human facial expression using Tensorflow with Keras. After that, we created an OpenCV script that focus on human faces and let the model do the classification behind the scene. OpenCV will store the data in the spreadsheet frame by frame, and we passed the script onto another python script which takes in the spreadsheet and generate the data reports in PDF.

Challenges we ran into

Due to time constraint of the Hackathon, we were only able to finish the Machine Learning model and the data visual report generator. We have not finished creating a GUI that allows user to browse for videos and turn on the videos. We were also not finished with creating the web server with BlockStack's Gaia storage.

Accomplishments that we're proud of

We are proud of our Machine Learning model since it is something that none of us have any kind of ML experience before.

What we learned

Machine Learning

What's next for BehaviourStack

We will try to finish what we trying to aim for after the Hackathon

Built With

Share this project:

Updates