Inspiration

Help people with autism [and memory loss]

What it does

If the camera detects a face, It looks at their facial expressions, then it tries to determine how a person is feeling.

How we built it

Using Google Vision.Face API we determined where a person's face was and how they were feeling. Then using just Android Graphics we wrote information about people's faces on the screen. In addition, we used the GearVR Framework in order to display whatever was happening on the VR headset to turn it into AR.

Challenges we ran into

Inexperience, Difficulty with dependencies, and there was no easy way of finding face similarity without paying.

Accomplishments that we're proud of

We were able to go from not knowing any android development, to developing an app that solves a real-problem.

What we learned

How an android project is structured, how to use the different libraries provided to us, how to quickly develop an android project.

What's next for EyeGlass

full port to a vr headset.

Built With

Share this project:

Updates