Inspiration

We wanted to create a seamless, gesture-based authentication system for AR headsets that can easily switch between users in shared environments. Additionally, we wanted to integrate health tracking features such as heart rate monitoring and tremor detection, specifically for early onset Parkinson's. The idea of personalizing environments after authentication and simultaneously collecting health data inspired us to combine the power of AR with IoT and machine learning.

What it does

Snap Secure uses Snap AR Spectacles to authenticate users with a gesture, such as a finger tap. Once authenticated, the physical environment is adjusted to the user’s preferences (e.g., changing the color of lights). Meanwhile, the system collects health data like heart rate and tracks tremors, which can help in early diagnosis of diseases like Parkinson's as well as general health monitoring for the user. We even built models that work across different headsets from the Snap AR Spectacle to the Apple Vision Pro, and Meta Quest 2.

How we built it

We developed the authentication system using Snap AR Spectacles with gesture detection. Models for tracking gestures were built in Python and we built the models using just 3 training gestures per person! We used a algorithmic approach to process heart rate by processing the signal using a Ballistocardiogram-based approach that utilized a combination of a Butterworth filter and Fast Fourier Transform. The heart rate data used head data while the tremor detection used hand data. Heroku was used to handle requests between Lens Studio and our backend. A Raspberry Pi was employed to facilitate communication between the headset and physical devices like lamps, ensuring that environmental changes could be triggered based on user preferences.

Challenges we ran into

One of the biggest challenges was sending HTTP requests efficiently between Lens Studio and our backend, especially with motion data that needed to be processed in real-time. We had many troubles with this and had to reduce some of our scope due to the rates of sending the requests as well as implementing real-time gesture detection with a decision tree inside of Len Studio instead of sending motion data back and forth.

Accomplishments that we're proud of

We successfully built an end-to-end system that not only authenticates users with gestures but also adjusts the physical environment based on user preferences. Additionally, we integrated health-tracking features, which could have significant implications for early disease detection. We also created models that work across multiple headsets which could mean in the future there could be unified motion-based profiles.

What we learned

We gained valuable experience in working with AR technology and integrating it with IoT devices. We also learned about handling real-time data communication between hardware and software, as well as improving the accuracy of our health-tracking algorithms.

What's next for SnapSecure

We plan to improve the accuracy of our health detection models and work on making the system more scalable. Future features could include expanding the number of customizable environmental preferences such as unlocking a door with a gesture and integrating more health metrics to track for a wider range of diseases and cognitive states such as emotions and stress.

Share this project:

Updates