OurLife: A Second Brain for Those Who Need It Most
"I still remember my 11-year-old self when my great grandma was near the end of her life. I would visit her once every 1-2 weeks, and listen while my mom told her again who I was. Sometimes, even though she didn't remember me, she saw me and knew that I must be someone important to her. And she would smile." - Victor
Problem
Early-stage Alzheimer’s patients don’t lose their identity all at once, it begins with short-term continuity. They forget what just happened, who they just met, or what they were about to do. These moments often lead to social embarrassment, safety risks, and a gradual loss of self and independence. This accelerates the speed of disease progression.
Although very helpful, family and caregivers cannot be present at all times. Many of the most lonely and difficult moments occur in unpredictably, often without family or friends present. This inspired us to explore how wearable AR can provide gentle, human-centered support and comfort.
Our Solution
OurLife is an assistive memory system for people living with Dementia/Alzheimer’s. Using Ray-Ban Meta glasses, Meta Quest, and AI, it captures daily moments, provides real-time guidance, and helps caregivers support loved ones with dignity, safety, and connection.
- "What did I just do?" — When someone feels lost, they can ask: "What was I doing?" The system plays back a short summary of the last few minutes—like a friendly recap from a supportive friend.
- Object-Triggered Reminiscence – “This is your book. Your Husband Wang gave it to you on your anniversary. Do you remember that day?”
- Location-Triggered Reminder – GPS-based spoken reminders like “Don’t forget to bring your keys!”
- Invisible Social Support – “This is your granddaughter, Lily, you met her last week, say hi to her!”
How we built it
We built a 4-component XR system that connects Ray-Ban Meta glasses, a multi-modal AI backend, Meta Quest, and a caregiver dashboard into one coherent, human-centered workflow.
Patient Device (Ray-Ban Meta Glasses)
- Meta Wearables SDK (DAT) – Uses
MWDATCamerafor H.264 video streaming and frame-level head-POV image capture. - Multi-modal query capture – On user long-press, captures head-POV image, on-device STT via Apple
SFSpeechRecognizer, and GPS coordinates viaCoreLocation. - WebSocket transport – Sends structured JSON payloads (image URL, transcript, location) to the backend using a custom WebSocket client.
- Voice UI output – AI responses are spoken aloud through open-ear speakers using
AVSpeechSynthesizer, with live transcription mirrored in the caregiver app. - Continuous passive context capture – Every 20 seconds, captures head-POV image and context, uploads to backend, and indexes into a custom RAG store; processed into hourly summaries and daily summaries.
Backend (Google Cloud Platform)
- Gemini 2.0 Flash – Multi-modal reasoning over image, text, and location context.
- Cloud Firestore – Persistent storage for people, places, and memory records.
- Cloud Storage (GCS) – Image upload and retrieval pipeline for query snapshots.
- Retrieval-Augmented Generation (RAG) – Personalized memory retrieval for context-aware responses.
- Real-time APIs – WebSocket-based query/response channel and REST endpoints for memory access.
Memory Visualizer (Meta Quest)
- Unity 2022.3 + Meta XR SDK – VR application built for Quest 3.
- Daily memory sync – Curated moments from daytime captures packaged into nightly VR sessions.
- Immersive replay pipeline – First-person photo/video playback in panoramic and theater modes.
- Interactive recall prompts – In-VR Q&A interactions for memory reinforcement.
Caregiver App
- Next.js web app – Caregiver-facing configuration and monitoring dashboard.
- People & location management – Manage face profiles, reminders, and spatial anchors.
- Memory timeline – Review captured moments and AI interactions.
- System synchronization – Real-time sync with patient device and backend services.
┌─────────────────┐ ┌─────────────────┐ ┌─────────────────┐
│ RAY-BAN META │────▶│ CLOUD BACKEND │────▶│ META QUEST │
│ (Daytime) │ │ (GCP/Gemini) │ │ (Nighttime) │
│ │ │ │ │ │
│ • See & Hear │ │ • Understand │ │ • Immerse │
│ • Whisper Help │ │ • Remember │ │ • Reinforce │
│ • Capture │ │ • Protect │ │ • Connect │
└─────────────────┘ └─────────────────┘ └─────────────────┘
│ │ │
└───────────────────────┴───────────────────────┘
│
┌─────────────────┐
│ CAREGIVER APP │
│ (Next.js) │
│ │
│ • Configure │
│ • Monitor │
│ • Respond │
└─────────────────┘
Challenges we ran into
Developing for the Ray-Ban Meta glasses presented several challenges: (1) the limited SDK functionality restricted access to buttons, on-glass display, and sensors, making it difficult to design features that seamlessly integrate into everyday life; (2) scarce developer resources, including minimal sample code, documentation, and tutorials, slowed down the learning process; and (3) integrating real-time inference, audio output, and cross-device communication required careful handling of latency, permissions, and platform constraints to ensure a reliable, responsive system.
Accomplishments that we’re proud of
We are proud to have built a fully integrated, multi-device XR prototype in such a short timeframe. We identified a way to perpetually collect context and store and visualize this information to help serve as a second brain, unifying these capabilities into a coherent, human-centered system that provides social, memory, and safety support.
What we learned
Designing for memory and care requires careful attention to timing, tone, and restraint. The response should be just enough and not too much to overload the user. Furthermore, orchestrating across multiple different platforms can be very difficult, especially with 4 separate components.
What’s next for OurLife
Next, we plan to apply to Meta's AI glasses for social good fund. With further development, we can improve on latency, context, and expand on personalization tools for caregivers. We hope that we can use more of the Meta Ray-Bans with SDK improvements, and build for extended battery life as well.
In the long term, we aim to explore how just-in-time XR support can be safely deployed in real homes to support aging with dignity and independence.


Log in or sign up for Devpost to join the conversation.