Inspiration

High school is a fleeting chapter of our lives. At senior sunrise, I remember feeling a wave of sadness realizing how quickly those moments slip away. Here at Hack the North, we wanted to create something that ensures no one has to watch their peak experiences fade into memory alone. RORA was born from the belief that students, friends, and families deserve the power to relive their most meaningful moments.

What it does

RORA is an AI-powered memory preservation platform that transforms everyday photos and videos into immersive VR experiences. Users can capture, generate, and explore their memories in a virtual world, complete with narration, 3D reconstructions, and shareable experiences that feel as real as the day they happened.

How we built it

We combined a Flutter mobile app for capturing and managing memories with a Unity-based VR environment for immersive playback. Google Gemini Veo powers AI-generated videos, AWS S3 ensures secure cloud storage, and VAPI provides real-time narration. Photogrammetry with COLMAP brings static images to life as 3D environments, while FastAPI services tie it all together into a seamless system.

Challenges we ran into

Integrating large AI video generation pipelines with limited storage and bandwidth was a constant battle. Creating smooth VR playback on the Meta Quest 3 from diverse user content pushed our Unity and optimization skills to their limits. Finally, balancing performance and immersion without sacrificing user experience took rounds of iteration.

Technical Obstacles:

Built COLMAP from source with CUDA support (required SQLite3, Ceres, graphics libs) Configured headless GPU servers for OpenGL-dependent CV tools (xvfb, virtual displays) Worked around single-GPU limitations in academic software despite having 8 H100s Debugged file path mismatches between COLMAP output and Instant-NGP input Solved NeRF-to-mesh artifacts through parameter tuning and post-processing

Also, we rebuilt core tools - standard COLMAP packages lacked CUDA support. We deconstructed COLMAP and AliceVision source code to enable GPU acceleration on H100 architecture

Solution: Automated pipeline handling COLMAP feature extraction → GPU matching → bundle adjustment → NeRF training → multi-quality mesh export with artifact removal.

Accomplishments that we're proud of

We’re proud to have built an end-to-end platform in just one weekend that connects mobile, AI, cloud, and VR into one cohesive product. Seeing our first memory come alive on the Meta Quest 3 was a moment we’ll never forget. It felt like we had bottled time itself.

What we learned

We learned how to combine cutting-edge AI tools with human emotion, and how powerful technology becomes when it preserves what truly matters: our memories. We also grew in handling large-scale data, VR optimization, and user-centered design.

What's next for RORA

Next steps include multiplayer VR experiences so friends can relive moments together, more advanced photorealistic reconstructions, and integration with wearable devices for automatic memory capture. Ultimately, our goal is fairly simple. We want to ensure that no memory worth keeping is ever lost.

Built With

Share this project:

Updates