Inspiration
We wanted to create a seamless way to explore location-based insights and social media content using cutting-edge AR and VR technologies. By integrating AI and geospatial tools, we sought to offer users an interactive experience that connects them with relevant and meaningful information tied to their environment. Our goal was to reimagine how people engage with past events, social trends, and places.
What it does
MindBubble introduces "Foxy," an AI-powered AR companion that delivers location-based insights through interactive bubbles. Users can ask Foxy for information about a specific location or topic. The app generates contextualized social media-like posts and groups them into categories displayed as bubbles around the user. By interacting with these bubbles, users can explore detailed posts and even locate real-world places where events or activities were tagged.
How we built it
The system runs on Azure, leveraging an Azure Function to handle keyword and GPS-based requests. We integrated OpenAI’s ChatGPT to generate social media-like content and grouped it dynamically for display. We developed two client applications:
- A Meta Quest 3 app using passthrough for mixed reality.
- A phone-based AR app using Google Geospatial VPS for precise positioning.
Both applications render interactive bubbles and place Foxy as the user's guide in their AR or VR environment.
Challenges we ran into
Integrating with real social media APIs proved to be complex, with many platforms requiring subscriptions or company verification. To overcome this, we pivoted to leveraging ChatGPT to generate relevant content. Additionally, ensuring precise geolocation accuracy for AR and handling device-specific optimizations were challenging but rewarding. Since the Meta Quest does not have build in GPS we needed another way to get coordinates. We used a public Webservice to figure out our IP address and convert it into rough coordinates. The first time working with Lens Studio (Snap) was difficult. With the Spectacles, we encountered issues with developing with Lens Studio. Our Microsoft computer's Intel GPU Driver was not compatible with the studio so we had to override it which caused difficulties with using our other programs, like Adobe. We managed to create some animated visuals, but adding the real thing was not possible in time.
Accomplishments that we're proud of
- Successfully developed two working clients (Quest 3 and phone AR apps) that deliver consistent and immersive experiences.
- Implemented an Azure function which is available as a URL endpoint.
- Seamlessly integrated ChatGPT to generate dynamic, contextualized social media-like content.
- Leveraged Google Geospatial VPS to achieve precise AR placement for location-based information.
- Created an intuitive and engaging interface with Foxy as the central user companion.
- Designed low-fidelity prototypes of bubble visualizations using the Snap Spectacles in Lens Studio.
What we learned
- The complexities of working with real-world geolocation and AR technologies.
- How to optimize Azure functions and ChatGPT for dynamic, real-time content creation.
- The importance of user experience design in making complex technologies feel natural and engaging.
What's next for MindBubble
We aim to enhance the service by integrating real-time data sources, such as public events and live reviews, while exploring verified social media API partnerships. Expanding Foxy’s capabilities, including richer voice interaction and personalization, is also on the roadmap. Additionally, we plan to refine the VR experience, adding global virtual travel scenarios and more detailed 3D environments for exploring tagged locations.
See it in Action
- Phone AR: https://youtube.com/shorts/nTmskXDfWy0
- Quest MR: https://youtu.be/IwBJprK_vJM
- Snap Spectacles AR: https://www.youtube.com/watch?v=WEUaV4Hj4Zo


Log in or sign up for Devpost to join the conversation.