Inspiration

zr-map started as my graduate thesis, where the goal was to integrate real time data to visualize physical environments in XR. I wanted to explore a new market which is the integration of Real Time Analytics and XR Market. I wanted to build an XR system where users can stand inside a digital twin of a city, stadium, or campus and see the real world update around them, without physically being there.

What it does

Users can travel to any environment and view real time data. zr-map is a multiplayer application that lets users teleport across global environment and venues and explore real-time data visualization and media streams in photorealistic environments. In the application, users can:

  • View live freeway traffic (Google Directions API)
  • See real-time weather (Meteosource)
  • Read IoT telemetry (Cisco LoRaWAN sensors → Azure SQL)
  • Watch 360° venue footage
  • Navigate between digital twins (stadiums, industrial facilities, campuses)
  • Interact with AI avatars capable of context-aware dialogue using uploaded knowledge banks The application runs on Meta Quest and supports 100+ simultaneous users.

How we built it

The architectural overview illustrating the end-to-end integration process. 3D environments were generated using OpenStreetMap coordinates, Google Photorealistic Tiles, Blender (Blosm), ArcGIS SDK, and Unity. IoT sensors streamed telemetry to Azure IoT Hub, processed through Stream Analytics into SQL. Early versions connected Unity directly to SQL for rapid iteration.

  • To fetch traffic and weather, I built lightweight Unity controllers: UnityWebRequest req = UnityWebRequest.Get(url); var delay = leg.duration_in_traffic.value / leg.duration.value; string status = delay > 1.3f ? "Heavy" : delay > 1.1f ? "Moderate" : "Smooth";

  • Weather followed a similar pattern: string api = $"{baseUrl}?lat={lat}&lon={lon}&sections=current&key={apiKey}"; WeatherData data = JsonUtility.FromJson(json).current;

  • Teleportation was implemented through ray-based XR Interaction Toolkit logic: if (rayHitValid) XROrigin.MoveCameraTo(worldDestination);

  • Multiplayer used Unity Netcode, character IK, and synchronized teleports so all users see each other moving between environments in real time.

  • To capture IoT from direct SQL Integration, the following SQL was used from Azure. SELECT TOP (2) JSON_VALUE (telemetry, '$. temperature.value') AS temperature, JSON_VALUE (telemetry, '$.humidity.value') AS humidity DATEADD (SECOND, CAST(timestamp AS BIGINT)/1000, '1970-01-01’) AS readable_timestap, FROM [dbo].[sensordata] ORDER BY readable_timestamp DESC;

  • For media streams, we utilize 360 footage which was stored in Azure Blob Storage and streamed into Meta Quest.

Challenges we ran into

There were some challenges faced, since this was a niche software which involved integrating real time data to XR. Some of the challnges were:

  • Meta Quest firewalls blocked direct SQL and local FastAPI traffic entirely. I had to rebuild the pipeline around cloud-friendly HTTPS endpoints.
  • Cloud API management Azure Functions pricing rose to $2,251/month, even when idle.
  • 3D model baking of digital twins with Simplygon was financially unrealistic, forcing a hybrid workflow of open-source models and manual generation.
  • UnityWebRequest + XR networking constraints meant I had to implement caching fallbacks (“last known good telemetry”) to avoid broken data streams.

Accomplishments that we're proud of

  • Published a fully documented technical thesis detailing the complete end-to-end architecture for integrating IoT, cloud, and real-time APIs into XR—serving as a reproducible blueprint for future digital-twin systems.
  • Delivered a functional XR prototype to beta testers, who highlighted the realism of the environments, the responsiveness of the data overlays, and the usefulness of teleporting across multiple live venues.
  • Engaged stadium operations teams, including management at BMO Stadium, who expressed interest in piloting the application for venue monitoring, fan engagement, and real-time event visualization.

What we learned

XR is only compelling when data is alive. Users instantly pick up on stale or missing telemetry. I also learned that cloud costs, not engineering difficulty, often decide architectural choices

What's next for zr-map

Live 360° streaming from stadiums Publishing on the Meta Quest Store WebXR version for mobile access Partnerships with venues for real-time fan experiences Expanding to industrial digital twins and global cities Expand team members

Share this project:

Updates