Inspiration

Our project derived from our shared experiences during Covid. Three of our teammates arrived from Austin yesterday, but our fourth teammate, Shrey, whom we had not seen since before Covid, arrived from Dallas. While we sat and caught up last night, we talked about the loneliness and isolation of COVID only and how video calls and screens often didn’t do justice to genuine social interaction essential to life. This led us to ideate concepts relating to communication platforms and how VR could be a new avenue to explore solutions. Finally deciding on SuperPosition, a product that allows for VR-enabled shared interaction whenever and wherever we found ourselves pursuing more than just communication, but a hack to innovate in other industries like healthcare and entertainment.

What it does

SuperPosition enables anyone to stream content to virtual reality with an iPhone button. This allows for a new micro-economy class, where people who don't have the time, money, or desire to attend a concert, sporting event, or club can enjoy these events wherever they desire. Imagine seeing a soccer or basketball game in VR, being in the midst of the action, and walking around the middle of a basketball court or soccer field. A few other use cases for this technology are related to telehealth - virtual doctor/therapy appointments could serve as an alternative to in-person visits, increasing accessibility and convenience for patients. Doctors or surgeons could see a patient or a surgery happening in 3D through VR, enabling them to offer their expert guidance better. This tech allows students to attend virtual classes, interact with their teachers and classmates in real time, and explore new interactive learning methods. SuperPosition could even be used in industries such as real estate, where virtual tours can be provided, allowing customers to get a feel for the property before visiting in person, saving time and effort.

How we built it

We spent a lot of time browsing through the Apple Developer Documentation, figuring out different ways to capture depth data. At first, we thought of processing an arbitrary video frame-by-frame using a pix2pix ML model such as mono-depth. Still, once we learned that recent iPhones (12 pro and above) have mid-range LiDAR built into their rear camera, we decided to create an iOS app that directly streams LiDAR data to Unity Game Engine for live streaming a 3D environment and make use of the latest advances in mobile LiDAR. We spent some time deciding on general system architecture. We decided to have the iPhone stream data to a server on Heroku, which would then forward the data to Unity, which could render it and build a mesh for better visualization.

Challenges we ran into

  • Reprojecting AVimage depth data on swift into XYZ coordinates was very difficult. There weren’t many examples online of converting raw depth data in a pixel buffer into a point cloud.
  • Downsampling video data to LiDAR resolution.
  • Generating meshes in Unity from irregular point-cloud data
  • Integrating our stack of very different technologies

Accomplishments that we're proud of

None of us had immense experience using swift, objective c, c#, or unity, so being able to create a product using these languages was great. We also collaborated and established a solid dev pipeline throughout our integration "hell."

What we learned

We learned a lot about the hardware in our phones and interfacing with LiDAR sensors. We learned new languages like C, C#, and Swift. We figured out how to Interface with Unity's Point Cloud and Mesh Lab libraries. It was also our first time using Web Sockets across multiple servers.

What's next for SuperPosition

It's a promising project with much exciting entertainment, health, travel, and sports applications. We will keep building and ensuring the application is scalable and secure. From there, we'll ship a full MVP in the next month and begin getting a few customers.

Share this project:

Updates