Inspiration

Four foodies, one shared problem: The ordering experience at restaurants is stuck in the past and ripe for disruption.

As Augmented Reality (AR) enthusiasts, using AR to bring the conventionally 2D menu into the real world made sense. With more sensors and power, such as LiDAR scanners, 3D scanning capabilities, and computer vision, packed into mobile devices than ever before, now is a uniquely viable time for mobile AR to transform the food ordering space.

With a diverse background of AR, iOS, and UX development on our team, we felt confident in delivering this product!

What it does

New World Order has customer and restaurant facing apps, and for this hackathon, we built the customer-facing side.

Our Augmented Reality iOS App allows customers to see a menu of a given restaurant, and select items to view their 3D representation in AR, directly on their table. This helps them gauge portion size and learn much more than they would from an image or text.

If they like what they see, they can add it to their table and order directly from the app!

How we built it

We developed a concept that we decided was feasible, applicable, and useful based on our personal experiences with going to restaurants and ordering food.

We brainstormed UI and user flows using whiteboarding and wireframing in Figma, developed branding to represent the app visually, and prototyped everything in several directions (design, front-end, and AR implementation).

We built the app in XCode using Swift, and implemented ARKit to power the Augmented Reality experience. We employed UIKit to build the on-screen UI (e.g. menu lists, buttons, etc.) and SceneKit to manage our 3D assets and place them in AR space, while handling effects like realistic lighting and shadows.

We used photogrammetry software (MagiScan) and created Neural Radiance Fields (Luma AI) to capture 3D USDZ scans of various food items.

To position augmented reality objects precisely on the table at the exact spot a user’s phone was pointing, we leveraged Ray Casting to query the camera view for horizontal plane feature points and detect whether virtual objects were already in that space. We also use this to help users avoid placing multiple items on top of each other at the same location on a table.

Challenges we ran into

One of the biggest challenges we ran into was USDZ files (the most supported format for iOS applications) not having real world scale. They often appeared far too large in AR, and for an app that aims to show users an accurate AR rendition of their food, real-world scale is critical. However, we were able to determine that the scanning software we were using was multiplying the real-world scale by a fixed multiple, and scaled our assets down to adjust.

Originally, we attempted to use RealityKit and Reality Composer to handle our AR content. However, we found that there wasn’t enough flexibility for our needs, and the AR experience itself was slightly jittery. We optimized with a late-pivot to SceneKit, an older but more robust 3D engine which allowed for a stable AR experience without the limitations of RealityKit.

Following that pivot, we faced challenges with the Ray Casting feature. We were initially unable to tie a ray casting query to a specific AR Object we wanted to tie to the scene. We ultimately worked around the problem by removing the anchor auto-updater, making it easier for the program to anchor the object in one place on the table and not continually use ray-casting to check or attempt to move it’s position.

Accomplishments that we're proud of

We started this hackathon just over 24 hours ago with truly nothing. We were brand new to each other as teammates and started the ideation process from scratch, thinking of a broad range of ideas before solidifying our concept on the middle of the second day and getting to work.

Given those hurdles and delays, we’re very proud that we could deliver a working prototype and video in time! It certainly came with a lot of work; It’s 7am as I write this, and not because we woke up early!

Even more than that though, the biggest accomplishment of the past 36 hours have been our team dynamics. Despite the stressful environment, our group of 4 managed to stay laughing, learning, and having a great time the whole way through. That’s what these events should be about after all!

What we learned

For half our team, this was the first experience working with Augmented Reality and smartphone AR. And for all of us, our first experience with SceneKit and 3D scanning. We honed our skills in Swift, jogged our memory on UIKit, and sharpened our Figma sword.

But tools and technologies are transient. Values last forever. This hackathon reinforced in all of us the value of a prototyping, adaptive mindset. We had to fight the urge to keep going deeper into one approach if it wasn’t working after a few hours, and instead re-strategize to determine if there was a different tactic we could try instead. Pulling our own heads out of the weeds like that wasn’t easy, but it’s an essential skill for an engineer to have.

This hackathon also embodied the adage “Success doesn’t bring fun, fun brings success”. We weren’t hyper-focused on trying to “win” or “out-compete” everyone, but instead just wanted to have a good time and learn along the way. We ended up having a much better team dynamic because of it, which allowed us to support one another through road bumps in the hacking process and ultimately make far more progress than any of us could have imagined.

What's next for New World Order

We all believe this idea has legs and want to continue developing it outside of HackSC. On the horizon for a New World Order, we’ll be refining our customer-facing experience, adding more APIs and connecting w/ a complete backend, making it easier for restaurants to load items and allow users to pay for food directly through the AR environment. Additionally, we’ll continue polishing our UI and app flows to make a new technology like AR easier for first time users to navigate.

On the roadmap for customer-side, we want to add ways to customize and learn more about their food in an interactive and fun way. For example, instead of swiping a slider on a 2D app to change the spice level, a customer may be able to shake a chili pepper over their AR food to make it more or less spicy. Such methods of boosting interactivity not only raise engagement between a customer and our client businesses, but also improve the customer’s ordering experience by making it more fun.

To round out the MVP, we’ll build a restaurant-facing portal where owners can create 3D scans of their food using their smartphone (and a bunch of AI behind the scenes) and add it to items on their menu.

And after that, our New World Order will be looking at becoming a new company as we look for our seed round of funding! :)

Built With

Share this project:

Updates