✨ Inspiration

It all started with badges.

People already wear physical pins or stickers to show who they are — their name, role, interests, even their "social battery." We loved that idea, but thought: what if we could take it further?

What if these badges weren’t stuck to your chest… but floating above your head in AR?
And what if we could do it for everyone — not just once, but in a living, digital layer that updates in real time?

That’s how BeaconAR was born: a wearable social signal for the real world, designed for connection.


🚀 What it does

BeaconAR adds floating AR badges above people’s heads using Snap Spectacles.

These badges show:

  • Name
  • What you do
  • What you’re here for (chat, collab, hiring, etc.)
  • Whether you're open to connect

You can also high-five someone and instantly sync — a fun, quick gesture to connect. And once you do, you’ll be able to find them again later. It’s both a conversation starter and a memory keeper.


🛠️ How we built it

  • Built for Snap Spectacles
  • Used Spectacles Interaction Kit for UI and gestures
  • Used Sync Kit to share connection status in real time
  • Integrated hand tracking for high-five detection

All synced between users wearing Spectacles, no phone interaction required during the moment. Just look around and connect.


🧩 Challenges we ran into

The biggest challenge? Testing.

We only had one headset and were working remotely from different cities. That made real-time, multiplayer AR development especially tough. Debugging interactions, syncing hand gestures, and validating presence — all had to be faked, simulated, or tested late at night when we could finally sync up.

Also: Snap currently doesn’t support syncing 100+ headsets in a space — so for now, large-scale AR social interaction is still a dream. But we believe that’s where this is headed, and we’re building for it.


🏆 Accomplishments that we're proud of

  • A fully working AR prototype
  • Badges update live based on user status
  • High-five gesture syncing feels natural and fun
  • The whole thing runs cleanly on-glasses, hands-free

And above all — it feels like something real. You can put the headset on, look around, and the room just… makes more sense.


📚 What we learned

  • Got deep into Snap’s SDKs, especially Sync Kit and Interaction Kit
  • Learned the limitations of Spectacles when it comes to scaling up multiplayer
  • Realized we’ll need things like face recognition or shared spatial anchors to go further
  • Found new ways to think about presence, proximity, and intention in physical space

🔮 What's next for BeaconAR

From our presentation deck, here’s where we’re going next:

  • ✅ Start a dialogue with the Snap team about scalability and multi-user syncing
  • 🧠 Explore pathways to support shared AR states and large-scale interactions
  • 🛠️ Refine batch logic, last-status syncing, and opt-in behavior
  • 🧰 Develop organizer tools for badge setup and live moderation
  • 🧪 Run small event pilots to test with real users and iterate fast

We're excited to push BeaconAR further — not just for conferences, but for any moment where people come together in real life and want to connect better.

Built With

  • chatgpt
  • god's-help
  • lens-studio
Share this project:

Updates