Inspiration

The inspiration came from wanting to give AR experiences tangible, real-world presence. While AR is immersive, it's still confined to the digital realm - we wanted to break that barrier and let users interact with physical objects through their Spectacles, creating a true bridge between the virtual and physical worlds.


What it does

Buns is an AR-controlled hardware assistant powered by Snap Spectacles. Users can pinch ground positions in AR space, and Buns autonomously navigates to that location in the real world. The system uses:

  • Ground plane detection to set navigation targets
  • Pathfinding algorithms that calculate angles and distances
  • BLE communication between Spectacles and ESP32-controlled stepper motors
  • Autonomous navigation with turn-and-move logic to reach the target position

How we built it

Hardware Stack:

  • Snap Spectacles
  • ESP32 microcontroller as BLE bridge
  • 2 Arduino Unos controlling 4 stepper motors (28BYJ-48 with ULN2003 drivers)
  • 4-wheel differential drive robot platform

Software Stack:

  • TypeScript in Lens Studio for Spectacles AR interface
  • BLE GATT protocol for wireless communication (custom service/characteristic UUIDs)
  • Arduino C++ for motor control logic
  • Ground plane interaction using Spectacles' spatial mapping
  • Custom pathfinding algorithm with angle calculation (atan2), turn alignment (5° threshold), and distance tracking

Development Process: Built iteratively from gesture control → motor control → ground interaction → autonomous navigation.


Challenges we ran into

  1. Motor direction debugging - Stepper motors had inverted directions that required extensive testing and flipping of individual motor polarities across multiple Arduinos
  2. Touch/tap detection - Initial gesture detection methods failed; had to experiment with InteractionComponent, TapGestureModule, and ultimately ground plane pinching
  3. BLE communication reliability - Managing string-based command protocol and ensuring consistent data transmission
  4. Coordinate system mapping - Translating AR space coordinates to robot movement commands while accounting for relative positioning
  5. Pathfinding calibration - Tuning turn angles (degrees per command) and distance per step for accurate navigation

Accomplishments that we're proud of

  • Seamless AR-to-hardware pipeline - Successfully bridged Spectacles AR input to physical robot movement via BLE
  • Autonomous ground-target navigation - Robot can pathfind to user-selected positions without manual control
  • Multi-Arduino coordination - Orchestrated 4 stepper motors across 2 Arduinos with synchronized commands
  • Intuitive gesture interface - Natural pinch-on-ground interaction feels magical and requires no learning curve
  • Working prototype from scratch - Built entire system from concept to functional demo using official Snapchat samples as foundation

What we learned

  • BLE protocol implementation and GATT services for IoT communication
  • Stepper motor control patterns and timing considerations
  • Spectacles Lens Studio development and spatial computing APIs
  • Coordinate system transformations between AR and physical space
  • Pathfinding algorithms (angle calculation with atan2, turn-align-move logic)
  • Hardware debugging techniques for motor polarity and direction issues
  • The importance of iterative testing when bridging digital and physical systems

What's next for Buns

  • Obstacle avoidance using ultrasonic sensors
  • Voice commands for more complex interactions ("Buns, go to the kitchen")
  • Object manipulation - Add a gripper arm for pick-and-place tasks
  • Multi-robot coordination - Control multiple Buns units simultaneously
  • Visual feedback - LED indicators on the robot showing status
  • Improved pathfinding - A* algorithm for complex route planning around obstacles
  • Persistent memory - Save favorite locations and patrol routes
  • Computer vision integration - Let Buns recognize and respond to objects/people in its environment

Built With

Share this project:

Updates