Inspiration

Power Pet was inspired by lived experience with neurodivergence, research on autism, and working directly with autistic individuals, where we saw how sensory regulation, predictability, and emotional support can dramatically affect focus and comfort. Traditional productivity tools often increase stress instead of reducing it, especially for people with sensory sensitivities.

We were also inspired by calming digital experiences like Spirit City Lofi, which create gentle, low-pressure environments that make it easier to stay present and engaged. We wanted to bring that same calm into XR while making it tangible, not just visual.

According to research by NIH, companion animals can improve well-being and productivity by reducing stress and increasing emotional support, while also introducing practical drawbacks like care, cost, and unpredictability. A virtual-physical pet removes those barriers while preserving the benefits, making companionship accessible to more people (PMC11218162).

Power Pet combines VR, touch, and responsive hardware to create a companion that supports focus and emotional regulation without the limitations of owning a real pet.

What it does

Power Pet is a hybrid VR and physical companion that reacts to users in real time. Users interact with a virtual pet in Meta Quest 3 using controller free hand tracking, while a physical robotic pet mirrors the same movements in the real world. The pet would be able to respond to proximity, motion, and touch with expressive behaviors, helping users stay engaged without pressure. As users complete tasks or re-engage after distraction, the pet provides gentle emotional rewards such as movement, eye contact, or calming animations, turning productivity into a supportive, sensory-friendly experience rather than a stressful one. In addition to interaction, our goal is to allow Power Pet to passively collect non-invasive behavioral signals such as proximity, movement, touch duration, and re-engagement timing. This creates a lightweight dataset that can support exploratory research on how multi-sensory XR companions influence attention and emotional regulation over time, grounding the system in wellness-oriented design and research practices.

How we built it

  • VR experience on Meta Quest 3 built with A-Frame (WebXR), persistent anchors, hand tracking, and live WebSocket connectivity
  • Custom physical robotic arm, motors, and embedded systems that interface with VR, integrating Arduino Uno Q, stepper motors, servos, and WebSockets. Hybrid Linux & MCU workflow on Uno Q where Linux handles networking using Python, MCU handles real-time motor control, and WebSocket (socket.io) bridge to synchronize VR state and physical movement
  • Low stimulation intentional character designs and animation built in Blender/Maya with expressive eyes, emotions, and customizable features

Challenges we ran into

  • Arduino Uno Q’s hybrid Linux and MCU architecture required a completely new debugging approach and custom bridging
  • Traditional Arduino workflows did not work with networking, requiring use of App Lab and Web UI tools
  • Hardware failures including burnt motor drivers, stalling motors, unstable structures, and power issues
  • Designing expressive emotions within WebXR limitations
  • Synchronizing physical and virtual movement reliably without jitter
  • Ensuring the system stayed calm and predictable rather than overstimulating

Accomplishments that we're proud of

  • Successfully synchronized VR hand-tracked interaction with physical motion
  • Built a working WebXR to hardware pipeline using Arduino Uno Q
  • Created a tangible XR companion that users can touch and interact with
  • Designed a calming, accessible character system from scratch
  • Achieved real-time responsiveness across VR, networking, and robotics
  • Delivered a working hybrid XR and hardware system under hackathon constraints

What we learned

  • We learned how to achieve physical stability and software reliability on our prototype
  • WebXR enables incredibly fast iteration for spatial experiences
  • Hybrid systems require clear separation of networking and real-time control
  • Emotional design is technical because the behavioral response to each emotion needs to be engineered through state machines, sensor thresholds, and motor constraints to feel safe and intentional
  • Touch and other sensory channels affects how people relate to XR experiences

What's next for Power Pet

  • User can engage in guided meditation to ground themselves and return to tasks.
  • User can talk to pet through LLM chat, when they feel stuck
  • Shop to buy skins/accessories with earned points, so user’s progress unlocks meaningful rewards
  • Scent cue (aroma) added for another sensory channel to help user regulate.
  • Add real world context by incorporating thermo and movement modulino to detect “stress/comfort” proxy (such as temperature drop or fidgeting) and trigger calming behaviors.

Built With

Share this project:

Updates