Inspiration

We were fascinated by the question: "Does the internet dream of itself?", posed by Werner Herzog in his film Lo and Behold: Reveries of the Connected World.

Would the internet dream about an interconnected web of billions of people? Would it dream about the past, or the future?

Technology is in the realm of the human. It is created by us, populated by us, and is an extension of us.

We think that if the internet could dream, it would dream about us.

This philosophical prompt led us to explore how digital objects could flow between devices from the intimate space of your phone to an immersive shared VR environment.

Users swipe through objects on their phone while an Arduino pulse sensor monitors their heartbeat in real-time. This biometric data directly controls the visual effects in the internet's "dream world" - the room pulses and breathes with the user's heart. Through these actions, it is clear to the user that their purpose in this interaction is to provide data - both conscious choices (swipes) and unconscious signals (heartbeat).

What it does

Does the Internet Dream of Itself? creates a bridge between your iPhone, Arduino hardware, and Meta Quest headset. Users browse a deck of 3D objects on their phone (Tinder-style swiping interface), and when they swipe right, the object materializes in the VR space in real-time. Meanwhile, their heartbeat - captured by a pulse sensor - drives shader effects in the virtual environment, making the dream world respond to their physiological state. Grabbing objects in VR triggers room-wide animations, further blurring the line between user and environment. Multiple users can collaboratively populate a shared virtual dream world.

How we built it

  • VR Application: Unity 6 with OpenXR, XR Interaction Toolkit
  • Phone App: JavaScript/HTML/CSS wrapped in Capacitor for iOS, using Three.js for 3D model previews
  • Real-time Communication: Node.js WebSocket server relaying messages between phone and VR headset
  • Hardware Integration: Arduino with pulse sensor connected via WiFi (TCP), sending BPM and IBI data to Unity in real-time
  • Biometric Shaders: Custom shader effects that respond to heart rate, creating a breathing/pulsing environment
  • XR Interactions: Grabbable objects that trigger room animations when picked up
  • 3D Models: models rendered on both phone (preview) and VR (spawned objects)

Pitch Deck

Pitch Deck

Demo

Full Demo Video

Demo Video

30-Second Reel

30s Reel

Built With

Share this project:

Updates