Inspiration
Many people grow up being told what to aim for rather than being given the space to discover what truly resonates with them. Parents, teachers, social expectations, and economic pressure often shape life and career decisions before individuals ever get the chance to explore their own interests and strengths. Over time, this leads to misalignment, burnout, and a quiet sense of dissatisfaction.
We were inspired by the concept of Ikigai — the intersection of what you are good at, what you enjoy, what the world needs, and what provides value. Today, discovering this alignment is slow, expensive, and inaccessible to most people. Dream.ee came from the question: what if self-discovery could be experiential, intuitive, and personal instead of abstract and prescriptive?
What it does
Dream.ee is a VR-based exploratory platform that generates adaptive worlds to help users understand what genuinely engages them. Users enter immersive environments on Meta Quest and explore freely, without instructions or predefined goals.
As users interact with the world, Dream.ee captures behavioral signals such as time spent on objects, repeated interactions, movement patterns, and exploration depth. These signals are analyzed in real time to infer what types of activities, challenges, and environments resonate with the user. Based on this, the system dynamically generates new worlds that better align with the user’s demonstrated interests.
The Experience Flow
A session begins with a EEG setup UI, and a button to start vocally prompting the AI to generate the first world.
While the user prompts and explores the world, the system continuously tracks the EEG response of the user in the background. A lightweight analytics loop processes this data to identify engagement patterns and detect when interest begins to drop.
When the system determines that the user is ready for something new, a background agent brings up the UI to ask the user to prompt the AI for the next world. The new world is then loaded and seamlessly swapped in, creating a continuous and personalized exploration experience.
How we built it
Dream.ee was built using Unity on Meta Quest 3 for the VR runtime, and the Brain Bit SDK2 for Unity for integration with the Brain Bit EEG sensor (which is a type of brain-computer interface). It used a lightweight FastAPI backend, and worlds were represented as structured WorldSpec JSON objects, which allows us to dynamically generate, load, and swap environments at runtime.
An in-memory analytics service scores engagement based on dwell time and interaction frequency. A simple boredom detector triggers the generation of the next world, which is created by a creator agent and sent back to the headset to be loaded without breaking immersion.
Challenges we ran into
One major challenge was identifying meaningful engagement signals without relying on heavy sensors or complex models. We had to design interaction metrics that were simple, robust, and expressive enough to guide world generation.
Another challenge was maintaining immersion during world transitions. Dynamically generating and swapping worlds in real time required careful handling of object lifecycles, preloading, and timing. We also had to balance building a minimal proof of concept while keeping the system flexible enough to support future multimodal inputs.
One more challenge was Unity integration with the BCI. The BCI was labelled Brain Bit 1, but it turned out to be Brain Bit 2, which needs a whole different set of backend functions to be sensed, connected and used for data stream from the user's brain.
Accomplishments that we're proud of
We built a complete end-to-end loop where user behavior directly influences content generation in real time. The system can detect engagement, generate a new tailored world in the background, and seamlessly swap environments in VR.
We are also proud of how lightweight and modular the system is. Despite its simplicity, it clearly demonstrates the power of experiential, behavior-driven self-discovery.
What we learned
We learned that meaningful personalization does not require perfect models — clear signals and tight feedback loops are often enough. We also gained insight into designing adaptive VR systems that respond in real time without breaking immersion.
Most importantly, we learned that giving users freedom, rather than direction, surfaces far more authentic signals about who they are and what they enjoy.
What's next for Dream.ee
Next, we plan to incorporate additional multimodal signals such as voice tone and EEG-derived attention proxies to deepen our understanding of engagement. We also want to expand the richness of generated worlds by introducing more complex mechanics, narratives, and longer-term progression.
Ultimately, we envision Dream.ee as a platform for experiential self-discovery — one that helps people align their motivation, effort, and aspirations with what truly resonates with them.
Built With
- bash
- c#
- eeg-headband
- eleven-labs
- javascript
- meta-quest-3
- python
- react-3-fiber
- spark.js
- typescript
- worldlabs-api


Log in or sign up for Devpost to join the conversation.