Inspiration
The inspiration for BounceBeat came from the desire to combine music with 3D spatial experiences. We wanted users to create music in an entirely new way, using unconventional sound elements like glass breaking or dog barking, along with traditional instrument sounds. The idea was to let users explore musical creativity while interacting with virtual elements that would be difficult or impossible to experience in real life.
What it does
BounceBeat is an interactive Mixed Reality app where users create unique sound compositions by placing two types of blocks in the scanned environment. Each block can be assigned sounds, and if available, musical notes from a preset. A ball bounces between the blocks, generating specific sounds upon contact. Players can creatively arrange the blocks to design their own interactive soundscapes, blending musical notes with experimental noises.
How we built it
We built BounceBeat using XR technology with a focus on hand tracking, which allowed users to place blocks in the scanned environment. We utilized Unity and the Meta Presence Platform, along with Passthrough Mixed Reality, to facilitate interaction with the real world. The physics engine ensures that the ball bounces realistically between the blocks, triggering different sounds. Hand tracking was implemented using the Meta SDK. Scene understanding and depth perception enabled meaningful physical interactions with the environment, while snap functionality helped reset elements to their starting points.
Challenges we ran into
We faced several challenges during the development of BounceBeat. Snap and physics interactions did not work as smoothly as intended, which created issues with our initial 3D grid platform and affected the app's responsiveness. Additionally, the grid's implementation led to visual clutter and was not user-friendly, prompting us to make it an optional feature to enhance the overall user experience.
Accomplishments that we're proud of
Despite being a new team, we quickly started developing a working prototype. We’re also pleased with the variety of sounds included in the app, which enhances its creative possibilities. BounceBeat’s combination of music and spatial design reflects our team’s effort to explore new ideas.
What we learned
We gained valuable insights into the complexities of integrating physics-based gameplay with creative sound design in XR environments. Balancing these elements required careful consideration and adjustment. Additionally, we realized that our project planning could have been more structured. Better organization and foresight could have streamlined development and helped us address challenges more effectively.
What's next for Bounce Beat
In the future, we plan to add multiplayer functionality, allowing users to collaborate and create soundscapes together in real time. We’re also working on a customization feature, where users can upload their own sounds and design their own blocks. These additions will enhance creativity and offer even more engaging experiences for the players. We aimed to design a holo ball to visually demonstrate the ball’s trajectory and how it interacts with the blocks, helping users better anticipate and control the ball’s movement. Although we planned to implement this feature, time constraints prevented its inclusion in the current version. It remains a close future plan to enhance user experience.
Additionally, we’re exploring the possibility of incorporating AI assistance. When users want to create more complex or challenging melodies and are struggling to find the right path, the AI assistant could help guide them, suggesting block placements and sound combinations to achieve their desired musical outcome. This feature would further enrich the experience, making it accessible to both beginners and advanced users.

Log in or sign up for Devpost to join the conversation.