Inspiration
Our inspiration was really a culmination of two ideas. Distractions are everywhere and technology is constantly trying to grab attention. Truly being present in the moment is difficult when a cellphone is there. Even if cell phones are turned off and turned face down, their mere presence reduces people’s cognitive capacity (Ward et al, 2017)
Stable Diffusion can make some really cool images through AI, even getting rid of unwanted objects intruding on a photograph. Using those two ideas we created a product that can remove distracting objects from view. Your phone can sit on your desk for when you need it, but you cannot see it and thus cannot be distracted by it. Works Cited: Brain Drain: The Mere Presence of One’s Own Smartphone Reduces Available Cognitive Capacity Adrian F. Ward, Kristen Duke, Ayelet Gneezy, and Maarten W. Bos, Journal of the Association for Consumer Research 2017 2:2, 140-154
What it does
Diminished Reality was envisioned as a presence assistant, allowing you to better focus by blocking out tech distractions. Using a mixed reality headset, we created a product that can remove distracting objects from view. Your phone can sit on your desk for when you need it, but you cannot see it and thus cannot be distracted by it. While it was not the focus of our presentation, this concept can be expanded to help block other unwanted objects from view. Not seeing cigarettes or alcohol if you are trying to avoid them can be a helpful tool to allow people to be more successful in making the life changes they want. For ethical reasons, we do not allow any blocking of living creatures, humans or animals. We wanted people to be able to “out of sight, out of mind” distracting objects, but not vulnerable communities - allowing the rich to never see homeless people is more dystopian than protopian. Here is a quick trailer presentation of our concept.
How we built it
Tech stack Computer Vision Figma HTC Vive XR Elite Machine Learning OpenCV Python Stable Diffusion Unity
Challenges we ran into
We needed a color passthrough headset to bring our vision to life. We used the HTC Vive XR Elite headset which had color passthrough, but as it was only released a few days ago, its SDK was not well documented. The HTC team was incredible helpful but we had to constantly ask questions and figure out different workarounds to bring our dream to reality. We initially planned to use object tracking to track objects in the world, but the headset was limited to object tracking stationary objects on the floor. We made our design come to life with the work-around of having the table be the “floor” and only tracking stationary objects. Getting the headset to connect to Unity was a bit more complicated than expected. Stable Diffusion connected with a few APIs to find the right model to work with. Was able to run it on the computer’s GPU for basic image generation but kept running out of memory for the inpainting functionality we needed. We did manage to get it working and eliminated objects from view as we wanted, but its processing speed was slow. We didn’t have enough time to properly optimize and figure out a faster way to use this in time, so we switched to OpenCV to blur the image at the last minute. This wasn’t ideal, but was “good enough” for the time given.
Accomplishments that we're proud of
Fusing multiple technologies together using mixed reality and bleeding edge hardware. Mixed reality is the next "thing" an we worked closely with the HTC team to test their brand new device to be used with new used cases.
We proved the potential for AI in AR/VR, fusing them together in an interesting fashion. We built entirely new pipelines, end-to-end, for state of the art AI in-painting in Mixed Reality.
What we learned
As always - MVP is the real MVP of any hack, minimum viable product is the most valuable player. We had grand dreams and want to continue pushing them in the next steps for Diminished Reality, but our bare bones approach still showcased what the concept can do. Being on the bleeding-edge of hardware is always tricky, good documentation often follows hardware releases as opposed to preceding them. Hack with new tech at your own peril!
What's next for Diminished Reality
We love how we managed to remove objects from view, but we still want to use Stable Diffusion if we can get the proper performance. This will allow for more seamless and realistic object removal. We want to be able to use world object tracking to allow for optional trigger removal. We could potentially automatically remove alcohol, cigarettes, or other triggering objects from the world.
Log in or sign up for Devpost to join the conversation.