Inspiration
Haptic Definition is inspired by our team's passion for hardware. With our cutting edge suit and gloves, anyone can immerse themselves with what they see on the screen. We wanted to build immersion anywhere as people are limited by the physical constraints that 4D offers such as seating and venue.
What it does
Redefining the definition of HD as users are able to turn any video into an interactive experience. Users are able to upload any mp4 file onto our website and experience the sensations depicted in the video by putting on a suit and gloves that are both laced with vibration motors and heating pads.
How we built it
Our hardware is custom-built, featuring ESP32 microcontrollers, temperature pads, vibration motors, and a hand-stitched suit and glove combo. This involved an immense amount of soldering wires to connect all of the motors together. The heat pads were sewn onto the insides of our gloves and we sewed pockets to keep the motors in place. We crimped wires to connect the vest to the gloves, allowing for a full-body integration. Meanwhile, our backend tech comprises of Google Gemini 1.5 to conduct video analysis that determines what sensations a viewer should feel and when. We also feature Bluetooth and WebSocket communication to bridge the gap between web and suit. Our stylish video viewing interface uses modern frontend technologies like Tailwind.
Challenges we ran into
Achieving Haptic Definition was no easy feat. The suit construction proved to be the most difficult part. Our vibration motors were thin and fragile, often snapping if pushed the wrong way. We also initially wanted to include cold temperatures as well, but realized we lacked proper heat sinks to deliver the chilliest experiences. Communication between suit and web ended up being far more convoluted than initially anticipated. An ESP32 has a difficult time connecting to a WebSocket server, forcing us to find an alternative solution: Bluetooth.
Accomplishments that we're proud of
We're proud of being able to create a project that highlighted hardware while still utilizing AI at the same time. As AI is the buzz nowadays, it's always interesting to see how we can implement it in new and unique ways. We're proud of connecting many different technologies to make this project come to reality as it's a blend of different hardware, AI, websockets, and frontend technologies.
What we learned
From the hardware side, we learned how to sew and improved upon our soldering skills. This was the first time we worked with vibration motors and heat pads so we had to find a solution to properly power all of our components which ended up being through the ESP32.
In regards to software, we learned how to utilize the Gemini API to analyze videos as it is a new feature added to Gemini 1.5 Pro. It was interesting to learn all the different ways to cut videos into frame, as some methods are way more efficient than others
What's next for Haptic Definition
We want to see full-immersion a reality. The next step is to add more features to accomplish this goal such as adding a ptc thermistor to simulate cold temperatures. the Haptic Definition is the first of its kind. It’s a suit unlike any other. While other products may exist for games, this is the first that can be used for movies and videos on the spot.
Built With
- dc-vibration-motor
- esp32
- fastapi
- node.js
- python
- react
- sewing
- soldering
- tailwind
- typescript
- vite
Log in or sign up for Devpost to join the conversation.