Inspiration
We've all tried air drumming to our favorite songs. But what if any surface could become a drum? What if your kitchen table, desk, or even textbooks could transform into a full drum kit?
We watched street performers turn buckets into instruments and got inspired. We wanted to democratize music creation by removing the barrier of expensive equipment. With just drumsticks, a webcam, and our app, anyone can start drumming anywhere.
What it does
VODKA (Virtual Online Drum Kit App) transforms any ordinary surface into a virtual drum kit using computer vision and accelerometer data.
the workflow:
- Point your webcam at any surface (table, floor, pillows). Our model segments the frame into distinct regions.
- Use our ESP32-powered drumsticks with motion sensors
- Hit any surface and hear appropriate drum sounds with velocity sensitivity
- Capture your performance and share it with friends!
How we built it
Hardware Stack:
- ESP32 microcontrollers (2x) - one per drumstick
- MPU6050 6-axis sensors
- Electrical tape CV/ML Pipeline: YOLOV8nano (drumstick tip detection) and FastSAM trained on material segmentation Backend: Flask, Python services, pygame.mixer Frontend: React + Vite, SocketIO
Challenges we ran into
- We've never touched hardware
- Pushed a commit that killed all processes somehow at some point
Accomplishments that we're proud of
- Got our hardware component to work!
- Working around a convoluted workflow (drum hit -> sound) with multiple ingestion streams
What we learned
- How to (not) solder stuff + hardware in general
- Surface segmentation is pretty hard
What's next for VODKA
- Should have probably hosted drumstick tip inference on Baseten since inference time is critical
Built With
- flask
- yolo


Log in or sign up for Devpost to join the conversation.