Inspiration
The recycling industry is facing a silent, explosive crisis. Every year, over 5,000 fires erupt in U.S. recycling facilities, fueled almost entirely by hidden lithium-ion batteries that residents mistakenly toss into the blue bin.
We call this "wish-cycling"—the hope that something is recyclable when it actually isn't. But this wishful thinking has a massive cost: it endangers workers, destroys infrastructure, and costs universities and cities billions in contamination fees.
We realized that once the trash is in the truck, it's too late. The only way to stop the fires (and save the money) is to catch the hazard at the rim. That’s why we built BinTek.
What it does
BinTek is an intelligent "smart lid" add-on that retrofits existing campus trash cans. It acts as a firewall for the waste stream:
Detects: An ultrasonic sensor detects when a user approaches or drops an item.
Analyzes: The system triggers a camera to capture the falling object in mid-air.
Identifies: Using EyePop.ai's computer vision, it instantly labels the object (e.g., "Lithium Battery," "Starbucks Cup," "Banana Peel").
Acts: (Future State) It physically rejects the item or flags the bin for immediate hazardous waste pickup, preventing the battery from ever reaching the compactor truck.
How we built it
We built BinTek as a hybrid of precise hardware and high-speed software.The HardwareArduino Uno: Handles the ultrasonic distance sensing. We implemented a dynamic calibration system that calculates the bin depth on startup to auto-ignore the bottom of the can.
Ultrasonic Physics: We utilized the speed of sound logic to determine object presence:$$Distance = \frac{Speed_{sound} \times Time_{echo}}{2}$$
Continuity Camera: We leveraged the high-quality optics of an iPhone connected to a Mac to bypass the poor quality of standard webcams.
The Software EyePop.ai: We integrated EyePop’s API to handle the heavy lifting of object detection and labeling. This allowed us to train our model on specific trash items rapidly. Python Controller: We wrote a custom multi-threaded controller script. Thread 1: Listens to the Arduino via Serial for the DROP signal. Thread 2: Runs an "Always-On" camera buffer to ensure we capture the frame with $<5ms$ latency.
Challenges we ran into
Fighting Gravity with "Smart Burst" Logic: Gravity is fast. By the time our camera woke up for a single shot, the trash had often already hit the bottom. We couldn't just take one picture; we had to take five. We implemented a high-speed Burst Shutter that captures a sequence of frames in under 200ms.
The Real Fix: We didn't just send all 5 images to the AI. We built a pre-processing algorithm to analyze the burst, identify which frame contained the object in clear view, and select only the "Best Frame" to feed into our identification algorithm.
Built With
- arduino
- fastapi
- postgresql
- python
- react
Log in or sign up for Devpost to join the conversation.