Inspiration
The inspiration came from the desire to learn about sophisticated software without the massive financial burden that comes with premium hardware in drones. So the question arose, what if the software could be utilized on any drone and made it available open source.
What it does
The software allows any drone to track facial movement and hand gestures and to fly and move keeping the user in the center of the frame, this can be utilized at multiple different levels! We aim out technology to help and develop the experience of photographers with the hands-off control, and decrease the barrier to entry to drone by making it simpler to use.
How we built it
We mainly used python with the help of libraries and frameworks like PyTorch, YoloV8, MediaPipe, OpenCV, tkinter, PIL, DJITello etc.
Challenges we ran into
While implementing hand-gesture commands, we had a setback and faced an unsolved problem (yet). The integration between face recognition, hand recognition, drone functions etc. was harder than we anticipated to since it had a lot of moving parts that we needed to connect. None of us had any UI experience so creating the interface was a challenge too.
Accomplishments that we're proud of
We have implemented pivot-tracking and move-tracking features. Pivot-tracking allows user to make the drone stationary while turning its axis to follow the user. Move-tracking is basically having your drone on a hands-free leash (it follows you anywhere you go)!
We implemented a accurate hand gesture recognition, although we are yet to implemented new functions attached to the gestures.
A lot of the framework was brand new information to us but we were still able to learn it and create a functional software.
What we learned
Understanding project scope and what can be done in limited time was an important lesson for us that we will definitely take moving forward. We learnt a lot of new frameworks like MediaPipe, YoloV8, DJITello, thinkter, PIL
What's next for Open Droid
Adding functions attached to the hand gestures, adding a sling shot feature etc. Since our hand recognition software can detect two hands, the left hand will control the mod of the drone (low fly, high fly, slow fly, fast fly, default) and the right hand will control functions (go back, come closer, circle around me, slingshot, land on my hand etc.)
After accomplishing these goals we would like to make the software more user friendly and open source!
Log in or sign up for Devpost to join the conversation.