Inspiration
One of our teammates had spoken to a fire chief, who let him know about the danger of sending a person to collect data on a fire (to let others know what kind of help the firefighters would send in). Based on that story, we believe that there is an opportunity to prevent people from risking their lives to obtain data. This airplane drone-based solution prevents the loss of life while also gathering relevant data that helps firefighters save the lives of those affected by the fire.
What it does
We ideated two different versions of the drone, both designed to be able to gain information about a flight environment quickly and cheaply. The smaller model, which we demoed, will fly in and out of a fire zone and use the mounted esp-32 microcontroller to video the scene and identify where a fire is occurring and at what intensity. The fire department can then gauge where to allocate resources. It uses its camera to detect the presence of red shades and determine the prevalence of the flame at a given location in its flight.
We also developed a depth sensing system using stereoscopic cameras, which was too heavy to directly test on the smaller model, but would work on a larger version. We programmed a system to create a complex depth map and point cloud using two cheap cameras, which could allow the drone to map an entire environment, including the locations of flames, debris, and other obstacles, in no time at all.
How we built it
We obtained a rubber band-powered glider, and then attached a motor and 3d-printed propeller on it. Then we attached the esp-32 so that we could do our onboard processing.
Challenges we ran into
We had some issues getting the propeller to print correctly, but were able to resolve it by iterating on its design. In addition, the math for the stereoscopic mapping was complicated, but we persevered and achieved a working real-time depth map.
Accomplishments that we're proud of
We're proud that the drone flies, and our payload records and transmits information, and has a future in saving lives.
What we learned
We learned how to program and manage microcontrollers, how to wire electronics on a mobile device, how to solder, and even implementing computer vision in python and C++.
What's next for AIRPLANE
The next steps are to use stronger motors and bigger wings in order to put the stereoscopic cameras. With more advanced microcontrollers and maneuverability, the possibilities in computer vision are endless--especially with regard to applications of computer vision and mobility.
Built With
- c++
- esp-32
Log in or sign up for Devpost to join the conversation.