Inspiration

On a moonlit night by a serene lake in Germany, a blind girl finds comfort. A passerby, moved by her presence, seeks to share the beauty of their surroundings. Deeply inspired, he later composes the Moonlight Sonata in tribute to her unseen world.

As we later found out, this story is purely fictional, but the idea of giving vision to those who lack it inspired our team of 3 to create a new and innovative device for the visually impaired-- a robot guide dog built on state-of-the-art computer vison foundations.

What it does

Our robot is designed to be a new way for the visually impaired to perceive the world. It can be controlled by the user via a controller for maximum accessibility. It utilizes top of the line depth sensing technology-- a highly specialized camera which can sense the depth in a scene at extreme precision. We utilize this technology by sensing what's near to the guide dog and what's far, and give audio output to the user to direct them on where to go. Additionally, the dog is capable of describing its surroundings to a high degree of accuracy, allowing for the user to be guided thoroughly and walk with confidence.

How we built it

We divided the workload equally into robotics design, component integration and A.I. implementation. Although we worked is relatively separate domains for the initial part of this project, in the latter end of the hackathon our works melded together. The robot dog itself was built using a number of 3D-printed parts, with a RaspberryPi 5 and Arduino Mini at its core. The RP5 managed a number of more complex functions while the Arduino acted as our primary communication with the dog's motors. To connect the controller, we utilized RP5's integrated Bluetooth functionality and serial communication with the Arduino mini. Using the Intel Realsense Camera, we are able to automatically adjust the dog's line of path to avoid obstacles. We also used an Open A.I. key to Text to Speech what the dog can see in front of it.

HARDWARE: -Intel Realsense Camera -RaspberryPi 5 -RaspberryPi 4B+ -Arduino Mini -PS4 Controller -4 Motors -3d Printed Bed SOFTWARE: -opencv -chatgpt -pyrealsense2 -python -flask

Challenges we ran into

We went into this challenge blind. We brought all that we could fit in the car and hoped for the best. Our initial ideation phase went swimmingly, but we soon came to realize that our productivity was suboptimal and our coordination was poor. Over the course of the hackathon, we refined our roles and cemented our scope. We faced extremely difficult software and hardware problems, such as recompiling pyrealsense2 from source on ARM architecture and our RPI frying. Even though we were able to overcome these major setbacks, tragically, our only motor driver was burned twenty minutes before the submission time, forcing us to submit an unfinished product.

Accomplishments that we're proud of

Our tech stack is very advanced and required our highest effort to try to debug.

What we learned

Hackathons are a team effort, if just one of us is lazy or unproductive, it will bring down the whole team. In the beginning, our team coordination was poor and we paid the price for it later in the hackathon when we had to bring everything together and coordinate.

Don't burn your motor driver!

What's next for MoonlightRobota?

The project could have gone smoother. Generally, our time was wasted sometimes and we made many pitfalls.

Despite our tribulations, we're still adamant on creating accessible hardware and solving real world problems. We will not forget our accomplishments in future hackathons and in our careers.

Built With

Share this project:

Updates