Inspiration
In support of the U.S. Senate Committee on Health, Education, Labor and Pensions' public hearing at Gallaudet University, they claimed that "Across all age groups... .22% of the population are deaf".
Oftentimes, people with disabilities are left in the dust, with limited opportunities to enhance their careers. With Hack rover.tech, we aim to provide inclusive opportunities for the deaf marginalized group.
What it does
Through computer vision, our rover can detect hand signs, specifically, the American Sign Language. By following specific hand patterns, the rover can follow the user's inputs and traverse an environment.
The overall goal of this project was to allow for space exploration, while also allowing the deaf community to take advantage of the experience.
In addition, with the implementation of SONR, we are able to have self-contained motors per rover "Mesh Healing" that traverse different geographical environments, such as the dark side of the moon. Through network identification, the rovers will be able to communicate with one another, map out geographical regions, and transmit file storage.
Lastly, with our website implementation, users can see all of the rovers that are offline and online, along with their location, current task, storage capacity, focused goals, and rankings!
How we built it
Tears and Blood
With a toy car kit, we began to start the physical rover, however, due to the lack of materials, we decided to lean on a computer vision approach. Through python, we were able to use tensor flow to implement hand gesture recognition. With this implementation, we were able to train a deep learning model to recognize specific hand gestures. These hand gestures would then be used to mobilize a specific rover.
Built with Sonr
Which is an ecosystem of decentralized peer-to-peer applications built on top of a Cosmos-powered blockchain. The ecosystem includes tools to empower developers to build decentralized applications.
Challenges we ran into
Conceptualizing our Idea
Due to restraints in utilizing the SONR SDK, we focused on creating a theory of how to utilize the entire IoT network as a whole. This included networking several rovers running on the SONR Blockchain to create their own secure mesh network. Enabling our rovers to operate indecently even if the application connected to them goes down. Utilizing uniquely ID'd rovers, we could daisy chain them to potentially increase the range of any network across the ground and to send data just about anywhere the rover could traverse.
Having enough time to test all of our data
Due to constraints of the weekend, we couldn't apply enough tests to implement enough of our machine learning tests to create fully customized hand gestures.
Updating the SONR SDK in real-time
Unfortunately, the SDK's build was not working to support the responsiveness we needed to accurately update a dashboard in real-time. This limited us to creating a conceptualized dashboard of data, documentation, and user authentication. As well as including the ability to see the status of each individual rover.
Accomplishments that we're proud of
Having trained a deep learning neural network with the use of Computer Vision to detect American Sign Language (ASL) was a great accomplishment as our team is relatively new to this!
What we learned
We learned how to implement neural networks and obtain a %90+ accuracy on our American Sign Language Neural Network.
What's next for Hack rover.tech
We would absolutely love to finish the rover and hardware component of the hack! With multiple rovers, we could show the actual implementation on specific terrains!
Built With
- deep-learning
- go
- google-cloud
- hardawre
- image-processing
- javascript
- led
- machine-learning
- node.js
- python
- raspberry-pi
- react
- sonr
- tensor






Log in or sign up for Devpost to join the conversation.