MonkeySign: The ASL Learning Arcade Game
In our pursuit of expanding our skill set and exploring new territories, our team embarked on an ambitious project that combined machine learning, frontend development, and American Sign Language (ASL). Despite the challenges we faced, our passion and enthusiasm for the idea fueled our drive to successfully implement this unique and engaging concept!
Project Overview
MonkeySign is an innovative, game-like platform designed to teach American Sign Language (ASL) in a fun and interactive manner. Inspired by MonkeyType, a popular touch-typing training tool, we sought to create a similarly engaging experience for ASL learners, emphasizing education and social good.
Core Features
MonkeySign utilizes real-time hand gesture detection to recognize ASL letters signed by the user. As the user successfully signs each letter, they progress through the game, enhancing their ASL skills in an enjoyable and captivating environment.
Technical Details
Frontend
We built the frontend using Vite and TailwindCSS, ensuring a smooth and responsive user interface.
Backend
Our backend leverages Flask and socket.io for seamless communication between the frontend and the machine learning model.
Detection Algorithms
To accurately detect and recognize ASL letters, we employed Handtrack.js for hand detection and a custom Convolutional Neural Network (CNN) model, specifically trained on ASL data for letter recognition.
Challenges & Solutions
Throughout the development process, we encountered several obstacles, including:
- Raspberry Pi Camera issues: Initially, our project involved using a Raspberry Pi Camera, which malfunctioned during the hackathon. We pivoted our idea to create a more engaging arcade-like experience instead.
- Model training difficulties: Our team spent 12 hours attempting to train a single object detection model. Eventually, we opted to use two separate models, one for hand detection and another for letter recognition.
- Bounding box coordination: Integrating bounding boxes with webcam data on the frontend proved to be challenging but was successfully implemented after multiple iterations. ## Proud Accomplishments We are particularly proud of our ability to integrate the frontend, backend, and Raspberry Pi components, resulting in a cohesive and engaging ASL learning arcade game.
Lessons Learned
Throughout this project, we gained valuable experience in training object detection machine learning models, working with live data, and employing computer vision techniques.
Future Development for MonkeySign
Our vision for MonkeySign includes several exciting enhancements:
- Multiplayer functionality: We aim to create a more collaborative and enjoyable learning experience by incorporating multiplayer features.
- Raspberry Pi integration: We plan to fix the Raspberry Pi issues to deliver a unique arcade-like booth experience for our users.
- Model improvements: We will continue refining the machine learning model to enhance its accuracy and adaptability under various conditions.
Built With
- fast-api
- flask
- javascript
- pip
- python
- react
- socket.io
- tensorflow
- vscode





Log in or sign up for Devpost to join the conversation.