Inspiration
Communication is a basic human need, yet millions of deaf and mute individuals face daily challenges expressing themselves. GloveCom was inspired by the desire to bridge this communication gap using affordable technology.
What it does
GloveCom is a smart wearable glove that translates sign language gestures into real-time text and speech. Flex sensors detect finger movements, process the data using embedded logic and machine learning, and send the output to a mobile app via Bluetooth.
How we built it
The glove is built using an ESP32 microcontroller and flex sensors to capture hand gestures. The data is transmitted via Bluetooth Low Energy (BLE) to a Flutter-based mobile application, where it is displayed as text and converted into speech using text-to-speech technology.
Challenges we ran into
Calibrating flex sensors accurately, handling gesture variations, and ensuring stable Bluetooth communication were major challenges. Optimizing real-time response was also difficult.
What we learned
We gained hands-on experience in embedded systems, IoT, Bluetooth communication, Flutter development, and assistive technology design.
What's next for GloveCom
Future plans include adding AI-based gesture recognition, multi-language support, cloud syncing, and making the glove more compact and affordable.
Built With
- arduino
- bluetooth-low-energy-(ble)
- dart
- esp32
- flex-sensors
- flutter
- machine-learning
- text-to-speech-(tts)

Log in or sign up for Devpost to join the conversation.