FlexSonic

Design a smart glove that translates hand gestures into audible speech, empowering communication for the speech-impaired.


Project Domains Project Mentors Project Difficulty
ESP-IDF, Machine Learning, Embedded C, PCB Designing Bhavesh Phundhkar, Yash Suthar, Swanand Patil Hard

Project Description

The FlexSonic project challenges you to build a wearable device that converts sign language and hand gestures into audible speech. This smart glove will be equipped with flex sensors to detect finger movements and an Inertial Measurement Unit (IMU) to capture wrist and hand orientation.
An ESP32 microcontroller will process this stream of sensor data in real-time. By leveraging machine learning, the system will learn to recognize specific gestures and map them to predefined words or phrases. The final output will be synthesized speech, played through a speaker, creating a seamless gesture-to-speech translation system.


Resources

MPU6050
Flex Sensor Datasheet
DfPlayer