Abstract: Hand gestures are a type of non-verbal communication that can be utilized in many contexts, such as deaf-mute communication. The automatic interpretation of sign language is a research area that has not gotten much attention, despite the fact that it is essential for hearing impaired and silent persons to live independent lives as sign language is their primary mode of communication. Numerous methods and algorithms have been created in this field with the help of artificial intelligence and image processing. To recognize the signs and translate them into the necessary patterns, any system that understands sign language has undergone considerable training. This will help deaf people to communicate with the outside world easily. This proposed technique helps vocally disabled people to communicate.

Keywords: Human-Machine Interaction, Gesture Recognition, Machine Learning, Neural Networks, Convolutional Neural Network.


PDF | DOI: 10.17148/IJARCCE.2023.12567

Open chat
Chat with IJARCCE