TY - JOUR AU - Muthu Mariappan H AU - Dr Gomathi V PY - 2021/06/16 Y2 - 2024/03/29 TI - Indian Sign Language Recognition through Hybrid ConvNet-LSTM Networks JF - EMITTER International Journal of Engineering Technology JA - EMITTER Int'l J. of Engin. Technol. VL - 9 IS - 1 SE - Articles DO - 10.24003/emitter.v9i1.613 UR - https://emitter.pens.ac.id/index.php/emitter/article/view/613 AB - Dynamic hand gesture recognition is a challenging task of Human-Computer Interaction (HCI) and Computer Vision. The potential application areas of gesture recognition include sign language translation, video gaming, video surveillance, robotics, and gesture-controlled home appliances. In the proposed research, gesture recognition is applied to recognize sign language words from real-time videos. Classifying the actions from video sequences requires both spatial and temporal features. The proposed system handles the former by the Convolutional Neural Network (CNN), which is the core of several computer vision solutions and the latter by the Recurrent Neural Network (RNN), which is more efficient in handling the sequences of movements. Thus, the real-time Indian sign language (ISL) recognition system is developed using the hybrid CNN-RNN architecture. The system is trained with the proposed CasTalk-ISL dataset. The ultimate purpose of the presented research is to deploy a real-time sign language translator to break the hurdles present in the communication between hearing-impaired people and normal people. The developed system achieves 95.99% top-1 accuracy and 99.46% top-3 accuracy on the test dataset. The obtained results outperform the existing approaches using various deep models on different datasets. ER -