Volume - 4 | Issue - 1 | march 2022
DOI
10.36548/jaicn.2022.1.004
Published
16 April, 2022
A sign language recognition system for low-resource Sinhala Sign Language using Leap Motion (LM) and Deep Neural Networks (DNN) has been presented in this paper. The study extracts static and dynamic features of hand movements of Sinhala Sign Language (SSL) using a LM controller which acquires the position of the palm, radius of hand sphere and positions of five fingers, and the proposed system is tested with the selected 24 letters and 6 words. The experimental results prove that the proposed DNN model with an average testing accuracy of 89.2% outperforms a Naïve Bayes model with 73.3% testing accuracy and a Support Vector Machine (SVM) based model with 81.2% testing accuracy. Therefore, the proposed system which uses 3D non-contact LM Controller and machine learning model has a great potential to be an affordable solution for people with hearing impairment when they communicate with normal people in their day-to-day life in all service sectors.
KeywordsSinhala Sign Language Deep Neural Networks Naïve Bayes Support Vector Machine Hand Gesture Recognition Sign Language Classification Leap Motion Controller