Abstract
EchoGesture Communication revolutionizes the interaction of differently-abled individuals using hand gestures. People with disabilities often face difficulties in using the conventional electronic gadgets. The proposed study, utilizes sensors, microcontroller, computer vision, and machine learning, to enable real-time recognition of hand gestures, facilitating effective communication. Additionally, Convolutional Neural Network (CNN) is used in the research to achieve accurate gesture recognition. The proposed system allows individuals with disability to communicate effectively using hand gestures.
References
- Elmezain, Mahmoud, Ayoub Al-Hamadi, and Bernd Michaelis. "Hand trajectory-based gesture spotting and recognition using HMM." In 2009 16th IEEE international conference on image processing (ICIP), Cairo, Egypt. IEEE, 2009. 3577-3580
- Mitra, Sushmita, and Tinku Acharya. "Gesture recognition: A survey." IEEE Transactions on Systems, Man, and Cybernetics, Part C (Applications and Reviews) 37, no. 3 (2007): 311-324.
- Cheng, Hong, Lu Yang, and Zicheng Liu. "Survey on 3D hand gesture recognition." IEEE transactions on circuits and systems for video technology 26, no. 9 (2015): 1659-1673.
- Van den Bergh, Michael, and Luc Van Gool. "Combining RGB and ToF cameras for real-time 3D hand gesture interaction." In 2011 IEEE workshop on applications of computer vision (WACV), IEEE, 2011. 66-72.
- Ohn-Bar, Eshed, and Mohan Manubhai Trivedi. "Hand gesture recognition in real time for automotive interfaces: A multimodal vision-based approach and evaluations." IEEE transactions on intelligent transportation systems 15, no. 6 (2014): 2368-2377.
- Nefian, Ara V., Luhong Liang, Xiaobo Pi, Xiaoxing Liu, and Kevin Murphy. "Dynamic Bayesian networks for audio-visual speech recognition." EURASIP Journal on Advances in Signal Processing 2002 (2002): 1-15.
- Wu, Ying, and Thomas S. Huang. "Vision-based gesture recognition: A review." In International gesture workshop, Berlin, Heidelberg: Springer Berlin Heidelberg, 1999. 103-115
- Rautaray, Siddharth S., and Anupam Agrawal. "Vision based hand gesture recognition for human computer interaction: a survey." Artificial intelligence review 43 (2015): 1-54.
- Ramey, Arnaud, Victor Gonzalez-Pacheco, and Miguel A. Salichs. "Integration of a low-cost RGB-D sensor in a social robot for gesture recognition." In Proceedings of the 6th international conference on Human-robot interaction, Lausanne, Switzerland, 2011.229-230.
- Liang, Rung-Huei, and Ming Ouhyoung. "A real-time continuous gesture recognition system for sign language." In Proceedings third IEEE international conference on automatic face and gesture recognition, Nara, Japan. IEEE, 1998. 558-567
- Mathias, Kolsch, and Matthew Turk. "Robust hand detection." In Sixth IEEE International Conference on Automatic Face and Gesture Recognition, 2004. Proceedings., Seoul, Korea (South), IEEE Computer Society, 2004. 614-614.
- Liwicki, Stephan, and Mark Everingham. "Automatic recognition of fingerspelled words in british sign language." In 2009 IEEE computer society conference on computer vision and pattern recognition workshops, Miami, FL, USA, IEEE, 2009. 50-57
- Keskin, Cem, Furkan Kıraç, Yunus Emre Kara, and Lale Akarun. "Real time hand pose estimation using depth sensors." Consumer Depth Cameras for Computer Vision: Research Topics and Applications. Advances in Computer Vision and Pattern Recognition. Springer, London. (2013): 119-137.
- Murthy, G. R. S., and R. S. Jadon. "A review of vision based hand gestures recognition." International Journal of Information Technology and Knowledge Management 2, no. 2 (2009): 405-410.
- Ren, Zhou, Junsong Yuan, and Zhengyou Zhang. "Robust hand gesture recognition based on finger-earth mover's distance with a commodity depth camera." In Proceedings of the 19th ACM international conference on Multimedia, Scottsdale Arizona USA. 2011. 1093-1096.
