A survey on Deep Learning Based Eye Gaze Estimation Methods
PDF
PDF

How to Cite

Sangeetha, S. K. B. 2021. “A Survey on Deep Learning Based Eye Gaze Estimation Methods”. Journal of Innovative Image Processing 3 (3): 190-207. https://doi.org/10.36548/jiip.2021.3.003.

Keywords

  • Computer vision
  • Deep learning systems
  • Eye gaze tracking and infrared image sensors

Abstract

In recent years, deep-learning systems have made great progress, particularly in the disciplines of computer vision and pattern recognition. Deep-learning technology can be used to enable inference models to do real-time object detection and recognition. Using deep-learning-based designs, eye tracking systems could determine the position of eyes or pupils, regardless of whether visible-light or near-infrared image sensors were utilized. For growing electronic vehicle systems, such as driver monitoring systems and new touch screens, accurate and successful eye gaze estimates are critical. In demanding, unregulated, low-power situations, such systems must operate efficiently and at a reasonable cost. A thorough examination of the different deep learning approaches is required to take into consideration all of the limitations and opportunities of eye gaze tracking. The goal of this research is to learn more about the history of eye gaze tracking, as well as how deep learning contributed to computer vision-based tracking. Finally, this research presents a generalized system model for deep learning-driven eye gaze direction diagnostics, as well as a comparison of several approaches.

References

K. Aravindhan, S. K. B. Sangeetha, K. Periyakaruppan, K. P. Keerthana, V. SanjayGiridhar and V. Shamaladevi, "Design of Attendance Monitoring System Using RFID," 2021 7th International Conference on Advanced Computing and Communication Systems (ICACCS), 2021, pp. 1628-1631, doi: 10.1109/ICACCS51430.2021.9441704.

K. Aravindhan, S. K. B. Sangeetha, K. Periyakaruppan, E. Manoj, R. Sivani and S. Ajithkumar, "Smart Charging Navigation for VANET Based Electric Vehicles," 2021 7th International Conference on Advanced Computing and Communication Systems (ICACCS), 2021, pp. 1588-1591, doi: 10.1109/ICACCS51430.2021.9441842.

B. Amos, B. Ludwiczuk, and M. Satyanarayanan, Openface: A general-purpose face recognition library with mobile applications, CMU School of Computer Science, Openface, 2016.

Arsenovic, Marko & Sladojevic, Srdjan & Stefanović, Darko & Anderla, Andras. (2018). Deep neural network ensemble architecture for eye movements classification. 1-4. 10.1109/INFOTEH.2018.8345537.

Brousseau, Braiden & Rose, Jonathan & Eizenman, Moshe. (2020). Hybrid Eye-Tracking on a Smartphone with CNN Feature Extraction and an Infrared 3D Model. Sensors. 20. 543. 10.3390/s20020543.

Chen, Shuo & Liu, Chengjun. (2015). Eye Detection Using Discriminatory Haar Features and A New Efficient SVM. Image and Vision Computing. 33. 10.1016/j.imavis.2014.10.007.

Chinsatit, Warapon & Saitoh, Takeshi. (2017). CNN-Based Pupil Center Detection for Wearable Gaze Estimation System. Applied Computational Intelligence and Soft Computing. 2017. 1-10. 10.1155/2017/8718956.

Dubey, Neeru & Ghosh, Shreya & Dhall, Abhinav. (2019). Unsupervised Learning of Eye Gaze Representation from the Web. 1-7. 10.1109/IJCNN.2019.8851961.

Fuhl, Wolfgang & Santini, Thiago & Kasneci, Gjergji & Kasneci, Enkelejda. (2016). PupilNet: Convolutional Neural Networks for Robust Pupil Detection.

George, Anjith & Routray, Aurobinda. (2016). Real-time eye gaze direction classification using convolutional neural networks. 1-5. 10.1109/SPCOM.2016.7746701.

Gou, Chao & Wu, Yue & Wang, Kang & Wang, Kunfeng & Ji, Qiang. (2017). A Joint Cascaded Framework for Simultaneous Eye Detection and Eye State Estimation. Pattern Recognition. 67. 23–31. 10.1016/j.patcog.2017.01.023.

Haoxiang, Wang, and S. Smys. "Overview of Configuring Adaptive Activation Functions for Deep Neural Networks-A Comparative Study." Journal of Ubiquitous Computing and Communication Technologies (UCCT) 3, no. 01 (2021): 10-22.

Kar, Anuradha & Corcoran, Peter. (2017). A Review and Analysis of Eye-Gaze Estimation Systems, Algorithms and Performance Evaluation Methods in Consumer Platforms. IEEE Access. PP. 1-1. 10.1109/ACCESS.2017.2735633.

Khan, & Lee,. (2019). Gaze and Eye Tracking: Techniques and Applications in ADAS. Sensors. 19. 5540. 10.3390/s19245540.

Kim, Hyunjun & Jo, Jaeik & Toh, K.A. & Kim, Jaihie. (2016). Eye detection in a facial image under pose variation based on multi-scale iris shape feature. Image and Vision Computing. 57. 10.1016/j.imavis.2016.10.003.

Klaib, Ahmad & Alsrehin, Nawaf & Melhem, Wasen & Bashtawi, Haneen & Magableh, Aws. (2021). Eye tracking algorithms, techniques, tools, and applications with an emphasis on machine learning and Internet of Things technologies. Expert Systems with Applications. 166. 114037. 10.1016/j.eswa.2020.114037.

Lian, Dongze & Hu, Lina & Luo, Weixin & Xu, Yanyu & Duan, Lixin & Yu, Jingyi & Gao, Shenghua. (2018). Multiview Multitask Gaze Estimation With Deep Convolutional Neural Networks. IEEE Transactions on Neural Networks and Learning Systems. PP. 1-14. 10.1109/TNNLS.2018.2865525.

Li, Wenyu & Dong, Qinglin & Jia, Hao & Zhao, Shijie & Wang, Yongchen & Xie, Li & Pan, Qiang & Duan, Feng & Liu, Tianming. (2019). Training a Camera to Perform Long-Distance Eye Tracking by Another Eye-Tracker. IEEE Access. 7. 1-1. 10.1109/ACCESS.2019.2949150.

Lemley, Joseph & Kar, Anuradha & Drimbarean, Alexandru & Corcoran, Peter. (2018). Efficient CNN Implementation for Eye-Gaze Estimation on Low-Power/Low-Quality Consumer Imaging Systems.

Lemley, Joseph & Kar, Anuradha & Corcoran, Peter. (2018). Eye Tracking in Augmented Spaces: A Deep Learning Approach. 1-6. 10.1109/GEM.2018.8516529.

Lindén, Erik & Sjöstrand, Jonas & Proutiere, Alexandre. (2019). Learning to Personalize in Appearance-Based Gaze Tracking.

Lemley, Joseph & Kar, Anuradha & Drimbarean, Alexandru & Corcoran, Peter. (2019). Convolutional Neural Network Implementation for Eye-Gaze Estimation on Low-Quality Consumer Imaging Systems. IEEE Transactions on Consumer Electronics. PP. 1-1. 10.1109/TCE.2019.2899869.

Naqvi, Rizwan & Arsalan, Muhammad & Batchuluun, Ganbayar & Yoon, Hyo & Kang, Ryoung & Park,. (2018). Deep Learning-Based Gaze Detection System for Automobile Drivers Using a NIR Camera Sensor. Sensors (Basel, Switzerland). 18. 10.3390/s18020456.

Palmero, Cristina & Komogortsev, Oleg & Talathi, Sachin. (2020). Benefits of temporal information for appearance-based gaze estimation.

Rakhmatulin, Ildar & Duchowski, Andrew. (2020). Deep Neural Networks for Low-Cost Eye Tracking. Procedia Computer Science. 176. 10.1016/j.procs.2020.09.041.

Reddy, Tharun & Gupta, Vinay & Behera, Laxmidhar. (2019). Autoencoding Convolutional Representations for Real-Time Eye-Gaze Detection. 10.1007/978-981-13-1135-2_18.

Sangeetha, S. (2020). Machine Learning Tools for Digital Pathology-The Next Big Wave in Medical Science. Solid State Technology, 63(4), 3732-3749.

S.K.B. Sangeetha et al (2021).An empirical analysis of machine learning frameworks for digital pathology in medical science.J. Phys.: Conf. Ser. 1767 012031

Sangeetha, S. K. B., Dhaya, R., & Kanthavel, R. (2019). Improving performance of cooperative communication in heterogeneous manet environment. Cluster Computing, 22(5), 12389-12395.

Sharma, Riti & Savakis, Andreas. (2015). Lean histogram of oriented gradients features for effective eye detection. Journal of Electronic Imaging. 24. 063007. 10.1117/1.JEI.24.6.063007.

Stember, Joseph & Celik, H & Krupinski, E & Chang, P & Mutasa, S & Wood, Bradford & Lignelli, A & Moonis, G & Schwartz, L & Jambawalikar, Sachin. (2019). Eye Tracking for Deep Learning Segmentation Using Convolutional Neural Networks. Journal of Digital Imaging. 32. 10.1007/s10278-019-00220-4.

Sun, Hsin-Pei & Yang, Cheng-Hsun & Lai, Shang-Hong. (2017). A Deep Learning Approach to Appearance-Based Gaze Estimation under Head Pose Variations. 935-940. 10.1109/ACPR.2017.155.

Sungheetha, Akey, and Rajesh Sharma. "Design an Early Detection and Classification for Diabetic Retinopathy by Deep Feature Extraction based Convolution Neural Network." Journal of Trends in Computer Science and Smart technology (TCSST) 3, no. 02 (2021): 81-94.

F. Timm and E. Barth, “Accurate eye centre localisation by means of gradients,” Visapp11, pp. 125–130, 2011.

Vora, Sourabh & Rangesh, Akshay & Trivedi, Mohan. (2017). On generalizing driver gaze zone estimation using convolutional neural networks. 849-854. 10.1109/IVS.2017.7995822.

Wang, Kang & Wang, Shen & Ji, Qiang. (2016). Deep eye fixation map learning for calibration-free eye gaze tracking. 47-55. 10.1145/2857491.2857515.