[go: up one dir, main page]
More Web Proxy on the site http://driver.im/ skip to main content
article

Pupil detection for head-mounted eye tracking in the wild: an evaluation of the state of the art

Published: 01 November 2016 Publication History

Abstract

Robust and accurate detection of the pupil position is a key building block for head-mounted eye tracking and prerequisite for applications on top, such as gaze-based human---computer interaction or attention analysis. Despite a large body of work, detecting the pupil in images recorded under real-world conditions is challenging given significant variability in the eye appearance (e.g., illumination, reflections, occlusions, etc.), individual differences in eye physiology, as well as other sources of noise, such as contact lenses or make-up. In this paper we review six state-of-the-art pupil detection methods, namely ElSe (Fuhl et al. in Proceedings of the ninth biennial ACM symposium on eye tracking research & applications, ACM. New York, NY, USA, pp 123---130, 2016), ExCuSe (Fuhl et al. in Computer analysis of images and patterns. Springer, New York, pp 39---51, 2015), Pupil Labs (Kassner et al. in Adjunct proceedings of the 2014 ACM international joint conference on pervasive and ubiquitous computing (UbiComp), pp 1151---1160, 2014. doi:10.1145/2638728.2641695), SET (Javadi et al. in Front Neuroeng 8, 2015), Starburst (Li et al. in Computer vision and pattern recognition-workshops, 2005. IEEE Computer society conference on CVPR workshops. IEEE, pp 79---79, 2005), and źwirski (źwirski et al. in Proceedings of the symposium on eye tracking research and applications (ETRA). ACM, pp 173---176, 2012. doi:10.1145/2168556.2168585). We compare their performance on a large-scale data set consisting of 225,569 annotated eye images taken from four publicly available data sets. Our experimental results show that the algorithm ElSe (Fuhl et al. 2016) outperforms other pupil detection methods by a large margin, offering thus robust and accurate pupil positions on challenging everyday eye images.

References

[1]
Braunagel, C., Kasneci, E., Stolzmann, W., Rosenstiel, W.: Driver-activity recognition in the context of conditionally autonomous driving. In: 2015 IEEE 18th International Conference on Intelligent Transportation Systems, pp 1652---1657 (2015).
[2]
Bulling, A., Ward, J.A., Gellersen, H., Tröster, G.: Eye movement analysis for activity recognition using electrooculography. IEEE Trans. Pattern Anal. Mach. Intell. 33(4), 741---753 (2011).
[3]
Bulling, A., Weichel, C., Gellersen, H.: Eyecontext: recognition of high-level contextual cues from human visual behaviour. In: Proceedings of the 31st SIGCHI International Conference on Human Factors in Computing Systems (CHI), pp. 305---308 (2013).
[4]
Douglas, D.H., Peucker, T.K.: Algorithms for the reduction of the number of points required to represent a digitized line or its caricature. Cartogr. Int. J. Geogr. Inf. Geovisualization 10(2), 112---122 (1973)
[5]
Fuhl, W., Kübler, T., Sippel, K., Rosenstiel, W., Kasneci, E.: ExCuSe: robust pupil detection in real-world scenarios. In: Azzopardi, G., Petkov, N. (eds.) Computer Analysis of Images and Patterns, Springer, New York, pp. 39---51 (2015)
[6]
Fuhl, W., Santini, T.C., K¿ubler, T., Kasneci, E.: Else: Ellipse selection for robust pupil detection in real-world environments. In: Proceedings of the Ninth Biennial ACM Symposium on Eye Tracking Research & Applications, ACM, New York, NY, USA, ETRA '16, pp 123---130 (2016)
[7]
Goni, S., Echeto, J., Villanueva, A., Cabeza, R.: Robust algorithm for pupil-glint vector detection in a video-oculography eyetracking system. In: Pattern Recognition, 2004. Proceedings of the 17th International Conference on ICPR 2004. IEEE (2004)
[8]
Javadi, A.H., Hakimi, Z., Barati, M., Walsh, V., Tcheang, L.: Set: a pupil detection method using sinusoidal approximation. Front. Neuroeng. 8, 4 (2015)
[9]
Jian, M., Lam, K.M.: Simultaneous hallucination and recognition of low-resolution faces based on singular value decomposition. Circuits Syst. Video Technol. IEEE Trans. 25(11), 1761---1772 (2015)
[10]
Jian, M., Lam, K.M., Dong, J.: A novel face-hallucination scheme based on singular value decomposition. Pattern Recognit. 46(11), 3091---3102 (2013)
[11]
Jian, M., Lam, K.M., Dong, J.: Facial-feature detection and localization based on a hierarchical scheme. Inf. Sci. 262, 1---14 (2014)
[12]
Kasneci, E.: Towards the automated recognition of assistance need for drivers with impaired visual field. PhD thesis, University of Tübingen, Tübingen (2013). http://tobias-lib.uni-tuebingen.de/volltexte/2013/7033
[13]
Kasneci, E., Sippel, K., Aehling, K., Heister, M., Rosenstiel, W., Schiefer, U., Papageorgiou, E.: Driving with binocular visual field loss? A study on a supervised on-road parcours with simultaneous eye and head tracking. Plos One 9(2), e87,470 (2014)
[14]
Kasneci, E., Sippel, K., Heister, M., Aehling, K., Rosenstiel, W., Schiefer, U., Papageorgiou, E.: Homonymous visual field loss and its impact on visual exploration: A supermarket study. TVST 3(6), 2 (2014)
[15]
Kassner, M., Patera, W., Bulling, A.: Pupil: an open source platform for pervasive eye tracking and mobile gaze-based interaction. In: Adjunct Proceedings of the 2014 ACM International Joint Conference on Pervasive and Ubiquitous Computing (UbiComp), pp. 1151---1160 (2014).
[16]
Kasneci, E., Kasneci, G., Kübler, T.C., Rosenstiel, W.: Artificial Neural Networks: Methods and Applications in Bio-Neuroinformatics, Springer International Publishing, chap Online Recognition of Fixations, Saccades, and Smooth Pursuits for Automated Analysis of Trac Hazard Perception, pp 411---434 (2015).
[17]
Keil, A., Albuquerque, G., Berger, K., Magnor, M.A.: Real-time gaze tracking with a consumer-grade video camera In: Vaclav, S. (ed.) WSCG'2010, February 1---4, 2010, UNION Agency---Science Press, Plzen (2010)
[18]
Li, D., Winfield, D., Parkhurst, D.J.: Starburst: A hybrid algorithm for video-based eye tracking combining feature-based and model-based approaches. In: Computer Vision and Pattern Recognition-Workshops, 2005. IEEE Computer Society Conference on CVPR Workshops. IEEE, pp. 79---79 (2005)
[19]
Lin, L., Pan, L., Wei, L., Yu, L.: A robust and accurate detection of pupil images. In: 3rd International Conference on Biomedical Engineering and Informatics (BMEI), 2010, IEEE, vol. 1, pp. 70---74 (2010)
[20]
Liu, X., Xu, F., Fujimura, K.: Real-time eye detection and tracking for driver observation under various light conditions. In: Intelligent Vehicle Symposium, 2002, IEEE, vol. 2, pp. 344---351. IEEE (2002)
[21]
Long, X., Tonguz, O.K., Kiderman, A.: A high speed eye tracking system with robust pupil center estimation algorithm. In: Engineering in Medicine and Biology Society, 2007. EMBS 2007. 29th Annual International Conference of the IEEE. IEEE (2007)
[22]
Majaranta, P., Bulling, A.: Eye Tracking and Eye-Based Human-Computer Interaction. Advances in Physiological Computing. Springer, London (2014).
[23]
Mohammed, G.J., Hong, B.R., Jarjes, A.A.: Accurate pupil features extraction based on new projection function. Comput. Inf. 29(4), 663---680 (2012)
[24]
Peréz, A., Cordoba, M.L., Garcia, A., Méndez, R., Munoz, M.L., Pedraza, J.L., Sanchez. F.: A precise eye-gaze detection and tracking system. In: Václav S. (ed.) WSCG `2003: Posters: The 11th International Conference in Central Europe on Computer Graphics, Visualization and Computer Vision 2003, pp. 105---108. UNION Agency, Plzen (2003)
[25]
Schnipke, S.K., Todd, M.W.: Trials and tribulations of using an eye-tracking system. In: CHI'00 Extended Abstracts on Human Factors in Computing Systems. ACM (2000)
[26]
Sippel, K., Kasneci, E., Aehling, K., Heister, M., Rosenstiel, W., Schiefer, U., Papageorgiou, E.: Binocular glaucomatous visual field loss and its impact on visual exploration--a supermarket study. PLoS One 9(8), e106,089 (2014).
[27]
Stellmach, S., Dachselt, R.: Look & touch: gaze-supported target acquisition. In: Proceedings of the SIGCHI Conference on Human Factors in Computing Systems, pp. 2981---2990. ACM (2012)
[28]
Sugano, Y., Bulling, A.: Self-calibrating head-mounted eye trackers using egocentric visual saliency. In: Proceedings of the 28th ACM Symposium on User Interface Software and Technology (UIST), pp. 363---372 (2015).
[29]
Suzuki, S., et al.: Topological structural analysis of digitized binary images by border following. Comput. Vis. Graphics Image Process. 30(1), 32---46 (1985)
[30]
¿wirski, L., Bulling, A., Dodgson, N.: Robust real-time pupil tracking in highly off-axis images. In: Proceedings of the Symposium on Eye Tracking Research & Applications (ETRA), pp. 173---176. ACM (2012).
[31]
Tafaj, E., Kübler, T., Kasneci, G., Rosenstiel, W., Bogdan, M.: Online classification of eye tracking data for automated analysis of traffic hazard perception. In: Artificial Neural Networks and Machine Learning, ICANN 2013, vol. 8131, pp. 442---450. Springer, Berlin--Heidelberg (2013)
[32]
Tonsen, M., Zhang, X., Sugano, Y., Bulling, A.: Labelled pupils in the wild: a dataset for studying pupil detection in unconstrained environments. In: Proceedings of the ACM International Symposium on Eye Tracking Research & Applications (ETRA), pp. 139---142 (2016).
[33]
Trösterer, S., Meschtscherjakov, A., Wilfinger, D., Tscheligi, M.: Eye tracking in the car: challenges in a dual-task scenario on a test track. In: Proceedings of the 6th AutomotiveUI. ACM (2014)
[34]
Turner, J., Bulling, A., Alexander, J., Gellersen, H.: Cross-device gaze-supported point-to-point content transfer. In: Proceedings of the ACM International Symposium on Eye Tracking Research & Applications (ETRA), pp. 19---26 (2014).
[35]
Valenti, R., Gevers, T.: Accurate eye center location through invariant isocentric patterns. Trans. Pattern Anal. Mach. Intell. 34(9), 1785---1798 (2012)
[36]
Viola, P., Jones, M.: Rapid object detection using a boosted cascade of simple features. In: Computer Vision and Pattern Recognition, 2001. Proceedings of the 2001 IEEE Computer Society Conference on CVPR 2001, vol. 1, pp. I---511. IEEE (2001)
[37]
Wood, E., Bulling, A.: Eyetab: Model-based gaze estimation on unmodified tablet computers. In: Proceedings of the 8th Symposium on Eye Tracking Research & Applications (ETRA), pp. 207---210 (2014).
[38]
Zhu, D., Moore, S.T., Raphan, T.: Robust pupil center detection using a curvature algorithm. Comput. Methods Progr. Biomed. 59(3), 145---157 (1999)

Cited By

View all
  • (2024)EyeTrAES: Fine-grained, Low-Latency Eye Tracking via Adaptive Event SlicingProceedings of the ACM on Interactive, Mobile, Wearable and Ubiquitous Technologies10.1145/36997458:4(1-32)Online publication date: 21-Nov-2024
  • (2024)Using Deep Learning to Increase Eye-Tracking Robustness, Accuracy, and Precision in Virtual RealityProceedings of the ACM on Computer Graphics and Interactive Techniques10.1145/36547057:2(1-16)Online publication date: 17-May-2024
  • (2024)Zero-Shot Segmentation of Eye Features Using the Segment Anything Model (SAM)Proceedings of the ACM on Computer Graphics and Interactive Techniques10.1145/36547047:2(1-16)Online publication date: 17-May-2024
  • Show More Cited By

Recommendations

Comments

Please enable JavaScript to view thecomments powered by Disqus.

Information & Contributors

Information

Published In

cover image Machine Vision and Applications
Machine Vision and Applications  Volume 27, Issue 8
November 2016
237 pages

Publisher

Springer-Verlag

Berlin, Heidelberg

Publication History

Published: 01 November 2016

Author Tags

  1. Computer vision
  2. Data set
  3. Head-mounted eye tracking
  4. Image processing
  5. Pupil detection

Qualifiers

  • Article

Contributors

Other Metrics

Bibliometrics & Citations

Bibliometrics

Article Metrics

  • Downloads (Last 12 months)0
  • Downloads (Last 6 weeks)0
Reflects downloads up to 22 Dec 2024

Other Metrics

Citations

Cited By

View all
  • (2024)EyeTrAES: Fine-grained, Low-Latency Eye Tracking via Adaptive Event SlicingProceedings of the ACM on Interactive, Mobile, Wearable and Ubiquitous Technologies10.1145/36997458:4(1-32)Online publication date: 21-Nov-2024
  • (2024)Using Deep Learning to Increase Eye-Tracking Robustness, Accuracy, and Precision in Virtual RealityProceedings of the ACM on Computer Graphics and Interactive Techniques10.1145/36547057:2(1-16)Online publication date: 17-May-2024
  • (2024)Zero-Shot Segmentation of Eye Features Using the Segment Anything Model (SAM)Proceedings of the ACM on Computer Graphics and Interactive Techniques10.1145/36547047:2(1-16)Online publication date: 17-May-2024
  • (2024)Deep face profiler (DeFaP)Expert Systems with Applications: An International Journal10.1016/j.eswa.2024.124425254:COnline publication date: 18-Oct-2024
  • (2024)Eye detection and coarse localization of pupil for video-based eye tracking systemsExpert Systems with Applications: An International Journal10.1016/j.eswa.2023.121316236:COnline publication date: 1-Feb-2024
  • (2023)Usability of Chat and Forum Discussion Tools in Higher EducationProceedings of the 35th Australian Computer-Human Interaction Conference10.1145/3638380.3638392(504-517)Online publication date: 2-Dec-2023
  • (2023)V-ir-Net: A Novel Neural Network for Pupil and Corneal Reflection Detection trained on Simulated Light DistributionsProceedings of the 25th International Conference on Mobile Human-Computer Interaction10.1145/3565066.3608690(1-7)Online publication date: 26-Sep-2023
  • (2023)Pupil centre’s localization with transformer without real pupilMultimedia Tools and Applications10.1007/s11042-023-14403-382:16(25467-25484)Online publication date: 27-Feb-2023
  • (2021)A Novel Gaze Gesture Sensor for Smart Glasses Based on Laser Self-MixingExtended Abstracts of the 2021 CHI Conference on Human Factors in Computing Systems10.1145/3411763.3451621(1-6)Online publication date: 8-May-2021
  • (2021)Robust and accurate pupil detection for head-mounted eye trackingComputers and Electrical Engineering10.1016/j.compeleceng.2021.10719393:COnline publication date: 23-Aug-2021
  • Show More Cited By

View Options

View options

Media

Figures

Other

Tables

Share

Share

Share this Publication link

Share on social media