[go: up one dir, main page]
More Web Proxy on the site http://driver.im/ Skip to main content

Advertisement

Log in

Pupil detection for head-mounted eye tracking in the wild: an evaluation of the state of the art

  • Special Issue Paper
  • Published:
Machine Vision and Applications Aims and scope Submit manuscript

Abstract

Robust and accurate detection of the pupil position is a key building block for head-mounted eye tracking and prerequisite for applications on top, such as gaze-based human–computer interaction or attention analysis. Despite a large body of work, detecting the pupil in images recorded under real-world conditions is challenging given significant variability in the eye appearance (e.g., illumination, reflections, occlusions, etc.), individual differences in eye physiology, as well as other sources of noise, such as contact lenses or make-up. In this paper we review six state-of-the-art pupil detection methods, namely ElSe (Fuhl et al. in Proceedings of the ninth biennial ACM symposium on eye tracking research & applications, ACM. New York, NY, USA, pp 123–130, 2016), ExCuSe (Fuhl et al. in Computer analysis of images and patterns. Springer, New York, pp 39–51, 2015), Pupil Labs (Kassner et al. in Adjunct proceedings of the 2014 ACM international joint conference on pervasive and ubiquitous computing (UbiComp), pp 1151–1160, 2014. doi:10.1145/2638728.2641695), SET (Javadi et al. in Front Neuroeng 8, 2015), Starburst (Li et al. in Computer vision and pattern recognition-workshops, 2005. IEEE Computer society conference on CVPR workshops. IEEE, pp 79–79, 2005), and Świrski (Świrski et al. in Proceedings of the symposium on eye tracking research and applications (ETRA). ACM, pp 173–176, 2012. doi:10.1145/2168556.2168585). We compare their performance on a large-scale data set consisting of 225,569 annotated eye images taken from four publicly available data sets. Our experimental results show that the algorithm ElSe (Fuhl et al. 2016) outperforms other pupil detection methods by a large margin, offering thus robust and accurate pupil positions on challenging everyday eye images.

This is a preview of subscription content, log in via an institution to check access.

Access this article

Subscribe and save

Springer+ Basic
£29.99 /Month
  • Get 10 units per month
  • Download Article/Chapter or eBook
  • 1 Unit = 1 Article or 1 Chapter
  • Cancel anytime
Subscribe now

Buy Now

Price includes VAT (United Kingdom)

Instant access to the full article PDF.

Fig. 1
Fig. 2
Fig. 3
Fig. 4
Fig. 5
Fig. 6
Fig. 7
Fig. 8
Fig. 9
Fig. 10
Fig. 11
Fig. 12
Fig. 13
Fig. 14

Similar content being viewed by others

Notes

  1. https://www.ti.uni-tuebingen.de/Pupil-detection.1827.0.html?&L=1.

  2. http://thirtysixthspan.com/openEyes/software.html.

  3. https://sites.google.com/site/eyegoeyetracker/.

  4. https://www.cl.cam.ac.uk/research/rainbow/projects/pupiltracking/.

  5. https://github.com/pupil-labs/pupil.

  6. https://www.ti.uni-tuebingen.de/Pupil-detection.1827.0.html?&L=1.

References

  1. Braunagel, C., Kasneci, E., Stolzmann, W., Rosenstiel, W.: Driver-activity recognition in the context of conditionally autonomous driving. In: 2015 IEEE 18th International Conference on Intelligent Transportation Systems, pp 1652–1657 (2015). doi:10.1109/ITSC.2015.268

  2. Bulling, A., Ward, J.A., Gellersen, H., Tröster, G.: Eye movement analysis for activity recognition using electrooculography. IEEE Trans. Pattern Anal. Mach. Intell. 33(4), 741–753 (2011). doi:10.1109/TPAMI.2010.86

    Article  Google Scholar 

  3. Bulling, A., Weichel, C., Gellersen, H.: Eyecontext: recognition of high-level contextual cues from human visual behaviour. In: Proceedings of the 31st SIGCHI International Conference on Human Factors in Computing Systems (CHI), pp. 305–308 (2013). doi:10.1145/2470654.2470697

  4. Douglas, D.H., Peucker, T.K.: Algorithms for the reduction of the number of points required to represent a digitized line or its caricature. Cartogr. Int. J. Geogr. Inf. Geovisualization 10(2), 112–122 (1973)

    Article  Google Scholar 

  5. Fuhl, W., Kübler, T., Sippel, K., Rosenstiel, W., Kasneci, E.: ExCuSe: robust pupil detection in real-world scenarios. In: Azzopardi, G., Petkov, N. (eds.) Computer Analysis of Images and Patterns, Springer, New York, pp. 39–51 (2015)

  6. Fuhl, W., Santini, T.C., KŁubler, T., Kasneci, E.: Else: Ellipse selection for robust pupil detection in real-world environments. In: Proceedings of the Ninth Biennial ACM Symposium on Eye Tracking Research & Applications, ACM, New York, NY, USA, ETRA ’16, pp 123–130 (2016)

  7. Goni, S., Echeto, J., Villanueva, A., Cabeza, R.: Robust algorithm for pupil-glint vector detection in a video-oculography eyetracking system. In: Pattern Recognition, 2004. Proceedings of the 17th International Conference on ICPR 2004. IEEE (2004)

  8. Javadi, A.H., Hakimi, Z., Barati, M., Walsh, V., Tcheang, L.: Set: a pupil detection method using sinusoidal approximation. Front. Neuroeng. 8, 4 (2015)

  9. Jian, M., Lam, K.M.: Simultaneous hallucination and recognition of low-resolution faces based on singular value decomposition. Circuits Syst. Video Technol. IEEE Trans. 25(11), 1761–1772 (2015)

    Article  Google Scholar 

  10. Jian, M., Lam, K.M., Dong, J.: A novel face-hallucination scheme based on singular value decomposition. Pattern Recognit. 46(11), 3091–3102 (2013)

    Article  Google Scholar 

  11. Jian, M., Lam, K.M., Dong, J.: Facial-feature detection and localization based on a hierarchical scheme. Inf. Sci. 262, 1–14 (2014)

    Article  MathSciNet  Google Scholar 

  12. Kasneci, E.: Towards the automated recognition of assistance need for drivers with impaired visual field. PhD thesis, University of Tübingen, Tübingen (2013). http://tobias-lib.uni-tuebingen.de/volltexte/2013/7033

  13. Kasneci, E., Sippel, K., Aehling, K., Heister, M., Rosenstiel, W., Schiefer, U., Papageorgiou, E.: Driving with binocular visual field loss? A study on a supervised on-road parcours with simultaneous eye and head tracking. Plos One 9(2), e87,470 (2014)

    Article  Google Scholar 

  14. Kasneci, E., Sippel, K., Heister, M., Aehling, K., Rosenstiel, W., Schiefer, U., Papageorgiou, E.: Homonymous visual field loss and its impact on visual exploration: A supermarket study. TVST 3(6), 2 (2014)

    Article  Google Scholar 

  15. Kassner, M., Patera, W., Bulling, A.: Pupil: an open source platform for pervasive eye tracking and mobile gaze-based interaction. In: Adjunct Proceedings of the 2014 ACM International Joint Conference on Pervasive and Ubiquitous Computing (UbiComp), pp. 1151–1160 (2014). doi:10.1145/2638728.2641695

  16. Kasneci, E., Kasneci, G., Kübler, T.C., Rosenstiel, W.: Artificial Neural Networks: Methods and Applications in Bio-Neuroinformatics, Springer International Publishing, chap Online Recognition of Fixations, Saccades, and Smooth Pursuits for Automated Analysis of Trac Hazard Perception, pp 411–434 (2015). doi:10.1007/978-3-319-09903-320

  17. Keil, A., Albuquerque, G., Berger, K., Magnor, M.A.: Real-time gaze tracking with a consumer-grade video camera In: Vaclav, S. (ed.) WSCG’2010, February 1–4, 2010, UNION Agency–Science Press, Plzen (2010)

  18. Li, D., Winfield, D., Parkhurst, D.J.: Starburst: A hybrid algorithm for video-based eye tracking combining feature-based and model-based approaches. In: Computer Vision and Pattern Recognition-Workshops, 2005. IEEE Computer Society Conference on CVPR Workshops. IEEE, pp. 79–79 (2005)

  19. Lin, L., Pan, L., Wei, L., Yu, L.: A robust and accurate detection of pupil images. In: 3rd International Conference on Biomedical Engineering and Informatics (BMEI), 2010, IEEE, vol. 1, pp. 70–74 (2010)

  20. Liu, X., Xu, F., Fujimura, K.: Real-time eye detection and tracking for driver observation under various light conditions. In: Intelligent Vehicle Symposium, 2002, IEEE, vol. 2, pp. 344–351. IEEE (2002)

  21. Long, X., Tonguz, O.K., Kiderman, A.: A high speed eye tracking system with robust pupil center estimation algorithm. In: Engineering in Medicine and Biology Society, 2007. EMBS 2007. 29th Annual International Conference of the IEEE. IEEE (2007)

  22. Majaranta, P., Bulling, A.: Eye Tracking and Eye-Based Human-Computer Interaction. Advances in Physiological Computing. Springer, London (2014). doi:10.1007/978-1-4471-6392-33

    Book  Google Scholar 

  23. Mohammed, G.J., Hong, B.R., Jarjes, A.A.: Accurate pupil features extraction based on new projection function. Comput. Inf. 29(4), 663–680 (2012)

    Google Scholar 

  24. Peréz, A., Cordoba, M.L., Garcia, A., Méndez, R., Munoz, M.L., Pedraza, J.L., Sanchez. F.: A precise eye-gaze detection and tracking system. In: Václav S. (ed.) WSCG ‘2003: Posters: The 11th International Conference in Central Europe on Computer Graphics, Visualization and Computer Vision 2003, pp. 105–108. UNION Agency, Plzen (2003)

  25. Schnipke, S.K., Todd, M.W.: Trials and tribulations of using an eye-tracking system. In: CHI’00 Extended Abstracts on Human Factors in Computing Systems. ACM (2000)

  26. Sippel, K., Kasneci, E., Aehling, K., Heister, M., Rosenstiel, W., Schiefer, U., Papageorgiou, E.: Binocular glaucomatous visual field loss and its impact on visual exploration—a supermarket study. PLoS One 9(8), e106,089 (2014). doi:10.1371/journal.pone.0106089

    Article  Google Scholar 

  27. Stellmach, S., Dachselt, R.: Look & touch: gaze-supported target acquisition. In: Proceedings of the SIGCHI Conference on Human Factors in Computing Systems, pp. 2981–2990. ACM (2012)

  28. Sugano, Y., Bulling, A.: Self-calibrating head-mounted eye trackers using egocentric visual saliency. In: Proceedings of the 28th ACM Symposium on User Interface Software and Technology (UIST), pp. 363–372 (2015). doi:10.1145/2807442.2807445

  29. Suzuki, S., et al.: Topological structural analysis of digitized binary images by border following. Comput. Vis. Graphics Image Process. 30(1), 32–46 (1985)

    Article  MATH  Google Scholar 

  30. Świrski, L., Bulling, A., Dodgson, N.: Robust real-time pupil tracking in highly off-axis images. In: Proceedings of the Symposium on Eye Tracking Research & Applications (ETRA), pp. 173–176. ACM (2012). doi:10.1145/2168556.2168585

  31. Tafaj, E., Kübler, T., Kasneci, G., Rosenstiel, W., Bogdan, M.: Online classification of eye tracking data for automated analysis of traffic hazard perception. In: Artificial Neural Networks and Machine Learning, ICANN 2013, vol. 8131, pp. 442–450. Springer, Berlin—Heidelberg (2013)

  32. Tonsen, M., Zhang, X., Sugano, Y., Bulling, A.: Labelled pupils in the wild: a dataset for studying pupil detection in unconstrained environments. In: Proceedings of the ACM International Symposium on Eye Tracking Research & Applications (ETRA), pp. 139–142 (2016). doi:10.1145/2857491.2857520

  33. Trösterer, S., Meschtscherjakov, A., Wilfinger, D., Tscheligi, M.: Eye tracking in the car: challenges in a dual-task scenario on a test track. In: Proceedings of the 6th AutomotiveUI. ACM (2014)

  34. Turner, J., Bulling, A., Alexander, J., Gellersen, H.: Cross-device gaze-supported point-to-point content transfer. In: Proceedings of the ACM International Symposium on Eye Tracking Research & Applications (ETRA), pp. 19–26 (2014). doi:10.1145/2578153.2578155

  35. Valenti, R., Gevers, T.: Accurate eye center location through invariant isocentric patterns. Trans. Pattern Anal. Mach. Intell. 34(9), 1785–1798 (2012)

    Article  Google Scholar 

  36. Viola, P., Jones, M.: Rapid object detection using a boosted cascade of simple features. In: Computer Vision and Pattern Recognition, 2001. Proceedings of the 2001 IEEE Computer Society Conference on CVPR 2001, vol. 1, pp. I–511. IEEE (2001)

  37. Wood, E., Bulling, A.: Eyetab: Model-based gaze estimation on unmodified tablet computers. In: Proceedings of the 8th Symposium on Eye Tracking Research & Applications (ETRA), pp. 207–210 (2014). doi:10.1145/2578153.2578185

  38. Zhu, D., Moore, S.T., Raphan, T.: Robust pupil center detection using a curvature algorithm. Comput. Methods Progr. Biomed. 59(3), 145–157 (1999)

    Article  Google Scholar 

Download references

Acknowledgments

This work was funded, in part, by the Cluster of Excellence on Multimodal Computing and Interaction (MMCI) at Saarland University as well as a JST CREST research grant.

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Wolfgang Fuhl.

Rights and permissions

Reprints and permissions

About this article

Check for updates. Verify currency and authenticity via CrossMark

Cite this article

Fuhl, W., Tonsen, M., Bulling, A. et al. Pupil detection for head-mounted eye tracking in the wild: an evaluation of the state of the art. Machine Vision and Applications 27, 1275–1288 (2016). https://doi.org/10.1007/s00138-016-0776-4

Download citation

  • Received:

  • Revised:

  • Accepted:

  • Published:

  • Issue Date:

  • DOI: https://doi.org/10.1007/s00138-016-0776-4

Keywords

Navigation