[go: up one dir, main page]
More Web Proxy on the site http://driver.im/ skip to main content
research-article
Open access

Using Deep Learning to Increase Eye-Tracking Robustness, Accuracy, and Precision in Virtual Reality

Published: 17 May 2024 Publication History

Abstract

Algorithms for the estimation of gaze direction from mobile and video-based eye trackers typically involve tracking a feature of the eye that moves through the eye camera image in a way that covaries with the shifting gaze direction, such as the center or boundaries of the pupil. Tracking these features using traditional computer vision techniques can be difficult due to partial occlusion and environmental reflections. Although recent efforts to use machine learning (ML) for pupil tracking have demonstrated superior results when evaluated using standard measures of segmentation performance, little is known of how these networks may affect the quality of the final gaze estimate. This work provides an objective assessment of the impact of several contemporary ML-based methods for eye feature tracking when the subsequent gaze estimate is produced using either feature-based or model-based methods. Metrics include the accuracy and precision of the gaze estimate, as well as drop-out rate.

References

[1]
Kamran Binaee, Christian Sinnott, Kaylie Jacleen Capurro, Paul MacNeilage, and Mark D Lescroart. 2021. Pupil Tracking Under Direct Sunlight. In ACM Symposium on Eye Tracking Research and Applications (Virtual Event, Germany) (ETRA '21 Adjunct). Association for Computing Machinery, New York, NY, USA, Article 18, 4 pages. https://doi.org/10.1145/3450341.3458490
[2]
Jack Brookes, Matthew Warburton, Mshari Alghadier, Mark Mon-Williams, and Faisal Mushtaq. 2020. Studying human behavior with virtual reality: The Unity Experiment Framework. Behavior research methods 52 (2020), 455--463.
[3]
Xin Cai, Jiabei Zeng, and Shiguang Shan. 2021. Landmark-aware Self-supervised Eye Semantic Segmentation. Proceedings - 2021 16th IEEE International Conference on Automatic Face and Gesture Recognition, FG 2021 (2021). https://doi.org/10.1109/FG52635.2021.9667031
[4]
Aayush K. Chaudhary. 2019. Motion Tracking of Iris Features for Eye Tracking. In Proceedings of the 11th ACM Symposium on Eye Tracking Research & Applications (Denver, Colorado) (ETRA '19). Association for Computing Machinery, New York, NY, USA, Article 53, 3 pages. https://doi.org/10.1145/3314111.3322872
[5]
Aayush K. Chaudhary, Rakshit Kothari, Manoj Acharya, Shusil Dangi, Nitinraj Nair, Reynold Bailey, Christopher Kanan, Gabriel Diaz, and Jeff B. Pelz. 2019. RITnet: Real-time Semantic Segmentation of the Eye for Gaze Tracking. Proceedings - 2019 International Conference on Computer Vision Workshop, ICCVW 2019 (10 2019), 3698--3702. https://doi.org/10.1109/iccvw.2019.00568 RITnet Original Paper.
[6]
Kai Dierkes, Moritz Kassner, and Andreas Bulling. 2019. A fast approach to refraction-aware eye-model fitting and gaze prediction. Eye Tracking Research and Applications Symposium (ETRA) (6 2019). https://doi.org/10.1145/3314111.3319819
[7]
Wolfgang Fuhl, Thomas Kübler, Katrin Sippel, Wolfgang Rosenstiel, and Enkelejda Kasneci. 2015a. ExCuSe: Robust Pupil Detection in Real-World Scenarios. Lecture Notes in Computer Science (including subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics) 9256 (2015), 39--51. https://doi.org/10.1007/978-3-319-23192-1_4
[8]
Wolfgang Fuhl, Thiago Santini, Gjergji Kasneci, and Enkelejda Kasneci. 2016a. PupilNet: Convolutional Neural Networks for Robust Pupil Detection. (1 2016). https://arxiv.org/abs/1601.04902v1
[9]
Wolfgang Fuhl, Thiago C. Santini, Thomas Kübler, and Enkelejda Kasneci. 2015b. ElSe: Ellipse Selection for Robust Pupil Detection in Real-World Environments. Eye Tracking Research and Applications Symposium (ETRA) 14 (11 2015), 123--130. https://doi.org/10.48550/arxiv.1511.06575
[10]
Wolfgang Fuhl, Johannes Schneider, and Enkelejda Kasneci. 2021. 1000 Pupil Segmentations in a Second using Haar Like Features and Statistical Learning. Proceedings of the IEEE International Conference on Computer Vision 2021-October (2 2021), 3459--3469. https://doi.org/10.48550/arxiv.2102.01921
[11]
Wolfgang Fuhl, Marc Tonsen, Andreas Bulling, and Enkelejda Kasneci. 2016b. Pupil detection for head-mounted eye tracking in the wild: an evaluation of the state of the art. Machine Vision and Applications 27 (11 2016), 1275--1288. Issue 8. https://doi.org/10.1007/S00138-016-0776-4/FIGURES/14
[12]
Stephan J. Garbin, Yiru Shen, Immo Schuetz, Robert Cavin, Gregory Hughes, and Sachin S. Talathi. 2019. OpenEDS: Open Eye Dataset. (4 2019). https://arxiv.org/abs/1905.03702v2
[13]
Pupil Labs GmbH. 2022a. HTC Vive Add-On. https://docs.pupil-labs.com/vr-ar/htc-vive/
[14]
Pupil Labs GmbH. 2022b. Pupil core - open source eye tracking platform - pupil labs. https://pupil-labs.com/products/core/
[15]
E.D. Guestrin and M. Eizenman. 2006. General theory of remote gaze estimation using the pupil center and corneal reflections. IEEE Transactions on Biomedical Engineering 53, 6 (2006), 1124--1133. https://doi.org/10.1109/TBME.2005.863952
[16]
Craig Hennessey, Borna Noureddin, and Peter Lawrence. 2006. A single camera eye-gaze tracking system with free head motion. In Proceedings of the 2006 Symposium on Eye Tracking Research & Applications (San Diego, California) (ETRA '06). Association for Computing Machinery, New York, NY, USA, 87--94. https://doi.org/10.1145/1117309.1117349
[17]
Amir Homayoun Javadi, Zahra Hakimi, Morteza Barati, Vincent Walsh, and Lili Tcheang. 2015. Set: A pupil detection method using sinusoidal approximation. Frontiers in Neuroengineering 8 (4 2015), 4. Issue APR. https://doi.org/10.3389/FNENG.2015.00004/ABSTRACT
[18]
Anuradha Kar and Peter Corcoran. 2017. A Review and Analysis of Eye-Gaze Estimation Systems, Algorithms and Performance Evaluation Methods in Consumer Platforms. IEEE Access 5 (2017), 16495--16519. https://doi.org/10.1109/ACCESS.2017.2735633
[19]
Moritz Kassner, William Patera, and Andreas Bulling. 2014. Pupil: An Open Source Platform for Pervasive Eye Tracking and Mobile Gaze-based Interaction. UbiComp 2014 - Adjunct Proceedings of the 2014 ACM International Joint Conference on Pervasive and Ubiquitous Computing (4 2014), 1151--1160. https://arxiv.org/abs/1405.0006v1
[20]
Joohwan Kim, Michael Stengel, Alexander Majercik, Shalini De Mello, David Dunn, Samuli Laine, Morgan McGuire, and David Luebke. 2019. NVGaze: An anatomically-informed dataset for low-latency, near-eye gaze estimation. Conference on Human Factors in Computing Systems - Proceedings (5 2019). https://doi.org/10.1145/3290605.3300780
[21]
Rakshit S. Kothari, Reynold J. Bailey, Christopher Kanan, Jeff B. Pelz, and Gabriel J. Diaz. 2022. EllSeg-Gen, towards Domain Generalization for head-mounted eyetracking. Proceedings of the ACM on Human-Computer Interaction 6 (5 2022). Issue ETRA. https://doi.org/10.1145/3530880
[22]
Rakshit S. Kothari, Aayush K. Chaudhary, Reynold J. Bailey, Jeff B. Pelz, and Gabriel J. Diaz. 2020. EllSeg: An Ellipse Segmentation Framework for Robust Gaze Tracking. IEEE Transactions on Visualization and Computer Graphics 27 (7 2020), 2757--2767. Issue 5. https://doi.org/10.1109/tvcg.2021.3067765
[23]
Dongheng Li, David Winfield, and Derrick J. Parkhurst. 2005. Starburst: A hybrid algorithm for video-based eye tracking combining feature-based and model-based approaches. IEEE Computer Society Conference on Computer Vision and Pattern Recognition Workshops 2005-September (2005). https://doi.org/10.1109/CVPR.2005.531
[24]
SR Research Ltd. 2023. SR Research Ltd. - Eye-Tracking Company. https://www.sr-research.com/
[25]
Jeff J. Macinnes, Shariq Iqbal, John Pearson, and Elizabeth N. Johnson. 2018. Wearable Eye-tracking for Research: Automated dynamic gaze mapping and accuracy/precision comparisons across devices. bioRxiv (2018). https://doi.org/10.1101/299925 arXiv:https://www.biorxiv.org/content/early/2018/06/28/299925.full.pdf
[26]
Norman H Mackworth and Edward Llewellyn Thomas. 1962. Head-mounted eye-marker camera. JOSA 52, 6 (1962), 713--716.
[27]
John Merchant. 1967. The oculometer. Technical Report.
[28]
André Meyer, Martin Böhme, Thomas Martinetz, and Erhardt Barth. 2006. A Single-Camera Remote Eye Tracker. In Perception and Interactive Technologies, Elisabeth André, Laila Dybkjær, Wolfgang Minker, Heiko Neumann, and Michael Weber (Eds.). Springer Berlin Heidelberg, Berlin, Heidelberg, 208--211.
[29]
Nitinraj Nair. 2020. RIT-Eyes: Realistic Eye Image and Video Generation for Eye Tracking Applications. Theses (6 2020). https://scholarworks.rit.edu/theses/10553
[30]
Nitinraj Nair, Rakshit Kothari, Aayush K. Chaudhary, Zhizhuo Yang, Gabriel J. Diaz, Jeff B. Pelz, and Reynold J. Bailey. 2020. RIT-Eyes: Rendering of near-eye images for eye-tracking applications. Proceedings - SAP 2020: ACM Symposium on Applied Perception (6 2020). https://doi.org/10.1145/3385955.3407935
[31]
Thiago Santini, Wolfgang Fuhl, and Enkelejda Kasneci. 2017. PuRe: Robust pupil detection for real-time pervasive eye tracking. Computer Vision and Image Understanding 170 (12 2017), 40--50. https://doi.org/10.1016/j.cviu.2018.02.002
[32]
Thiago Santini, Wolfgang Fuhl, and Enkelejda Kasneci. 2018. PuReST: Robust pupil tracking for real-time pervasive eye tracking. In Proceedings of the 2018 ACM symposium on eye tracking research & applications. 1--5.
[33]
Lech Swirski and Neil Dodgson. 2013. A fully-automatic, temporal approach to single camera, glint-free 3D eye model fitting. Proc. PETMEI (2013), 1--11.
[34]
Marc Tonsen, Xucong Zhang, Yusuke Sugano, and Andreas Bulling. 2015. Labeled pupils in the wild: A dataset for studying pupil detection in unconstrained environments. Eye Tracking Research and Applications Symposium (ETRA) 14 (11 2015), 139--142. https://doi.org/10.1145/2857491.2857520
[35]
Zhimin Wang, Yuxin Zhao, Yunfei Liu, and Feng Lu. 2021. Edge-guided near-eye image analysis for head mounted displays. Proceedings - 2021 IEEE International Symposium on Mixed and Augmented Reality, ISMAR 2021 (2021), 11--20. https://doi.org/10.1109/ISMAR52148.2021.00015
[36]
Erroll Wood, Tadas Baltrušaitis, Louis-Philippe Morency, Peter Robinson, and Andreas Bulling. 2016. Learning an Appearance-Based Gaze Estimator from One Million Synthesised Images. In Proceedings of the Ninth Biennial ACM Symposium on Eye Tracking Research & Applications. 131--138.
[37]
Yuk Hoi Yiu, Moustafa Aboulatta, Theresa Raiser, Leoni Ophey, Virginia L. Flanagin, Peter zu Eulenburg, and Seyed Ahmad Ahmadi. 2019. DeepVOG: Open-source pupil segmentation and gaze estimation in neuroscience using deep learning. Journal of Neuroscience Methods 324 (8 2019). https://doi.org/10.1016/J.JNEUMETH.2019.05.016
[38]
Zhiwei Zhu and Qiang Ji. 2005. Robust real-time eye detection and tracking under variable lighting conditions and various face orientations. Computer Vision and Image Understanding 98, 1 (2005), 124--154. https://doi.org/10.1016/j.cviu.2004.07.012 Special Issue on Eye Detection and Tracking.
[39]
Lech Świrski, Andreas Bulling, and Neil Dodgson. 2012. Robust real-time pupil tracking in highly off-axis images. Eye Tracking Research and Applications Symposium (ETRA) (2012), 173--176. https://doi.org/10.1145/2168556.2168585

Cited By

View all
  • (2024)Optic flow density modulates corner-cutting in a virtual steering task for younger and older adultsScientific Reports10.1038/s41598-024-78645-314:1Online publication date: 12-Nov-2024

Index Terms

  1. Using Deep Learning to Increase Eye-Tracking Robustness, Accuracy, and Precision in Virtual Reality

    Recommendations

    Comments

    Please enable JavaScript to view thecomments powered by Disqus.

    Information & Contributors

    Information

    Published In

    cover image Proceedings of the ACM on Computer Graphics and Interactive Techniques
    Proceedings of the ACM on Computer Graphics and Interactive Techniques  Volume 7, Issue 2
    May 2024
    101 pages
    EISSN:2577-6193
    DOI:10.1145/3665652
    Issue’s Table of Contents
    This work is licensed under a Creative Commons Attribution-NonCommercial-ShareAlike International 4.0 License.

    Publisher

    Association for Computing Machinery

    New York, NY, United States

    Publication History

    Published: 17 May 2024
    Published in PACMCGIT Volume 7, Issue 2

    Check for updates

    Author Tags

    1. eye tracking
    2. gaze estimation
    3. neural networks
    4. virtual reality

    Qualifiers

    • Research-article
    • Research
    • Refereed

    Funding Sources

    Contributors

    Other Metrics

    Bibliometrics & Citations

    Bibliometrics

    Article Metrics

    • Downloads (Last 12 months)369
    • Downloads (Last 6 weeks)77
    Reflects downloads up to 10 Dec 2024

    Other Metrics

    Citations

    Cited By

    View all
    • (2024)Optic flow density modulates corner-cutting in a virtual steering task for younger and older adultsScientific Reports10.1038/s41598-024-78645-314:1Online publication date: 12-Nov-2024

    View Options

    View options

    PDF

    View or Download as a PDF file.

    PDF

    eReader

    View online with eReader.

    eReader

    Login options

    Full Access

    Media

    Figures

    Other

    Tables

    Share

    Share

    Share this Publication link

    Share on social media