[go: up one dir, main page]
More Web Proxy on the site http://driver.im/ skip to main content
10.1145/3025453.3025599acmconferencesArticle/Chapter ViewAbstractPublication PageschiConference Proceedingsconference-collections
research-article
Open access

Toward Everyday Gaze Input: Accuracy and Precision of Eye Tracking and Implications for Design

Published: 02 May 2017 Publication History

Abstract

For eye tracking to become a ubiquitous part of our everyday interaction with computers, we first need to understand its limitations outside rigorously controlled labs, and develop robust applications that can be used by a broad range of users and in various environments. Toward this end, we collected eye tracking data from 80 people in a calibration-style task, using two different trackers in two lighting conditions. We found that accuracy and precision can vary between users and targets more than six-fold, and report on differences between lighting, trackers, and screen regions. We show how such data can be used to determine appropriate target sizes and to optimize the parameters of commonly used filters. We conclude with design recommendations and examples how our findings and methodology can inform the design of error-aware adaptive applications.

Supplementary Material

ZIP File (pn1714-file4.zip)

References

[1]
Michael Ashmore, Andrew T. Duchowski, and Garth Shoemaker. 2005. Efficient eye pointing with a fisheye lens. In Proceedings of Graphics Interface 2005. Canadian Human-Computer Communications Society, Waterloo, 203--210.
[2]
Michael Barz, Andreas Bulling, and Florian Daiber. 2016a. Computational Modelling and Prediction of Gaze Estimation Error for Head-mounted Eye Trackers DFKI Technical Report. (2016).
[3]
Michael Barz, Florian Daiber, and Andreas Bulling. 2016b. Prediction of gaze estimation error for error-aware gaze-based interfaces. In Proceedings of the Ninth Biennial ACM Symposium on Eye Tracking Research & Applications - ETRA '16. ACM Press, NY, NY, USA, 275--278.
[4]
Richard Bates and Howell Istance. 2002. Zooming interfaces!: enhancing the performance of eye controlled pointing devices. In Proceedings of the fifth international ACM conference on Assistive technologies - Assets '02. ACM Press, NY, NY, USA, 119.
[5]
Pieter Blignaut, Kenneth Holmqvist, Marcus Nyström, and Richard Dewhurst. 2014. Improving the Accuracy of Video-Based Eye Tracking in Real Time through Post-Calibration Regression. In Current Trends in Eye Tracking Research. Springer International Publishing, Cham, 77--100.
[6]
Géry Casiez, Nicolas Roussel, and Daniel Vogel. 2012. 1e filter: a simple speed-based low-pass filter for noisy input in interactive systems. In Proceedings of the 2012 ACM annual conference on Human Factors in Computing Systems CHI '12. ACM Press, NY, NY, USA, 2527.
[7]
Sarah D'Angelo and Darren Gergle. 2016. Gazed and Confused: Understanding and Designing Shared Gaze for Remote Collaboration. In Proceedings of the 2016 CHI Conference on Human Factors in Computing Systems CHI '16. ACM Press, NY, NY, USA, 2492--2496.
[8]
Heiko Drewes and Albrecht Schmidt. 2007. Interacting with the computer using gaze gestures. Springer-Verlag. 475--488 pages.
[9]
Andrew T Duchowski, Nathan Cournia, and Hunter Murphy. 2004. Gaze-contingent displays: a review. Cyberpsychology & behavior : the impact of the Internet, multimedia and virtual reality on behavior and society 7, 6 (dec 2004), 621--34.
[10]
David Fono and Roel Vertegaal. 2005. EyeWindows: evaluation of eye-controlled zooming windows for focus selection. In Proceedings of the SIGCHI conference on Human factors in computing systems CHI '05. ACM Press, NY, NY, USA, 151.
[11]
Kenneth Holmqvist, Marcus Nyström, Richard Andersson, Richard Dewhurst, Halszka Jarodzka, and Joost de Weijer. 2011. Eye tracking: A comprehensive guide to methods and measures. OUP Oxford.
[12]
Yu-Chun Huang and Fong-Gong Wu. 2015. Visual and manual loadings with QWERTY-like ambiguous keyboards: Relevance of letter-key assignments on mobile phones. International Journal of Industrial Ergonomics 50 (2015), 143--150.
[13]
Aulikki Hyrskykari, Päivi Majaranta, Antti Aaltonen, and Kari-Jouko Räihä. 2000. Design issues of iDICT: A Gaze-Assisted Translation Aid. In Proceedings of the symposium on Eye tracking research & applications ETRA '00. ACM Press, NY, NY, USA, 9--14.
[14]
SensoMotoric Instruments. 2016. SMI REDn Scientific -- Specifiactions. (2016). http://www.smivision.com/fileadmin/user
[15]
Howell Istance and Aulikki Hyrskykari. 2011. Gaze-Aware Systems and Attentive Applications. In Gaze Interaction and Applications of Eye Tracking. Vol. 58. IGI Global, 175--195.
[16]
Howell Istance, Aulikki Hyrskykari, Lauri Immonen, Santtu Mansikkamaa, and Stephen Vickers. 2010. Designing gaze gestures for gaming: an Investigation of Performance. In Proceedings of the 2010 Symposium on Eye-Tracking Research & Applications - ETRA '10. ACM Press, NY, NY, USA, 323.
[17]
Robert J. K. Jacob. 1991. The use of eye movements in human-computer interaction techniques: what you look at is what you get. ACM Transactions on Information Systems 9, 2 (Apr 1991), 152--169.
[18]
Shahram Jalaliniya and Diako Mardanbegi. 2016. Eyegrip: Detecting targets in a series of uni-directional moving objects using optokinetic nystagmus eye movements. In Proceedings of the 2016 CHI Conference on Human Factors in Computing Systems. ACM, 5801--5811.
[19]
J Jimenez, D Gutierrez, and P Latorre. 2008. Gaze-based Interaction for Virtual Environments. Journal of Universal Computer Science 14 19 (2008), 3085--3098. http://giga.cps.unizar.es/
[20]
Manu Kumar, Tal Garfinkel, Dan Boneh, and Terry Winograd. 2007. Reducing shoulder-surfing by using gaze-based password entry. In Proceedings of the 3rd symposium on Usable privacy and security SOUPS '07. ACM Press, NY, NY, USA, 13.
[21]
Manu Kumar, Jeff Klingner, Rohan Puranik, Terry Winograd, and Andreas Paepcke. 2008. Improving the accuracy of gaze input for interaction. In Proceedings of the 2008 symposium on Eye tracking research & applications - ETRA '08. ACM Press, NY, NY, USA, 65.
[22]
Manu Kumar, Andreas Paepcke, and Terry Winograd. 2007. EyePoint: practical pointing and selection using gaze and keyboard. In Proceedings of the SIGCHI conference on Human factors in computing systems CHI '07. ACM Press, NY, NY, USA, 421.
[23]
Andrew Kurauchi, Wenxin Feng, Ajjen Joshi, Carlos Morimoto, and Margrit Betke. 2016. EyeSwipe: Dwell-free Text Entry Using Gaze Paths. In Proceedings of the 2016 CHI Conference on Human Factors in Computing Systems CHI '16. ACM Press, NY, NY, USA, 1952--1956.
[24]
Päivi Majaranta, Ulla-Kaija Ahola, and Oleg fipakov. 2009. Fast gaze typing with an adjustable dwell time. In Proceedings of the 27th international conference on Human factors in computing systems CHI 09. ACM Press, NY, NY, USA, 357.
[25]
Päivi Majaranta, Hirotaka Aoki, Mick Donegan, Dan Witzner Hansen, and John Paulin Hansen. 2011. Gaze Interaction and Applications of Eye Tracking: Advances in Assistive Technologies. Information Science Reference Imprint of: IGI Publishing.
[26]
Päivi Majaranta and Andreas Bulling. 2014. Eye Tracking and Eye-Based Human-Computer Interaction. Springer London, 39--65.
[27]
Michael Mauderer, Simone Conte, Miguel A. Nacenta, and Dhanraj Vishwanath. 2014. Depth perception with gaze-contingent depth of field. In Proceedings of the 32nd annual ACM conference on Human factors in computing systems CHI '14. ACM Press, NY, NY, USA, 217--226.
[28]
Msi. 2016. msi GT72S G TOBII (6TH GEN) (GTX 980M). (2016). https://us.msi.com/Laptop/ GT72S-G-Tobii-6th-Gen-GTX-980M.html
[29]
Cuong Nguyen and Feng Liu. 2016. Gaze-based Notetaking for Learning from Lecture Videos. In Proceedings of the 2016 CHI Conference on Human Factors in Computing Systems CHI '16. ACM Press, NY, NY, USA, 2093--2097.
[30]
Marcus Nyström, Richard Andersson, Kenneth Holmqvist, and Joost van de Weijer. 2013. The influence of calibration method and eye physiology on eyetracking data quality. Behavior Research Methods 45, 1 (mar 2013), 272--288.
[31]
Kristien Ooms, Lien Dupont, Lieselot Lapon, and Stanislav Popelka. 2015. Accuracy and precision of fixation locations recorded with the low-cost Eye Tribe tracker in different experimental set-ups. Journal of Eye Movement Research 8, 1 (2015).
[32]
Håkon Raudsandmoen and Børge Rødsjø. 2012. Empirically Based Design Guidelines for Gaze Interaction in Windows 7. (2012). http: //urn.kb.se/resolve?urn=urn:nbn:no:ntnu:diva-19544
[33]
David Rozado, Javier S. Agustin, Francisco B. Rodriguez, and Pablo Varona. 2012. Gliding and saccadic gaze gesture recognition in real time. ACM Transactions on Interactive Intelligent Systems 1, 2 (jan 2012), 1--27.
[34]
Dario D. Salvucci and Joseph H. Goldberg. 2000. Identifying fixations and saccades in eye-tracking protocols. In Proceedings of the symposium on Eye tracking research & applications - ETRA '00. ACM Press, NY, NY, USA, 71--78.
[35]
Simon Schenk, Philipp Tiefenbacher, Gerhard Rigoll, and Michael Dorr. 2016. SPOCK: A Smooth Pursuit Oculomotor Control Kit. In Proceedings of the 2016 CHI Conference Extended Abstracts on Human Factors in Computing Systems CHI EA '16. ACM Press, NY, NY, USA, 2681--2687.
[36]
Julia Schwarz, Scott Hudson, Jennifer Mankoff, and Andrew D. Wilson. 2010. A framework for robust and flexible handling of inputs with uncertainty. In Proceedings of the 23nd annual ACM symposium on User interface software and technology - UIST '10. ACM Press, NY, NY, USA, 47.
[37]
Baris Serim and Giulio Jacucci. 2016. Pointing while Looking Elsewhere: Designing for Varying Degrees of Visual Guidance during Manual Input. Proceedings of the 2016 CHI Conference on Human Factors in Computing Systems (2016), 5789--5800.
[38]
John L. Sibert, Mehmet Gokturk, and Robert A. Lavine. 2000. The reading assistant: eye gaze triggered auditory prompting for reading remediation. In Proceedings of the 13th annual ACM symposium on User interface software and technology - UIST '00. ACM Press, NY, NY, USA, 101--107.
[39]
Oleg Špakov. 2012. Comparison of eye movement filters used in HCI. In Proceedings of the Symposium on Eye Tracking Research and Applications - ETRA '12. ACM Press, NY, NY, USA, 281.
[40]
Oleg Špakov, Poika Isokoski, Jari Kangas, Deepak Akkil, and Päivi Majaranta. 2016. PursuitAdjuster: an exploration into the design space of smooth pursuit-based widgets. In Proceedings of the Ninth Biennial ACM Symposium on Eye Tracking Research & Applications ETRA '16. ACM Press, NY, NY, USA, 287--290.
[41]
Dave M Stampe. 1993. Heuristic filtering and reliable calibration methods for video-based pupil-tracking systems. Behavior Research Methods, Instruments, & Computers 25, 2 (1993), 137--142.
[42]
Thomas Templier, Kenan Bektas, and Richard H.R. Hahnloser. 2016. Eye-Trace: Segmentation of Volumetric Microscopy Images with Eyegaze. In Proceedings of the 2016 CHI Conference on Human Factors in Computing Systems CHI '16. ACM Press, NY, NY, USA, 5812--5823.
[43]
The Eye Tribe. 2016. Eye Tribe Tracker Pro. (2016). https://theeyetribe.com/products/
[44]
Tobii Technology. 2016a. EyeX controller accurary and precision test report, tobii developer zone. (2016). http://developer.tobii.com/community/forums/topic/ eyex-controller-accurary-and-precision-test-report/
[45]
Tobii Technology. 2016b. Tobii EyeX Controller -- Specifications. (2016). http://www.tobii.com/xperience/products/
[46]
Mélodie Vidal, Andreas Bulling, and Hans Gellersen. 2013. Pursuits: Spontaneous Interaction with Displays based on Smooth Pursuit Eye Movement and Moving Targets. In Proceedings of the 2013 ACM international joint conference on Pervasive and ubiquitous computing UbiComp '13. ACM Press, NY, NY, USA, 439.
[47]
Jacob O. Wobbrock, James Rubinstein, Michael W. Sawyer, and Andrew T. Duchowski. 2008. Longitudinal evaluation of discrete consecutive gaze gestures for text entry. In Proceedings of the 2008 symposium on Eye tracking research & applications - ETRA '08. ACM Press, NY, NY, USA, 11.
[48]
Erroll Wood and Andreas Bulling. 2014. EyeTab: model-based gaze estimation on unmodified tablet computers. In Proceedings of the Symposium on Eye Tracking Research and Applications - ETRA '14. ACM Press, NY, NY, USA, 207--210.
[49]
Shumin Zhai, Carlos Morimoto, and Steven Ihde. 1999. Manual and gaze input cascaded (MAGIC) pointing. In Proceedings of the SIGCHI conference on Human factors in computing systems the CHI is the limit CHI '99. ACM Press, NY, NY, USA, 246--253.

Cited By

View all
  • (2025)The fundamentals of eye tracking part 4: Tools for conducting an eye tracking studyBehavior Research Methods10.3758/s13428-024-02529-757:1Online publication date: 6-Jan-2025
  • (2024)Interacting with Smart Virtual Assistants for Individuals with Dysarthria: A Comparative Study on Usability and User PreferencesApplied Sciences10.3390/app1404140914:4(1409)Online publication date: 8-Feb-2024
  • (2024)Comparison of Unencumbered Interaction Technique for Head-Mounted DisplaysProceedings of the ACM on Human-Computer Interaction10.1145/36981468:ISS(500-516)Online publication date: 24-Oct-2024
  • Show More Cited By

Index Terms

  1. Toward Everyday Gaze Input: Accuracy and Precision of Eye Tracking and Implications for Design

    Recommendations

    Comments

    Please enable JavaScript to view thecomments powered by Disqus.

    Information & Contributors

    Information

    Published In

    cover image ACM Conferences
    CHI '17: Proceedings of the 2017 CHI Conference on Human Factors in Computing Systems
    May 2017
    7138 pages
    ISBN:9781450346559
    DOI:10.1145/3025453
    Permission to make digital or hard copies of all or part of this work for personal or classroom use is granted without fee provided that copies are not made or distributed for profit or commercial advantage and that copies bear this notice and the full citation on the first page. Copyrights for components of this work owned by others than ACM must be honored. Abstracting with credit is permitted. To copy otherwise, or republish, to post on servers or to redistribute to lists, requires prior specific permission and/or a fee. Request permissions from [email protected]

    Sponsors

    Publisher

    Association for Computing Machinery

    New York, NY, United States

    Publication History

    Published: 02 May 2017

    Permissions

    Request permissions for this article.

    Check for updates

    Badges

    • Honorable Mention

    Author Tags

    1. adaptive interfaces
    2. eye tracking
    3. gaze filters
    4. sensor noise

    Qualifiers

    • Research-article

    Conference

    CHI '17
    Sponsor:

    Acceptance Rates

    CHI '17 Paper Acceptance Rate 600 of 2,400 submissions, 25%;
    Overall Acceptance Rate 6,199 of 26,314 submissions, 24%

    Upcoming Conference

    CHI 2025
    ACM CHI Conference on Human Factors in Computing Systems
    April 26 - May 1, 2025
    Yokohama , Japan

    Contributors

    Other Metrics

    Bibliometrics & Citations

    Bibliometrics

    Article Metrics

    • Downloads (Last 12 months)1,101
    • Downloads (Last 6 weeks)95
    Reflects downloads up to 11 Jan 2025

    Other Metrics

    Citations

    Cited By

    View all
    • (2025)The fundamentals of eye tracking part 4: Tools for conducting an eye tracking studyBehavior Research Methods10.3758/s13428-024-02529-757:1Online publication date: 6-Jan-2025
    • (2024)Interacting with Smart Virtual Assistants for Individuals with Dysarthria: A Comparative Study on Usability and User PreferencesApplied Sciences10.3390/app1404140914:4(1409)Online publication date: 8-Feb-2024
    • (2024)Comparison of Unencumbered Interaction Technique for Head-Mounted DisplaysProceedings of the ACM on Human-Computer Interaction10.1145/36981468:ISS(500-516)Online publication date: 24-Oct-2024
    • (2024)Combination of Augmented Reality Based Brain-Computer Interface and EyeTracking for Control of a Multi-Robot SystemProceedings of the 2024 4th International Conference on Robotics and Control Engineering10.1145/3674746.3674760(90-95)Online publication date: 27-Jun-2024
    • (2024)Shifting Focus with HCEye: Exploring the Dynamics of Visual Highlighting and Cognitive Load on User Attention and Saliency PredictionProceedings of the ACM on Human-Computer Interaction10.1145/36556108:ETRA(1-18)Online publication date: 28-May-2024
    • (2024)GazeSwitch: Automatic Eye-Head Mode Switching for Optimised Hands-Free PointingProceedings of the ACM on Human-Computer Interaction10.1145/36556018:ETRA(1-20)Online publication date: 28-May-2024
    • (2024)40 Years of Eye Typing: Challenges, Gaps, and Emergent StrategiesProceedings of the ACM on Human-Computer Interaction10.1145/36555968:ETRA(1-19)Online publication date: 28-May-2024
    • (2024)The Effect of Degraded Eye Tracking Accuracy on Interactions in VRProceedings of the 2024 Symposium on Eye Tracking Research and Applications10.1145/3649902.3656369(1-7)Online publication date: 4-Jun-2024
    • (2024)Enhancing User Gaze Prediction in Monitoring Tasks: The Role of Visual HighlightsProceedings of the 2024 Symposium on Eye Tracking Research and Applications10.1145/3649902.3655652(1-3)Online publication date: 4-Jun-2024
    • (2024)Evaluation of Eye Tracking Signal Quality for Virtual Reality Applications: A Case Study in the Meta Quest ProProceedings of the 2024 Symposium on Eye Tracking Research and Applications10.1145/3649902.3653347(1-8)Online publication date: 4-Jun-2024
    • Show More Cited By

    View Options

    View options

    PDF

    View or Download as a PDF file.

    PDF

    eReader

    View online with eReader.

    eReader

    Login options

    Media

    Figures

    Other

    Tables

    Share

    Share

    Share this Publication link

    Share on social media