[go: up one dir, main page]
More Web Proxy on the site http://driver.im/ skip to main content
10.1145/2858036.2858404acmconferencesArticle/Chapter ViewAbstractPublication PageschiConference Proceedingsconference-collections
research-article

Building a Personalized, Auto-Calibrating Eye Tracker from User Interactions

Published: 07 May 2016 Publication History

Abstract

We present PACE, a Personalized, Automatically Calibrating Eye-tracking system that identifies and collects data unobtrusively from user interaction events on standard computing systems without the need for specialized equipment. PACE relies on eye/facial analysis of webcam data based on a set of robust geometric gaze features and a two-layer data validation mechanism to identify good training samples from daily interaction data. The design of the system is founded on an in-depth investigation of the relationship between gaze patterns and interaction cues, and takes into consideration user preferences and habits. The result is an adaptive, data-driven approach that continuously recalibrates, adapts and improves with additional use. Quantitative evaluation on 31 subjects across different interaction behaviors shows that training instances identified by the PACE data collection have higher gaze point-interaction cue consistency than those identified by conventional approaches. An in-situ study using real-life tasks on a diverse set of interactive applications demonstrates that the PACE gaze estimation achieves an average error of 2.56º, which is comparable to state-of-the-art, but without the need for explicit training or calibration. This demonstrates the effectiveness of both the gaze estimation method and the corresponding data collection mechanism.

Supplementary Material

suppl.mov (pn1836-file3.mp4)
Supplemental video
MP4 File (p5169-huang.mp4)

References

[1]
Alnajar, F., Gevers, T., Valenti, R. and Ghebreab, S. 2013. Calibration-Free Gaze Estimation Using Human Gaze Patterns. 2013 IEEE International Conference on Computer Vision (Dec. 2013), 137-144.
[2]
B. A. Shenoi 2005. Introduction to Digital Signal Processing and Filter Design. Wiley.
[3]
Breiman, L. 2001. Random forests. Machine Learning. 45, (2001), 5-32.
[4]
Chen, M.C., Anderson, J.R. and Sohn, M.H. 2001. What can a mouse cursor tell us more? CHI '01 extended abstracts on Human factors in computing systems CHI '01 (NY, NY, USA, 2001), 281.
[5]
Cootes, T.F., Edwards, G.J. and Taylor, C.J. 2001. Active appearance models. IEEE Transactions on Pattern Analysis and Machine Intelligence. 23, 6 (Jun. 2001), 681-685.
[6]
Fares, R., Fang, S. and Komogortsev, O. 2013. Can we beat the mouse with MAGIC? Proceedings of the SIGCHI Conference on Human Factors in Computing Systems CHI '13 (NY, NY, USA, 2013), 1387.
[7]
Guo, Q. and Agichtein, E. 2010. Towards predicting web searcher gaze position from mouse movements. Proceedings of the 28th of the international conference extended abstracts on Human factors in computing systems CHI EA '10 (NY, NY, USA, 2010), 3601.
[8]
Hansen, D.W. and Ji, Q. 2010. In the eye of the beholder: a survey of models for eyes and gaze. IEEE transactions on pattern analysis and machine intelligence. 32, 3 (Mar. 2010), 478-500.
[9]
Hornof, A.J. and Halverson, T. 2002. Cleaning up systematic error in eye-tracking data by using required fixation locations. Behavior research methods, instruments, & computers : a journal of the Psychonomic Society, Inc. 34, (2002), 592-604.
[10]
Huang, J., White, R. and Buscher, G. 2012. User see, user point: gaze and cursor alignment in web search. Proceedings of the 2012 ACM annual conference on Human Factors in Computing Systems CHI '12 (NY, NY, USA, 2012), 1341.
[11]
Huang, J., White, R.W. and Dumais, S. 2011. No clicks, no problem: using cursor movements to understand and improve search. Proceedings of the 2011 annual conference on Human factors in computing systems CHI '11 (NY, NY, USA, 2011), 1225.
[12]
Huang, M.X., Kwok, T.C.K., Ngai, G., Leong, H.V. and Chan, C.F.S. 2014. Building a Self-Learning Eye Gaze Model from User Interaction Data. ACM Multimedia (2014).
[13]
Jacob, R.J.K. 1990. What you look at is what you get: eye movement-based interaction techniques. Proceedings of the SIGCHI conference on Human factors in computing systems Empowering people CHI '90 (NY, NY, USA, 1990), 11-18.
[14]
Judd, T., Ehinger, K., Durand, F. and Torralba, A. 2009. Learning to predict where humans look. 2009 IEEE 12th International Conference on Computer Vision (Sep. 2009), 2106-2113.
[15]
Liebling, D.J. and Dumais, S.T. 2014. Gaze and mouse coordination in everyday work. Proceedings of the 2014 ACM International Joint Conference on Pervasive and Ubiquitous Computing Adjunct Publication - UbiComp '14 Adjunct (2014), 1141- 1150.
[16]
Lu, F., Okabe, T., Sugano, Y. and Sato, Y. 2014. Learning gaze biases with head motion for head posefree gaze estimation. Image and Vision Computing. 32, 3 (Mar. 2014), 169-179.
[17]
Lu, F., Sugano, Y., Okabe, T. and Sato, Y. 2012. Head pose-free appearance-based gaze sensing via eye image synthesis. International Conference on Pattern Recognition (ICPR), (2012), 1008 - 1011.
[18]
Lu, F., Sugano, Y., Okabe, T. and Sato, Y. 2014. Inferring Human Gaze from Appearance via Adaptive Linear Regression. IEEE Transactions on Pattern Analysis and Machine Intelligence. (2014), 1-1.
[19]
Rodden, K., Fu, X., Aula, A. and Spiro, I. 2008. Eyemouse coordination patterns on web search results pages. Proceeding of the twenty-sixth annual CHI conference extended abstracts on Human factors in computing systems CHI '08 (NY, NY, USA, 2008), 2997.
[20]
Saragih, J.M., Lucey, S. and Cohn, J.F. 2009. Face alignment through subspace constrained mean-shifts. 2009 IEEE 12th International Conference on Computer Vision (Sep. 2009), 1034-1041.
[21]
Sugano, Y., Matsushita, Y. and Sato, Y. 2013. Appearance-based gaze estimation using visual saliency. IEEE transactions on pattern analysis and machine intelligence. 35, 2 (Feb. 2013), 329-41.
[22]
Sugano, Y., Matsushita, Y., Sato, Y. and Koike, H. 2008. An Incremental Learning Method for Unconstrained Gaze Estimation. 10th European Conference on Computer Vision, ECCV' 2008 (2008), 656-667.
[23]
Williams, O., Blake, A. and Cipolla, R. Sparse and Semi-supervised Visual Mapping with the S^3GP. 2006 IEEE Computer Society Conference on Computer Vision and Pattern Recognition - Volume 1 (CVPR'06) 230-237.
[24]
Wood, E. and Bulling, A. 2014. EyeTab. Proceedings of the Symposium on Eye Tracking Research and Applications - ETRA '14 (NY, NY, USA, 2014), 207-210.
[25]
Xiong, X. and De la Torre, F. 2013. Supervised Descent Method and Its Applications to Face Alignment. 2013 IEEE Conference on Computer Vision and Pattern Recognition (Jun. 2013), 532-539.
[26]
Zhang, Y. and Hornof, A.J. 2014. Easy post-hoc spatial recalibration of eye tracking data. Proceedings of the Symposium on Eye Tracking Research and Applications - ETRA '14 (2014), 95-98.
[27]
Zhang, Y. and Hornof, A.J. 2011. Mode-of-disparities error correction of eye-tracking data. Behavior research methods. 43, (2011), 834-842.
[28]
Zhu, Z., Ji, Q. and Bennett, K.P. 2006. Nonlinear Eye Gaze Mapping Function Estimation via Support Vector Regression. 18th International Conference on Pattern Recognition (ICPR'06) (2006), 1132-1135.

Cited By

View all
  • (2024)CalibRead: Unobtrusive Eye Tracking Calibration from Natural Reading BehaviorProceedings of the ACM on Interactive, Mobile, Wearable and Ubiquitous Technologies10.1145/36997378:4(1-30)Online publication date: 21-Nov-2024
  • (2024)PrivatEyes: Appearance-based Gaze Estimation Using Federated Secure Multi-Party ComputationProceedings of the ACM on Human-Computer Interaction10.1145/36556068:ETRA(1-23)Online publication date: 28-May-2024
  • (2024)GazeSwitch: Automatic Eye-Head Mode Switching for Optimised Hands-Free PointingProceedings of the ACM on Human-Computer Interaction10.1145/36556018:ETRA(1-20)Online publication date: 28-May-2024
  • Show More Cited By

Index Terms

  1. Building a Personalized, Auto-Calibrating Eye Tracker from User Interactions

      Recommendations

      Comments

      Please enable JavaScript to view thecomments powered by Disqus.

      Information & Contributors

      Information

      Published In

      cover image ACM Conferences
      CHI '16: Proceedings of the 2016 CHI Conference on Human Factors in Computing Systems
      May 2016
      6108 pages
      ISBN:9781450333627
      DOI:10.1145/2858036
      Permission to make digital or hard copies of all or part of this work for personal or classroom use is granted without fee provided that copies are not made or distributed for profit or commercial advantage and that copies bear this notice and the full citation on the first page. Copyrights for components of this work owned by others than ACM must be honored. Abstracting with credit is permitted. To copy otherwise, or republish, to post on servers or to redistribute to lists, requires prior specific permission and/or a fee. Request permissions from [email protected]

      Sponsors

      Publisher

      Association for Computing Machinery

      New York, NY, United States

      Publication History

      Published: 07 May 2016

      Permissions

      Request permissions for this article.

      Check for updates

      Badges

      • Best Paper

      Author Tags

      1. data validation
      2. gaze estimation
      3. gaze-interaction correspondence
      4. implicit modeling

      Qualifiers

      • Research-article

      Funding Sources

      • Hong Kong Research Grants Council

      Conference

      CHI'16
      Sponsor:
      CHI'16: CHI Conference on Human Factors in Computing Systems
      May 7 - 12, 2016
      California, San Jose, USA

      Acceptance Rates

      CHI '16 Paper Acceptance Rate 565 of 2,435 submissions, 23%;
      Overall Acceptance Rate 6,199 of 26,314 submissions, 24%

      Upcoming Conference

      CHI 2025
      ACM CHI Conference on Human Factors in Computing Systems
      April 26 - May 1, 2025
      Yokohama , Japan

      Contributors

      Other Metrics

      Bibliometrics & Citations

      Bibliometrics

      Article Metrics

      • Downloads (Last 12 months)81
      • Downloads (Last 6 weeks)6
      Reflects downloads up to 08 Mar 2025

      Other Metrics

      Citations

      Cited By

      View all
      • (2024)CalibRead: Unobtrusive Eye Tracking Calibration from Natural Reading BehaviorProceedings of the ACM on Interactive, Mobile, Wearable and Ubiquitous Technologies10.1145/36997378:4(1-30)Online publication date: 21-Nov-2024
      • (2024)PrivatEyes: Appearance-based Gaze Estimation Using Federated Secure Multi-Party ComputationProceedings of the ACM on Human-Computer Interaction10.1145/36556068:ETRA(1-23)Online publication date: 28-May-2024
      • (2024)GazeSwitch: Automatic Eye-Head Mode Switching for Optimised Hands-Free PointingProceedings of the ACM on Human-Computer Interaction10.1145/36556018:ETRA(1-20)Online publication date: 28-May-2024
      • (2024)Instant interaction driven adaptive gaze control interfaceScientific Reports10.1038/s41598-024-62365-914:1Online publication date: 22-May-2024
      • (2024)A review on visible-light eye-tracking methods based on a low-cost cameraJournal of Ambient Intelligence and Humanized Computing10.1007/s12652-024-04760-815:4(2381-2397)Online publication date: 14-Mar-2024
      • (2024)Using eye-tracking in education: review of empirical research and technologyEducational technology research and development10.1007/s11423-024-10342-4Online publication date: 24-Jan-2024
      • (2024)Using Cockpit Interactions for Implicit Eye-Tracking Calibration in a Flight SimulatorComputer Vision, Imaging and Computer Graphics Theory and Applications10.1007/978-3-031-66743-5_12(256-270)Online publication date: 22-Aug-2024
      • (2023)An End-to-End Review of Gaze Estimation and its Interactive Applications on Handheld Mobile DevicesACM Computing Surveys10.1145/360694756:2(1-38)Online publication date: 15-Sep-2023
      • (2023)Realization of Human Eye Pupil Detection System using Canny Edge Detector and Circular Hough Transform Technique2023 2nd International Conference on Applied Artificial Intelligence and Computing (ICAAIC)10.1109/ICAAIC56838.2023.10140671(861-865)Online publication date: 4-May-2023
      • (2023)Audio-Driven Gaze Estimation for MOOC Video2023 IEEE 6th International Conference on Big Data and Artificial Intelligence (BDAI)10.1109/BDAI59165.2023.10256826(118-123)Online publication date: 7-Jul-2023
      • Show More Cited By

      View Options

      Login options

      View options

      PDF

      View or Download as a PDF file.

      PDF

      eReader

      View online with eReader.

      eReader

      Figures

      Tables

      Media

      Share

      Share

      Share this Publication link

      Share on social media