[go: up one dir, main page]
More Web Proxy on the site http://driver.im/ skip to main content
10.1145/2807442.2807445acmconferencesArticle/Chapter ViewAbstractPublication PagesuistConference Proceedingsconference-collections
research-article

Self-Calibrating Head-Mounted Eye Trackers Using Egocentric Visual Saliency

Published: 05 November 2015 Publication History

Abstract

Head-mounted eye tracking has significant potential for gaze-based applications such as life logging, mental health monitoring, or the quantified self. A neglected challenge for the long-term recordings required by these applications is that drift in the initial person-specific eye tracker calibration, for example caused by physical activity, can severely impact gaze estimation accuracy and thus system performance and user experience. We first analyse calibration drift on a new dataset of natural gaze data recorded using synchronised video-based and Electrooculography-based eye trackers of 20 users performing everyday activities in a mobile setting. Based on this analysis we present a method to automatically self-calibrate head-mounted eye trackers based on a computational model of bottom-up visual saliency. Through evaluations on the dataset we show that our method 1) is effective in reducing calibration drift in calibrated eye trackers and 2) given sufficient data, can achieve gaze estimation accuracy competitive with that of a calibrated eye tracker, without any manual calibration.

Supplementary Material

MP4 File (p363.mp4)

References

[1]
F. Alnajar, T. Gevers, R. Valenti, and S. Ghebreab. 2013. Calibration-Free Gaze Estimation Using Human Gaze Patterns. In Proc. ICCV. 137--144.
[2]
D. Bannach, P. Lukowicz, and O. Amft. 2008. Rapid prototyping of activity recognition applications. Pervasive Computing, IEEE 7, 2 (2008), 22--31.
[3]
M. Barz, A. Bulling, and F. Daiber. 2015. Computational Modelling and Prediction of Gaze Estimation Error for Head-mounted Eye Trackers. Technical Report. German Research Center for Artificial Intelligence (DFKI). 10 pages.
[4]
R. Benenson, M. Mathias, R. Timofte, and L. Van Gool. 2012. Pedestrian detection at 100 frames per second. In Proc. CVPR. 2903--2910.
[5]
R. Benenson, M. Mathias, T. Tuytelaars, and L. Van Gool. 2013. Seeking the Strongest Rigid Detector. In Proc. CVPR. 3666--3673.
[6]
A. Borji and L. Itti. 2013. State-of-the-Art in Visual Attention Modeling. IEEE Trans. Pattern Anal. Mach. Intell. 35, 1 (2013), 185--207.
[7]
A. Bulling, D. Roggen, and G. Tröster. 2009. Wearable EOG goggles: Seamless sensing and context-awareness in everyday environments. J. Ambient Intell. Smart Environ. 1, 2 (2009), 157--171.
[8]
A. Bulling, J. Ward, H. Gellersen, and G. Troster. 2011. Eye Movement Analysis for Activity Recognition Using Electrooculography. IEEE Trans. Pattern Anal. Mach. Intell. 33, 4 (2011), 741--753.
[9]
A. Bulling, J. A. Ward, and H. Gellersen. 2012. Multimodal Recognition of Reading Activity in ransit Using Body-Worn Sensors. ACM Trans. Appl. Percept. 9, 1 (2012), 2:1--2:21.
[10]
A. Bulling, C. Weichel, and H. Gellersen. 2013. EyeContext: Recognition of High-level Contextual Cues from Human Visual Behaviour. In Proc. CHI. 305--308.
[11]
Z. Bylinskii, T. Judd, A. Borji, L. Itti, F. Durand, A. Oliva, and A. Torralba. 2014. MIT Saliency Benchmark. http://saliency.mit.edu/. (2014).
[12]
M. Cerf, J. Harel, W. Einhaeuser, and C. Koch. 2008. Predicting human gaze using low-level saliency combined with face detection. In Proc. NIPS. 241--248.
[13]
J. Chen and Q. Ji. 2015. A Probabilistic Approach to Online Eye Gaze Tracking Without Explicit Personal Calibration. IEEE Trans. Image Process. 24, 3 (2015), 1076--1086.
[14]
J. Drewes, G. S. Masson, and A. Montagnini. 2012. Shifts in Reported Gaze Position Due to Changes in Pupil Size: Ground Truth and Compensation. In Proc. ETRA. 209--212.
[15]
M. A. Fischler and R. C. Bolles. 1981. Random Sample Consensus: A Paradigm for Model Fitting with Applications to Image Analysis and Automated Cartography. Commun. ACM 24, 6 (June 1981), 381--395.
[16]
E. Guestrin and E. Eizenman. 2006. General theory of remote gaze estimation using the pupil center and corneal reflections. IEEE Trans. Biomed. Eng. 53, 6 (2006), 1124--1133.
[17]
D. Hansen and Q. Ji. 2010. In the Eye of the Beholder: A Survey of Models for Eyes and Gaze. IEEE Trans. Pattern Anal. Mach. Intell. 32, 3 (2010), 478--500.
[18]
A. Hornof and T. Halverson. 2002. Cleaning up systematic error in eye-tracking data by using required fixation locations. Behav. Res. Meth. Instrum. Comput. 34, 4 (2002), 592--604.
[19]
Y. Ishiguro, A. Mujibiya, T. Miyaki, and J. Rekimoto. 2010. Aided Eyes: Eye Activity Sensing for Daily Life. In Proc. AH. Article 25, 25:1--25:7 pages.
[20]
Y. Ishiguro and J. Rekimoto. 2012. GazeCloud: A Thumbnail Extraction Method Using Gaze Log Data for Video Life-Log. In Proc. ISWC. 72--75.
[21]
S. Ishimaru, K. Kunze, Y. Uema, K. Kise, M. Inami, and K. Tanaka. 2014. Smarter Eyewear: Using Commercial EOG Glasses for Activity Recognition. In Proc. Ubicomp. 239--242.
[22]
Y. Itoh and G. Klinker. 2014. Interaction-free calibration for optical see-through head-mounted displays based on 3d eye localization. In Proc. 3DUI. 75--82.
[23]
S. John, E. Weitnauer, and H. Koesling. 2012. Entropy-based correction of eye tracking data for static scenes. In Proc. ETRA. 297--300.
[24]
W. Kabsch. 1976. A solution for the best rotation to relate two sets of vectors. Acta Crystallographica Section A 32, 5 (09 1976), 922--923.
[25]
M. Kassner, W. Patera, and A. Bulling. 2014. Pupil: An Open Source Platform for Pervasive Eye Tracking and Mobile Gaze-based Interaction. In Proc. Ubicomp. 1151--1160.
[26]
K. Kunze, S. Ishimaru, Y. Utsumi, and K. Kise. 2013. My Reading Life: Towards Utilizing Eyetracking on Unmodified Tablets and Phones. In Proc. Ubicomp. 283--286.
[27]
J. Li and Y. Zhang. 2013. Learning SURF Cascade for Fast and Accurate Object Detection. In Proc. CVPR. 3468--3475.
[28]
P. Majaranta and A. Bulling. 2014. Eye Tracking and Eye-Based Human-Computer Interaction. Springer, 39--65.
[29]
H. Manabe and M. Fukumoto. 2006. Full-time Wearable Headphone-type Gaze Detector. In CHI EA. 1073--1078.
[30]
D. Model and M. Eizenman. 2010. User-calibration-free Remote Gaze Estimation System. In Proc. ETRA. 29--36.
[31]
A. Plopski, Y. Itoh, C. Nitschke, K. Kiyokawa, G. Klinker, and H. Takemura. 2015. Corneal-imaging calibration for optical see-through head-mounted displays. IEEE Trans. Vis. Comput. Graphics 21, 4 (2015), 481--490.
[32]
D. Sculley. 2010. Web-scale K-means Clustering. In Proc. WWW. 1177--1178.
[33]
S.-W. Shih, Y.-T. Wu, and J. Liu. 2000. A calibration-free gaze tracking technique. In Proc. ICPR, Vol. 4. 201--204 vol.4.
[34]
J. Steil and A. Bulling. 2015. Discovery of Everyday Human Activities From Long-term Visual Behaviour Using Topic Models. In Proc. UbiComp 2015.
[35]
S. Stellmach and R. Dachselt. 2012. Look & Touch: Gaze-supported Target Acquisition. In Proc. CHI. 2981--2990.
[36]
Y. Sugano, Y. Matsushita, and Y. Sato. 2013. Appearance-Based Gaze Estimation Using Visual Saliency. IEEE Trans. Pattern Anal. Mach. Intell. 35, 2 (2013), 329--341.
[37]
H. R. Tavakoli, E. Rahtu, and J. Heikkilä. 2011. Fast and efficient saliency detection using sparse sampling and kernel density estimation. In Proc. SCIA. 666--675.
[38]
B. Tessendorf, A. Bulling, D. Roggen, T. Stiefmeier, M. Feilner, P. Derleth, and G. Tröster. 2011. Recognition of hearing needs from body and eye movements to improve hearing instruments. In Proc. PerCom. 314--331.
[39]
J. Turner, J. Alexander, A. Bulling, D. Schmidt, and H. Gellersen. 2013. Eye Pull, Eye Push: Moving Objects between Large Screens and Personal Devices with Gaze & Touch. In Proc. INTERACT.
[40]
J. Turner, A. Bulling, J. Alexander, and H. Gellersen. 2014. Cross-Device Gaze-Supported Point-to-Point Content Transfer. In Proc. ETRA. 19--26.
[41]
M. Vidal, J. Turner, A. Bulling, and H. Gellersen. 2012. Wearable Eye Tracking for Mental Health Monitoring. Computer Communications 35, 11 (2012), 1306--1311.
[42]
J. Zhang and S. Sclaroff. 2013. Saliency Detection: A Boolean Map Approach. In Proc. ICCV. 153--160.

Cited By

View all
  • (2024)CalibRead: Unobtrusive Eye Tracking Calibration from Natural Reading BehaviorProceedings of the ACM on Interactive, Mobile, Wearable and Ubiquitous Technologies10.1145/36997378:4(1-30)Online publication date: 21-Nov-2024
  • (2024)An Empirical Study on Social Anxiety in a Virtual Environment through Mediating Variables and Multiple Sensor DataProceedings of the ACM on Human-Computer Interaction10.1145/36869778:CSCW2(1-24)Online publication date: 8-Nov-2024
  • (2024)Multimodal Teacher Dashboards: Challenges and Opportunities of Enhancing Teacher Insights Through a Case StudyIEEE Transactions on Learning Technologies10.1109/TLT.2023.327684817(181-201)Online publication date: 1-Jan-2024
  • Show More Cited By

Index Terms

  1. Self-Calibrating Head-Mounted Eye Trackers Using Egocentric Visual Saliency

      Recommendations

      Comments

      Please enable JavaScript to view thecomments powered by Disqus.

      Information & Contributors

      Information

      Published In

      cover image ACM Conferences
      UIST '15: Proceedings of the 28th Annual ACM Symposium on User Interface Software & Technology
      November 2015
      686 pages
      ISBN:9781450337793
      DOI:10.1145/2807442
      Permission to make digital or hard copies of all or part of this work for personal or classroom use is granted without fee provided that copies are not made or distributed for profit or commercial advantage and that copies bear this notice and the full citation on the first page. Copyrights for components of this work owned by others than the author(s) must be honored. Abstracting with credit is permitted. To copy otherwise, or republish, to post on servers or to redistribute to lists, requires prior specific permission and/or a fee. Request permissions from [email protected].

      Sponsors

      Publisher

      Association for Computing Machinery

      New York, NY, United States

      Publication History

      Published: 05 November 2015

      Permissions

      Request permissions for this article.

      Check for updates

      Author Tags

      1. calibration drift
      2. electrooculography
      3. mobile eye tracking
      4. user calibration
      5. visual saliency

      Qualifiers

      • Research-article

      Funding Sources

      Conference

      UIST '15

      Acceptance Rates

      UIST '15 Paper Acceptance Rate 70 of 297 submissions, 24%;
      Overall Acceptance Rate 561 of 2,567 submissions, 22%

      Upcoming Conference

      UIST '25
      The 38th Annual ACM Symposium on User Interface Software and Technology
      September 28 - October 1, 2025
      Busan , Republic of Korea

      Contributors

      Other Metrics

      Bibliometrics & Citations

      Bibliometrics

      Article Metrics

      • Downloads (Last 12 months)72
      • Downloads (Last 6 weeks)7
      Reflects downloads up to 17 Dec 2024

      Other Metrics

      Citations

      Cited By

      View all
      • (2024)CalibRead: Unobtrusive Eye Tracking Calibration from Natural Reading BehaviorProceedings of the ACM on Interactive, Mobile, Wearable and Ubiquitous Technologies10.1145/36997378:4(1-30)Online publication date: 21-Nov-2024
      • (2024)An Empirical Study on Social Anxiety in a Virtual Environment through Mediating Variables and Multiple Sensor DataProceedings of the ACM on Human-Computer Interaction10.1145/36869778:CSCW2(1-24)Online publication date: 8-Nov-2024
      • (2024)Multimodal Teacher Dashboards: Challenges and Opportunities of Enhancing Teacher Insights Through a Case StudyIEEE Transactions on Learning Technologies10.1109/TLT.2023.327684817(181-201)Online publication date: 1-Jan-2024
      • (2024)Self-Calibrating Gaze Estimation via Matching Spatio-Temporal Reading Patterns and Eye-Feature PatternsIEEE Transactions on Industrial Informatics10.1109/TII.2024.341329220:10(11697-11707)Online publication date: Oct-2024
      • (2024)Self-Calibrating Gaze Estimation With Optical Axes Projection for Head-Mounted Eye TrackingIEEE Transactions on Industrial Informatics10.1109/TII.2023.327632220:2(1397-1407)Online publication date: Feb-2024
      • (2024)Using Cockpit Interactions for Implicit Eye-Tracking Calibration in a Flight SimulatorComputer Vision, Imaging and Computer Graphics Theory and Applications10.1007/978-3-031-66743-5_12(256-270)Online publication date: 22-Aug-2024
      • (2023)Teleoperation of Humanoid Robots: A SurveyIEEE Transactions on Robotics10.1109/TRO.2023.323695239:3(1706-1727)Online publication date: Jun-2023
      • (2023)3D Face Reconstruction and Gaze Tracking in the HMD for Virtual InteractionIEEE Transactions on Multimedia10.1109/TMM.2022.315682025(3166-3179)Online publication date: 1-Jan-2023
      • (2023)Continuous Gaze Tracking With Implicit Saliency-Aware Calibration on Mobile DevicesIEEE Transactions on Mobile Computing10.1109/TMC.2022.318513422:10(5816-5828)Online publication date: 1-Oct-2023
      • (2023)Augmented Reality User’s Experience: AI-Based Data Collection, Processing and AnalysisAugmented Reality and Artificial Intelligence10.1007/978-3-031-27166-3_2(31-46)Online publication date: 30-Apr-2023
      • Show More Cited By

      View Options

      Login options

      View options

      PDF

      View or Download as a PDF file.

      PDF

      eReader

      View online with eReader.

      eReader

      Media

      Figures

      Other

      Tables

      Share

      Share

      Share this Publication link

      Share on social media