[go: up one dir, main page]
More Web Proxy on the site http://driver.im/ skip to main content
10.1145/3379156.3391365acmconferencesArticle/Chapter ViewAbstractPublication PagesetraConference Proceedingsconference-collections
short-paper

Positional head-eye tracking outside the lab: an open-source solution

Published: 02 June 2020 Publication History

Abstract

Simultaneous head and eye tracking has traditionally been confined to a laboratory setting and real-world motion tracking limited to measuring linear acceleration and angular velocity. Recently available mobile devices such as the Pupil Core eye tracker and the Intel RealSense T265 motion tracker promise to deliver accurate measurements outside the lab. Here, the researchers propose a hard- and software framework that combines both devices into a robust, usable, low-cost head and eye tracking system. The developed software is open source and the required hardware modifications can be 3D printed. The researchers demonstrate the system’s ability to measure head and eye movements in two tasks: an eyes-fixed head rotation task eliciting the vestibulo-ocular reflex inside the laboratory, and a natural locomotion task where a subject walks around a building outside of the laboratory. The resultant head and eye movements are discussed, as well as future implementations of this system.

References

[1]
Kathryn Bonnen, Jonathan S. Matthis, Agostino Gibaldi, Martin S. Banks, Dennis Levi, and Mary Hayhoe. 2019. A role for stereopsis in walking over complex terrains. Journal of Vision 19, 10 (2019). Issue 178b.
[2]
G. Bradski. 2000. The OpenCV Library. Dr. Dobb’s Journal of Software Tools(2000).
[3]
Chang Chen, Hua Zhu, Menggang Li, and Shaoze You. 2018. A review of visual-inertial simultaneous localization and mapping from filtering-based and optimization-based perspectives. Robotics 7 (7 2018), 1–20. Issue 45.
[4]
Wolfgang Einhäuser, Frank Schumann, Stanislavs Bardins, Klaus Bartl, Guido Böning, Erich Schneider, and Peter König. 2007. Human eye-head co-ordination in natural exploration. Network: Computation in Neural Systems 18 (9 2007), 267–297. Issue 3.
[5]
James Gao, Alex Huth, and Young Park. 2017. Case Forge. Retrieved February 3, 2020 from https://caseforge.co/
[6]
Kenneth Holmqvist and Richard Andersson. 2017. Eye-tracking: A comprehensive guide to methods, paradigms and measures.
[7]
WooYeon Jeong and Kyoung Mu Lee. 2005. CV-SLAM: A new ceiling vision-based SLAM technique. 2005 IEEE/RSJ International Conference on Intelligent Robots and Systems (2005), 1–6.
[8]
Moritz Kassner, William Patera, and Andreas Bulling. 2014. Pupil: An open source platform for pervasive eye tracking and mobile gaze-based interaction. Proceedings of the 2014 ACM International Joint Conference on Pervasive and Ubiquitous Computing: Adjunct Publication (9 2014), 1151–1160.
[9]
Thomas Kinsman, Karen Evans, Glenn Sweeny, Tommy Keane, and Jeff Pelz. 2012. Ego-motion compensation improves fixation detection in wearable eye tracking. Proceedings of the Symposium for Eye Tracking Research and Applications (3 2012), 221–224.
[10]
Rakshit Kothari, Zhizhuo Yanh, Christopher Kanan, Reynold Bailey, Jeff Pelz, and Gabriel Diaz. 2019. Gaze-in-wild: A dataset for studying eye and head coordination in everyday activities. arXiv (5 2019).
[11]
Linnéa Larsson, Marcus Nyström, Andrea Schwaller, Martin Stridh, and Kenneth Holmqvist. 2014. Compensation of head movements in mobile eye-tracking data using an inertial measurement unit. Proceedings of the 2014 ACM International Joint Conference on Pervasive and Ubiquitous Computing: Adjunct Publication (9 2014), 1161–1167.
[12]
Feng Li, Jeff B. Pelz, and Scott J. Daly. 2009. Measuring hand, head and vehicle motions in commuting environments. Proceedings of SPIE 7240(2009), 7240–1–7240–11.
[13]
Wei Li and Jinling Wang. 2012. Effective adaptive Kalman filter for MEMS-IMU/Magnetometers integrated attitude and heading reference systems. The Journal of Navigation 66 (2012), 99–113. Issue 1.
[14]
Paul R. MacNeilage, Luan Nguyen, and Christian Sinnott. 2019. Characterization of natural head and eye movements driving retinal flow. Journal of Vision 19, 10 (2019). Issue 147d.
[15]
Jonathan S. Matthis, Karl S. Muller, and Mary M. Hayhoe. 2019. Retinal optic flow and the control of locomotion. Journal of Vision 19, 10 (2019). Issue 179.
[16]
Jonathan S. Matthis, Jacob L. Yates, and Mary M. Hayhoe. 2018. Gaze and the control of foot placement when walking in natural terrain. Current Biology 28(2018), 1224–1233.
[17]
Thiago Santini, Hanna Brinkmann, Luise Reitstätter, Helmut Leder, Raphael Rosenberg, Wolfgang Rosenstiel, and Enkelejda Kasneci. 2018. The art of pervasive eye tracking: Unconstrained eye tracking in the Austrian gallery Belvedere.PETMEI ’18: 7th Workshop on Pervasive Eye Tracking and Mobile Eye-Based Interaction (2018), 1–8.

Cited By

View all
  • (2024)The visual experience dataset: Over 200 recorded hours of integrated eye movement, odometry, and egocentric videoJournal of Vision10.1167/jov.24.11.624:11(6)Online publication date: 8-Oct-2024
  • (2024)Gaze and behavioural metrics in the refractive correction of presbyopiaOphthalmic and Physiological Optics10.1111/opo.1331044:4(774-786)Online publication date: 5-Apr-2024
  • (2023)Noise estimation for head-mounted 3D binocular eye tracking using Pupil Core eye-tracking gogglesBehavior Research Methods10.3758/s13428-023-02150-056:1(53-79)Online publication date: 27-Jun-2023
  • Show More Cited By

Recommendations

Comments

Please enable JavaScript to view thecomments powered by Disqus.

Information & Contributors

Information

Published In

cover image ACM Conferences
ETRA '20 Short Papers: ACM Symposium on Eye Tracking Research and Applications
June 2020
305 pages
ISBN:9781450371346
DOI:10.1145/3379156
Permission to make digital or hard copies of all or part of this work for personal or classroom use is granted without fee provided that copies are not made or distributed for profit or commercial advantage and that copies bear this notice and the full citation on the first page. Copyrights for components of this work owned by others than the author(s) must be honored. Abstracting with credit is permitted. To copy otherwise, or republish, to post on servers or to redistribute to lists, requires prior specific permission and/or a fee. Request permissions from [email protected].

Sponsors

Publisher

Association for Computing Machinery

New York, NY, United States

Publication History

Published: 02 June 2020

Permissions

Request permissions for this article.

Check for updates

Author Tags

  1. Eye tracking
  2. gaze estimation
  3. head tracking
  4. locomotion
  5. mobile
  6. open source
  7. simultaneous localization and mapping

Qualifiers

  • Short-paper
  • Research
  • Refereed limited

Funding Sources

  • NIH
  • NSF

Conference

ETRA '20

Acceptance Rates

Overall Acceptance Rate 69 of 137 submissions, 50%

Upcoming Conference

Contributors

Other Metrics

Bibliometrics & Citations

Bibliometrics

Article Metrics

  • Downloads (Last 12 months)57
  • Downloads (Last 6 weeks)7
Reflects downloads up to 06 Jan 2025

Other Metrics

Citations

Cited By

View all
  • (2024)The visual experience dataset: Over 200 recorded hours of integrated eye movement, odometry, and egocentric videoJournal of Vision10.1167/jov.24.11.624:11(6)Online publication date: 8-Oct-2024
  • (2024)Gaze and behavioural metrics in the refractive correction of presbyopiaOphthalmic and Physiological Optics10.1111/opo.1331044:4(774-786)Online publication date: 5-Apr-2024
  • (2023)Noise estimation for head-mounted 3D binocular eye tracking using Pupil Core eye-tracking gogglesBehavior Research Methods10.3758/s13428-023-02150-056:1(53-79)Online publication date: 27-Jun-2023
  • (2023)GFIE: A Dataset and Baseline for Gaze-Following from 2D to 3D in Indoor Environments2023 IEEE/CVF Conference on Computer Vision and Pattern Recognition (CVPR)10.1109/CVPR52729.2023.00860(8907-8916)Online publication date: Jun-2023
  • (2023)Eye Tracking in Virtual Reality: a Broad Review of Applications and ChallengesVirtual Reality10.1007/s10055-022-00738-z27:2(1481-1505)Online publication date: 18-Jan-2023
  • (2023)Gaze Tracking: A Survey of Devices, Libraries and ApplicationsModelling and Development of Intelligent Systems10.1007/978-3-031-27034-5_2(18-41)Online publication date: 26-Feb-2023
  • (2022)High-fidelity eye, head, body, and world tracking with a wearable deviceBehavior Research Methods10.3758/s13428-022-01888-356:1(32-42)Online publication date: 25-Jul-2022
  • (2022)Consider the Head Movements! Saccade Computation in Mobile Eye-Tracking2022 Symposium on Eye Tracking Research and Applications10.1145/3517031.3529624(1-7)Online publication date: 8-Jun-2022
  • (2021)ARETT: Augmented Reality Eye Tracking Toolkit for Head Mounted DisplaysSensors10.3390/s2106223421:6(2234)Online publication date: 23-Mar-2021
  • (2021)Pupil Tracking Under Direct SunlightACM Symposium on Eye Tracking Research and Applications10.1145/3450341.3458490(1-4)Online publication date: 25-May-2021
  • Show More Cited By

View Options

Login options

View options

PDF

View or Download as a PDF file.

PDF

eReader

View online with eReader.

eReader

HTML Format

View this article in HTML Format.

HTML Format

Media

Figures

Other

Tables

Share

Share

Share this Publication link

Share on social media