[go: up one dir, main page]
More Web Proxy on the site http://driver.im/ skip to main content
10.1145/3450341.3458488acmconferencesArticle/Chapter ViewAbstractPublication PagesetraConference Proceedingsconference-collections
short-paper
Open access

Integrating High Fidelity Eye, Head and World Tracking in a Wearable Device

Published: 25 May 2021 Publication History

Abstract

A challenge in mobile eye tracking is balancing the quality of data collected with the ability for a subject to move freely and naturally through their environment. This challenge is exacerbated when an experiment necessitates multiple data streams recorded simultaneously and in high fidelity. Given these constraints, previous devices have had limited spatial and temporal resolution, as well as compression artifacts. To address this, we have designed a wearable device capable of recording a subject’s body, head, and eye positions, simultaneously with RGB and depth data from the subject’s visual environment, measured in high spatial and temporal resolution. The sensors include a binocular eye tracker, an RGB-D scene camera, a high-frame-rate scene camera, and two visual odometry sensors, which we synchronize and record from, with a total incoming data rate of over 700 MB/s. All sensors are operated by a mini-PC optimized for fast data collection, and powered by a small battery pack. The headset weighs only 1.4 kg, the remainder just 3.9kg, and can be comfortably worn by the subject in a small backpack, allowing full mobility.

References

[1]
Kathryn Bonnen, Jonathan S Matthis, Agostino Gibaldi, Martin S Banks, Dennis Levi, and Mary Hayhoe. 2019. A role for stereopsis in walking over complex terrains. Journal of Vision 19, 10 (2019), 178b–178b.
[2]
Vasha DuTell, Agostino Gibaldi, Giulia Focarelli, Bruno Olshausen, and Marty Banks. 2020. The Spatiotemporal Power Spectrum of Natural Human Vision. Journal of Vision 20, 11 (2020), 1661–1661.
[3]
Agostino Gibaldi and Martin S Banks. 2019. Binocular eye movements are adapted to the natural environment. Journal of Neuroscience 39, 15 (2019), 2877–2888.
[4]
Agostino Gibaldi and Martin S Banks. 2021. Crossed–uncrossed projections from primate retina are adapted to disparities of natural scenes. Proceedings of the National Academy of Sciences 118, 7 (2021), e2015651118.
[5]
Agostino Gibaldi, Vivek Labhishetty, Larry N Thibos, and Martin S Banks. 2021. The blur horopter: Retinal conjugate surface in binocular viewing. Journal of Vision 21, 8 (2021), –.
[6]
Peter Hausamann, Christian Sinnott, Martin Daumer, and Paul MacNeilage. 2021. Validation of the Intel RealSense T265 for Tracking Natural Human Head Motion. (2021).
[7]
Peter Hausamann, Christian Sinnott, and Paul R MacNeilage. 2020. Positional head-eye tracking outside the lab: an open-source solution. In ACM Symposium on Eye Tracking Research and Applications. 1–5.
[8]
Moritz Kassner, William Patera, and Andreas Bulling. 2014. Pupil: An Open Source Platform for Pervasive Eye Tracking and Mobile Gaze-based Interaction. In Adjunct Proceedings of the 2014 ACM International Joint Conference on Pervasive and Ubiquitous Computing (Seattle, Washington) (UbiComp ’14 Adjunct). ACM, New York, NY, USA, 1151–1160. https://doi.org/10.1145/2638728.2641695
[9]
Rakshit Kothari, Zhizhuo Yang, Christopher Kanan, Reynold Bailey, Jeff B Pelz, and Gabriel J Diaz. 2020. Gaze-in-wild: A dataset for studying eye and head coordination in everyday activities. Scientific reports 10, 1 (2020), 1–18.
[10]
Jonathan Samir Matthis, Jacob L Yates, and Mary M Hayhoe. 2018. Gaze and the control of foot placement when walking in natural terrain. Current Biology 28, 8 (2018), 1224–1233.

Cited By

View all

Recommendations

Comments

Please enable JavaScript to view thecomments powered by Disqus.

Information & Contributors

Information

Published In

cover image ACM Conferences
ETRA '21 Adjunct: ACM Symposium on Eye Tracking Research and Applications
May 2021
78 pages
ISBN:9781450383578
DOI:10.1145/3450341
This work is licensed under a Creative Commons Attribution International 4.0 License.

Sponsors

Publisher

Association for Computing Machinery

New York, NY, United States

Publication History

Published: 25 May 2021

Permissions

Request permissions for this article.

Check for updates

Author Tags

  1. data collection hardware
  2. human-computer interaction
  3. mobile eye tracking
  4. natural scenes
  5. open-source
  6. wearable eye-tracker

Qualifiers

  • Short-paper
  • Research
  • Refereed limited

Funding Sources

  • National Science Foundation
  • UC Berkeley Center for Innovation in Vision and Optics
  • National Defense Science and Engineering Graduate Fellowship

Conference

ETRA '21
Sponsor:

Acceptance Rates

Overall Acceptance Rate 69 of 137 submissions, 50%

Upcoming Conference

Contributors

Other Metrics

Bibliometrics & Citations

Bibliometrics

Article Metrics

  • 0
    Total Citations
  • 767
    Total Downloads
  • Downloads (Last 12 months)237
  • Downloads (Last 6 weeks)33
Reflects downloads up to 21 Dec 2024

Other Metrics

Citations

Cited By

View all

View Options

View options

PDF

View or Download as a PDF file.

PDF

eReader

View online with eReader.

eReader

HTML Format

View this article in HTML Format.

HTML Format

Login options

Media

Figures

Other

Tables

Share

Share

Share this Publication link

Share on social media