[go: up one dir, main page]
More Web Proxy on the site http://driver.im/ skip to main content
10.1145/3208031.3208035acmconferencesArticle/Chapter ViewAbstractPublication PagesubicompConference Proceedingsconference-collections
research-article

Making stand-alone PS-OG technology tolerant to the equipment shifts

Published: 15 June 2018 Publication History

Abstract

Tracking users' gaze in virtual reality headsets allows natural and intuitive interaction with virtual avatars and virtual objects. Moreover, a technique known as foveated rendering can help save computational resources and enable hi-resolution but lightweight virtual reality technologies. Predominantly, eye-tracking hardware in modern VR headsets consist of infrared camera(s) and LEDs. Such hardware, together with image processing software consumes a substantial amount of energy, and, provided that hi-speed gaze detection is needed, might be very expensive. A promising technique to overcome these issues is photo-sensor oculography (PS-OG), which allows eye-tracking with high sampling rate and low power consumption. However, the main limitation of the previous PS-OG systems is their inability to compensate for the equipment shifts. In this study, we employ a simple multi-layer perceptron neural network to map raw sensor data to gaze locations and report its performance for shift compensation. Modeling and evaluation is done via a simulation.

References

[1]
Behnam Bastani and Eric Turner. 2017. Introducing a New Foveation Pipeline for Virtual/Mixed Reality. (2017). https://research.googleblog.com/2017/12/introducing-new-foveation-pipeline-for.html
[2]
Brian Guenter, Mark Finch, Steven Drucker, Desney Tan, and John Snyder. 2012. Foveated 3D graphics. ACM Transactions on Graphics (TOG) 31, 6 (2012), 164.
[3]
Raphael Hunger. 2005. Floating point operations in matrix-vector calculus. Munich University of Technology, Inst. for Circuit Theory and Signal Processing Munich.
[4]
Tianxing Li, Qiang Liu, and Xia Zhou. 2017. Ultra-Low Power Gaze Tracking for Virtual Reality. (2017).
[5]
Addison Mayberry, Yamin Tun, Pan Hu, Duncan Smith-Freedman, Deepak Ganesan, Benjamin M Marlin, and Christopher Salthouse. 2015. CIDER: Enabling robustness-power tradeoffs on a computational eyeglass. In Proceedings of the 21st Annual International Conference on Mobile Computing and Networking. ACM, 400--412.
[6]
J Parker Mitchell, Aaron R Young, Jordan Sangid, Kelley A Deuso, Patricia J Eckhart, Taher Naderi, and Mark Dean. 2017. Performance, Management, and Monitoring of 68 Node Raspberry Pi 3 Education Cluster: Big Orange Bramble (BOB). (2017). http://www.scs.org/wp-content/uploads/2017/06/37_Final_Manuscript.pdf
[7]
F. Pedregosa, G. Varoquaux, A. Gramfort, V. Michel, B. Thirion, O. Grisel, M. Blondel, P. Prettenhofer, R. Weiss, V. Dubourg, J. Vanderplas, A. Passos, D. Cournapeau, M. Brucher, M. Perrot, and E. Duchesnay. 2011. Scikit-learn: Machine Learning in Python. Journal of Machine Learning Research 12 (2011), 2825--2830.
[8]
Ioannis Rigas, Hayes Raffle, and Oleg V Komogortsev. 2017a. Hybrid PS-V Technique: A Novel Sensor Fusion Approach for Fast Mobile Eye-Tracking with Sensor-Shift Aware Correction. IEEE Sensors Journal 17, 24 (2017), 8356--8366.
[9]
Ioannis Rigas, Hayes Raffle, and Oleg V Komogortsev. 2017b. Photosensor Oculography: Survey and Parametric Analysis of Designs using Model-Based Simulation. arXiv preprint arXiv:1707.05413 (2017).
[10]
Lech Świrski and Neil A. Dodgson. 2014. Rendering synthetic ground truth images for eye tracker evaluation. In Proceedings of ETRA 2014. 219--222. http://www.cl.cam.ac.uk/research/rainbow/projects/eyerender/
[11]
Nicholas Torok, Victor Guillemin Jr, and JM Barnothy. 1951. Photoelectric Nystagmography. Annals of Otology, Rhinology & Laryngology 60, 4 (1951), 917--926.

Cited By

View all
  • (2023)Multi-Rate Sensor Fusion for Unconstrained Near-Eye Gaze EstimationProceedings of the 2023 Symposium on Eye Tracking Research and Applications10.1145/3588015.3588407(1-8)Online publication date: 30-May-2023
  • (2020)Optical Gaze Tracking with Spatially-Sparse Single-Pixel Detectors2020 IEEE International Symposium on Mixed and Augmented Reality (ISMAR)10.1109/ISMAR50242.2020.00033(117-126)Online publication date: Nov-2020
  • (2019)Get a gripProceedings of the 11th ACM Symposium on Eye Tracking Research & Applications10.1145/3314111.3319835(1-10)Online publication date: 25-Jun-2019
  • Show More Cited By

Recommendations

Comments

Please enable JavaScript to view thecomments powered by Disqus.

Information & Contributors

Information

Published In

cover image ACM Conferences
PETMEI '18: Proceedings of the 7th Workshop on Pervasive Eye Tracking and Mobile Eye-Based Interaction
June 2018
50 pages
ISBN:9781450357890
DOI:10.1145/3208031
Permission to make digital or hard copies of all or part of this work for personal or classroom use is granted without fee provided that copies are not made or distributed for profit or commercial advantage and that copies bear this notice and the full citation on the first page. Copyrights for components of this work owned by others than the author(s) must be honored. Abstracting with credit is permitted. To copy otherwise, or republish, to post on servers or to redistribute to lists, requires prior specific permission and/or a fee. Request permissions from [email protected].

Sponsors

Publisher

Association for Computing Machinery

New York, NY, United States

Publication History

Published: 15 June 2018

Permissions

Request permissions for this article.

Check for updates

Author Tags

  1. photo-sensor oculography
  2. sensor shift correction
  3. virtual reality

Qualifiers

  • Research-article

Funding Sources

  • Michigan State University

Conference

ETRA '18

Contributors

Other Metrics

Bibliometrics & Citations

Bibliometrics

Article Metrics

  • Downloads (Last 12 months)17
  • Downloads (Last 6 weeks)0
Reflects downloads up to 11 Dec 2024

Other Metrics

Citations

Cited By

View all
  • (2023)Multi-Rate Sensor Fusion for Unconstrained Near-Eye Gaze EstimationProceedings of the 2023 Symposium on Eye Tracking Research and Applications10.1145/3588015.3588407(1-8)Online publication date: 30-May-2023
  • (2020)Optical Gaze Tracking with Spatially-Sparse Single-Pixel Detectors2020 IEEE International Symposium on Mixed and Augmented Reality (ISMAR)10.1109/ISMAR50242.2020.00033(117-126)Online publication date: Nov-2020
  • (2019)Get a gripProceedings of the 11th ACM Symposium on Eye Tracking Research & Applications10.1145/3314111.3319835(1-10)Online publication date: 25-Jun-2019
  • (2019)Power-efficient and shift-robust eye-tracking sensor for portable VR headsetsProceedings of the 11th ACM Symposium on Eye Tracking Research & Applications10.1145/3314111.3319821(1-8)Online publication date: 25-Jun-2019
  • (2019)Assessment of Shift-Invariant CNN Gaze Mappings for PS-OG Eye Movement Sensors2019 IEEE/CVF International Conference on Computer Vision Workshop (ICCVW)10.1109/ICCVW.2019.00450(3651-3659)Online publication date: Oct-2019
  • (2018)Developing photo-sensor oculography (PS-OG) system for virtual reality headsetsProceedings of the 2018 ACM Symposium on Eye Tracking Research & Applications10.1145/3204493.3208341(1-3)Online publication date: 14-Jun-2018

View Options

Login options

View options

PDF

View or Download as a PDF file.

PDF

eReader

View online with eReader.

eReader

Media

Figures

Other

Tables

Share

Share

Share this Publication link

Share on social media