[go: up one dir, main page]
More Web Proxy on the site http://driver.im/ skip to main content
10.1145/3204493.3208336acmconferencesArticle/Chapter ViewAbstractPublication PagesetraConference Proceedingsconference-collections
abstract

EyeMR: low-cost eye-tracking for rapid-prototyping in head-mounted mixed reality

Published: 14 June 2018 Publication History

Abstract

Mixed Reality devices can either augment reality (AR) or create completely virtual realities (VR). Combined with head-mounted devices and eye-tracking, they enable users to interact with these systems in novel ways. However, current eye-tracking systems are expensive and limited in the interaction with virtual content. In this paper, we present EyeMR, a low-cost system (below 100$) that enables researchers to rapidly prototype new techniques for eye and gaze interactions. Our system supports mono- and binocular tracking (using Pupil Capture) and includes a Unity framework to support the fast development of new interaction techniques. We argue for the usefulness of EyeMR based on results of a user evaluation with HCI experts.

Supplementary Material

MP4 File (a90-stratmann.mp4)

References

[1]
A. Bulling and H. Gellersen. 2010. Toward Mobile Eye-Based Human-Computer Interaction. IEEE Pervasive Computing 9, 4 (October 2010), 8--12.
[2]
Andrew T Duchowski. 2007. Eye tracking methodology. Theory and practice 328 (2007).
[3]
Google. 2014. Google Cardboard. (June 2014). Retrieved April 25, 2018 from https://vr.google.com/cardboard
[4]
Julie A Jacko. 2009. Human-Computer Interaction. New Trends: 13th International Conference, HCI International 2009, San Diego, CA, USA, July 19--24, 2009, Proceedings. Vol. 5610. Springer Science & Business Media.
[5]
Moritz Kassner, William Patera, and Andreas Bulling. 2014. Pupil: An Open Source Platform for Pervasive Eye Tracking and Mobile Gaze-based Interaction. In Proceedings of the 2014 ACM International Joint Conference on Pervasive and Ubiquitous Computing: Adjunct Publication (UbiComp '14 Adjunct). ACM, New York, NY, USA, 1151--1160.
[6]
Ken Pfeuffer, Benedikt Mayer, Diako Mardanbegi, and Hans Gellersen. 2017. Gaze + Pinch Interaction in Virtual Reality. In Proceedings of the 5th Symposium on Spatial User Interaction (SUI '17). ACM, New York, NY, USA, 99--108.
[7]
Project Group Bull's Eye. 2017a. EyeMR Documentation. (Nov. 2017). Retrieved April 25, 2018 from https://bullseye.uol.de
[8]
Project Group Bull's Eye. 2017b. EyeMR Github Repository. (Nov. 2017). Retrieved April 25, 2018 from https://github.com/PGBullsEye
[9]
PTC Inc. 2015. Vuforia Augmented Reality SDK. (2015). Retrieved April 25, 2018 from https://www.vuforia.com
[10]
Pupil Labs. 2014. Pupil Capture. (2014). Retrieved April 25, 2018 from https://pupil-labs.com
[11]
Unity Technologies. 2014. Unity 5. (March 2014). Retrieved April 25, 2018 from https://unity3d.com
[12]
Mélodie Vidal, Andreas Bulling, and Hans Gellersen. 2013. Pursuits: Spontaneous Interaction with Displays Based on Smooth Pursuit Eye Movement and Moving Targets. In Proceedings of the 2013 ACM International Joint Conference on Pervasive and Ubiquitous Computing (UbiComp '13). ACM, New York, NY, USA, 439--448.

Cited By

View all
  • (2021)Hands-free interaction in immersive virtual reality: A systematic reviewIEEE Transactions on Visualization and Computer Graphics10.1109/TVCG.2021.306768727:5(2702-2713)Online publication date: May-2021

Recommendations

Comments

Please enable JavaScript to view thecomments powered by Disqus.

Information & Contributors

Information

Published In

cover image ACM Conferences
ETRA '18: Proceedings of the 2018 ACM Symposium on Eye Tracking Research & Applications
June 2018
595 pages
ISBN:9781450357067
DOI:10.1145/3204493
Permission to make digital or hard copies of part or all of this work for personal or classroom use is granted without fee provided that copies are not made or distributed for profit or commercial advantage and that copies bear this notice and the full citation on the first page. Copyrights for third-party components of this work must be honored. For all other uses, contact the Owner/Author.

Sponsors

Publisher

Association for Computing Machinery

New York, NY, United States

Publication History

Published: 14 June 2018

Check for updates

Author Tags

  1. augmented reality
  2. eye-tracking
  3. head-mounted
  4. mixed reality
  5. rapid prototyping
  6. virtual reality

Qualifiers

  • Abstract

Conference

ETRA '18

Acceptance Rates

Overall Acceptance Rate 69 of 137 submissions, 50%

Upcoming Conference

Contributors

Other Metrics

Bibliometrics & Citations

Bibliometrics

Article Metrics

  • Downloads (Last 12 months)10
  • Downloads (Last 6 weeks)0
Reflects downloads up to 21 Dec 2024

Other Metrics

Citations

Cited By

View all
  • (2021)Hands-free interaction in immersive virtual reality: A systematic reviewIEEE Transactions on Visualization and Computer Graphics10.1109/TVCG.2021.306768727:5(2702-2713)Online publication date: May-2021

View Options

Login options

View options

PDF

View or Download as a PDF file.

PDF

eReader

View online with eReader.

eReader

Media

Figures

Other

Tables

Share

Share

Share this Publication link

Share on social media