[go: up one dir, main page]
More Web Proxy on the site http://driver.im/ skip to main content
10.1145/2807442.2807479acmconferencesArticle/Chapter ViewAbstractPublication PagesuistConference Proceedingsconference-collections
research-article

GazeProjector: Accurate Gaze Estimation and Seamless Gaze Interaction Across Multiple Displays

Published: 05 November 2015 Publication History

Abstract

Mobile gaze-based interaction with multiple displays may occur from arbitrary positions and orientations. However, maintaining high gaze estimation accuracy in such situa-tions remains a significant challenge. In this paper, we present GazeProjector, a system that combines (1) natural feature tracking on displays to determine the mobile eye tracker's position relative to a display with (2) accurate point-of-gaze estimation. GazeProjector allows for seam-less gaze estimation and interaction on multiple displays of arbitrary sizes independently of the user's position and orientation to the display. In a user study with 12 partici-pants we compare GazeProjector to established methods (here: visual on-screen markers and a state-of-the-art video-based motion capture system). We show that our approach is robust to varying head poses, orientations, and distances to the display, while still providing high gaze estimation accuracy across multiple displays without re-calibration for each variation. Our system represents an important step towards the vision of pervasive gaze-based interfaces.

Supplementary Material

MP4 File (p395.mp4)

References

[1]
Alahi, A., Ortiz, R. and Vandergheynst, P. FREAK: Fast retina keypoint. Proc. CVPR 2012, 510--517.
[2]
Ballagas, R., Borchers, J., Rohs, M. and Sheridan, J. The smart phone: A ubiquitous input device. IEEE Pervasive Computing 5(1), 70--77, 2006.
[3]
Baur, D., Boring, S. and Feiner, S. Virtual projection: exploring optical projection as a metaphor for multidevice interaction. Proc. CHI 2012, 1693--1702.
[4]
Boring, S., Baur, D., Butz, A., Gustafson, S. and Baudisch, P. Touch projector: mobile interaction through video. Proc. CHI 2010, 2287--2296.
[5]
Breuninger, J., Lange, C., and Bengler, K. Implementing Gaze Control for Peripheral Devices. Proc. PETMEI 2011, 3--8.
[6]
Bulling, A., and Gellersen, H. Toward Mobile Eyebased human-computer interaction. IEEE Pervasive Computing 9(4):8--12, 2010.
[7]
Bulling, A., Alt, F., and Schmidt, A. Increasing the security of gaze-based cued-recall graphical passwords using saliency masks. Proc. CHI 2012, 3011--3020.
[8]
Cerrolaza, J. J., Villanueva, A., Villanueva, M. and Cabeza, R. Error characterization and compensation in eye tracking systems. Proc. ETRA 2012, 205--208.
[9]
Eaddy, M., Blasko, G., Babcock, J. and Feiner, S. My own private kiosk: Privacy-preserving public displays. Proc. ISWC 2004, 132--135.
[10]
Guitton D. and Volle M. Gaze control in humans: eyehead coordination during orienting movements to targets within and beyond the oculomotor range. Journal of Neurophysiology 58:427--459, 1987.
[11]
Hennessey, C. and Fiset, J. Long Range Eye Tracking: Bringing Eye Tracking into the Living Room. Proc. ETRA 2012, 249--252.
[12]
Herbert, L., Pears, N., Jackson, D. and Olivier, P. Mobile device and intelligent display interaction via scaleinvariant image feature matching. Proc. PECCS 2011.
[13]
Jacob, R. J. K. What you look at is what you get: eye movement-based interaction techniques. Proc. CHI 1990, 11--18.
[14]
Kassner, M., Patera, W. and Bulling, A. Pupil: An Open Source Platform for Pervasive Eye Tracking and Mobile Gaze-based Interaction. Adj. Proc. UbiComp 2014, 1151--1160.
[15]
Lowe, D.G. Object recognition from local scaleinvariant features. Proc. ICCV 1999, 1150--1157.
[16]
Mardanbegi, D. and Hansen, D.W. Mobile Gaze-based Screen Interaction in 3D Environments. Proc. NGCA 2011, 2:1--2:4.
[17]
Marquardt, N., Diaz-Marino, R., Boring, S., and Greenberg, S. The proximity toolkit: prototyping proxemic interactions in ubiquitous computing ecologies, Proc. UIST 2011, 315--326.
[18]
Majaranta, P., and Kari-Jouko R. Twenty years of eye typing: systems and design issues. Proc. ETRA 2002, 15--22.
[19]
Model, D. and Eizenman, M. A General Framework for the Extension of the Tracking Range of UserCalibration-Free Remote Eye-Gaze Tracking Systems, Proc. ETRA 2012, 253--256.
[20]
Nakanishi, Y., Fujii, T., Kiatjima, K., Sato, Y. and Koike, H. Vision-based face tracking system for large displays. Proc. UbiComp 2002, 152--159.
[21]
Pears, N., Jackson, D.G. and Olivier, P. Smart phone interaction with registered displays. IEEE Pervasive Computing 8(2), 14--21, 2009.
[22]
Sibert, L. E. and Jacob, R. J. K. Evaluation of eye gaze interaction. Proc. CHI 2000, 281--288.
[23]
Sippl, A., Holzmann, C., Zachhuber, D. and Ferscha, A. Real-time gaze tracking for public displays. Proc. AmI 2010, 167--176.
[24]
San Agustin, J., Hansen, J. P. and Tall, M. Gaze-based interaction with public displays using off-the-shelf components. Adj. Proc. Ubicomp 2010, 377--378.
[25]
Smith, J. D., Vertegaal, R. and Sohn, C. ViewPointer: lightweight calibration-free eye tracking for ubiquitous handsfree deixis. Proc. UIST 2005, 53--61.
[26]
Stahl, J. S. Amplitude of human head movements associated with horizontal saccades. Experimental Brain Research 126.1: 41--54, 1999.
[27]
Stellmach, S. and Dachselt, R. Look & touch: gazesupported target acquisition. Proc. CHI 2012, 29812990.
[28]
Stellmach, S. and Dachselt, R. Still looking: investigating seamless gaze-supported selection, positioning, and manipulation of distant targets. Proc. CHI 2013, 285294.
[29]
Turner, J., Bulling, A. and Gellersen, H. Extending the visual field of a head-mounted eye tracker for pervasive eye-based interaction. Proc. ETRA 2012, 269--272.
[30]
Vertegaal, R. Attentive user interfaces. Communications of the ACM 46(3):30--33, 2003.
[31]
Vidal, M., Bulling, A. and Gellersen, H. Pursuits: Spontaneous Interaction with Displays based on Smooth Pursuit Eye Movement and Moving Targets. Proc. UbiComp 2013, 439--448.
[32]
Yu, L., and Eizenman, E. A new methodology for determining point-of-gaze in head-mounted eye tracking systems. IEEE Transactions on Biomedical Engineering 51(10):1765--1773, 2004.
[33]
Zhai, S., Morimoto, C. and Ihde, S. Manual and gaze input cascaded (MAGIC) pointing. Proc. CHI 1999, 246--253.
[34]
Zhang, Y., Bulling, A. and Gellersen, H. Sideways: A Gaze Interface for Spontaneous Interaction with Situated Displays. Proc. CHI 2013, 851--860

Cited By

View all
  • (2024)Gesture combinations during collaborative decision-making at wall displaysi-com10.1515/icom-2023-003723:1(57-69)Online publication date: 25-Mar-2024
  • (2024)The Elephant in the Room: Expert Experiences Designing, Developing and Evaluating Data Visualizations on Large DisplaysProceedings of the ACM on Human-Computer Interaction10.1145/36981398:ISS(301-329)Online publication date: 24-Oct-2024
  • (2024)I've Got the Data in My Pocket! - Exploring Interaction Techniques with Everyday Objects for Cross-Device Data TransferProceedings of Mensch und Computer 202410.1145/3670653.3670778(242-255)Online publication date: 1-Sep-2024
  • Show More Cited By

Index Terms

  1. GazeProjector: Accurate Gaze Estimation and Seamless Gaze Interaction Across Multiple Displays

    Recommendations

    Comments

    Please enable JavaScript to view thecomments powered by Disqus.

    Information & Contributors

    Information

    Published In

    cover image ACM Conferences
    UIST '15: Proceedings of the 28th Annual ACM Symposium on User Interface Software & Technology
    November 2015
    686 pages
    ISBN:9781450337793
    DOI:10.1145/2807442
    Permission to make digital or hard copies of all or part of this work for personal or classroom use is granted without fee provided that copies are not made or distributed for profit or commercial advantage and that copies bear this notice and the full citation on the first page. Copyrights for components of this work owned by others than ACM must be honored. Abstracting with credit is permitted. To copy otherwise, or republish, to post on servers or to redistribute to lists, requires prior specific permission and/or a fee. Request permissions from [email protected]

    Sponsors

    Publisher

    Association for Computing Machinery

    New York, NY, United States

    Publication History

    Published: 05 November 2015

    Permissions

    Request permissions for this article.

    Check for updates

    Author Tags

    1. calibration
    2. eye tracking
    3. gaze estimation
    4. large displays
    5. multi-display environments
    6. natural feature tracking

    Qualifiers

    • Research-article

    Conference

    UIST '15

    Acceptance Rates

    UIST '15 Paper Acceptance Rate 70 of 297 submissions, 24%;
    Overall Acceptance Rate 561 of 2,567 submissions, 22%

    Upcoming Conference

    UIST '25
    The 38th Annual ACM Symposium on User Interface Software and Technology
    September 28 - October 1, 2025
    Busan , Republic of Korea

    Contributors

    Other Metrics

    Bibliometrics & Citations

    Bibliometrics

    Article Metrics

    • Downloads (Last 12 months)26
    • Downloads (Last 6 weeks)2
    Reflects downloads up to 17 Dec 2024

    Other Metrics

    Citations

    Cited By

    View all
    • (2024)Gesture combinations during collaborative decision-making at wall displaysi-com10.1515/icom-2023-003723:1(57-69)Online publication date: 25-Mar-2024
    • (2024)The Elephant in the Room: Expert Experiences Designing, Developing and Evaluating Data Visualizations on Large DisplaysProceedings of the ACM on Human-Computer Interaction10.1145/36981398:ISS(301-329)Online publication date: 24-Oct-2024
    • (2024)I've Got the Data in My Pocket! - Exploring Interaction Techniques with Everyday Objects for Cross-Device Data TransferProceedings of Mensch und Computer 202410.1145/3670653.3670778(242-255)Online publication date: 1-Sep-2024
    • (2024)Interactive Visualization on Large High‐Resolution Displays: A SurveyComputer Graphics Forum10.1111/cgf.1500143:6Online publication date: 30-Apr-2024
    • (2023)GazeCast: Using Mobile Devices to Allow Gaze-based Interaction on Public DisplaysProceedings of the 2023 Symposium on Eye Tracking Research and Applications10.1145/3588015.3589663(1-8)Online publication date: 30-May-2023
    • (2022)Understanding and Creating Spatial Interactions with Distant Displays Enabled by Unmodified Off-The-Shelf SmartphonesMultimodal Technologies and Interaction10.3390/mti61000946:10(94)Online publication date: 19-Oct-2022
    • (2022)Unsupervised Detection of Dynamic Hand Gestures from Leap Motion DataImage Analysis and Processing – ICIAP 202210.1007/978-3-031-06427-2_35(414-424)Online publication date: 15-May-2022
    • (2020)EMOProceedings of the 18th International Conference on Mobile Systems, Applications, and Services10.1145/3386901.3388917(448-461)Online publication date: 15-Jun-2020
    • (2020)Eye Tracking for Target Acquisition in Sparse VisualizationsACM Symposium on Eye Tracking Research and Applications10.1145/3379156.3391834(1-5)Online publication date: 2-Jun-2020
    • (2020)Screen-Light Decomposition Framework for Point-of-Gaze Estimation Using a Single Uncalibrated Camera and Multiple Light SourcesJournal of Mathematical Imaging and Vision10.1007/s10851-020-00947-8Online publication date: 15-Feb-2020
    • Show More Cited By

    View Options

    Login options

    View options

    PDF

    View or Download as a PDF file.

    PDF

    eReader

    View online with eReader.

    eReader

    Media

    Figures

    Other

    Tables

    Share

    Share

    Share this Publication link

    Share on social media