[go: up one dir, main page]
More Web Proxy on the site http://driver.im/ skip to main content
10.1145/3588015.3589508acmconferencesArticle/Chapter ViewAbstractPublication PagesetraConference Proceedingsconference-collections
short-paper

Reanalyzing Effective Eye-related Information for Developing User’s Intent Detection Systems

Published: 30 May 2023 Publication History

Abstract

Studies on gaze-based interactions have utilized natural eye-related information to detect user intent. Most use a machine learning-based approach to minimize the cost of choosing appropriate eye-related information. While those studies demonstrated the effectiveness of an intent detection system, understanding which eye-related information is useful for interactions is important. In this paper, we reanalyze how eye-related information affected the detection performance of a previous study to develop better intent detection systems in the future. Specifically, we analyzed two aspects of dimensionality reduction and adaptation to different tasks. The results showed that saccade and fixation are not always useful, and the direction of gaze movement could potentially cause overfitting.

References

[1]
Roman Bednarik, Hana Vrzakova, and Michal Hradis. 2012. What Do You Want to Do Next: A Novel Approach for Intent Prediction in Gaze-based Interaction. In Proceedings of the 2012 ACM Symposium on Eye Tracking Research & Applications (Santa Barbara, California) (ETRA ’12). Association for Computing Machinery, New York, NY, USA, 83–90. https://doi.org/10.1145/2168556.2168569
[2]
Brendan David-John, Candace Peacock, Ting Zhang, T. Scott Murdison, Hrvoje Benko, and Tanya R. Jonker. 2021. Towards Gaze-Based Prediction of the Intent to Interact in Virtual Reality. In ACM Symposium on Eye Tracking Research and Applications (Virtual Event, Germany) (ETRA ’21 Short Papers). Association for Computing Machinery, New York, NY, USA, Article 2, 7 pages. https://doi.org/10.1145/3448018.3458008
[3]
Toshiya Isomoto, Shota Yamanaka, and Buntarou Shizuki. 2022. Dwell Selection with ML-Based Intent Prediction Using Only Gaze Data. Proceedings of the ACM on Interactive, Mobile, Wearable and Ubiquitous Technologies 6, 3, Article 120 (Sep 2022), 21 pages. https://doi.org/10.1145/3550301
[4]
Candace E. Peacock, Ben Lafreniere, Ting Zhang, Stephanie Santosa, Hrvoje Benko, and Tanya R. Jonker. 2022. Gaze as an Indicator of Input Recognition Errors. Proceedings of the ACM on Interactive, Mobile, Wearable and Ubiquitous Technologies 6, ETRA, Article 142 (May 2022), 18 pages. https://doi.org/10.1145/3530883
[5]
Naveen Sendhilnathan, Ting Zhang, Ben Lafreniere, Tovi Grossman, and Tanya R. Jonker. 2022. Detecting Input Recognition Errors and User Errors Using Gaze Dynamics in Virtual Reality. In Proceedings of the 35th Annual ACM Symposium on User Interface Software and Technology (Bend, OR, USA) (UIST ’22). Association for Computing Machinery, New York, NY, USA, Article 38, 19 pages. https://doi.org/10.1145/3526113.3545628

Index Terms

  1. Reanalyzing Effective Eye-related Information for Developing User’s Intent Detection Systems

    Recommendations

    Comments

    Please enable JavaScript to view thecomments powered by Disqus.

    Information & Contributors

    Information

    Published In

    cover image ACM Conferences
    ETRA '23: Proceedings of the 2023 Symposium on Eye Tracking Research and Applications
    May 2023
    441 pages
    ISBN:9798400701504
    DOI:10.1145/3588015
    Permission to make digital or hard copies of all or part of this work for personal or classroom use is granted without fee provided that copies are not made or distributed for profit or commercial advantage and that copies bear this notice and the full citation on the first page. Copyrights for components of this work owned by others than the author(s) must be honored. Abstracting with credit is permitted. To copy otherwise, or republish, to post on servers or to redistribute to lists, requires prior specific permission and/or a fee. Request permissions from [email protected].

    Sponsors

    Publisher

    Association for Computing Machinery

    New York, NY, United States

    Publication History

    Published: 30 May 2023

    Permissions

    Request permissions for this article.

    Check for updates

    Author Tags

    1. Gaze-based interaction
    2. eye information
    3. feature selection
    4. machine-learning

    Qualifiers

    • Short-paper
    • Research
    • Refereed limited

    Funding Sources

    Conference

    ETRA '23

    Acceptance Rates

    Overall Acceptance Rate 69 of 137 submissions, 50%

    Upcoming Conference

    Contributors

    Other Metrics

    Bibliometrics & Citations

    Bibliometrics

    Article Metrics

    • 0
      Total Citations
    • 92
      Total Downloads
    • Downloads (Last 12 months)25
    • Downloads (Last 6 weeks)2
    Reflects downloads up to 19 Dec 2024

    Other Metrics

    Citations

    View Options

    Login options

    View options

    PDF

    View or Download as a PDF file.

    PDF

    eReader

    View online with eReader.

    eReader

    HTML Format

    View this article in HTML Format.

    HTML Format

    Media

    Figures

    Other

    Tables

    Share

    Share

    Share this Publication link

    Share on social media