[go: up one dir, main page]
More Web Proxy on the site http://driver.im/ skip to main content
10.1145/2957265.2965016acmconferencesArticle/Chapter ViewAbstractPublication PagesmobilehciConference Proceedingsconference-collections
extended-abstract

Inferring user action with mobile gaze tracking

Published: 06 September 2016 Publication History

Abstract

Gaze tracking in psychological, cognitive, and user interaction studies has recently evolved toward mobile solutions, as they enable direct assessing of users' visual attention in natural environments, and augmented and virtual reality (AR/VR) applications. Productive approaches in analyzing and predicting user actions with gaze data require a multidisciplinary approach with experts in cognitive and behavioral sciences, machine vision, and machine learning. This workshop brings together a cross-domain group of individuals to (i) discuss and contribute to the problem of using mobile gaze tracking for inferring user action, (ii) advance the sharing of data and analysis algorithms as well as device solutions, and (iii) increase understanding of behavioral aspects of gaze-action sequences in natural environments and AR/VR applications.

References

[1]
Alireza Fathi, Yin Li, and James M. Rehg. 2012. Learning to recognize daily actions using gaze. In Computer Vision--ECCV 2012. Springer, 314--327.
[2]
Moritz Kassner, William Patera, and Andreas Bulling. 2014. Pupil: an open source platform for pervasive eye tracking and mobile gaze-based interaction. In Proceedings of the 2014 ACM International Joint Conference on Pervasive and Ubiquitous Computing: Adjunct Publication. ACM, 1151--1160.
[3]
Makeroni Labs. 2016. Eye of Horus. https://hackaday.io/project/6638-eye-of-horus-open-source-eye-tracking-assistance. (2016). Accessed: 2016-02-26.
[4]
Kristian Lukander, Sharman Jagadeesan, Huageng Chi, and Kiti Müller. 2013. OMG!: a new robust, wearable and affordable open source mobile gaze tracker. In Proceedings of the 15th international conference on Human-computer interaction with mobile devices and services. ACM, 408--411.
[5]
Francisco J. Parada, Dean Wyatte, Chen Yu, Ruj Akavipat, Brandi Emerick, and Thomas Busey. 2015. ExpertEyes: Open-source, high-definition eyetracking. Behavior research methods 47, 1 (2015), 73--84.
[6]
Akihiro Tsukada, Motoki Shino, Michael Devyver, and Takeo Kanade. 2011. Illumination-free gaze estimation method for first-person vision wearable device. In Computer Vision Workshops (ICCV Workshops), 2011 IEEE International Conference on. IEEE, 2084--2091.

Cited By

View all
  • (2020)Inferring Intent and Action from Gaze in Naturalistic BehaviorCognitive Analytics10.4018/978-1-7998-2460-2.ch074(1464-1482)Online publication date: 2020
  • (2017)Inferring Intent and Action from Gaze in Naturalistic BehaviorInternational Journal of Mobile Human Computer Interaction10.5555/3213394.32133989:4(41-57)Online publication date: 1-Oct-2017
  • (2017)Inferring Intent and Action from Gaze in Naturalistic BehaviorInternational Journal of Mobile Human Computer Interaction10.4018/IJMHCI.20171001049:4(41-57)Online publication date: 1-Oct-2017

Index Terms

  1. Inferring user action with mobile gaze tracking

    Recommendations

    Comments

    Please enable JavaScript to view thecomments powered by Disqus.

    Information & Contributors

    Information

    Published In

    cover image ACM Conferences
    MobileHCI '16: Proceedings of the 18th International Conference on Human-Computer Interaction with Mobile Devices and Services Adjunct
    September 2016
    664 pages
    ISBN:9781450344135
    DOI:10.1145/2957265
    Permission to make digital or hard copies of part or all of this work for personal or classroom use is granted without fee provided that copies are not made or distributed for profit or commercial advantage and that copies bear this notice and the full citation on the first page. Copyrights for third-party components of this work must be honored. For all other uses, contact the Owner/Author.

    Sponsors

    Publisher

    Association for Computing Machinery

    New York, NY, United States

    Publication History

    Published: 06 September 2016

    Check for updates

    Author Tags

    1. action inference
    2. augmented reality
    3. behavioral analysis
    4. gaze tracking algorithms
    5. machine learning
    6. mobile gaze tracking
    7. natural environments
    8. virtual reality

    Qualifiers

    • Extended-abstract

    Funding Sources

    Conference

    MobileHCI '16
    Sponsor:

    Acceptance Rates

    Overall Acceptance Rate 202 of 906 submissions, 22%

    Contributors

    Other Metrics

    Bibliometrics & Citations

    Bibliometrics

    Article Metrics

    • Downloads (Last 12 months)8
    • Downloads (Last 6 weeks)1
    Reflects downloads up to 20 Dec 2024

    Other Metrics

    Citations

    Cited By

    View all
    • (2020)Inferring Intent and Action from Gaze in Naturalistic BehaviorCognitive Analytics10.4018/978-1-7998-2460-2.ch074(1464-1482)Online publication date: 2020
    • (2017)Inferring Intent and Action from Gaze in Naturalistic BehaviorInternational Journal of Mobile Human Computer Interaction10.5555/3213394.32133989:4(41-57)Online publication date: 1-Oct-2017
    • (2017)Inferring Intent and Action from Gaze in Naturalistic BehaviorInternational Journal of Mobile Human Computer Interaction10.4018/IJMHCI.20171001049:4(41-57)Online publication date: 1-Oct-2017

    View Options

    Login options

    View options

    PDF

    View or Download as a PDF file.

    PDF

    eReader

    View online with eReader.

    eReader

    Media

    Figures

    Other

    Tables

    Share

    Share

    Share this Publication link

    Share on social media