[go: up one dir, main page]
More Web Proxy on the site http://driver.im/ skip to main content
10.1145/2957265.2965014acmconferencesArticle/Chapter ViewAbstractPublication PagesmobilehciConference Proceedingsconference-collections
extended-abstract

Classifying mobile eye tracking data with hidden Markov models

Published: 06 September 2016 Publication History

Abstract

Naturalistic eye movement behavior has been measured in a variety of scenarios [15] and eye movement patterns appear indicative of task demands [16]. However, systematic task classification of eye movement data is a relatively recent development [1,3,7]. Additionally, prior work has focused on classification of eye movements while viewing 2D screen based imagery. In the current study, eye movements from eight participants were recorded with a mobile eye tracker. Participants performed five everyday tasks: Making a sandwich, transcribing a document, walking in an office and a city street, and playing catch with a flying disc [14]. Using only saccadic direction and amplitude time series data, we trained a hidden Markov model for each task and classified unlabeled data by calculating the probability that each model could generate the observed sequence. We present accuracy and time to recognize results, demonstrating better than chance performance.

References

[1]
Boccignone, G. (2015). Advanced statistical methods for eye movement analysis and modeling: a gentle introduction. arXiv preprint arXiv:1506.07194.
[2]
Borji, A., & Itti, L. (2014). Defending Yarbus: Eye movements reveal observers' task. Journal of vision, 14(3), 29--29.
[3]
Bulling, A., Ward, J. A., Gellersen, H., & Tröster, G. (2009, September). Eye movement analysis for activity recognition. In Proceedings of the 11th international conference on Ubiquitous computing (pp. 41--50). ACM.
[4]
Coco, M. I., & Keller, F. (2014). Classification of visual and linguistic tasks using eye-movement features. Journal of vision, 14(3), 11--11.
[5]
George, A., & Routray, A. (2015). A score level fusion method for eye movement biometrics. Pattern Recognition Letters.
[6]
Greene, M. R., Liu, T., & Wolfe, J. M. (2012). Reconsidering Yarbus: A failure to predict observers' task from eye movement patterns. Vision research, 62, 1--8.
[7]
Haass, M. J., Matzen, L. E., Butler, K. M., & Armenta, M. (2016, March). A new method for categorizing scanpaths from eye tracking data. In Proceedings of the Ninth Biennial ACM Symposium on Eye Tracking Research & Applications (pp. 35--38). ACM.
[8]
Haji-Abolhassani, A., & Clark, J. J. (2014). An inverse yarbus process: Predicting observers' task from eye movement patterns. Vision research, 103, 127--142.
[9]
Kanan, C., Ray, N. A., Bseiso, D. N., Hsiao, J. H., & Cottrell, G. W. (2014, March). Predicting an observer's task using multi-fixation pattern analysis. In Proceedings of the symposium on eye tracking research and applications (pp. 287--290). ACM.
[10]
Lenz, R. (2014). Saccadic Eye Movements and the Generalized Pareto Distribution. arXiv preprint arXiv:1406.6201.
[11]
Murphy, K. P. (2012). Machine learning: a probabilistic perspective. The MIT Press.
[12]
Puolamäki, K., Salojärvi, J., Savia, E., Simola, J., & Kaski, S. (2005, August). Combining eye movements and collaborative filtering for proactive information retrieval. In Proceedings of the 28th annual international ACM SIGIR conference on Research and development in information retrieval (pp. 146--153). ACM.
[13]
Rabiner, L. R. A tutorial on hidden Markov models and selected applications in speech recognition. Readings in speech recognition, 53(3):267--296, 1990.
[14]
Sullivan, B., Ghahghaei, S., & Walker, L. (2015, January). Real world eye movement statistics. AVA Christmas Meeting 2014. Perception (Vol. 44, No. 4, pp. 467--467). 207 Brondesbury Park, London NW2 5JN, England: Pion LTD.
[15]
Tatler, B. W., Hayhoe, M. M., Land, M. F., & Ballard, D. H. (2011). Eye guidance in natural vision: Reinterpreting salience. Journal of vision, 11(5), 5--5.
[16]
Yarbus, A. (1967). Eye movements and vision. New York: Plenum Press (Translated from the Russian edition by Haigh, B).
[17]
McCallum, A.: Efficiently inducing features of conditional random fields. In: Proc. 19th Conference on Uncertainty in Artificial Intelligence. (2003)

Cited By

View all
  • (2024)Eye tracking data set of academics making an omelette: An egg-breaking workProceedings of the 2024 Symposium on Eye Tracking Research and Applications10.1145/3649902.3655657(1-3)Online publication date: 4-Jun-2024
  • (2023)What Is Hidden in Clear Sight and How to Find It—A Survey of the Integration of Artificial Intelligence and Eye TrackingInformation10.3390/info1411062414:11(624)Online publication date: 20-Nov-2023
  • (2023)Eye Tracking, Usability, and User Experience: A Systematic ReviewInternational Journal of Human–Computer Interaction10.1080/10447318.2023.222160040:17(4484-4500)Online publication date: 18-Jun-2023
  • Show More Cited By

Index Terms

  1. Classifying mobile eye tracking data with hidden Markov models

    Recommendations

    Comments

    Please enable JavaScript to view thecomments powered by Disqus.

    Information & Contributors

    Information

    Published In

    cover image ACM Conferences
    MobileHCI '16: Proceedings of the 18th International Conference on Human-Computer Interaction with Mobile Devices and Services Adjunct
    September 2016
    664 pages
    ISBN:9781450344135
    DOI:10.1145/2957265
    Permission to make digital or hard copies of part or all of this work for personal or classroom use is granted without fee provided that copies are not made or distributed for profit or commercial advantage and that copies bear this notice and the full citation on the first page. Copyrights for third-party components of this work must be honored. For all other uses, contact the Owner/Author.

    Sponsors

    Publisher

    Association for Computing Machinery

    New York, NY, United States

    Publication History

    Published: 06 September 2016

    Check for updates

    Author Tags

    1. machine learning
    2. mobile eye tracking
    3. natural tasks
    4. task classification

    Qualifiers

    • Extended-abstract

    Conference

    MobileHCI '16
    Sponsor:

    Acceptance Rates

    Overall Acceptance Rate 202 of 906 submissions, 22%

    Contributors

    Other Metrics

    Bibliometrics & Citations

    Bibliometrics

    Article Metrics

    • Downloads (Last 12 months)5
    • Downloads (Last 6 weeks)2
    Reflects downloads up to 11 Jan 2025

    Other Metrics

    Citations

    Cited By

    View all
    • (2024)Eye tracking data set of academics making an omelette: An egg-breaking workProceedings of the 2024 Symposium on Eye Tracking Research and Applications10.1145/3649902.3655657(1-3)Online publication date: 4-Jun-2024
    • (2023)What Is Hidden in Clear Sight and How to Find It—A Survey of the Integration of Artificial Intelligence and Eye TrackingInformation10.3390/info1411062414:11(624)Online publication date: 20-Nov-2023
    • (2023)Eye Tracking, Usability, and User Experience: A Systematic ReviewInternational Journal of Human–Computer Interaction10.1080/10447318.2023.222160040:17(4484-4500)Online publication date: 18-Jun-2023
    • (2023)What we see is what we do: a practical Peripheral Vision-Based HMM framework for gaze-enhanced recognition of actions in a medical procedural taskUser Modeling and User-Adapted Interaction10.1007/s11257-022-09352-933:4(939-965)Online publication date: 4-Jan-2023
    • (2020)Inferring Intent and Action from Gaze in Naturalistic BehaviorCognitive Analytics10.4018/978-1-7998-2460-2.ch074(1464-1482)Online publication date: 2020
    • (2020)A Simple Model of Reading Eye Movement Based on Deep LearningIEEE Access10.1109/ACCESS.2020.30333828(193757-193767)Online publication date: 2020
    • (2019)Application of hidden Markov models to eye tracking data analysis of visual quality inspection operationsCentral European Journal of Operations Research10.1007/s10100-019-00628-xOnline publication date: 7-Jun-2019
    • (2017)Inferring Intent and Action from Gaze in Naturalistic BehaviorInternational Journal of Mobile Human Computer Interaction10.5555/3213394.32133989:4(41-57)Online publication date: 1-Oct-2017
    • (2017)Inferring Intent and Action from Gaze in Naturalistic BehaviorInternational Journal of Mobile Human Computer Interaction10.4018/IJMHCI.20171001049:4(41-57)Online publication date: 1-Oct-2017

    View Options

    Login options

    View options

    PDF

    View or Download as a PDF file.

    PDF

    eReader

    View online with eReader.

    eReader

    Media

    Figures

    Other

    Tables

    Share

    Share

    Share this Publication link

    Share on social media