[go: up one dir, main page]
More Web Proxy on the site http://driver.im/ skip to main content
10.1145/1647314.1647350acmconferencesArticle/Chapter ViewAbstractPublication Pagesicmi-mlmiConference Proceedingsconference-collections
research-article

Multimodal integration of natural gaze behavior for intention recognition during object manipulation

Published: 02 November 2009 Publication History

Abstract

Naturally gaze is used for visual perception of our environment and gaze movements are mainly controlled subconsciously. Forcing the user to consciously diverge from that natural gaze behavior for interaction purposes causes high cognitive workload and destroys information contained in natural gaze movements. Instead of proposing a new gaze-based interaction technique, we analyze natural gaze behavior during an object manipulation task and show ways how it can be used for intention recognition, which provides a universal basis for integrating gaze into multimodal interfaces for different applications. We propose a model for multimodal integration of natural gaze behavior and evaluate it for two different use cases, namely for improvement of robustness of other potentially noisy input cues and for the design of proactive interaction techniques.

References

[1]
T. Bader, R. Rapple, and J. Beyerer. Fast invariant contour-based classification of hand symbols for hci. In CAIP 2009, volume 5702 of LNCS, pages 689--696. Springer, 2009.
[2]
S. K. Card, J. D. Mackinlay, and G. G. Robertson. The design space of input devices. In CHI '90: Proceedings of the conference on Human factors in computing systems, pages 117--124, New York, NY, USA, 1990. ACM Press.
[3]
A. T. Duchowski. Eye Tracking Methodology, Theory and Practice. Springer, 2003.
[4]
J. R. Flanagan and R. S. Johansson. Action plans used in action observation. Nature, 424:769--771, 2003.
[5]
B. Gesierich, A. Bruzzo, G. Ottoboni, and L. Finos. Human gaze behaviour during action execution and observation. Acta Psychologica, 128:324--330, 2008.
[6]
R. J. K. Jacob. What you look at is what you get: eye movement-based interaction techniques. In CHI '90: Proceedings of the conference on Human factors in computing systems, pages 11--18. ACM, 1990.
[7]
R. S. Johansson, G. Westling, A. Backstrom, and J. R. Flanagan. Eye-hand coordination in object manipulation. The Journal of Neuroscience, 21(17):6917--6932, 2001.
[8]
M. F. Land and D. N. Lee. Where we look when we steer. Nature, 369:742--744, 1994.
[9]
M. F. Land and P. McLeod. From eye movements to actions: how batsmen hit the ball. Nature Neuroscience, 3:1340--1345, 2000.
[10]
C. Lankford. E ective eye--gaze input into windows. In ETRA '00: Proceedings of the symposium on Eye tracking research&applications, pages 23--27, New York, NY, USA, 2000. ACM.
[11]
B. Law, M. S. Atkins, A. E. Kirkpatrick, and A. J. Lomax. Eye gaze patterns di erentiate novice and experts in a virtual laparoscopic surgery training environment. In ETRA '04: Proceedings of the symposium on Eye tracking research&applications, pages 41--48. ACM, 2004.
[12]
J. Pelz, M. M. Hayhoe, and R. Loeber. The coordination of eye, head, and hand movements in a natural task. Exp Brain Res, pages 266--277, 2001.
[13]
D. D. Salvucci and J. H. Goldberg. Identifying xations and saccades in eye--tracking protocols. In ETRA '00: Proceedings of the symposium on Eye tracking research&applications, pages 71--78. ACM, 2000.
[14]
SensoMotoric Instruments GmbH (SMI), Warthestrassee 21 D-14513 Teltow/Berlin. iView X Manual Version 1.03.09, September 2003.
[15]
B. A. Smith, J. Ho, W. Ark, and S. Zhai. Hand eye coordination patterns in target selection. In ETRA '00: Proceedings of the symposium on Eye tracking research&applications, pages 117--122. ACM, 2000.

Cited By

View all
  • (2024)Bi-Directional Gaze-Based Communication: A ReviewMultimodal Technologies and Interaction10.3390/mti81201088:12(108)Online publication date: 4-Dec-2024
  • (2024)Gaze-Based Intention Estimation: Principles, Methodologies, and Applications in HRIACM Transactions on Human-Robot Interaction10.1145/365637613:3(1-30)Online publication date: 26-Sep-2024
  • (2022)Integrating Gaze and Speech for Enabling Implicit InteractionsProceedings of the 2022 CHI Conference on Human Factors in Computing Systems10.1145/3491102.3502134(1-14)Online publication date: 29-Apr-2022
  • Show More Cited By

Index Terms

  1. Multimodal integration of natural gaze behavior for intention recognition during object manipulation

    Recommendations

    Comments

    Please enable JavaScript to view thecomments powered by Disqus.

    Information & Contributors

    Information

    Published In

    cover image ACM Conferences
    ICMI-MLMI '09: Proceedings of the 2009 international conference on Multimodal interfaces
    November 2009
    374 pages
    ISBN:9781605587721
    DOI:10.1145/1647314
    Permission to make digital or hard copies of all or part of this work for personal or classroom use is granted without fee provided that copies are not made or distributed for profit or commercial advantage and that copies bear this notice and the full citation on the first page. Copyrights for components of this work owned by others than ACM must be honored. Abstracting with credit is permitted. To copy otherwise, or republish, to post on servers or to redistribute to lists, requires prior specific permission and/or a fee. Request permissions from [email protected]

    Sponsors

    Publisher

    Association for Computing Machinery

    New York, NY, United States

    Publication History

    Published: 02 November 2009

    Permissions

    Request permissions for this article.

    Check for updates

    Author Tags

    1. gaze-based interaction
    2. intention recognition
    3. model

    Qualifiers

    • Research-article

    Conference

    ICMI-MLMI '09
    Sponsor:

    Acceptance Rates

    Overall Acceptance Rate 453 of 1,080 submissions, 42%

    Contributors

    Other Metrics

    Bibliometrics & Citations

    Bibliometrics

    Article Metrics

    • Downloads (Last 12 months)15
    • Downloads (Last 6 weeks)2
    Reflects downloads up to 03 Jan 2025

    Other Metrics

    Citations

    Cited By

    View all
    • (2024)Bi-Directional Gaze-Based Communication: A ReviewMultimodal Technologies and Interaction10.3390/mti81201088:12(108)Online publication date: 4-Dec-2024
    • (2024)Gaze-Based Intention Estimation: Principles, Methodologies, and Applications in HRIACM Transactions on Human-Robot Interaction10.1145/365637613:3(1-30)Online publication date: 26-Sep-2024
    • (2022)Integrating Gaze and Speech for Enabling Implicit InteractionsProceedings of the 2022 CHI Conference on Human Factors in Computing Systems10.1145/3491102.3502134(1-14)Online publication date: 29-Apr-2022
    • (2022)Human action prediction in collaborative environments based on shared-weight LSTMs with feature dimensionality reductionApplied Soft Computing10.1016/j.asoc.2022.109245126:COnline publication date: 1-Sep-2022
    • (2022)Ensemble of LSTMs and Feature Selection for Human Action PredictionIntelligent Autonomous Systems 1610.1007/978-3-030-95892-3_33(429-441)Online publication date: 8-Apr-2022
    • (2021)Intention Recognition in Human Robot Interaction Based on Eye TrackingHuman-Computer Interaction – INTERACT 202110.1007/978-3-030-85613-7_29(428-437)Online publication date: 26-Aug-2021
    • (2020)ManiGaze: a Dataset for Evaluating Remote Gaze Estimator in Object Manipulation SituationsACM Symposium on Eye Tracking Research and Applications10.1145/3379156.3391369(1-5)Online publication date: 2-Jun-2020
    • (2019)Virtual Object Manipulation by Combining Touch and Head Interactions for Mobile Augmented RealityApplied Sciences10.3390/app91429339:14(2933)Online publication date: 22-Jul-2019
    • (2019)Prediction of interaction intention based on eye movement gaze feature2019 IEEE 8th Joint International Information Technology and Artificial Intelligence Conference (ITAIC)10.1109/ITAIC.2019.8785737(378-383)Online publication date: May-2019
    • (2018)Gaze and motion information fusion for human intention inferenceInternational Journal of Intelligent Robotics and Applications10.1007/s41315-018-0051-02:2(136-148)Online publication date: 19-Mar-2018
    • Show More Cited By

    View Options

    Login options

    View options

    PDF

    View or Download as a PDF file.

    PDF

    eReader

    View online with eReader.

    eReader

    Media

    Figures

    Other

    Tables

    Share

    Share

    Share this Publication link

    Share on social media