[go: up one dir, main page]
More Web Proxy on the site http://driver.im/ skip to main content
10.1145/1969773.1969792acmotherconferencesArticle/Chapter ViewAbstractPublication PagesautomotiveuiConference Proceedingsconference-collections
research-article

Making use of drivers' glances onto the screen for explicit gaze-based interaction

Published: 11 November 2010 Publication History

Abstract

Interaction with communication and infotainment systems in the car is common while driving. Our research investigates modalities and techniques that enable interaction with interactive applications while driving without compromising safety. In this paper we present the results of an experiment where we use eye-gaze tracking in combination with a button on the steering wheel as explicit input substituting the interaction on the touch screen. This approach combines the advantages of direct interaction on visual displays without the drawbacks of touch screens. In particular the freedom of placement for the screen (even out of reach from the user) and that both hands remain on the steering wheel are the main advantages. The results show that this interaction modality is slightly slower and more distracting than a touch screen but it is significantly faster than automated speech interaction.

References

[1]
A. Baron and P. Green. Safety and usability of speech interfaces for in-vehicle tasks while driving: A brief literature review. Technical Report UMTRI 2006-5, University of Michigan Transp. Res. Inst., 2006.
[2]
L. Fletcher and A. Zelinsky. Driver Inattention Detection based on Eye Gaze--Road Event Correlation. In: The International Journal of Robotics Research, 28, 2009.
[3]
I. Gonzalez, J. Wobbrock, D. Chau, A. Faulring, and B. Myers. Eyes on the road, hands on the wheel: thumb-based interaction techniques for input on steering wheels. In Proc. of GI'07, pages 95--102, 2007.
[4]
S. G. Hart and L. E. Staveland. Development of a NASA-TLX (Task Load Index): Results of Empirical and Theoretical Research. In: Hancock PS, Meshkati N, editors. Human Mental Workload. Amsterdam, Holland: 1988. pp. 139--83.
[5]
R. J. Jacob. The Use of Eye Movements in Human-Computer Interaction Techniques - What You Look At is What You Get. ACM Transactions on Information Systems, 9(3), April 1991.
[6]
M. Jeong, S. Jung, and G. G. Lee. Speech recognition error correction using maximum entropy language model. In Proc. of INTERSPEECH 2004, pages 2137--2140, 2004.
[7]
D. Kern, P. Marshall, and A. Schmidt. Gazemarks: gaze-based visual placeholders to ease attention switching. In Proc. of CHI 2010, pages 2093--2102.
[8]
S. Mattes. The lane change task as a tool for driver distraction evaluation. In H. Strasser, H. Rausch, and H. Bubb, editors, Quality of Work and Products in Enterprises of the Future. Ergonomia, 2003.
[9]
A. Pauzié and G. Pachiaudi. Subjective evaluation of the mental workload in the driving context. Traffic and Transport Psychology, pages 173--182, 1997.
[10]
B. Reimer, J. F. Coughlin, and B. Mehler, B. Development of a Driver Aware Vehicle for Monitoring, Managing & Motivating Older Operator Behavior. In Proc. of the ITS-America, Washington, DC. (2009).
[11]
E. Ringger and J. F. Allen. A fertility channel model for post-correction of continuous speech recognition. In Proc. of the 4th Intern. Conf. on Spoken Language (ICSLP'96), pages 897--900, 1996.
[12]
K. Vertanen and P. Kristensson. Parakeet: A continous speech recognition system for mobile touch-screen devices. In Proc. of IUI 2009, Sanibel Island, Florida, 2009.
[13]
Vlingo Corporation. Vlingo mobile, 2010.

Cited By

View all
  • (2023)Multimodal Gaze-Based Interaction in Cars: Are Mid-Air Gestures with Haptic Feedback Safer Than Buttons?Design, User Experience, and Usability10.1007/978-3-031-35702-2_24(333-352)Online publication date: 9-Jul-2023
  • (2022)Investigating the Influence of Gaze- and Context-Adaptive Head-up Displays on Take-Over RequestsProceedings of the 14th International Conference on Automotive User Interfaces and Interactive Vehicular Applications10.1145/3543174.3546089(108-118)Online publication date: 17-Sep-2022
  • (2022)GazeScale: Towards General Gaze-Based Interaction in Public PlacesProceedings of the 2022 International Conference on Multimodal Interaction10.1145/3536221.3556588(591-596)Online publication date: 7-Nov-2022
  • Show More Cited By

Index Terms

  1. Making use of drivers' glances onto the screen for explicit gaze-based interaction

    Recommendations

    Comments

    Please enable JavaScript to view thecomments powered by Disqus.

    Information & Contributors

    Information

    Published In

    cover image ACM Other conferences
    AutomotiveUI '10: Proceedings of the 2nd International Conference on Automotive User Interfaces and Interactive Vehicular Applications
    November 2010
    160 pages
    ISBN:9781450304375
    DOI:10.1145/1969773
    Permission to make digital or hard copies of all or part of this work for personal or classroom use is granted without fee provided that copies are not made or distributed for profit or commercial advantage and that copies bear this notice and the full citation on the first page. Copyrights for components of this work owned by others than ACM must be honored. Abstracting with credit is permitted. To copy otherwise, or republish, to post on servers or to redistribute to lists, requires prior specific permission and/or a fee. Request permissions from [email protected]

    Sponsors

    • Carnegie Mellon University: Carnegie Mellon University

    In-Cooperation

    Publisher

    Association for Computing Machinery

    New York, NY, United States

    Publication History

    Published: 11 November 2010

    Permissions

    Request permissions for this article.

    Check for updates

    Author Tags

    1. automotive
    2. eye tracking
    3. modality choice
    4. timing

    Qualifiers

    • Research-article

    Funding Sources

    Conference

    AutomotiveUI '10
    Sponsor:
    • Carnegie Mellon University

    Acceptance Rates

    Overall Acceptance Rate 248 of 566 submissions, 44%

    Contributors

    Other Metrics

    Bibliometrics & Citations

    Bibliometrics

    Article Metrics

    • Downloads (Last 12 months)10
    • Downloads (Last 6 weeks)0
    Reflects downloads up to 08 Mar 2025

    Other Metrics

    Citations

    Cited By

    View all
    • (2023)Multimodal Gaze-Based Interaction in Cars: Are Mid-Air Gestures with Haptic Feedback Safer Than Buttons?Design, User Experience, and Usability10.1007/978-3-031-35702-2_24(333-352)Online publication date: 9-Jul-2023
    • (2022)Investigating the Influence of Gaze- and Context-Adaptive Head-up Displays on Take-Over RequestsProceedings of the 14th International Conference on Automotive User Interfaces and Interactive Vehicular Applications10.1145/3543174.3546089(108-118)Online publication date: 17-Sep-2022
    • (2022)GazeScale: Towards General Gaze-Based Interaction in Public PlacesProceedings of the 2022 International Conference on Multimodal Interaction10.1145/3536221.3556588(591-596)Online publication date: 7-Nov-2022
    • (2021)Why Do I Have to Take Over Control? Evaluating Safe Handovers with Advance Notice and Explanations in HADProceedings of the 2021 International Conference on Multimodal Interaction10.1145/3462244.3479884(308-317)Online publication date: 18-Oct-2021
    • (2021)How to Increase Automated Vehicles’ Acceptance through In-Vehicle Interaction Design: A ReviewInternational Journal of Human–Computer Interaction10.1080/10447318.2020.186051737:4(308-330)Online publication date: 1-Jan-2021
    • (2019)Using gaze-based interactions in automated vehicles for increased road safetyProceedings of the 11th International Conference on Automotive User Interfaces and Interactive Vehicular Applications: Adjunct Proceedings10.1145/3349263.3351910(321-326)Online publication date: 21-Sep-2019
    • (2019)Usability and UX of a Gaze Interaction Tool for Front Seat PassengersProceedings of Mensch und Computer 201910.1145/3340764.3344890(677-681)Online publication date: 8-Sep-2019
    • (2019)Interactive gaze and finger controlled HUD for carsJournal on Multimodal User Interfaces10.1007/s12193-019-00316-9Online publication date: 23-Nov-2019
    • (2018)Performance Evaluation Strategies for Eye Gaze Estimation Systems with Quantitative Metrics and VisualizationsSensors10.3390/s1809315118:9(3151)Online publication date: 18-Sep-2018
    • (2018)Eye Gaze Controlled Projected Display in Automotive and Military Aviation EnvironmentsMultimodal Technologies and Interaction10.3390/mti20100012:1(1)Online publication date: 17-Jan-2018
    • Show More Cited By

    View Options

    Login options

    View options

    PDF

    View or Download as a PDF file.

    PDF

    eReader

    View online with eReader.

    eReader

    Figures

    Tables

    Media

    Share

    Share

    Share this Publication link

    Share on social media