[go: up one dir, main page]
More Web Proxy on the site http://driver.im/ skip to main content
10.1145/2254556.2254589acmotherconferencesArticle/Chapter ViewAbstractPublication PagesaviConference Proceedingsconference-collections
research-article

Tap, swipe, or move: attentional demands for distracted smartphone input

Published: 21 May 2012 Publication History

Abstract

Smartphones are frequently used in environments where the user is distracted by another task, for example by walking or by driving. While the typical interface for smartphones involves hardware and software buttons and surface gestures, researchers have recently posited that, for distracted environments, benefits may exist in using motion gestures to execute commands. In this paper, we examine the relative cognitive demands of motion gestures and surface taps and gestures in two specific distracted scenarios: a walking scenario, and an eyes-free seated scenario. We show, first, that there is no significant difference in reaction time for motion gestures, taps, or surface gestures on smartphones. We further show that motion gestures result in significantly less time looking at the smartphone during walking than does tapping on the screen, even with interfaces optimized for eyes-free input. Taken together, these results show that, despite somewhat lower throughput, there may be benefits to making use of motion gestures as a modality for distracted input on smartphones.

References

[1]
Ashbrook, D. and Starner, T. 2010. MAGIC: A Motion Gesture Design Tool. CHI '10: Proc. of Human factors in computing systems (Atlanta, Georgia, Apr. 2010), 2159--2168.
[2]
Ba h, K. M. et al. 2008. You can touch, but you can't look: interacting with in-vehicle systems. Proc. of Human factors in computing systems (New York, NY, USA, 2008), 1139--1148.
[3]
Bergstrom-Lehtovirta, J. et al. 2011. The effects of walking speed on target acquisition on a touchscreen interface. Proc. of Human Computer Interaction with Mobile Devices and Services (New York, NY, USA, 2011), 143--146.
[4]
Bragdon, A. et al. 2011. Experimental analysis of touch-screen gesture designs in mobile environments. Proc. of Human factors in computing systems (New York, NY, USA, 2011), 403--412.
[5]
Christiansen, L. H. et al. 2011. Don't look at me, i'm talking to you: investigating input and output modalities for in-vehicle systems. Proc. of conference on Human-computer interaction - Volume Part II (Berlin, Heidelberg, 2011), 675--691.
[6]
Cockburn, A. et al. 2007. A predictive model of menu performance. Proc. of Human factors in computing systems (New York, NY, USA, 2007), 627--636.
[7]
Döring, T. et al. 2011. Gestural interaction on the steering wheel: reducing the visual demand. Proc. of Human factors in computing systems (New York, NY, USA, 2011), 483--492.
[8]
G, E. and Nilsson 2009. Design patterns for user interface for mobile applications. Advances in Engineering Software. 40, 12 (2009), 1318--1328.
[9]
González, I. E. et al. 2007. Eyes on the road, hands on the wheel: thumb-based interaction techniques for input on steering wheels. Proc. of Graphics Interface 2007, 95--102.
[10]
Hick, W. E. 1952. On the rate of gain of information. The Quarterly Journal of Experimental Psychology. 4, (1952), 11--26.
[11]
Hopkins, D. 1991. The design and implementation of pie menus. Dr. Dobb's J. 16, 12 (Dec. 1991), 16--26.
[12]
Jones, E. et al. 2010. GesText: Accelerometer-based Gestural Text-Entry Systems. CHI '10: Proc. of conference on Human factors in computing systems (Apr. 2010).
[13]
Kern, D. et al. 2009. Writing to your car: handwritten text input while driving. Proc. of conference extended abstracts on Human factors in computing systems, 4705--4710.
[14]
Kurtenbach, G. and Buxton, W. 1994. User learning and performance with marking menus. Proc. of the conference on Human factors in computing systems: celebrating interdependence (New York, NY, USA, 1994), 258--264.
[15]
Manly, T. et al. 1999. The absent mind: further investigations of sustained attention to response. Neuropsychologia. 37, 6 (Jun. 1999), 661--670.
[16]
Negulescu, M. et al. 2011. A Recognition Safety Net: Bi-Level Threshold Recognition for Mobile Motion Gestures. Proc. of MobileHCI MobileGestures extended abstracts (2011).
[17]
Noy, Y. I. et al. 2004. Task interruptability and duration as measures of visual distraction. Applied Ergonomics. 35, 3 (2004), 207--213.
[18]
Oakley, I. and Park, J. 2009. Motion marking menus: An eyes-free approach to motion input for handheld devices. International Journal of Human-Computer Studies. 67, 6 (2009), 515--532.
[19]
Partridge, K. et al. 2002. TiltType: accelerometer-supported text entry for very small devices. UIST '02: Proc. of symposium on User interface software and technology, 201--204.
[20]
Rekimoto, J. 1996. Tilting operations for small screen interfaces. UIST '96: Proc. of symposium on User interface software and technology, 167--168.
[21]
Ruiz, J. and Li, Y. 2011. DoubleFlip: A Motion Gesture for Mobile Interaction. Proc. of conference on Human factors in computing systems (Vancouver, British Columbia, 2011).
[22]
Ruiz, J. et al. 2011. User-defined motion gestures for mobile interaction. Proc. of conference on Human factors in computing systems (New York, NY, USA, 2011), 197--206.
[23]
Welford, A. T. 1980. Reaction Times. Academic Press.

Cited By

View all
  • (2024)PhoneInVR: An Evaluation of Spatial Anchoring and Interaction Techniques for Smartphone Usage in Virtual RealityProceedings of the 2024 CHI Conference on Human Factors in Computing Systems10.1145/3613904.3642582(1-16)Online publication date: 11-May-2024
  • (2024)Enhancing Swipe Typing with Gated Linear Transformer2024 International Conference on Emerging Smart Computing and Informatics (ESCI)10.1109/ESCI59607.2024.10497245(1-6)Online publication date: 5-Mar-2024
  • (2024)Usability testing of a mobile health application to support individuals with active tuberculosis: a mixed methods studyInformatics for Health and Social Care10.1080/17538157.2024.233337949:2(136-148)Online publication date: 26-Mar-2024
  • Show More Cited By

Index Terms

  1. Tap, swipe, or move: attentional demands for distracted smartphone input

    Recommendations

    Comments

    Please enable JavaScript to view thecomments powered by Disqus.

    Information & Contributors

    Information

    Published In

    cover image ACM Other conferences
    AVI '12: Proceedings of the International Working Conference on Advanced Visual Interfaces
    May 2012
    846 pages
    ISBN:9781450312875
    DOI:10.1145/2254556
    Permission to make digital or hard copies of all or part of this work for personal or classroom use is granted without fee provided that copies are not made or distributed for profit or commercial advantage and that copies bear this notice and the full citation on the first page. Copyrights for components of this work owned by others than ACM must be honored. Abstracting with credit is permitted. To copy otherwise, or republish, to post on servers or to redistribute to lists, requires prior specific permission and/or a fee. Request permissions from [email protected]

    Sponsors

    • Consulta Umbria SRL
    • University of Salerno: University of Salerno

    In-Cooperation

    Publisher

    Association for Computing Machinery

    New York, NY, United States

    Publication History

    Published: 21 May 2012

    Permissions

    Request permissions for this article.

    Check for updates

    Author Tags

    1. eyes-free interaction
    2. motion gestures
    3. smartphones

    Qualifiers

    • Research-article

    Conference

    AVI'12
    Sponsor:
    • University of Salerno

    Acceptance Rates

    Overall Acceptance Rate 128 of 490 submissions, 26%

    Contributors

    Other Metrics

    Bibliometrics & Citations

    Bibliometrics

    Article Metrics

    • Downloads (Last 12 months)49
    • Downloads (Last 6 weeks)2
    Reflects downloads up to 03 Jan 2025

    Other Metrics

    Citations

    Cited By

    View all
    • (2024)PhoneInVR: An Evaluation of Spatial Anchoring and Interaction Techniques for Smartphone Usage in Virtual RealityProceedings of the 2024 CHI Conference on Human Factors in Computing Systems10.1145/3613904.3642582(1-16)Online publication date: 11-May-2024
    • (2024)Enhancing Swipe Typing with Gated Linear Transformer2024 International Conference on Emerging Smart Computing and Informatics (ESCI)10.1109/ESCI59607.2024.10497245(1-6)Online publication date: 5-Mar-2024
    • (2024)Usability testing of a mobile health application to support individuals with active tuberculosis: a mixed methods studyInformatics for Health and Social Care10.1080/17538157.2024.233337949:2(136-148)Online publication date: 26-Mar-2024
    • (2023)Tap or Swipe? Effects of Interaction Gestures for Retrieval of Match Statistics via Second Screen on Watching Soccer on TVProceedings of the 2023 ACM International Conference on Interactive Media Experiences10.1145/3573381.3596473(303-308)Online publication date: 12-Jun-2023
    • (2023)Squeez’In: Private Authentication on Smartphones based on Squeezing GesturesProceedings of the 2023 CHI Conference on Human Factors in Computing Systems10.1145/3544548.3581419(1-15)Online publication date: 19-Apr-2023
    • (2023)Phone Sleight of Hand: Finger-Based Dexterous Gestures for Physical Interaction with Mobile PhonesProceedings of the 2023 CHI Conference on Human Factors in Computing Systems10.1145/3544548.3581121(1-19)Online publication date: 19-Apr-2023
    • (2023)An exploration of pressure input with bare finger for Mobile interaction in stationary and Mobile situationsMultimedia Tools and Applications10.1007/s11042-023-14503-082:17(25711-25731)Online publication date: 14-Feb-2023
    • (2023)Hap2Gest: An Eyes-Free Interaction Concept with Smartphones Using Gestures and Haptic FeedbackHuman-Computer Interaction – INTERACT 202310.1007/978-3-031-42280-5_31(479-500)Online publication date: 25-Aug-2023
    • (2023)Effects of Moving Speed and Phone Location on Eyes-Free Gesture Input with Mobile DevicesHuman-Computer Interaction – INTERACT 202310.1007/978-3-031-42280-5_30(469-478)Online publication date: 25-Aug-2023
    • (2022)Understanding and Adapting Bezel-to-Bezel Interactions for Circular Smartwatches in Mobile and Encumbered ScenariosProceedings of the ACM on Human-Computer Interaction10.1145/35467366:MHCI(1-28)Online publication date: 20-Sep-2022
    • Show More Cited By

    View Options

    Login options

    View options

    PDF

    View or Download as a PDF file.

    PDF

    eReader

    View online with eReader.

    eReader

    Media

    Figures

    Other

    Tables

    Share

    Share

    Share this Publication link

    Share on social media