Abstract
Using smartphones while moving is challenging and can be dangerous. Eyes-free input gestures can provide a means to use smartphones without the need for visual attention from users. In this study, we investigated the effect of different moving speeds (standing, walking, or jogging) and different locations (phone held freely in the hand, or phone placed inside a shoulder bag) on eyes-free input gestures with smartphone. Our results from 12 male participants showed gesture’s entering duration is not affected by moving speed or phone location, however, other features of gesture, such as length, height, width, area, and phone orientation, are mostly affected by moving speed or phone location. So, eyes-free gestures’ features vary significantly as the user’s environmental factors, such as moving speed or phone location, change and should be considered by designers.
Access this chapter
Tax calculation will be finalised at checkout
Purchases are for personal use only
Similar content being viewed by others
References
Anthony, L., Brown, Q., Nias, J., Tate, B.: Examining the need for visual feedback during gesture interaction on mobile touchscreen devices for kids. In: Proceedings of the 12th International Conference on Interaction Design and Children, IDC 2013, pp. 157–164. Association for Computing Machinery, New York (2013). https://doi.org/10.1145/2485760.2485775
Appert, C., Zhai, S.: Using strokes as command shortcuts: cognitive benefits and toolkit support. In: Proceedings of the SIGCHI Conference on Human Factors in Computing Systems, CHI 2009, pp. 2289–2298. ACM, New York (2009). https://doi.org/10.1145/1518701.1519052
Blagojevic, R., Chang, S.H.H., Plimmer, B.: The power of automatic feature selection: Rubine on steroids. In: Proceedings of the Seventh Sketch-Based Interfaces and Modeling Symposium, pp. 79–86. Eurographics Association, Goslar, DEU (2010)
Bragdon, A., Nelson, E., Li, Y., Hinckley, K.: Experimental analysis of touch-screen gesture designs in mobile environments. In: Proceedings of the SIGCHI Conference on Human Factors in Computing Systems, pp. 403–412 (2011). https://doi.org/10.1145/1978942.1979000
Chen, Q., Perrault, S.T., Roy, Q., Wyse, L.: Effect of temporality, physical activity and cognitive load on spatiotemporal vibrotactile pattern recognition. In: Proceedings of the 2018 International Conference on Advanced Visual Interfaces, AVI 2018, Association for Computing Machinery, New York (2018). https://doi.org/10.1145/3206505.3206511
Clawson, J., Starner, T., Kohlsdorf, D., Quigley, D.P., Gilliland, S.: Texting while walking: an evaluation of mini-qwerty text input while on-the-go. In: Proceedings of the 16th International Conference on Human-Computer Interaction with Mobile Devices & Services, pp. 339–348 (2014). https://doi.org/10.1145/2628363.2628408
Coast, J.R., Blevins, J.S., Wilson, B.A.: Do gender differences in running performance disappear with distance? Can. J. Appl. Physiol. 29(2), 139–145 (2004)
Cockburn, A., Woolley, D., Thai, K.T.P., Clucas, D., Hoermann, S., Gutwin, C.: Reducing the attentional demands of in-vehicle touchscreens with stencil overlays. In: Proceedings of the 10th International Conference on Automotive User Interfaces and Interactive Vehicular Applications - AutomotiveUI 2018, pp. 33–42. ACM Press (2018)
Dai, L., Sears, A., Goldman, R.: Shifting the focus from accuracy to recallability: a study of informal note-taking on mobile information technologies. ACM Trans. Comput.-Hum. Interact. (TOCHI) 16(1), 1–46 (2009)
Dobbelstein, D., Winkler, C., Haas, G., Rukzio, E.: PocketThumb: a wearable dual-sided touch interface for cursor-based control of smart-eyewear. Proc. ACM Interact. Mob. Wearable Ubiquitous Technol. 1(2), 1–17 (2017)
Eardley, R., Roudaut, A., Gill, S., Thompson, S.J.: Understanding grip shifts: how form factors impact hand movements on mobile phones, pp. 4680–4691. ACM, New York (2017)
Eardley, R., Roudaut, A., Gill, S., Thompson, S.J.: Investigating How Smartphone Movement is Affected by Body Posture, pp. 1–8. ACM, New York (2018)
Guettaf, A., Rekik, Y., Grisoni, L.: Effect of attention saturating and cognitive load on tactile texture recognition for mobile surface. In: Ardito, C., et al. (eds.) INTERACT 2021, Part IV. LNCS, vol. 12935, pp. 557–579. Springer, Cham (2021). https://doi.org/10.1007/978-3-030-85610-6_31
Harvey, M., Pointon, M.: Searching on the go: the effects of fragmented attention on mobile web search tasks. In: Proceedings of the 40th International ACM SIGIR Conference on Research and Development in Information Retrieval, pp. 155–164 (2017)
Kane, S.K., Wobbrock, J.O., Smith, I.E.: Getting off the treadmill: evaluating walking user interfaces for mobile devices in public spaces. In: Proceedings of the 10th International Conference on Human Computer Interaction with Mobile Devices and Services, pp. 109–118. ACM, New York (2008)
Kubo, Y., Shizuki, B., Tanaka, J.: B2B-swipe: swipe gesture for rectangular smartwatches from a bezel to a bezel. In: Proceedings of the 2016 CHI Conference on Human Factors in Computing Systems, pp. 3852–3856 (2016)
Negulescu, M., Ruiz, J., Li, Y., Lank, E.: Tap, swipe, or move: attentional demands for distracted smartphone input. In: Proceedings of the International Working Conference on Advanced Visual Interfaces, AVI 2012, pp. 173–180. ACM, New York (2012)
Ng, A., Williamson, J., Brewster, S.: The effects of encumbrance and mobility on touch-based gesture interactions for mobile phones. In: Proceedings of the 17th International Conference on Human-Computer Interaction with Mobile Devices and Services, MobileHCI 2015, pp. 536–546. Association for Computing Machinery, New York (2015). https://doi.org/10.1145/2785830.2785853
Rekik, Y., Vatavu, R.D., Grisoni, L.: Understanding users’ perceived difficulty of multi-touch gesture articulation. In: Proceedings of the 16th International Conference on Multimodal Interaction, pp. 232–239. ACM, New York (2014)
Roudaut, A., Rau, A., Sterz, C., Plauth, M., Lopes, P., Baudisch, P.: Gesture output: eyes-free output using a force feedback touch surface, pp. 2547–2556. ACM, New York (2013)
Rubine, D.: Specifying gestures by example. In: Proceedings of the 18th Annual Conference on Computer Graphics and Interactive Techniques, SIGGRAPH 1991, pp. 329–337. ACM, New York (1991)
Tinwala, H., MacKenzie, I.S.: Eyes-free text entry on a touchscreen phone. In: 2009 IEEE Toronto International Conference Science and Technology for Humanity (TIC-STH), pp. 83–88 (2009)
Vatavu, R.-D., Vogel, D., Casiez, G., Grisoni, L.: Estimating the perceived difficulty of pen gestures. In: Campos, P., Graham, N., Jorge, J., Nunes, N., Palanque, P., Winckler, M. (eds.) INTERACT 2011. LNCS, vol. 6947, pp. 89–106. Springer, Heidelberg (2011). https://doi.org/10.1007/978-3-642-23771-3_9
Viviani, P., Terzuolo, C.: 32 space-time invariance in learned motor skills. In: Stelmach, G.E., Requin, J. (eds.) Tutorials in Motor Behavior, Advances in Psychology, vol. 1, pp. 525–533. North-Holland (1980)
Wobbrock, J.O.: Situationally aware mobile devices for overcoming situational impairments. In: Proceedings of the ACM SIGCHI Symposium on Engineering Interactive Computing Systems, EICS 2019. ACM, New York (2019)
Wobbrock, J.O., Wilson, A.D., Li, Y.: Gestures without libraries, toolkits or training: a \$1 recognizer for user interface prototypes. In: Proceedings of the 20th Annual ACM Symposium on User Interface Software and Technology, UIST 2007, pp. 159–168. ACM, New York (2007)
Acknowledgements
This project has received funding from the European Union’s Horizon 2020 research and innovation program under the Marie Skłodowska-Curie grant agreement No 860114. Alexandru Dancu acknowledges support from project no. PN-III-P4-ID-PCE-2020-0434 (PCE29/2021), within PNCDI III.
Author information
Authors and Affiliations
Corresponding author
Editor information
Editors and Affiliations
Rights and permissions
Copyright information
© 2023 The Author(s), under exclusive license to Springer Nature Switzerland AG
About this paper
Cite this paper
Jamalzadeh, M., Rekik, Y., Grisoni, L., Vatavu, RD., Volpe, G., Dancu, A. (2023). Effects of Moving Speed and Phone Location on Eyes-Free Gesture Input with Mobile Devices. In: Abdelnour Nocera, J., Kristín Lárusdóttir, M., Petrie, H., Piccinno, A., Winckler, M. (eds) Human-Computer Interaction – INTERACT 2023. INTERACT 2023. Lecture Notes in Computer Science, vol 14142. Springer, Cham. https://doi.org/10.1007/978-3-031-42280-5_30
Download citation
DOI: https://doi.org/10.1007/978-3-031-42280-5_30
Published:
Publisher Name: Springer, Cham
Print ISBN: 978-3-031-42279-9
Online ISBN: 978-3-031-42280-5
eBook Packages: Computer ScienceComputer Science (R0)