[go: up one dir, main page]
More Web Proxy on the site http://driver.im/ skip to main content
10.1145/3577190.3614131acmconferencesArticle/Chapter ViewAbstractPublication Pagesicmi-mlmiConference Proceedingsconference-collections
research-article

µGeT: Multimodal eyes-free text selection technique combining touch interaction and microgestures

Published: 09 October 2023 Publication History

Abstract

We present µGeT, a novel multimodal eyes-free text selection technique. µGeT combines touch interaction with microgestures. µGeT is especially suited for People with Visual Impairments (PVI) by expanding the input bandwidth of touchscreen devices, thus shortening the interaction paths for routine tasks. To do so, µGeT extends touch interaction (left/right and up/down flicks) using two simple microgestures: thumb touching either the index or the middle finger. For text selection, the multimodal technique allows us to directly modify the positioning of the two selection handles and the granularity of text selection. Two user studies, one with 9 PVI and one with 8 blindfolded sighted people, compared µGeT with a baseline common technique (VoiceOver like on iPhone). Despite a large variability in performance, the two user studies showed that µGeT is globally faster and yields fewer errors than VoiceOver. A detailed analysis of the interaction trajectories highlights the different strategies adopted by the participants. Beyond text selection, this research shows the potential of combining touch interaction and microgestures for improving the accessibility of touchscreen devices for PVI.

Supplemental Material

MP4 File
Demo video of the two interaction techniques (baseline and µGet).

References

[1]
Carl Halladay Abraham, Bert Boadi-Kusi, Enyam Komla Amewuho Morny, and Prince Agyekum. 2021. Smartphone usage among people living with severe visual impairment and blindness. Assistive Technology 0, 0 (2021), 1–8. https://doi.org/10.1080/10400435.2021.1907485
[2]
Peter Ackland, Serge Resnikoff, and Rupert Bourne. 2017. World blindness and visual impairment: despite many successes, the problem is growing.Community eye health 30, 100 (2017), 71–73.
[3]
Nancy Alajarmeh. 2021. The extent of mobile accessibility coverage in WCAG 2.1: sufficiency of success criteria and appropriateness of relevant conformance levels pertaining to accessibility problems encountered by users who are visually impaired. Universal Access in the Information Society (2021). https://doi.org/10.1007/s10209-020-00785-w
[4]
Grace M. Begany, Ning Sa, and Xiaojun Jenny Yuan. 2016. Factors Affecting User Perception of a Spoken Language vs. Textual Search Interface: A Content Analysis. Interact. Comput. 28 (2016), 170–180.
[5]
Hemant Bhaskar Surale, Fabrice Matulic, and Daniel Vogel. 2017. Experimental Analysis of Mode Switching Techniques in Touch-based User Interfaces multi-touch; touch input; mode switching. (2017). https://doi.org/10.1145/3025453.3025865
[6]
Roger Boldu, Denys J.C. Matthies, Haimo Zhang, and Suranga Nanayakkara. 2020. AiSee: An Assistive Wearable Device to Support Visually Impaired Grocery Shoppers. Proc. ACM Interact. Mob. Wearable Ubiquitous Technol. 4, 4, Article 119 (dec 2020), 25 pages. https://doi.org/10.1145/3432196
[7]
Emeline Brule, Gilles Bailly, Anke Brock, Frédéric Valentin, Grégoire Denis, and Christophe Jouffrais. 2016. MapSense: Multi-sensory interactive maps for children living with visual impairments. In Conference on Human Factors in Computing Systems - Proceedings. Association for Computing Machinery, 445–457. https://doi.org/10.1145/2858036.2858375
[8]
Stuart K. Card, Thomas P. Moran, and Allen Newell. 1980. The Keystroke-Level Model for User Performance Time with Interactive Systems. Commun. ACM 23, 7 (jul 1980), 396–410. https://doi.org/10.1145/358886.358895
[9]
Edwin Chan, Teddy Seyed, Wolfgang Stuerzlinger, Xing-Dong Dong Yang, and Frank Maurer. 2016. User Elicitation on Single-Hand Microgestures. In Proceedings of the 2016 CHI Conference on Human Factors in Computing Systems(CHI ’16). Association for Computing Machinery, New York, NY, USA, 3403–3414. https://doi.org/10.1145/2858036.2858589
[10]
Eric Corbett and Astrid Weber. 2016. What Can I Say? Addressing User Experience Challenges of a Mobile Voice User Interface for Accessibility. In Proceedings of the 18th International Conference on Human-Computer Interaction with Mobile Devices and Services (Florence, Italy) (MobileHCI ’16). Association for Computing Machinery, New York, NY, USA, 72–82. https://doi.org/10.1145/2935334.2935386
[11]
Rafael Jeferson Pezzuto Damaceno, Juliana Cristina Braga, and Jesús Pascual Mena-Chalco. 2018. Mobile device accessibility for the visually impaired: problems mapping and recommendations. Universal Access in the Information Society 17, 2 (2018), 421–435. https://doi.org/10.1007/s10209-017-0540-1
[12]
Julie Ducasse, Anke M. Brock, and Christophe Jouffrais. 2018. Accessible Interactive Maps for Visually Impaired Users. In Mobility of Visually Impaired People. Springer International Publishing, 537–584. https://doi.org/10.1007/978-3-319-54446-5_17
[13]
Yasmine N. El-Glaly, Francis Quek, Tonya Smith-Jackson, and Gurjot Dhillon. 2013. Touch-Screens Are Not Tangible: Fusing Tangible Interaction with Touch Glass in Readers for the Blind. In Proceedings of the 7th International Conference on Tangible, Embedded and Embodied Interaction (Barcelona, Spain) (TEI ’13). Association for Computing Machinery, New York, NY, USA, 245–253. https://doi.org/10.1145/2460625.2460665
[14]
Gauthier Robert Jean Faisandaz, Alix Goguey, Christophe Jouffrais, and Laurence Nigay. 2022. Keep in Touch: Combining Touch Interaction with Thumb-to-Finger µGestures for People with Visual Impairment. In Proceedings of the 2022 International Conference on Multimodal Interaction (Bengaluru, India) (ICMI ’22). Association for Computing Machinery, New York, NY, USA, 105–116. https://doi.org/10.1145/3536221.3556589
[15]
Euan Freeman, Gareth Griffiths, and Stephen A. Brewster. 2017. Rhythmic micro-gestures: Discreet interaction on-The-go. ICMI 2017 - Proceedings of the 19th ACM International Conference on Multimodal Interaction 2017-Janua, September (nov 2017), 115–119. https://doi.org/10.1145/3136755.3136815
[16]
Alix Goguey, Géry Casiez, Andy Cockburn, and Carl Gutwin. 2018. Storyboard-Based Empirical Modeling of Touch Interface Performance. In Proceedings of the 2018 CHI Conference on Human Factors in Computing Systems (Montreal QC, Canada) (CHI ’18). Association for Computing Machinery, New York, NY, USA, 1–12. https://doi.org/10.1145/3173574.3174019
[17]
Alix Goguey, Sylvain Malacria, and Carl Gutwin. 2018. Improving Discoverability and Expert Performance in Force-Sensitive Text Selection for Touch Devices with Mode Gauges. Proceedings of the 2018 CHI Conference on Human Factors in Computing Systems (2018). https://doi.org/10.1145/3173574
[18]
Alix Goguey, Daniel Vogel, Fanny Chevalier, Thomas Pietrzak, Nicolas Roussel, and Géry Casiez. 2017. Leveraging finger identification to integrate multi-touch command selection and parameter manipulation. International Journal of Human-Computer Studies 99 (mar 2017), 21–36. https://doi.org/10.1016/J.IJHCS.2016.11.002
[19]
Nora Griffin-Shirley, Devender R Banda, Paul M Ajuwon, Jongpil Cheon, Jaehoon Lee, Hye Ran Park, and Sanpalei Nylla Lyngdoh. 2017. A Survey on the Use of Mobile Applications for People who Are Visually Impaired. Journal of Visual Impairment & Blindness 111 (2017), 307–323.
[20]
LeeLik Hang, LamKit Yung, LiTong, BraudTristan, SuXiang, and HuiPan. 2019. Quadmetric Optimized Thumb-to-Finger Interaction for Force Assisted One-Handed Text Entry on Mobile Headsets. Proceedings of the ACM on Interactive, Mobile, Wearable and Ubiquitous Technologies 3, 3 (sep 2019), 1–27. https://doi.org/10.1145/3351252
[21]
Chris Harrison and Scott E Hudson. 2012. Using Shear as a Supplemental Two-Dimensional Input Channel for Rich Touchscreen Interaction. (2012).
[22]
Chris Harrison, Julia Schwarz, and Scott E Hudson. 2011. TapSense: Enhancing Finger Interaction on Touch Surfaces. Proceedings of the 24th annual ACM symposium on User interface software and technology - UIST ’11 (2011). https://doi.org/10.1145/2047196
[23]
Juan David Hincapié-Ramos, Xiang Guo, Paymahn Moghadasian, and Pourang Irani. 2014. Consumed Endurance: A Metric to Quantify Arm Fatigue of Mid-Air Interactions. In Proceedings of the SIGCHI Conference on Human Factors in Computing Systems (Toronto, Ontario, Canada) (CHI ’14). Association for Computing Machinery, New York, NY, USA, 1063–1072. https://doi.org/10.1145/2556288.2557130
[24]
Shaun K. Kane, Meredith Ringel Morris, Annuska Z. Perkins, Daniel Wigdor, Richard E. Ladner, and Jacob O. Wobbrock. 2011. Access Overlays: Improving Non-Visual Access to Large Touch Screens for Blind Users. In Proceedings of the 24th Annual ACM Symposium on User Interface Software and Technology (Santa Barbara, California, USA) (UIST ’11). Association for Computing Machinery, New York, NY, USA, 273–282. https://doi.org/10.1145/2047196.2047232
[25]
Shaun K Kane, Meredith Ringel Morris, and Jacob O Wobbrock. 2013. Touchplates: Low-Cost Tactile Overlays for Visually Impaired Touch Screen Users. Proceedings of the 15th International ACM SIGACCESS Conference on Computers and Accessibility (2013). https://doi.org/10.1145/2513383
[26]
Shaun K Kane and Jacob O Wobbrock. 2011. Usable Gestures for Blind People: Understanding Preference and Performance. Proceedings of the SIGCHI Conference on Human Factors in Computing Systems (2011). https://doi.org/10.1145/1978942
[27]
Akif Khan and Shah Khusro. 2021. An insight into smartphone-based assistive solutions for visually impaired and blind people: issues, challenges and opportunities. Universal Access in the Information Society 20, 2 (2021), 265–298. https://doi.org/10.1007/s10209-020-00733-8
[28]
Ahreum Lee, Kiburm Song, Hokyoung Blake Ryu, Jieun Kim, and Gyuhyun Kwon. 2015. Fingerstroke time estimates for touchscreen-based mobile gaming interaction. Human movement science 44 (2015), 211–224.
[29]
G. Julian Lepinski, Tovi Grossman, and George Fitzmaurice. 2010. The design and evaluation of multitouch marking menus. Conference on Human Factors in Computing Systems - Proceedings 4 (2010), 2233–2242. https://doi.org/10.1145/1753326.1753663
[30]
Frank Chun Yat Li, David Dearman, and Khai N. Truong. 2009. Virtual Shelves: Interactions with Orientation Aware Devices. In Proceedings of the 22nd Annual ACM Symposium on User Interface Software and Technology (Victoria, BC, Canada) (UIST ’09). Association for Computing Machinery, New York, NY, USA, 125–128. https://doi.org/10.1145/1622176.1622200
[31]
Guanhong Liu, Yizheng Gu, Yiwen Yin, Chun Yu, Yuntao Wang, Haipeng Mi, and Yuanchun Shi. 2020. Keep the Phone in Your Pocket: Enabling Smartphone Operation with an IMU Ring for Visually Impaired People. Proceedings of the ACM on Interactive, Mobile, Wearable and Ubiquitous Technologies 4, 2 (jun 2020). https://doi.org/10.1145/3397308
[32]
Denys J. C. Matthies, Bodo Urban, Katrin Wolf, and Albrecht Schmidt. 2019. Reflexive Interaction: Extending the Concept of Peripheral Interaction. In Proceedings of the 31st Australian Conference on Human-Computer-Interaction (Fremantle, WA, Australia) (OZCHI’19). Association for Computing Machinery, New York, NY, USA, 266–278. https://doi.org/10.1145/3369457.3369478
[33]
Takashi Miyaki and Jun Rekimoto. 2009. Graspzoom: Zooming and scrolling control model for single-handed mobile interaction. MobileHCI09 - The 11th International Conference on Human-Computer Interaction with Mobile Devices and Services (2009). https://doi.org/10.1145/1613858.1613872
[34]
Aarthi Easwara Moorthy and Kim-Phuong L Vu. 2015. Privacy Concerns for Use of Voice Activated Personal Assistant in the Public Space. International Journal of Human–Computer Interaction 31, 4 (2015), 307–335. https://doi.org/10.1080/10447318.2014.986642
[35]
Matt Rice, R. Daniel Jacobson, Reginald G. Golledge, and David Jones. 2005. Design Considerations for Haptic and Auditory Map Interfaces. Cartography and Geographic Information Science 32, 4 (2005), 381–391. https://doi.org/10.1559/152304005775194656 arXiv:https://doi.org/10.1559/152304005775194656
[36]
João Ricardo dos S. Rosa and Natasha Malveira C. Valentim. 2020. Accessibility, Usability and User Experience Design for Visually Impaired People: A Systematic Mapping Study. In Proceedings of the 19th Brazilian Symposium on Human Factors in Computing Systems (Diamantina, Brazil) (IHC ’20). Association for Computing Machinery, New York, NY, USA, Article 5, 10 pages. https://doi.org/10.1145/3424953.3426626
[37]
Marcos Serrano and Laurence Nigay. 2009. Temporal Aspects of CARE-Based Multimodal Fusion: From a Fusion Mechanism to Composition Components and WoZ Components. In Proceedings of the 2009 International Conference on Multimodal Interfaces (Cambridge, Massachusetts, USA) (ICMI-MLMI ’09). Association for Computing Machinery, New York, NY, USA, 177–184. https://doi.org/10.1145/1647314.1647346
[38]
Adwait Sharma, Joan Sol Roo, and Jürgen Steimle. 2019. Grasping Microgestures: Eliciting Single-Hand Microgestures for Handheld Objects. In Proceedings of the 2019 CHI Conference on Human Factors in Computing Systems(CHI ’19, Chi). Association for Computing Machinery, New York, NY, USA, 1–13. https://doi.org/10.1145/3290605.3300632
[39]
Craig Stewart, Michael Rohs, Sven Kratz, and Georg Essl. 2010. Characteristics of Pressure-Based Input for Mobile Devices. Proceedings of the 28th international conference on Human factors in computing systems - CHI ’10 (2010). https://doi.org/10.1145/1753326
[40]
Hsin Ruey Tsai, Te Yen Wu, Min Chieh Hsiu, Jui Chun Hsiao, Da Yuan Huang, Yi Ping Hung, Mike Y. Chen, and Bing Yu Chen. 2017. SegTouch: Enhancing touch input while providing touch gestures on screens using thumb-to-index-finger gestures. In Conference on Human Factors in Computing Systems - Proceedings, Vol. Part F1276. ACM, New York, NY, USA, 2164–2171. https://doi.org/10.1145/3027063.3053109
[41]
Katrin Wolf. 2016. Microgestures—Enabling Gesture Input with Busy Hands. In Peripheral Interaction: Challenges and Opportunities for HCI in the Periphery of Attention, Saskia Bakker, Doris Hausen, and Ted Selker (Eds.). Springer International Publishing, Cham, 95–116. https://doi.org/10.1007/978-3-319-29523-7_5
[42]
Katrin Wolf, Anja Naumann, Michael Rohs, and Jörg Müller. 2011. Taxonomy of Microinteractions: Defining Microgestures Based on Ergonomic and Scenario-Dependent Requirements. In Proceedings of the 13th IFIP TC 13 International Conference on Human-Computer Interaction - Volume Part I(INTERACT’11, Vol. 6946 LNCS). Springer-Verlag, Berlin, Heidelberg, 559–575. https://doi.org/10.1007/978-3-642-23774-4_45
[43]
Haijun Xia, Tovi Grossman, and George Fitzmaurice. 2015. Nanostylus: Enhancing input on ultra-small displays with a finger-mounted stylus. UIST 2015 - Proceedings of the 28th Annual ACM Symposium on User Interface Software and Technology (nov 2015), 447–456. https://doi.org/10.1145/2807442.2807500
[44]
Mary Zajicek, Richard Wales, and Andrew Lee. 2004. Speech interaction for older adults., 122–130 pages. https://doi.org/10.1007/s10209-004-0091-0
[45]
Maozheng Zhao, Wenzhe Cui, IV Ramakrishnan, Shumin Zhai, and Xiaojun Bi. 2021. Voice and Touch Based Error-Tolerant Multimodal Text Editing and Correction for Smartphones. In The 34th Annual ACM Symposium on User Interface Software and Technology (Virtual Event, USA) (UIST ’21). Association for Computing Machinery, New York, NY, USA, 162–178. https://doi.org/10.1145/3472749.3474742

Index Terms

  1. µGeT: Multimodal eyes-free text selection technique combining touch interaction and microgestures

      Recommendations

      Comments

      Please enable JavaScript to view thecomments powered by Disqus.

      Information & Contributors

      Information

      Published In

      cover image ACM Conferences
      ICMI '23: Proceedings of the 25th International Conference on Multimodal Interaction
      October 2023
      858 pages
      ISBN:9798400700552
      DOI:10.1145/3577190
      Publication rights licensed to ACM. ACM acknowledges that this contribution was authored or co-authored by an employee, contractor or affiliate of a national government. As such, the Government retains a nonexclusive, royalty-free right to publish or reproduce this article, or to allow others to do so, for Government purposes only.

      Sponsors

      Publisher

      Association for Computing Machinery

      New York, NY, United States

      Publication History

      Published: 09 October 2023

      Permissions

      Request permissions for this article.

      Check for updates

      Author Tags

      1. Microgesture
      2. accessibility
      3. touch
      4. visual impairment.

      Qualifiers

      • Research-article
      • Research
      • Refereed limited

      Funding Sources

      • LabEx PERSYVAL-Lab
      • ANR, project MIC

      Conference

      ICMI '23
      Sponsor:

      Acceptance Rates

      Overall Acceptance Rate 453 of 1,080 submissions, 42%

      Contributors

      Other Metrics

      Bibliometrics & Citations

      Bibliometrics

      Article Metrics

      • 0
        Total Citations
      • 130
        Total Downloads
      • Downloads (Last 12 months)77
      • Downloads (Last 6 weeks)4
      Reflects downloads up to 20 Jan 2025

      Other Metrics

      Citations

      View Options

      Login options

      View options

      PDF

      View or Download as a PDF file.

      PDF

      eReader

      View online with eReader.

      eReader

      HTML Format

      View this article in HTML Format.

      HTML Format

      Media

      Figures

      Other

      Tables

      Share

      Share

      Share this Publication link

      Share on social media