[go: up one dir, main page]
More Web Proxy on the site http://driver.im/ skip to main content
research-article

Ally: Understanding Text Messaging to Build a Better Onscreen Keyboard for Blind People

Published: 22 October 2022 Publication History

Abstract

Millions of people worldwide use smartphones every day, but the standard issue QWERTY keyboard is poorly optimized for non-sighted input. In this article, we document the variety of methods blind people use to enter text into their smartphones, and focus on one particular need: sending text messages. We analyze two modern corpora of text messages and contrast them with an older text message corpus, as well as other corpora gathered from news articles, chat rooms, and books. We present a virtual keyboard for blind people optimized for sending text messages called Ally. To evaluate Ally, we conducted two user studies with blind participants. Our first study found increasing speeds and our second study found that half of participants reached comparable speeds to QWERTY, suggesting it may be a viable replacement. We conclude with a discussion of future work for non-sighted text-entry of text messages.

References

[1]
Tousif Ahmed, Roberto Hoyle, Kay Connelly, David Crandall, and Apu Kapadia. 2015. Privacy concerns and behaviors of people with visual impairments. In Proceedings of the 33rd Annual ACM Conference on Human Factors in Computing Systems. ACM, 3523–3532.
[2]
Ahmed Sabbir Arif and Wolfgang Stuerzlinger. 2009. Analysis of text entry performance metrics. In 2009 IEEE Toronto International Conference Science and Technology for Humanity (TIC-STH’09). IEEE, 100–105.
[3]
Ahmed Sabbir Arif and Wolfgang Stuerzlinger. 2010. Predicting the cost of error correction in character-based text entry technologies. In Proceedings of the SIGCHI Conference on Human Factors in Computing Systems. 5–14.
[4]
Shiri Azenkot and Nicole B. Lee. 2013. Exploring the use of speech input by blind people on mobile devices. In Proceedings of the 15th International ACM SIGACCESS Conference on Computers and Accessibility. ACM, 11.
[5]
Shiri Azenkot, Jacob O. Wobbrock, Sanjana Prasain, and Richard E. Ladner. 2012. Input finger detection for nonvisual touch screen text entry in Perkinput. In Proceedings of Graphics Interface 2012. Canadian Information Processing Society, 121–129.
[6]
Agathe Battestini, Vidya Setlur, and Timothy Sohn. 2010. A large scale study of text-messaging use. In Proceedings of the 12th International Conference on Human Computer Interaction with Mobile Devices and Services (MobileHCI’10). Association for Computing Machinery, New York, NY, 229–238.
[7]
Pabba Anu Bharath, Charudatta Jadhav, Shashank Ahire, Manjiri Joshi, Rini Ahirwar, and Anirudha Joshi. 2017. Performance of accessible gesture-based indic keyboard. In IFIP Conference on Human-Computer Interaction. Springer, 205–220.
[8]
Bhakti Bhikne, Anirudha Joshi, Manjiri Joshi, Shashank Ahire, and Nimish Maravi. 2018. How much faster can you type by speaking in Hindi? Comparing keyboard-only and keyboard+speech text entry. In Proceedings of the 9th Indian Conference on Human Computer Interaction (IndiaHCI’18). Association for Computing Machinery, New York, NY, 20–28.
[9]
Bhakti Bhikne, Anirudha Joshi, Manjiri Joshi, Charudatta Jadhav, and Prabodh Sakhardande. 2019. Faster and less error-prone: Supplementing an accessible keyboard with speech input. In Human-Computer Interaction – INTERACT 2019, David Lamas, Fernando Loizides, Lennart Nacke, Helen Petrie, Marco Winckler, and Panayiotis Zaphiris (Eds.). Springer International Publishing, Cham, 288–304.
[10]
Matthew N. Bonner, Jeremy T. Brudvik, Gregory D. Abowd, and W. Keith Edwards. 2010. No-look notes: Accessible eyes-free multi-touch text entry. In International Conference on Pervasive Computing. Springer, 409–426.
[11]
Pew Research Center. 2015. The Smartphone Difference. http://www.pewinternet.org/2015/04/01/us-smartphone-use-in2015/.
[12]
Tao Chen and Min-Yen Kan. 2013. Creating a live, public short message service corpus: The NUS SMS corpus. Language Resources and Evaluation 47, 2 (2013), 299–335.
[13]
James Clawson, Kent Lyons, Thad Starner, and Edward Clarkson. 2005. The impacts of limited visual feedback on mobile text entry for the twiddler and mini-qwerty keyboards. In Proceedings of the 9th IEEE International Symposium on Wearable Computers (2005). IEEE, 170–177.
[14]
Richard L. Deininger. 1960. Human factors engineering studies of the design and use of pushbutton telephone sets. Bell System Technical Journal 39, 4 (1960), 995–1012.
[15]
Mark D. Dunlop and Andrew Crossan. 2000. Predictive text entry methods for mobile phones. Personal Technologies 4, 2-3 (2000), 134–143.
[16]
National Federation for the Blind. 2018. Statistical Facts about Blindness in the United States. https://nfb.org/blindness-statistics.
[17]
Andrew Fowler, Kurt Partridge, Ciprian Chelba, Xiaojun Bi, Tom Ouyang, and Shumin Zhai. 2015. Effects of language modeling and its personalization on touchscreen typing performance. In Proceedings of the 33rd Annual ACM Conference on Human Factors in Computing Systems. ACM, 649–658.
[18]
Brian Frey, Caleb Southern, and Mario Romero. 2011. Brailletouch: Mobile texting for the visually impaired. In International Conference on Universal Access in Human-Computer Interaction. Springer, 19–25.
[19]
Tiago Guerreiro, Joaquim Jorge, and Daniel Gonçalves. 2012. Exploring the non-visual acquisition of targets on touch phones and tablets. In 2nd Workshop on Mobile Accessibility, MobileHCI.
[20]
Tiago Guerreiro, Paulo Lagoá, Hugo Nicolau, Daniel Gonçalves, and Joaquim A. Jorge. 2008. From tapping to touching: Making touch screens accessible to blind users. IEEE MultiMedia 15, 4 (2008), 48–50.
[21]
Tiago Guerreiro, Hugo Nicolau, Joaquim Jorge, and Daniel Gonçalves. 2009. NavTap: A long term study with excluded blind users. In Proceedings of the 11th International ACM SIGACCESS Conference on Computers and Accessibility. 99–106.
[22]
Ylva Hård af Segerstad. 2005. Language use in Swedish mobile text messaging. In Mobile Communications. Springer, 313–333.
[23]
Joseph Nathaniel Kaye, Chris Seonghwan Yoon, Danielle Maryse Lottridge, Senxuan Wang, and Darren Malcolm Burton. 2021. Virtual keyboard. US Patent No. 11,199,965.
[24]
Kevin A. Li, Patrick Baudisch, and Ken Hinckley. 2008. Blindsight: Eyes-free access to mobile phones. In Proceedings of the SIGCHI Conference on Human Factors in Computing Systems. ACM, 1389–1398.
[25]
Yiqin Lu, Chun Yu, Xin Yi, Yuanchun Shi, and Shengdong Zhao. 2017. Blindtype: Eyes-free text entry on handheld touchpad by leveraging thumb’s muscle memory. Proceedings of the ACM on Interactive, Mobile, Wearable and Ubiquitous Technologies 1, 2 (2017), 18.
[26]
I. Scott MacKenzie. 2012. Human-Computer Interaction: An Empirical Research Perspective. Newnes.
[27]
I. Scott MacKenzie and R. William Soukoreff. 2003. Phrase sets for evaluating text entry techniques. In CHI’03 extended Abstracts on Human Factors in Computing Systems. ACM, 754–755.
[28]
I. Scott MacKenzie and Shawn X. Zhang. 1999. The design and evaluation of a high-performance soft keyboard. In Proceedings of the SIGCHI Conference on Human Factors in Computing Systems. ACM, 25–31.
[29]
Jennifer Mankoff and Gregory D. Abowd. 1998. Cirrin: A word-level unistroke keyboard for pen input. In Proceedings of the 11th Annual ACM Symposium on User Interface Software and Technology. 213–214.
[30]
Sergio Mascetti, Cristian Bernareggi, and Matteo Belotti. 2011. TypeInBraille: A braille-based typing application for touchscreen devices. In Proceedings of the 13th International ACM SIGACCESS Conference on Computers and Accessibility. ACM, 295–296.
[31]
Mark S. Mayzner and Margaret Elizabeth Tresselt. 1965. Tables of single-letter and diagram frequency counts for various word-length and letter-position combinations. Psychonomic Monograph Supplements (1965).
[32]
Kyle Montague, João Guerreiro, Hugo Nicolau, Tiago Guerreiro, André Rodrigues, and Daniel Gonçalves. 2016. Towards inviscid text-entry for blind people through non-visual word prediction interfaces. In CHI Workshop on Inviscid Text-Entry and Beyond.
[33]
Hugo Nicolau, Tiago Guerreiro, Joaquim Jorge, and Daniel Gonçalves. 2010. Proficient blind users and mobile text-entry. In Proceedings of the 28th Annual European Conference on Cognitive Ergonomics. 19–22.
[34]
Hugo Nicolau, Kyle Montague, Tiago Guerreiro, João Guerreiro, and Vicki L. Hanson. 2014. B#: Chord-based correction for multitouch braille input. In Proceedings of the 32nd Annual ACM Conference on Human Factors in Computing Systems. ACM, 1705–1708.
[35]
Hugo Nicolau, Kyle Montague, Tiago Guerreiro, André Rodrigues, and Vicki L. Hanson. 2015. Typing performance of blind users: An analysis of touch behaviors, learning effect, and in-situ usage. In Proceedings of the 17th International ACM SIGACCESS Conference on Computers and Accessibility. 273–280.
[36]
Peter Norvig. 2013. English letter frequency counts: Mayzner revisited or ETAOIN SRHLDCU. Retrieved August 7, 2016 from Dostopno na http://www.norvig.com/mayzner.html.
[37]
João Oliveira, Tiago Guerreiro, Hugo Nicolau, Joaquim Jorge, and Daniel Gonçalves. 2011. Blind people and mobile touch-based text-entry: Acknowledging the need for different flavors. In Proceedings of the 13th International ACM SIGACCESS Conference on Computers and Accessibility. ACM, 179–186.
[38]
João Oliveira, Tiago Guerreiro, Hugo Nicolau, Joaquim Jorge, and Daniel Gonçalves. 2011. BrailleType: Unleashing braille over touch screen mobile phones. In IFIP Conference on Human-Computer Interaction. Springer, 100–107.
[39]
Ken Perlin. 1998. Quikwriting: Continuous stylus-based text entry. In Proceedings of the 11th Annual ACM Symposium on User Interface Software and Technology. 215–216.
[40]
Chandni Rajendran, Chinmay Parab, and Shreya Gupta. 2016. EGDE, A soft keyboard for fast typing for the visually challenged. In Proceedings of the 2016 CHI Conference Extended Abstracts on Human Factors in Computing Systems (CHI EA’16). Association for Computing Machinery, New York, NY, 50–55.
[41]
André Rodrigues, Kyle Montague, Hugo Nicolau, and Tiago Guerreiro. 2015. Getting smartphones to talkback: Understanding the smartphone adoption process of blind users. In Proceedings of the 17th International ACM SIGACCESS Conference on Computers and Accessibility. 23–32.
[42]
Sherry Ruan, Jacob O. Wobbrock, Kenny Liou, Andrew Ng, and James Landay. 2016. Speech is 3× faster than typing for English and Mandarin text entry on mobile devices. arXiv:1608.07323.
[43]
Latisha Shafie, Norizul Darus, and Nazira Osman. 2010. SMS language and college writing: The languages of the college texters. International Journal of Emerging Technologies in Learning (iJET) 5, 1 (2010), 26–31.
[44]
R. William Soukoreff and I. Scott MacKenzie. 2001. Measuring errors in text entry tasks: An application of the Levenshtein string distance statistic. In CHI’01 Extended Abstracts on Human Factors in Computing Systems. 319–320.
[45]
R. William Soukoreff and I. Scott MacKenzie. 2004. Recent developments in text-entry error rate measurement. In CHI’04 Extended Abstracts on Human Factors in Computing Systems (CHI EA’04). Association for Computing Machinery, New York, NY, 1425–1428.
[46]
R. William Soukoreff and I. Scott Mackenzie. 1995. Theoretical upper and lower bounds on typing speed using a stylus and a soft keyboard. Behaviour & Information Technology 14, 6 (1995), 370–379.
[47]
Caleb Southern, James Clawson, Brian Frey, Gregory Abowd, and Mario Romero. 2012. An evaluation of BrailleTouch: mobile touchscreen text entry for the visually impaired. In Proceedings of the 14th International Conference on Human-Computer Interaction with Mobile Devices and Services. ACM, 317–326.
[48]
Crispin Thurlow and Alex Brown. 2003. Generation Txt? The sociolinguistics of young people’s text-messaging. Discourse Analysis Online 1, 1 (2003), 30.
[49]
Hussain Tinwala and I. Scott MacKenzie. 2010. Eyes-free text entry with error correction on touchscreen mobile devices. In Proceedings of the 6th Nordic Conference on Human-Computer Interaction: Extending Boundaries. ACM, 511–520.
[50]
W.H.O.2017. World Health Organization: Blindness and Visual Impairment. http://www.who.int/en/news-room/fact-sheets/detail/blindness-and-visual-impairment.
[51]
Jacob O. Wobbrock, Brad A. Myers, and John A. Kembel. 2003. EdgeWrite: A stylus-based text entry method designed for high accuracy and stability of motion. In Proceedings of the 16th Annual ACM Symposium on User Interface Software and Technology. 61–70.
[52]
Kuldeep Yadav, Ponnurangam Kumaraguru, Atul Goyal, Ashish Gupta, and Vinayak Naik. 2011. SMSAssassin: Crowdsourcing driven mobile-based system for SMS spam filtering. In Proceedings of the 12th Workshop on Mobile Computing Systems and Applications. ACM, 1–6.
[53]
Georgios Yfantidis and Grigori Evreinov. 2006. Adaptive blind interaction technique for touchscreens. Universal Access in the Information Society 4, 4 (2006), 328–337.
[54]
Shumin Zhai. 2017. Modern touchscreen keyboards as intelligent user interfaces: A research review. In Proceedings of the 22nd International Conference on Intelligent User Interfaces (IUI’17). ACM, New York, NY, 1–2.
[55]
Shumin Zhai, Michael Hunter, and Barton A. Smith. 2002. Performance optimization of virtual keyboards. Human–Computer Interaction 17, 2-3 (2002), 229–269.
[56]
Suwen Zhu, Tianyao Luo, Xiaojun Bi, and Shumin Zhai. 2018. Typing on an invisible keyboard. In Proceedings of the 2018 CHI Conference on Human Factors in Computing Systems. ACM, 439.
[57]
Suwen Zhu, Jingjie Zheng, Shumin Zhai, and Xiaojun Bi. 2019. i’sFree: Eyes-free gesture typing via a touch-enabled remote control. In Proceedings of the 2019 CHI Conference on Human Factors in Computing Systems. ACM, 448.

Cited By

View all
  • (2024)Party Face Congratulations! Exploring Design Ideas to Help Sighted Users with Emoji Accessibility when Messaging with Screen Reader UsersProceedings of the ACM on Human-Computer Interaction10.1145/36410148:CSCW1(1-31)Online publication date: 26-Apr-2024
  • (2023)A Review of Design and Evaluation Practices in Mobile Text Entry for Visually Impaired and Blind PersonsMultimodal Technologies and Interaction10.3390/mti70200227:2(22)Online publication date: 17-Feb-2023
  • (2023)Signage Detection Based on Adaptive SIFTIntelligent Data Engineering and Analytics10.1007/978-981-99-6706-3_13(141-152)Online publication date: 26-Nov-2023

Index Terms

  1. Ally: Understanding Text Messaging to Build a Better Onscreen Keyboard for Blind People

        Recommendations

        Comments

        Please enable JavaScript to view thecomments powered by Disqus.

        Information & Contributors

        Information

        Published In

        cover image ACM Transactions on Accessible Computing
        ACM Transactions on Accessible Computing  Volume 15, Issue 4
        December 2022
        302 pages
        ISSN:1936-7228
        EISSN:1936-7236
        DOI:10.1145/3561961
        Issue’s Table of Contents

        Publisher

        Association for Computing Machinery

        New York, NY, United States

        Publication History

        Published: 22 October 2022
        Online AM: 04 May 2022
        Accepted: 25 April 2022
        Revised: 20 April 2022
        Received: 14 October 2020
        Published in TACCESS Volume 15, Issue 4

        Permissions

        Request permissions for this article.

        Check for updates

        Author Tags

        1. Soft keyboard
        2. onscreen keyboard
        3. smartphone keyboard
        4. mobile text-entry
        5. blind
        6. vision-impaired

        Qualifiers

        • Research-article
        • Refereed

        Contributors

        Other Metrics

        Bibliometrics & Citations

        Bibliometrics

        Article Metrics

        • Downloads (Last 12 months)130
        • Downloads (Last 6 weeks)11
        Reflects downloads up to 13 Dec 2024

        Other Metrics

        Citations

        Cited By

        View all
        • (2024)Party Face Congratulations! Exploring Design Ideas to Help Sighted Users with Emoji Accessibility when Messaging with Screen Reader UsersProceedings of the ACM on Human-Computer Interaction10.1145/36410148:CSCW1(1-31)Online publication date: 26-Apr-2024
        • (2023)A Review of Design and Evaluation Practices in Mobile Text Entry for Visually Impaired and Blind PersonsMultimodal Technologies and Interaction10.3390/mti70200227:2(22)Online publication date: 17-Feb-2023
        • (2023)Signage Detection Based on Adaptive SIFTIntelligent Data Engineering and Analytics10.1007/978-981-99-6706-3_13(141-152)Online publication date: 26-Nov-2023

        View Options

        Login options

        Full Access

        View options

        PDF

        View or Download as a PDF file.

        PDF

        eReader

        View online with eReader.

        eReader

        Full Text

        View this article in Full Text.

        Full Text

        HTML Format

        View this article in HTML Format.

        HTML Format

        Media

        Figures

        Other

        Tables

        Share

        Share

        Share this Publication link

        Share on social media