Abstract
Completely Automated Public Turing test to tell Computers and Humans Apart (CAPTCHA) is a test used on web sources to differentiate between humans and automated computer robots. CAPTCHA on smartphone platforms is fraught with difficulty for users with visual and hearing impairment. For deaf users, speech as a CAPTCHA input method is not useable in noisy environments. For blind users, typing a letter requires to search for its button on the virtual keyboard; thus, it is time-consuming. Gesture input is therefore a solution for smartphone applications, as neither vision nor hearing is required. This research developed new CAPTCHAs, HearAct and SeeAct, which are suitable for blind, deaf, and sighted users. These CAPTCHAs provide two ways of presenting universally accessible challenges: (1) Users see an image, identify an object in the image and then identify a word that corresponds with that object (SeeAct); (2) users listen to an object’s sound (the “sound-maker”, e.g., a dog) and identify the sound-maker (HearAct). HearAct and SeeAct CAPTCHAs provide both audio and an image of the sound maker, simultaneously. In both, users analyze a word to determine whether the word has a stated letter and then use gestures to answer. If the identified word has the letter, the user must tap; if not, they must swipe. Considering usability and accessibility, as compared to most common audio CAPTCHAs (used by blind users) and text CAPTCHAs (used by deaf and sighted users), results indicate significantly improved accessibility using both HearAct and SeeAct CAPTCHAs. Gesture-based HearAct and SeeAct CAPTCHA have a higher success rate, are more accessible, require less solving time, and are preferred by all users over the traditional CAPTCHA method.
Similar content being viewed by others
References
Alnfiai, M., Sampalli, S.: An evaluation of SingleTapBraille keyboard: a text entry method that utilizes braille patterns on touchscreen devices. In: Proceedings of the 18th International ACM SIGACCESS Conference on Computers and Accessibility (ASSETS ‘16). ACM, New York, pp. 161–169 (2016). https://doi.org/10.1145/2982142.2982161
Alnfiai, M., Sampalli, S.: An evaluation of BrailleEnter keyboard: an input method based on Braille patterns on touchscreen devices. In: International Conference on Computer and Applications (ICCA’17), Dubai, Computer Science (2017)
Alnfiai, M.: HearAct CAPTCHA: a novel design of audio CAPTCHA for visually impaired users. International Journal of Communication Networks and Information Security (IJCNIS) (2020)
Brodić, D., Amelio, A., Janković, R.: Exploring the influence of CAPTCHA types to the users response time by statistical analysis. Multimedia Tools Appl. 77(10), 12293–12329 (2018). https://doi.org/10.1007/s11042-017-4883-7
Buzzi, M., Buzzi, M., Leporini, B., Trujillo, A.: Analyzing visually impaired people’s touch gestures on smartphones. Multimedia Tools Appl. 76(4), 5141–5169 (2017). https://doi.org/10.1007/s11042-016-3594-9
Bhikne, B., Joshi, A., Joshi, M., Jadhav, C., Sakhardande, P.: Faster and less error-prone: supplementing an accessible keyboard with speech input. In: INTERACT (2019)
Bigham, J.P., Cavender, A.C.: Evaluating existing audio CAPTCHAs and an interface optimized for non-visual use. In: Proceedings of the SIGCHI Conference on Human Factors in Computing Systems. ACM (2009)
Bursztein, E., Beauxis, R., Paskov, H., Perito, D., Fabry, C., Mitchell, J.: The failure of noise-based non-continuous audio CAPTCHAs. In: IEEE Symposium on Security and Privacy, pp. 19–31. IEEE (2011)
Bursztein, E., Bethard, S., Fabry, C., Mitchell, J.C., Jurafsky, D.: How good are humans at solving CAPTCHAs? A large scale evaluation. In: Security and Privacy (SP), 2010 IEEE Symposium on, pp. 399–413. IEEE (2010)
Chow, W., Susilo, W., Thorncharoensri, P.: CAPTCHA design and security issues. In: Advances in Cyber Security: Principles, Techniques, and Applications. Springer, Singapore, pp. 69–92 (2019)
Façanha, A.R., Viana, W., Pequeno, M.C., de Borba Campos, M., Sánchez, J.: Touchscreen mobile phones virtual keyboarding for people with visual disabilities. In: Human–Computer Interaction. Applications and Services, pp. 134–145. Springer, Berlin (2014)
Fidas, C.A., Voyiatzis, A.G., Avouris, N.M.: On the necessity of user-friendly CAPTCHA. In: Proceedings of CHI ‘11, pp. 2623–2626 (2011)
FunCaptcha: FunCaptcha—the best Captcha alternative. (2015) Retrieved March 2, 2015 from http://www.funCAPTCHA.co/CAPTCHA/
Goswami, G., Powell, B.M., Vatsa, M., Singh, R., Noore, A.: FaceDCAPTCHA: face detection-based color image CAPTCHA. Future Gener. Comput. Syst. 31(2), 59–69 (2014)
Jiang, N., Dogan, H.: A gesture-based captcha design supporting mobile devices. In: Proceedings of the 2015 British HCI Conference. Lincolnshire, United Kingdom: ACM, pp. 202–207 (2015)
Jeon, M., Walker, B.N., Srivastava, A.: Spindex (Speech Index) Enhances menus on touch screen devices with tapping, wheeling, and flicking. ACM Trans. Comput. Hum. Interact. 19(2), 1–27 (2012)
Janković, R., Amelio, A.: Usability of puzzle and gesture-based CAPTCHAs on smartphones via statistical analysis. In: 2018 9th International Conference on Information, Intelligence, Systems and Applications (IISA), Zakynthos, Greece, 2018, pp. 1–8. https://doi.org/10.1109/iisa.2018.8633680
Lazar J., Feng J., Brooks T., Melamed G., Wentz B., Holman J., Olalere A., Ekedebe N.: The SoundsRight CAPTCHA: an improved approach to audio human–interaction proofs for blind users. In: CHI 2012, pp. 2267–2276. ACM (2012)
Lazar, J., Feng, J.H., Hochheiser, H.: Research Methods in Human–Computer Interaction. Wiley, London (2010)
Lewis, J.R.: Sample sizes for usability studies: additional considerations. Hum Fact J Hum Fact Ergon Soc 36(2):368–378 (2016)
Lowry, R.: Concepts and Applications of Inferential Statistics. Vassar College (1998). http://faculty.vassar.edu/lowry/webtext.html
Mohamed, M., Gao, S., Sachdeva, N., Saxena, N., Zhang, C., Kumaraguru, P., Van Oorschot, P.C.: On the security and usability of dynamic cognitive game CAPTCHAs. J Comput. Secur. 2017, 1–26 (2017)
Moradi, M., Keyvanpour, M.: CAPTCHA and its alternatives: a review. Secur. Commun. Netw. 8(12), 2135–2156 (2015). https://doi.org/10.1002/sec.1157
Reynaga, G., Chiasson, S.: The usability of CAPTCHAs on smartphones. In: International Conference on Security and Cryptography, (SECRYPT 2013), pp. 427–434. SCITEPRESS (2013)
Reynaga, G., Chiasson, S., van Oorschot, P.C.: Exploring the usability of CAPTCHAS on smartphones: comparisons and recommendations. In: Proceedings of Network and Distributes System Security Symposium, pp. 8–11 (2015)
Reynaga, G., Chiasson, S., van Oorschot, P.C.: Exploring the usability of captchas on smartphones: Comparisons and recommendations. In: Workshop on Usable Security USEC. Internet Society (2015)
Roussille, P., Raynal, M., Jouffrais, C.: DUCK: a deDUCtive soft keyboard for visually impaired users. In: 27ème conférence francophone sur l’Interaction Homme-Machine. ACM, Toulouse, France, p. a19 (2015)
Sauer, G., Hochheiser, H., Feng, J., Lazar, J.: Towards a universally usable CAPTCHA. In: Proceedings of SOAPS’08
Sauer, G., Lazar, J., Hochheiser, H., Feng, J.: Towards a universally usable human interaction proof: evaluation of task completion strategies. ACM Trans. Access. Comput. 2(4), Article 15 (2010)
Singh, V.P., Pal, P.: Survey of different types of captcha. Int. J. Comput. Sci. Inf. Technol. 5(2), 2242–2245 (2014)
Shaun, K.K., Wobbrock, J.O., Ladner, R,E.: Usable gestures for blind people: understanding 873–879. https://doi.org/10.1145/2800835.2804336 preference and performance. In: Proceedings of the SIGCHI conference on human factors in computing systems, May 07–12, 2011, Vancouver, BC, Canada (2011). https://doi.org/10.1145/1978942.1979001
Shirali-Shahreza, S., Penn, G., Balakrishnan, R., Ganjali, Y.: SeeSay and HearSay CAPTCHA for mobile interaction. In: Proceedings of the SIGCHI Conference on Human Factors in Computing Systems, ser. CHI’13. ACM, 2013, pp. 2147–2156
Tsuruta, Y., Takaya, M., Yamamura, A.: CAPTCHA suitable for smartphones. In: Information and Communication Technology—EurAsia Conference. Springer, Berlin, pp. 131–140 (2013). https://doi.org/10.1007/978-3-642-36818-9_14
von Ahn, L., Blum, M., Langford, J.: Telling humans and computers apart automatically. Commun. ACM 47(2), 56–60 (2004)
Wickramasingha, N.R.W.W.M.R.D, Keerawella, H. B. R. A. K. R. A. M, Samarasinghe, S.A.N., Ragel, R.G.: RotateCAPTCHA a novel interactive CAPTCHA design targeting mobile devices. In: 2015 IEEE 10th International Conference on Industrial and Information Systems (ICIIS), Peradeniya, pp. 49–54 (2015). https://doi.org/10.1109/iciinfs.2015.7398984
Acknowledgements
This research was supported by a Grant for Scientific Research from Taif University, Kingdom of Saudi Arabia (Project No. 1-439-6126).
Author information
Authors and Affiliations
Corresponding author
Additional information
Publisher's Note
Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.
Rights and permissions
About this article
Cite this article
Alnfiai, M. Evaluating the accessibility and usability of a universal CAPTCHA based on gestures for smartphones. Univ Access Inf Soc 20, 817–831 (2021). https://doi.org/10.1007/s10209-020-00730-x
Published:
Issue Date:
DOI: https://doi.org/10.1007/s10209-020-00730-x