Abstract
Sign languages enable effective communication between deaf and hearing people. Despite years of extensive pedagogical research, learning sign language online comes with a number of difficulties that might be frustrating for some students. Indeed, most of the existing approaches rely heavily on learning resources uploaded on websites, assuming that users will frequently consult them; however, this approach may feel tedious and uninspiring. To address this issue, several researchers have started looking into learning sign language in a game-based environment. However, the majority of the existing work still relies on website-based designs, with only a very few proposed systems providing an immersive virtual environment, and there are no user studies comparing website-based and immersive virtual environments. In this paper, we present an immersive environment for learning numbers 0–9 in American Sign Language (ASL). Our hypothesis is that an immersive virtual environment can provide users with a better learning experience and that users will show a higher level of engagement compared to website-based learning. We conducted a questionnaire-based user survey, and our initial findings suggest that users prefer to learn in an immersive virtual environment.
Access this chapter
Tax calculation will be finalised at checkout
Purchases are for personal use only
Similar content being viewed by others
Notes
- 1.
https://www.who.int/zh/news-room/fact-sheets/detail/deafness-and-hearing-loss.
- 2.
References
Adamo-Villani, N., Carpenter, E., Arns, L.: An immersive virtual environment for learning sign language mathematics. In: ACM SIGGRAPH 2006 Educators Program, pp. 20-es (2006)
Bantupalli, K., Xie, Y.: American sign language recognition using deep learning and computer vision. In: Proceedings of ICBD, pp. 4896–4899. IEEE (2018)
Battistoni, P., Di Gregorio, M., Sebillo, M., Vitiello, G.: Ai at the edge for sign language learning support. In: Proceedings of HCC, pp. 16–23. IEEE (2019)
Bheda, V., Radpour, D.: Using deep convolutional networks for gesture recognition in American sign language. arXiv:1710.06836 (2017)
Bird, J.J., Ekárt, A., Faria, D.R.: British sign language recognition via late fusion of computer vision and leap motion with transfer learning to American sign language. Sensors 20(18), 5151 (2020)
Bradski, G., Kaehler, A.: Opencv. Dr. Dobb’s J. Softw. Tools 3, 120 (2000)
Bragg, D., Caselli, N., Gallagher, J.W., Goldberg, M., Oka, C.J., Thies, W.: ASL sea battle: gamifying sign language data collection. In: Proceedings of CHI-HFCS, pp. 1–13 (2021)
Camgoz, N.C., Koller, O., Hadfield, S., Bowden, R.: Sign language transformers: joint end-to-end sign language recognition and translation. In: Proceedings of CVPR, pp. 10023–10033 (2020)
Economou, D., Russi, M.G., Doumanis, I., Mentzelopoulos, M., Bouki, V., Ferguson, J.: Using serious games for learning british sign language combining video, enhanced interactivity, and VR technology. J. Univ. Comput. Sci. 26(8), 996–1016 (2020)
Empe, N.A.A., Echon, R.C.L., Vega, H.D.A., Paterno, P.L.C., Jamis, M.N., Yabut, E.R.: SimboWika: a mobile and web application to learn filipino sign language for deaf students in elementary schools. In: Proceedings of R10-HTC, pp. 1–6. IEEE (2020)
Estrada-Cota, I., Carreño-León, M.A., Sandoval-Bringas, J.A., Leyva-Carrillo, A.A., Quiroz, H.X.C.: Design of a Web tool for teaching-learning of states and capitals of México through the Mexican sign language. In: Proceedings of ICITE, pp. 174–179. IEEE (2021)
Goswami, T., Javaji, S.R.: CNN model for American sign language recognition. In: Kumar, A., Mozar, S. (eds.) ICCCE 2020. LNEE, vol. 698, pp. 55–61. Springer, Singapore (2021). https://doi.org/10.1007/978-981-15-7961-5_6
Jiang, X., Hu, B., Chandra Satapathy, S., Wang, S.H., Zhang, Y.D.: Fingerspelling identification for Chinese sign language via alexnet-based transfer learning and adam optimizer. Scientific Programming 2020 (2020)
John, A., Krishnan, R.H., Vinitha, A.M.: Language Recognition System: An Application Based Study with Special Reference to Sociolinguistics and Computational Linguistics (2021)
Joy, J., Balakrishnan, K., Sreeraj, M.: Signquiz: a quiz based tool for learning finger spelled signs in Indian sign language using ASLR. IEEE Access 7, 28363–28371 (2019)
Kim, S., Ji, Y., Lee, K.B.: An effective sign language learning with object detection based ROI segmentation. In: Proceedings of IRC, pp. 330–333. IEEE (2018)
Kumar, S.S., Wangyal, T., Saboo, V., Srinath, R.: Time series neural networks for real time sign language translation. In: Proceedings of ICMLA, pp. 243–248. IEEE (2018)
Pallavi, P., Sarvamangala, D.: Recognition of sign language using deep neural network. Int. J. Adv. Res. Comput. Sci. 12, 92–97 (2021)
Park, J.H., Choi, H.J.: Factors influencing adult learners’ decision to drop out or persist in online learning. J. Educ. Technol. Soc. 12(4), 207–217 (2009)
Patricks, A.: Developing an accessible learning application for sign language (c) (2022)
Phan, H.D., Ellis, K., Dorin, A., Olivier, P.: Feedback strategies for embodied agents to enhance sign language vocabulary learning. In: ACM-IVA, pp. 1–8 (2020)
Reisoğlu, I., Topu, B., Yılmaz, R., Karakuş Yılmaz, T., Göktaş, Y.: 3d virtual learning environments in education: a meta-review. Asia Pac. Educ. Rev. 18, 81–100 (2017)
Samonte, M.J.C.: An assistive technology using fsl, speech recognition, gamification and online handwritten character recognition in learning statistics for students with hearing and speech impairment. In: Proceedings of ICFET, pp. 92–97 (2020)
Schioppo, J., Meyer, Z., Fabiano, D., Canavan, S.: Sign language recognition: Learning American sign language in a virtual environment. In: Extended Abstracts of the CHI Conference on Human Factors in Computing Systems, pp. 1–6 (2019)
Schnepp, J., Wolfe, R., Brionez, G., Baowidan, S., Johnson, R., McDonald, J.: Human-centered design for a sign language learning application. In: Proceedings of PETRAE, pp. 1–5 (2020)
Schrepp, M., Hinderks, A., Thomaschewski, J.: Applying the User Experience Questionnaire (UEQ) in different evaluation scenarios. In: Marcus, A. (ed.) DUXU 2014. LNCS, vol. 8517, pp. 383–392. Springer, Cham (2014). https://doi.org/10.1007/978-3-319-07668-3_37
Schrepp, M., Thomaschewski, J., Hinderks, A.: Construction of a benchmark for the user experience questionnaire (UEQ). Int. J. Interact. Multimed. Artif. Intell. 4(4), 40–44 (2017)
Vaitkevičius, A., Taroza, M., Blažauskas, T., Damaševičius, R., Maskeliūnas, R., Woźniak, M.: Recognition of American sign language gestures in a virtual reality using leap motion. Appl. Sci. 9(3), 445 (2019)
Wang, J., Ivrissimtzis, I., Li, Z., Zhou, Y., Shi, L.: Developing and evaluating a novel gamified virtual learning environment for ASL. In: INTERACT 2023. LNCS. Springer (2023)
Wang, J., Ivrissimtzis, I., Li, Z., Zhou, Y., Shi, L.: User-defined hand gesture interface to improve user experience of learning American sign language. In: International Conference on Intelligent Tutoring Systems, pp. 479–490. Springer (2023)
Zhang, F., et al.: Mediapipe hands: on-device real-time hand tracking. arXiv:2006.10214 (2020)
Author information
Authors and Affiliations
Corresponding author
Editor information
Editors and Affiliations
Rights and permissions
Copyright information
© 2023 The Author(s), under exclusive license to Springer Nature Switzerland AG
About this paper
Cite this paper
Wang, J., Ivrissimtzis, I., Li, Z., Zhou, Y., Shi, L. (2023). Exploring the Potential of Immersive Virtual Environments for Learning American Sign Language. In: Viberg, O., Jivet, I., Muñoz-Merino, P., Perifanou, M., Papathoma, T. (eds) Responsive and Sustainable Educational Futures. EC-TEL 2023. Lecture Notes in Computer Science, vol 14200. Springer, Cham. https://doi.org/10.1007/978-3-031-42682-7_31
Download citation
DOI: https://doi.org/10.1007/978-3-031-42682-7_31
Published:
Publisher Name: Springer, Cham
Print ISBN: 978-3-031-42681-0
Online ISBN: 978-3-031-42682-7
eBook Packages: Computer ScienceComputer Science (R0)