Abstract
This paper presents a socially interactive humanoid robot-assisted system for sign language (SL) tutoring for children with communication impairments by means of imitation-based interaction games. In this study, a five-fingered robot platform Robovie R3 is used to express a set of chosen words in Turkish sign language (TSL) using hand and body movements combined with facial expressions. The robot is able to recognize signs through a RGB-D camera and give vocal, visual and motional (as signs) feedback. The proposed game consists of an introductory phase where participants are introduced to the robot and the signs; an imitation-based learning phase where participants are motivated to imitate the signs demonstrated by the robot, and a test phase where the signs taught in the previous phases are tested within a guessing game. The current paper presents results from the studies with three different test groups. The presented humanoid robot is used as an assistive social companion in the game context using sign language and visual clues to interact with the children. The robot is evaluated according to the participant’s sign recognition ability within different setups. The results indicate that the robot has a significant effect on the sign learning performance of participants.
Similar content being viewed by others
References
Kushalnagar P, Mathur G, Moreland CJ, Napoli DJ, Osterling W, Padden C, Rathmann C (2010) Infants and children with hearing loss need early language access. J Clin Ethics 21(2):143–154
Kose H, Yorganci R. (2011) Tale of a robot: humanoid robot assisted sign language tutoring. In: Proceedings of IEEE-RAS international conference on humanoid robots, pp 105–111
Morasso P, Casadio M, Giannoni P, Masia L, Sanguineti V, Squeri V, Vergaro E (2009) Desirable features of a “Humanoid” robot-therapist. In: Proceedings of annual international conference of the IEEE on engineering in medicine and biology society (EMBC), pp 2418–2421
Riener R (2005) Control of robots for rehabilitation. In: Proceedings of international conference on computer as a tool (EUROCON), vol 1, pp 33–36
Ismail L, Shamsuddin S, Yussof H, Hashim H, Bahari S, Jaafar A, Zahari I (2011) Face detection technique of humanoid robot nao for application in robotic assistive therapy. In: Proceedings of IEEE international conference on control system, computing and engineering (ICCSCE), pp 517–521
Vygotsky LS (1978) Mind and society: the development of higher mental processes. Harvard University Press, Cambridge
Spitzer SL (2008) Play in children with autism: structure and experience. In: Parham LD, Fazio LS (eds) Play in occupational therapy for children. Mosby, St. Louis, pp 351–374
Iacono I, Lehmann H, Marti P, Robins B, Dautenhahn K (2001) Robots as social mediators for children with autism—a preliminary analysis comparing two different robotic platforms. In: Proceedings of IEEE international conference on development and learning (ICDL), vol 2, pp 1–6
Haberdar H, Albayrak S (2005) Real-time isolated Turkish sign language recognition from video using hidden Markov models with global features. In: Yolum P, Gungor T, Gurgen F, Ozturan C (eds) Lecture notes in computer science, computer and information sciences (ISCIS), vol 3733. Springer, Berlin, pp 677–687
Kadous W (1995) Grasp: recognition of Australian sign language using instrumented gloves. Dissertation, University of New South Wales
Starner T, Weaver J, Pentland A (1998) Real-time American sign language recognition using desk and wearable computer based video. IEEE Trans Pattern Anal Mach Intell 20(12):1371–1375
Braffort A, Bolot L, Segouat J (2011) Virtual signer co-articulation in octopus, a sign language generation platform. In: Proceedings of the 9th international gesture workshop, gesture in embodied communication and human-computer interaction
Aran O, Akarun L (2010) A multi-class classification strategy for fisher scores: application to signer independent sign language recognition. Pattern Recogn 43(5):1776–1788
Keskin C, Akarun L (2009) Sign tracking and recognition system using input–output HMMs. Pattern Recognit Lett 30(12):1086–1095
Caplier A, Stillittano S, Aran O, Akarun L, Bailly G, Beautemps D, Aboutabit N, Burger T (2007) Image and Video for Hearing Impaired people. In: Proceedings of EURASIP journal on image and video processing, vol 1, p 045641
Luis-Perez FE, Trujillo-Romero F, Martnez-Velazco W (2011) Control of a service robot using the mexican sign language. In: Batyrshin I, Sidorov G (eds) Lecture notes in computer science, advances in soft computing, vol 7095. Springer, Berlin, pp 419–430
Anastasiou D (2012) Gestures in assisted living environments. In: Efthimiou E, Kouroupetroglou G, Fotinea SE (eds) Lecture notes in computer science, gesture and sign language in human-computer interaction and embodied communication, vol 7206. Springer, Berlin, pp 1–12
Jaffe D (1994) Evolution of mechanical finger spelling hands for people who are deaf-blind. Journal of rehabilitation research and development 31:236–244
Hersh MA, Johnson MA (2003) Anatomy and physiology of hearing, hearing impairment and treatment. In: Hersh MA, Johnson MA (eds) Assistive technology for the hearing-impaired deaf and deafblind. Springer, London, pp 1–39
Gibet S (2011) analysis and synthesis of sign language gestures: from meaning to movement production. In: Proceedings of the 9th international gesture workshop, gesture in embodied communication and human-computer interaction
Kose H, Yorganci R, Algan EH, Syrdal DS (2012) Evaluation of the robot assisted sign language tutoring using video-based studies. Int J Soc Robot 4(3):273–283
Akalin N, Uluer P, Kose H (2013) Ispy-usign humanoid assisted interactive sign language tutoring games. In: Proceedings of IEEE RO-MAN, pp 290–291
Huenerfauth MA (2004) Multi-path architecture for machine translation of english text into American sign language animation. In: Proceedings of the student research workshop at HLT-NAACL association for computational linguistics, pp 25–30
Kipp M, Heloir A, Nguyen Q (2001) Sign language avatars: animation and comprehensibility. In: Vilhjalmsson H, Kopp S, Marsella S, Thorisson KR (eds) Lecture notes in computer science, intelligent virtual agents, vol 6895. Springer, Berlin, pp 113–126
Lave J, Wenger E (1991) Situated learning: legitimate peripheral participation. Cambridge University Press, New York
Bruner JS (1990) Acts of meaning. Harvard University Press, London
Powell S (2000) Helping children with autism to learn. Taylor & Francis, Boca Raton
Hakkarainen P (1999) Play and motivation. In: Engestrom Y, Miettinen R, Punamaki RL (eds) Perspectives on activity theory. Cambridge University Press, New York, pp 231–250
Ho-Sub Y, Su-Young C (2006) Visual processing of rock, scissors, paper game for human robot interaction. In: Proceedings of international joint conference SICE-ICASE, pp 326–329
Chao C, Jinhan L, Begum M, Thomaz AL (2011) Simon plays Simon says: the timing of turn-taking in an imitation game. In: Proceedings of IEEE RO-MAN, pp 235–240
Changchun L, Conn K, Sarkar N, Stone W (2008) Online affect detection and robot behavior adaptation for intervention of children with autism. IEEE transactions on robotics 24(4):883–896
Kanda T, Hirano T, Eaton D, Ishiguro H (2004) Interactive robots as social partners and peer tutors for children: a field trial. Hum-comput Interact 19(1):61–84
Isaacs EA, Clark HH (1987) References in conversation between experts and novices. J Exp Psychol 116(1):26–37
Nalin M, Baroni I, Kruijff-Korbayova I, Canamero L, Lewis M, Beck A, Cuayahuitl H, Sanna A (2012) Children’s adaptation in multi-session interaction with a humanoid robot. In: Proceedings of IEEE RO-MAN, pp 351–357
Michaud F, Clavet A, Lachiver G, Lucas M (2000) Designing toy robots to help autistic children—an open design project for electrical and computer engineering education. In: Proceedings of American Society for Engineering Education
Robins B, Dickerson P, Stribling P, Dautenhahn K (2004) Robot-mediated joint attention in children with autism: a case study in robot–human interaction. Interact Stud 5(2):161–198
Robins B, Dautenhahn K, Bo ekhorst T, Billard A (2005) Robotic assistants in therapy and education of children with autism: can a small humanoid robot help encourage social interaction skills? Univers Access Inf Soc 4(2):105–120
Salber D, Coutaz J (1993) Applying the wizard of Oz technique to the study of multimodal systems. In: Bass LJ, Gornostaev J, Unger C (eds) Lecture notes in computer science, human-computer interaction, vol 753. Springer, Berlin, pp 219–230
Kose H, Akalin N, Uluer P (2014) Socially interactive robotic platforms as sign language tutors. Int J Hum Robot 11(1):1450003
Akalin N, Uluer P, Kose H, Ince G (2013) Humanoid robots communication with participants using sign language: an interaction based sign language game. In: Proceedings of IEEE workshop on advanced robotics and its social impacts (ARSO), pp 181–186
Kose H, Yorganci R, Itauma II (2011) Humanoid robot assisted interactive sign language tutoring game. In: Proceedings of IEEE international conference on robotics and biomimetics (ROBIO), pp 2247–2248
Kose-Bagci H, Yorganci R, Algan EH (2011) Evaluation of the robot sign language tutoring assistant using video-based studies. In: Lilienthal AJ (ed) Proceedings of the European Conference on Mobile Robots (ECMR), pp 109–114
Ertugrul BS, Kivrak H, Daglarli E, Kulaglic A, Tekelioglu A, Kavak S, Ozkul A, Yorganci R, Kose H (2012) iSign: interaction games for humanoid assisted sign language tutoring. In: Proceedings of international workshop on human–agent interaction (iHAI)
Ertugrul BS, Gurpinar C, Kivrak H, Kose H (2013) Gesture recognition for humanoid assisted interactive sign language tutoring. In: Proceedings of signal processing and communications applications conference (SIU), pp 1–4
Kivrak H, Ertugrul BS, Yorganci R, Daglarli E, Kulaglic A, Kose A (2012) Humanoid assisted sign language tutoring. In: 5th workshop on human-friendly robotics (HFR)
Turkish Sign Language Dictionary. URL http://www.tdk.gov.tr/index.php?option=com_content&id=264
Acknowledgments
We would like to thank the managers, teachers and students of Ferahevler Primary School, Dosteller Secondary School for Hearing-Impaired Children and Turkish Hearing-Impaired Association for their voluntary participation, their criticisms and contributions to this study. And also we would like to thank TSL tutors Sumru Özsoy and Feride Korkmaz for their guidance. Research supported by the Scientific and Technological Research Council of Turkey under the contract TUBITAK KARIYER 111E283.
Author information
Authors and Affiliations
Corresponding author
Appendix
Appendix
The sample set of TSL words, their demonstration and the flashcards representing the signs are displayed in Fig. 7.
Rights and permissions
About this article
Cite this article
Uluer, P., Akalın, N. & Köse, H. A New Robotic Platform for Sign Language Tutoring. Int J of Soc Robotics 7, 571–585 (2015). https://doi.org/10.1007/s12369-015-0307-x
Accepted:
Published:
Issue Date:
DOI: https://doi.org/10.1007/s12369-015-0307-x