Abstract
This paper presents interactive games for sign language tutoring assisted by humanoid robots. The games are specially designed for children with communication impairments. In this study, different robot platforms such as a Nao H25 and a Robovie R3 humanoid robots are used to express a set of chosen signs in Turkish Sign Language using hand and arm movements. Two games involving physically and virtually embodied robots are designed. In the game involving physically embodied robot, the robot is able to communicate with the participant by recognizing colored flashcards through a camera based system and generating a selected subset of signs including motivational facial gestures, in return. A mobile version of the game is also implemented to be used as part of children’s education and therapy for the purpose of teaching signs. The humanoid robot acts as a social peer and assistant in the games to motivate the child, teach a selected set of signs, evaluate the child’s effort, and give appropriate feedback to improve the learning and recognition rate of children. Current paper presents results from the preliminary study with different test groups, where children played with the physical robot platform, R3, and a mobile game incorporating the videos of the robot performing the signs, thus the effect of assistive robot’s embodiment is analyzed within these games. The results indicate that the physical embodiment plays a significant role on improving the children’s performance, engagement and motivation.
Similar content being viewed by others
References
Piaget J (1964) Part I: cognitive development in children: Piaget development and learning. J Res Sci Teach 2(3):176–186
Mayberry RI (2002) Cognitive development of deaf children: the interface of language and perception in neuropsychology., Handbook of neuropsychology. Elsevier, New York
Iacono I et al (2001) Robots as social mediators for children with Autism: a preliminary analysis comparing two different robotic platforms. In: Proceedings of IEEE international conference on development and learning (ICDL)
Vygotsky LS, Cole M (1978) Mind in society. Harvard University Press, Cambridge
Bruner JS (1990) Acts of meaning. Harvard University Press, Cambridge
Powell S (2000) Helping children with autism to learn. David Fulton, London
Hakkarainen P (1999) Play and motivation. In: Engestrom Y, Miettinen R, Punamaki RL (eds) Perspectives on activity theory. Cambridge University Press, New York, pp 231–250
Kose H, Yorganci R, Algan EH, Syrdal DS (2012) Evaluation of the robot assisted sign language tutoring using video-based studies. Int J Soc Robot 4(3):273–283
Kose H, Yorganci R (2011) Tale of a robot: humanoid robot assisted sign language tutoring. In: Proceedings of IEEE-RAS international conference on humanoid robots
Akalin N, Uluer P, Kose H (2013) Ispy-usign humanoid assisted interactive sign language tutoring games. In: Proceedings of IEEE RO-MAN
Kose H et al (2015) iSign: an architecture for humanoid assisted sign language tutoring. In: Muhammed S, Moreno JC, Kong K, Amirat Y (eds) Springer tracts in advanced robotics-intelligent assistive robots, vol 106. Springer, Berlin
Kose H, Akalin N, Uluer P (2014) Socially interactive robotic platforms as sign language tutors. Int J Humanoid Robot 11(01):1450003
Lee KM, Jung Y, Kim J, Kim SR (2006) Are physically embodied social agents better than, disembodied social agents? The effects of physical embodiment, tactile interaction, and people’s loneliness in human robot interaction. Int J Hum Comput Stud 64:962973
Shinozawa K, Naya F, Yamato J, Kogure K (2005) Differences in effect of robot and screen agent recommendations on human decision-making. Int J Hum Comput Stud 62:267–279
Komatsu T, Abe Y (2008) Comparing an on-screen agent with a robotic agent in non-face-to-face interactions. In: Prendinger H, Lester J, Ishizuka M (eds) Intelligent virtual agents. Springer, Berlin, pp 498–504
Fischer K, Lohan K, Foth K, (2012) Levels of embodiment: linguistic analyses of factors influencing HRI. In: 7thACM/IEEE internationalconference on human robot interaction (HRI), pp 463–470
Lohan KS, Gieselmann S, Vollmer A-L, Rohlfing K, Wrede B (2010) Does embodiment affect tutoring behavior? In: IEEE international conference on development and learning (ICDL) conference
Dautenhahn K, Ogden B, Quick T (2002) From embodied to socially embedded agents: implications for interaction-aware robots. Cogn Syst Res 3(3):397–428
Bartneck C (2003) Interacting with an embodied emotional character. In: International conference on designing pleasurable products and interfaces, pp 55–60
Li J (2015) The benefit of being physically present: a survey of experimental works comparing copresent robots, telepresent robots and virtual agents. Int J Hum Comput Stud 77:23–37
Wainer J, Feil-Seifer DJ, Shell DA, Mataric MJ (2006) The role of physical embodiment in human robot interaction. In: Proceedings of IEEE international workshop on robot and human interactive communication, Hatfield, pp 117–122
Tiago A, Martinho C, Leite I, Paiva A (2008) iCat, the chess player: the influence of embodiment in the enjoyment of a game. In: Proceedings of 7th international conference on autonomous agents and multiagent systems, Estoril, pp 1253–1256
Kose-Bagci H, Ferrari E, Dautenhahn K, Syrdal DS, Nehaniv CL (2009) Effects of embodiment and gestures on social interaction in drumming games with a humanoid robot. Spec Issue Robot Hum Interact Commun Adv Robot 24(14):1951–1996
Parton B (2006) Sign language recognition and translation: a multi-disciplined approach from the field of artificial intelligence. J Deaf Stud Deaf Educ 11(1):94–101
Haberdar H, Albayrak S (2005) Real time isolated turkish sign language recognition from video using hidden markov models with global features. In: Yolum P, Gungor T, Gurgen F, Ozturan C (eds) Lecture notes in computer science, computer and information sciences (ISCIS), vol 3733. Springer, New York, pp 677–687
Keskin C, Akarun L (2009) Sign tracking and recognition system using input–output HMMs. Pattern Recognit Lett 30(12):1086–1095
Aran O, Akarun L (2010) A multi-class classification strategy for Fisher scores: application to signer independent sign language recognition. Pattern Recogn 43(5):1776–1788
Gibet, S (2001) Analysis and synthesis of sign language gestures: from meaning to movement production. In: Proceedings of the 9th international gesture workshop gesture in embodied communication and human–computer interaction
Salisbury JK, Roth B (1983) Kinematic and force analysis of articulated mechanical hands. J Mech Des 105(1):35–41
Sugiuchi H, Morino T, Terauchi M (2002) Execution and description of dexterous hand task by using multi-finger dual robot hand system: realization of Japanese sign language. In: Proceedings of IEEE international symposium on intelligent control
Huenerfauth MA (2004) Multi-path architecture for machine translation of English text into American Sign language animation. In: Proceedings of the student research workshop at HLT-NAACL association for computational linguistics
Kipp M, Heloir A, Nguyen Q (2001) Sign language avatars: animation and comprehensibility. In: Intelligent virtual agents. Springer, New York
Ho-Sub Y, Su-Young C (2006) Visual processing of rock, scissors, paper game for human robot interaction. In: Proceedings of international joint conference SICE-ICASE
Chao C, Jinhan L, Begum M, Thomaz AL (2011) Simon plays Simon says: the timing of turn-taking in an imitation game. In: Proceedings of IEEE RO-MAN, pp 235–240
Changchun L, Conn K, Sarkar N, Stone W (2008) Online affect detection and robot behavior adaptation for intervention of children with autism. IEEE Trans Robot 24(4):883–896
Kanda T et al (2004) Interactive robots as social partners and peer tutors for children: a field trial. Hum Comput Interact 19(1):61–84
Isaacs EA, Clark HH (1987) References in conversation between experts and novices. J Exp Psychol 116(1):26–37
Turkish Language Institution, http://www.tdk.gov.tr/index.php?option=com_content&id=264
Bogazici University Turkish Sigh Language Dictionary http://www.cmpe.boun.edu.tr/tid/
Akalin N, Uluer P, Kose H (2014) Non-verbal communication with a social robot peer: towards robot assisted interactive sign language tutoring. In: IEEE-RAS international conference on humanoid robots, pp 1122–1127
Ozkul AH et al (2014) Robostar: an interaction game with humanoid robots for learning sign language. In: IEEE international conference on robotics and biomimetics, pp 522–527
Acknowledgments
Research supported by the Scientific and Technological Research Council of Turkey under the contract TUBITAK KARIYER 111E283.
Author information
Authors and Affiliations
Corresponding author
Rights and permissions
About this article
Cite this article
Köse, H., Uluer, P., Akalın, N. et al. The Effect of Embodiment in Sign Language Tutoring with Assistive Humanoid Robots. Int J of Soc Robotics 7, 537–548 (2015). https://doi.org/10.1007/s12369-015-0311-1
Accepted:
Published:
Issue Date:
DOI: https://doi.org/10.1007/s12369-015-0311-1