Abstract
This paper presents a framework that allows users to interact with and navigate a humanoid robot using body gestures. The first part of the paper describes a study to define intuitive gestures for eleven navigational commands based on analyzing 385 gestures performed by 35 participants. From the study results, we present a taxonomy of the user-defined gesture sets, agreement scores for the gesture sets, and time performances of the gesture motions. The second part of the paper presents a full body interaction system for recognizing the user-defined gestures. We evaluate the system by recruiting 22 participants to test for the accuracy of the proposed system. The results show that most of the defined gestures can be successfully recognized with a precision between 86\(-\)100 % and an accuracy between 73\(-\)96 %. We discuss the limitations of the system and present future work improvements.
Similar content being viewed by others
Notes
We term a user that is experienced with robots and/or gesture tracking as Technical
Max Planck Institute for Psycholinguistics, Nijmegen, The Netherlands (http://www.lat-mpi.eu/tools/elan/).
It is important to note that the NT user-defined gesture set is used to demonstrate the various parts of the recognition system and its accuracy; thus, the system is not limited to the NT gesture set only.
References
Kistler F, Endrass B, Damian I, Dang C, André E (2012) Natural interaction with culturally adaptive virtual characters. J Multimodal User Interfaces 6:39–47
Suma EA, Lange B, Rizzo A, Krum DM, Mark B (2011) FAAST: the flexible action and articulated skeleton toolkit. In: Proceedings of the virtual reality, Singapore, pp 47–248
Stiefelhagen R, Fugen C, Gieselmann R, Holzapfel H, Nickel K, Waibel A (2004) Natural human-robot interaction using speech, head pose and gestures. In: Proceedings of cthe IEEE/RSJ international conference on intelligent robots and systems, (IROS 2004), 3:2422–2427
Suay HB, Chernova S (2011) Humanoid robot control using depth camera. In: Proceedings of the 6th international conference on Human-robot interaction, HRI ’11. NY, USA, ACM, New York, pp 401–402
Wobbrock JO, Morris MR, Wilson AD (2009) User-defined gestures for surface computing. In: Proceedings of the 27th international conference on Human factors in computing systems, CHI ’09. NY, USA, ACM, New York, pp 1083–1092
Kurdyukova E, Redlin M, André E (2012) Studying user-defined ipad gestures for interaction in multi-display environment. In: International Conference on Intelligent User Interfaces, ACM, New York, pp 1–6
Häring M, Eichberg J, André E (2012) Studies on grounding with gaze and pointing gestures in human-robot-interaction. In: Ge ShuzhiSam, Khatib Oussama, Cabibihan John-John, Simmons Reid, Williams Mary-Anne (eds) Social robotics, vol 7621 Lecture notes in computer science. Berlin Heidelberg, Springer, pp 378–387
Maha S, Rohlfing K, Kopp S, Joublin F (2011) A friendly gesture: Investigating the effect of multimodal robot behavior in human-robot interaction. In: IEEE, RO-MAN, Atlanta, 3: 247–252
Sidner CL, Lee C, Kidd CD, Lesh N, Rich C (2005) Explorations in engagement for humans and robots. Artif Intell 166(12):140–164
Salem M, Eyssel F, Rohlfing K, Kopp S, Joublin F (2013) To err is human(-like): effects of robot gesture on perceived anthropomorphism and likability. Int J Soc Robot 5(3):313–323
Salem M, Kopp S, Wachsmuth I, Rohlfing K, Joublin F (2012) Generation and evaluation of communicative robot gesture. Int J Soc Robot 4(2):201–217
Efron D (1941) Gesture and Environment. King’s Crown Press, Morningside Heights, New York
Ekman P, Friesen W (1969) The repertoire of nonverbal behavior: categories, origins, usage, and coding. Semiotica 1:49–98
McNeill D (1985) So you think gestures are nonverbal? Psychol Rev 92(3):350–371
McNeill D (1992) Head and mind: what gestures reveal about thought. University of Chicago University of Chicago Press, Chicago
McNeill D (2005) Gesture and thought. University of Chicago Press, Chicago
Saffer D (2009) Designing gestural interfaces. O’Reilly Media, Sebastopol
Jaime R, Yang L, Edward L (2011) User-defined motion gestures for mobile interaction. In: Proceedings of the 2011 annual conference on Human factors in computing systems, CHI ’11. NY, USA, ACM, New York, pp 197–206
Christian K, Daniel N, John D, Michael R (2010) User-defined gestures for connecting mobile phones, public displays, and tabletops. In: Proceedings of the 12th international conference on Human computer interaction with mobile devices and services, MobileHCI ’10. NY, USA, ACM, New York, pp 239–248
Zhang L, Huang Q, Liu Q, Liu T, Li D, Lu Y (2005) A teleoperation system for a humanoid robot with multiple information feedback and operational modes. In: IEEE international conference on robotics and biomimetics (ROBIO), pp 290–294
Kechavarzi BD, Sabanovic S, Weisman K (2012) Evaluation of control factors affecting the operator’s immersion and performance in robotic teleoperation. In: IEEE, RO-MAN, pp 608–613
Sian NE, Yokoi K, Kajita S, Kanehiro F, Tanie K (2002) Whole body teleoperation of a humanoid robot - development of a simple master device using joysticks. In: IEEE/RSJ international conference on intelligent robots and systems, vol. 3, pp 2569–2574
McColl D, Zhang Z, Nejat G (2011) Human body pose interpretation and classification for social human-robot interaction. Int J Soc Robot 3(3):313–332
Sakagami Y, Watanabe R, Aoyama C, Matsunaga S, Higaki N, Fujimura K (2002) The intelligent ASIMO: system overview and integration. In: IEEE/RSJ international conference on intelligent robots and systems, vol. 3, pp 2478–2483
Yorita A, Kubota N (2011) Cognitive development in partner robots for information support to elderly people. IEEE Trans Auton Ment Dev 3(1):64–73
Ju Z, Liu H (2010) Recognizing hand grasp and manipulation through empirical copula. Int J Soc Robot 2(3):321–328
Fujimoto I, Matsumoto T, Silva PRS, Kobayashi M, Higashi M (2011) Mimicking and evaluating human motion to improve the imitation skill of children with autism through a robot. Int J Soc Robot 3(4):349–357
Yun S-S, Kim M, Choi MT (2013) Easy interface and control of tele-education robots. Int J Soc Robot 5(3):335–343
Waldherr S, Romero R, Thrun S (2000) A gesture based interface for human-robot interaction. Auton Robot 9(2):151–173
Nguyen-Duc-Thanh N, Stoniern D, Lee SY, Kim DH (2011) A new approach for human-robot interaction using human body language. In: Proceedings of the 5th international conference on convergence and hybrid information technology, ICHIT’11. Springer, Berlin, pp 762–769
Broccia G, Livesu M, Scateni R (2011) Gestural interaction for robot motion control. In: EuroGraphics Italian chapter, pp 61–66
Cabibihan J-J, So W-C, Pramanik S (2012) Human-recognizable robotic gestures. IEEE Trans Auton Mental Dev, 4(4):305–314
Strobel M, Illmann J, Kluge B, Marrone F (2002) Using spatial context knowledge in gesture recognition for commanding a domestic service robot. In: Proceedings of the 11th IEEE international workshop on robot and human interactive communication, pp 468–473
Sato E, Yamaguchi T, Harashima F (2007) Natural interface using pointing behavior for human-robot gestural interaction. In: IEEE transactions on industrial electronics, 54(2):1105–1112
Sato E, Nakajima A, Yamaguchi T, Harashima F (2005) Humatronics (1)— natural interaction between human and networked robot using human motion recognition. In: IEEE/RSJ international conference on intelligent robots and systems, (IROS 2005), pp 930–935
Hu C, Meng MQ, Liu PX, Wang X (2003) Visual gesture recognition for human-machine interface of robot teleoperation. In: IEEE/RSJ international conference on intelligent robots and systems, (IROS 2003). Proceedings, vol. 2, pp 1560–1565
Konda KR, Königs A, Schulz H, Schulz D (2012) Real time interaction with mobile robots using hand gestures. In: Proceedings of the seventh annual ACM/IEEE international conference on human-robot interaction, HRI ’12. NY, USA, ACM, New York, pp 177–178
Dillmann R (2004) Teaching and learning of robot tasks via observation of human performance. Robot Auton Syst 47(23):109–116
Breazeal C, Scassellati B (2002) Robots that imitate humans. Trends Cogn Sci 6(11):481–487
Barattini P, Morand C, Robertson NM (2012) A proposed gesture set for the control of industrial collaborative robots. In IEEE RO-MAN, pp 132–137
Ende T, Haddadin S, Parusel S, Wusthoff T, Hassenzahl M, Albu-Schaffer A (2011) A human-centered approach to robot gesture based communication within collaborative working processes. In: IEEE/RSJ international conference on intelligent robots and systems (IROS), pp 3367–3374
Gleeson B, MacLean K, Haddadi A, Croft E, Alcazar J (2013) Gestures for industry intuitive human-robot communication from human observation. In: 8th ACM/IEEE international conference on human-robot interaction (HRI), pp 349–356
Bodiroa S, Stern HI, Edan Y (2012) Dynamic gesture vocabulary design for intuitive human-robot dialog. In: 7th ACM/IEEE international conference on Human-robot interaction (HRI), pp 111–112
Wobbrock JO, Aung HH, Rothrock B, Myers BA (2005) Maximizing the guessability of symbolic input. In: CHI ’05 extended abstracts on Human factors in computing systems, CHI EA ’05, ACM, New York, pp 1869–1872
Kang SK, Nam MY, Rhee PK (2008) Color based hand and finger detection technology for user interaction. In: ICHIT ’08. international conference on convergence and hybrid information technology, pp 229–236
Kita S (2009) Cross-cultural variation of speech-accompanying gesture: a review. Lang Cogn Process 24(2):145–167
Bartneck C, Nomura T, Kanda T, Suzuki T, Kato K (2005) Cultural differences in attitudes towards robots. In: Proceedings of the symposium on robot companions: hard problems and open challenges in Robot-human interaction,
Bartneck C, Suzuki T, Kanda T, Nomura T (2007) The influence of people’s culture and prior experiences with aibo on their attitude towards robots. AI Soc 21:217–230
Nomura T, Suzuki T, Kanda T, Han J, Shin N, Burke J, Kato K (2008) What people assume about humanoid and animal-type robots: cross-cultural analysis between japan, korea, and the united states. Int J Hum Robot 05(01):25–46
Acknowledgments
This work was partially funded by the European Commission within the 7th Framework Program under grant agreement eCute (FP7-ICT-257666).
Author information
Authors and Affiliations
Corresponding author
Rights and permissions
About this article
Cite this article
Obaid, M., Kistler, F., Häring, M. et al. A Framework for User-Defined Body Gestures to Control a Humanoid Robot. Int J of Soc Robotics 6, 383–396 (2014). https://doi.org/10.1007/s12369-014-0233-3
Accepted:
Published:
Issue Date:
DOI: https://doi.org/10.1007/s12369-014-0233-3