[go: up one dir, main page]
More Web Proxy on the site http://driver.im/ Skip to main content
Log in

A Framework for User-Defined Body Gestures to Control a Humanoid Robot

  • Published:
International Journal of Social Robotics Aims and scope Submit manuscript

Abstract

This paper presents a framework that allows users to interact with and navigate a humanoid robot using body gestures. The first part of the paper describes a study to define intuitive gestures for eleven navigational commands based on analyzing 385 gestures performed by 35 participants. From the study results, we present a taxonomy of the user-defined gesture sets, agreement scores for the gesture sets, and time performances of the gesture motions. The second part of the paper presents a full body interaction system for recognizing the user-defined gestures. We evaluate the system by recruiting 22 participants to test for the accuracy of the proposed system. The results show that most of the defined gestures can be successfully recognized with a precision between 86\(-\)100 % and an accuracy between 73\(-\)96 %. We discuss the limitations of the system and present future work improvements.

This is a preview of subscription content, log in via an institution to check access.

Access this article

Subscribe and save

Springer+ Basic
£29.99 /Month
  • Get 10 units per month
  • Download Article/Chapter or eBook
  • 1 Unit = 1 Article or 1 Chapter
  • Cancel anytime
Subscribe now

Buy Now

Price includes VAT (United Kingdom)

Instant access to the full article PDF.

Fig. 1
Fig. 2
Fig. 3
Fig. 4

Similar content being viewed by others

Notes

  1. We term a user that is experienced with robots and/or gesture tracking as Technical

  2. http://www.aldebaran-robotics.com.

  3. Max Planck Institute for Psycholinguistics, Nijmegen, The Netherlands (http://www.lat-mpi.eu/tools/elan/).

  4. It is important to note that the NT user-defined gesture set is used to demonstrate the various parts of the recognition system and its accuracy; thus, the system is not limited to the NT gesture set only.

  5. http://hcm-lab.de/fubi.html.

  6. http://www.openni.org.

  7. http://www.kinectforwindows.org.

  8. http://www.hcm-lab.de/fubi.html.

References

  1. Kistler F, Endrass B, Damian I, Dang C, André E (2012) Natural interaction with culturally adaptive virtual characters. J Multimodal User Interfaces 6:39–47

    Article  Google Scholar 

  2. Suma EA, Lange B, Rizzo A, Krum DM, Mark B (2011) FAAST: the flexible action and articulated skeleton toolkit. In: Proceedings of the virtual reality, Singapore, pp 47–248

  3. Stiefelhagen R, Fugen C, Gieselmann R, Holzapfel H, Nickel K, Waibel A (2004) Natural human-robot interaction using speech, head pose and gestures. In: Proceedings of cthe IEEE/RSJ international conference on intelligent robots and systems, (IROS 2004), 3:2422–2427

  4. Suay HB, Chernova S (2011) Humanoid robot control using depth camera. In: Proceedings of the 6th international conference on Human-robot interaction, HRI ’11. NY, USA, ACM, New York, pp 401–402

  5. Wobbrock JO, Morris MR, Wilson AD (2009) User-defined gestures for surface computing. In: Proceedings of the 27th international conference on Human factors in computing systems, CHI ’09. NY, USA, ACM, New York, pp 1083–1092

  6. Kurdyukova E, Redlin M, André E (2012) Studying user-defined ipad gestures for interaction in multi-display environment. In: International Conference on Intelligent User Interfaces, ACM, New York, pp 1–6

  7. Häring M, Eichberg J, André E (2012) Studies on grounding with gaze and pointing gestures in human-robot-interaction. In: Ge ShuzhiSam, Khatib Oussama, Cabibihan John-John, Simmons Reid, Williams Mary-Anne (eds) Social robotics, vol 7621 Lecture notes in computer science. Berlin Heidelberg, Springer, pp 378–387

  8. Maha S, Rohlfing K, Kopp S, Joublin F (2011) A friendly gesture: Investigating the effect of multimodal robot behavior in human-robot interaction. In: IEEE, RO-MAN, Atlanta, 3: 247–252

  9. Sidner CL, Lee C, Kidd CD, Lesh N, Rich C (2005) Explorations in engagement for humans and robots. Artif Intell 166(12):140–164

    Google Scholar 

  10. Salem M, Eyssel F, Rohlfing K, Kopp S, Joublin F (2013) To err is human(-like): effects of robot gesture on perceived anthropomorphism and likability. Int J Soc Robot 5(3):313–323

    Google Scholar 

  11. Salem M, Kopp S, Wachsmuth I, Rohlfing K, Joublin F (2012) Generation and evaluation of communicative robot gesture. Int J Soc Robot 4(2):201–217

    Google Scholar 

  12. Efron D (1941) Gesture and Environment. King’s Crown Press, Morningside Heights, New York

  13. Ekman P, Friesen W (1969) The repertoire of nonverbal behavior: categories, origins, usage, and coding. Semiotica 1:49–98

    Google Scholar 

  14. McNeill D (1985) So you think gestures are nonverbal? Psychol Rev 92(3):350–371

    Google Scholar 

  15. McNeill D (1992) Head and mind: what gestures reveal about thought. University of Chicago University of Chicago Press, Chicago

  16. McNeill D (2005) Gesture and thought. University of Chicago Press, Chicago

  17. Saffer D (2009) Designing gestural interfaces. O’Reilly Media, Sebastopol

  18. Jaime R, Yang L, Edward L (2011) User-defined motion gestures for mobile interaction. In: Proceedings of the 2011 annual conference on Human factors in computing systems, CHI ’11. NY, USA, ACM, New York, pp 197–206

  19. Christian K, Daniel N, John D, Michael R (2010) User-defined gestures for connecting mobile phones, public displays, and tabletops. In: Proceedings of the 12th international conference on Human computer interaction with mobile devices and services, MobileHCI ’10. NY, USA, ACM, New York, pp 239–248

  20. Zhang L, Huang Q, Liu Q, Liu T, Li D, Lu Y (2005) A teleoperation system for a humanoid robot with multiple information feedback and operational modes. In: IEEE international conference on robotics and biomimetics (ROBIO), pp 290–294

  21. Kechavarzi BD, Sabanovic S, Weisman K (2012) Evaluation of control factors affecting the operator’s immersion and performance in robotic teleoperation. In: IEEE, RO-MAN, pp 608–613

  22. Sian NE, Yokoi K, Kajita S, Kanehiro F, Tanie K (2002) Whole body teleoperation of a humanoid robot - development of a simple master device using joysticks. In: IEEE/RSJ international conference on intelligent robots and systems, vol. 3, pp 2569–2574

  23. McColl D, Zhang Z, Nejat G (2011) Human body pose interpretation and classification for social human-robot interaction. Int J Soc Robot 3(3):313–332

    Google Scholar 

  24. Sakagami Y, Watanabe R, Aoyama C, Matsunaga S, Higaki N, Fujimura K (2002) The intelligent ASIMO: system overview and integration. In: IEEE/RSJ international conference on intelligent robots and systems, vol. 3, pp 2478–2483

  25. Yorita A, Kubota N (2011) Cognitive development in partner robots for information support to elderly people. IEEE Trans Auton Ment Dev 3(1):64–73

    Google Scholar 

  26. Ju Z, Liu H (2010) Recognizing hand grasp and manipulation through empirical copula. Int J Soc Robot 2(3):321–328

    Google Scholar 

  27. Fujimoto I, Matsumoto T, Silva PRS, Kobayashi M, Higashi M (2011) Mimicking and evaluating human motion to improve the imitation skill of children with autism through a robot. Int J Soc Robot 3(4):349–357

    Google Scholar 

  28. Yun S-S, Kim M, Choi MT (2013) Easy interface and control of tele-education robots. Int J Soc Robot 5(3):335–343

    Google Scholar 

  29. Waldherr S, Romero R, Thrun S (2000) A gesture based interface for human-robot interaction. Auton Robot 9(2):151–173

    Google Scholar 

  30. Nguyen-Duc-Thanh N, Stoniern D, Lee SY, Kim DH (2011) A new approach for human-robot interaction using human body language. In: Proceedings of the 5th international conference on convergence and hybrid information technology, ICHIT’11. Springer, Berlin, pp 762–769

  31. Broccia G, Livesu M, Scateni R (2011) Gestural interaction for robot motion control. In: EuroGraphics Italian chapter, pp 61–66

  32. Cabibihan J-J, So W-C, Pramanik S (2012) Human-recognizable robotic gestures. IEEE Trans Auton Mental Dev, 4(4):305–314

    Google Scholar 

  33. Strobel M, Illmann J, Kluge B, Marrone F (2002) Using spatial context knowledge in gesture recognition for commanding a domestic service robot. In: Proceedings of the 11th IEEE international workshop on robot and human interactive communication, pp 468–473

  34. Sato E, Yamaguchi T, Harashima F (2007) Natural interface using pointing behavior for human-robot gestural interaction. In: IEEE transactions on industrial electronics, 54(2):1105–1112

  35. Sato E, Nakajima A, Yamaguchi T, Harashima F (2005) Humatronics (1)— natural interaction between human and networked robot using human motion recognition. In: IEEE/RSJ international conference on intelligent robots and systems, (IROS 2005), pp 930–935

  36. Hu C, Meng MQ, Liu PX, Wang X (2003) Visual gesture recognition for human-machine interface of robot teleoperation. In: IEEE/RSJ international conference on intelligent robots and systems, (IROS 2003). Proceedings, vol. 2, pp 1560–1565

  37. Konda KR, Königs A, Schulz H, Schulz D (2012) Real time interaction with mobile robots using hand gestures. In: Proceedings of the seventh annual ACM/IEEE international conference on human-robot interaction, HRI ’12. NY, USA, ACM, New York, pp 177–178

  38. Dillmann R (2004) Teaching and learning of robot tasks via observation of human performance. Robot Auton Syst 47(23):109–116

    Google Scholar 

  39. Breazeal C, Scassellati B (2002) Robots that imitate humans. Trends Cogn Sci 6(11):481–487

    Google Scholar 

  40. Barattini P, Morand C, Robertson NM (2012) A proposed gesture set for the control of industrial collaborative robots. In IEEE RO-MAN, pp 132–137

  41. Ende T, Haddadin S, Parusel S, Wusthoff T, Hassenzahl M, Albu-Schaffer A (2011) A human-centered approach to robot gesture based communication within collaborative working processes. In: IEEE/RSJ international conference on intelligent robots and systems (IROS), pp 3367–3374

  42. Gleeson B, MacLean K, Haddadi A, Croft E, Alcazar J (2013) Gestures for industry intuitive human-robot communication from human observation. In: 8th ACM/IEEE international conference on human-robot interaction (HRI), pp 349–356

  43. Bodiroa S, Stern HI, Edan Y (2012) Dynamic gesture vocabulary design for intuitive human-robot dialog. In: 7th ACM/IEEE international conference on Human-robot interaction (HRI), pp 111–112

  44. Wobbrock JO, Aung HH, Rothrock B, Myers BA (2005) Maximizing the guessability of symbolic input. In: CHI ’05 extended abstracts on Human factors in computing systems, CHI EA ’05, ACM, New York, pp 1869–1872

  45. Kang SK, Nam MY, Rhee PK (2008) Color based hand and finger detection technology for user interaction. In: ICHIT ’08. international conference on convergence and hybrid information technology, pp 229–236

  46. Kita S (2009) Cross-cultural variation of speech-accompanying gesture: a review. Lang Cogn Process 24(2):145–167

    Google Scholar 

  47. Bartneck C, Nomura T, Kanda T, Suzuki T, Kato K (2005) Cultural differences in attitudes towards robots. In: Proceedings of the symposium on robot companions: hard problems and open challenges in Robot-human interaction,

  48. Bartneck C, Suzuki T, Kanda T, Nomura T (2007) The influence of people’s culture and prior experiences with aibo on their attitude towards robots. AI Soc 21:217–230

    Google Scholar 

  49. Nomura T, Suzuki T, Kanda T, Han J, Shin N, Burke J, Kato K (2008) What people assume about humanoid and animal-type robots: cross-cultural analysis between japan, korea, and the united states. Int J Hum Robot 05(01):25–46

    Google Scholar 

Download references

Acknowledgments

This work was partially funded by the European Commission within the 7th Framework Program under grant agreement eCute (FP7-ICT-257666).

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Mohammad Obaid.

Rights and permissions

Reprints and permissions

About this article

Cite this article

Obaid, M., Kistler, F., Häring, M. et al. A Framework for User-Defined Body Gestures to Control a Humanoid Robot. Int J of Soc Robotics 6, 383–396 (2014). https://doi.org/10.1007/s12369-014-0233-3

Download citation

  • Accepted:

  • Published:

  • Issue Date:

  • DOI: https://doi.org/10.1007/s12369-014-0233-3

Keywords

Navigation