Abstract
People use imitation to encourage each other during conversation. We have conducted an experiment to investigate how imitation by a robot affect people’s perceptions of their conversation with it. The robot operated in one of three ways: full head gesture mimicking, partial head gesture mimicking (nodding), and non-mimicking (blinking). Participants rated how satisfied they were with the interaction. We hypothesized that participants in the full head gesture condition will rate their interaction the most positively, followed by the partial and non-mimicking conditions. We also performed gesture analysis to see if any differences existed between groups, and did find that men made significantly more gestures than women while interacting with the robot. Finally, we interviewed participants to try to ascertain additional insight into their feelings of rapport with the robot, which revealed a number of valuable insights.
Similar content being viewed by others
References
Arduino: Arduino physical computing platform (2009). http://www.arduino.cc. Last accessed 15 Apr 2009
Bailenson JN, Yee N (2005) Digital chameleons: Automatic assimilation of nonverbal gestures in immersive virtual environments. Psychol Sci 16:814–819
Davis MH (2006) Empathy. In: Stets JE, Turner JH (eds) Handbook of the sociology of emotions. Springer, New York
Frances SJ (1979) Sex differences in nonverbal behavior. Sex Roles 5(4):519–535
Gill S (2004), Body moves and tacit knowing. Cognition and technology: co-existence, convergence, and co-evolution, p 241
Gratch J, Wang N, Gerten J, Fast E, Duffy R (2007) Creating rapport with virtual agents. In: IVA ’07: Proceedings of the 7th international conference on intelligent virtual agents. Springer, Berlin, pp 125–138
Harrison C, Hudson SE (2008) Pseudo-3d video conferencing with a generic webcam. In: International symposium on multimedia. IEEE Press, New York, pp 236–241. doi:ieeecomputersociety.org/10.1109/ISM.2008.12
Hatfield E, Cacioppo J, Rapson R (1992) Emotional contagion. In: Clark MS (ed) Review of personality and social psychology: emotion and social behavior. Sage, Thousand Oaks, pp 151–171
Hatfield E, Cacioppo J, Rapson R (1994) Emotional contagion. Cambridge University Press, Cambridge
Kanda T, Kamasima M, Imai M, Ono T, Sakamoto D, Ishiguro H, Anzai Y (2007) A humanoid robot that pretends to listen to route guidance from a human. Auton Robots 22(1):87–100
Kang SH, Watt JH, Ala SK (2008) Communicators’ perceptions of social presence as a function of avatar realism in small display mobile communication devices. In: HICSS ’08: proceedings of the proceedings of the 41st annual Hawaii international conference on system sciences. IEEE Comput Soc, Los Alamitos, p 147. doi:10.1109/hicss.2008.95
Kang SH, Watt JH, Ala SK (2008) Social copresence in anonymous social interactions using a mobile video telephone. In: CHI ’08: Proceeding of the twenty-sixth annual SIGCHI conference on human factors in computing systems, Florence, Italy. ACM, New York, pp 1535–1544. doi:10.1145/1357054.1357295
Nadel J, Simon M, Canet P, Soussignan R, Blancard P, Canamero L, Gaussier P (2006) Human responses to an expressive robot. In: Proceedings of the sixth international workshop on epigenetic robotics, Lund University Cognitive Studies, pp 79–86
Oberman LM, Winkielman P, Ramachandran VS (2007) Face to face: Blocking facial mimicry can selectively impair recognition of emotional expressions. Soc Neurosci 2(3):167–178
Reas C, Fry B (2003) Processing: a learning environment for creating interactive web graphics. In: SIGGRAPH ’03: ACM SIGGRAPH 2003 Web Graphics. ACM, New York, p 1. doi:10.1145/965333.965356
Riek LD (2007) Realizing hinokio: candidate requirements for physical avatar systems. In: HRI ’07: proceedings of the ACM/IEEE international conference on human-robot interaction. ACM, New York, pp 303–308. doi:10.1145/1228716.1228757
Riek LD, Afzal S, Robinson P (2009) Do affect-sensitive machines influence user behavior? In: Proceedings of the AISB symposium on the social understanding of artificial intelligence (SSoAI). Edinburgh, UK
Riek LD, Robinson P (2008) Real-time empathy: facial mimicry on a robot. In: Workshop on affective interaction in natural environments (AFFINE) at the international ACM conference on multimodal interfaces (ICMI 08). ACM, New York
Ross DM, Menzler S, Zimmermann E (2008) Rapid facial mimicry in orangutan play. Biol Lett 4(1):27–30 doi:10.1098/rsbl.2007.0535
Sidner CL, Lee C, Morency LP, Forlines C (2006) The effect of head-nod recognition in human-robot conversation. In: HRI ’06: proceedings of the 1st ACM SIGCHI/SIGART conference on human-robot interaction. ACM, New York, pp 290–296. doi:10.1145/1121241.1121291
Steiner HC The firmata protocol (2008). http://www.firmata.org. Last accessed 15 Apr 2009
Walters ML (2008) The design space for robot appearance and behaviour for social robot companions. Ph.D. thesis, University of Hertfordshire. http://hdl.handle.net/2299/1806
Weiss A, Bernhaupt R, Tscheligi M, Yoshida E (2009) Addressing user experience and societal impact in a user study with a humanoid robot. In: Proceedings of the AISB symposium on new frontiers in human-robot interaction. Edinburgh, UK
Wilson PI, Fernandez J (2006) Facial feature detection using Haar classifiers. J Comput Small Coll 21(4):127–133
WowWee: WowWee alive chimpanzee robot (2008). http://www.wowwee.com. Last accessed 15 Apr 2009
Author information
Authors and Affiliations
Corresponding author
Additional information
This research is sponsored by the Qualcomm Research Studentship.
Rights and permissions
About this article
Cite this article
Riek, L.D., Paul, P.C. & Robinson, P. When my robot smiles at me: Enabling human-robot rapport via real-time head gesture mimicry. J Multimodal User Interfaces 3, 99–108 (2010). https://doi.org/10.1007/s12193-009-0028-2
Received:
Accepted:
Published:
Issue Date:
DOI: https://doi.org/10.1007/s12193-009-0028-2