[go: up one dir, main page]
More Web Proxy on the site http://driver.im/ skip to main content
research-article

Strutting Hero, Sneaking Villain: Utilizing Body Motion Cues to Predict the Intentions of Others

Published: 20 October 2015 Publication History

Abstract

A better understanding of how intentions and traits are perceived from body movements is required for the design of more effective virtual characters that behave in a socially realistic manner. For this purpose, realistic body motion, captured from human movements, is being used more frequently for creating characters with natural animations in games and entertainment. However, it is not always clear for programmers and designers which specific motion parameters best convey specific information such as certain emotions, intentions, or traits. We conducted two experiments to investigate whether the perceived traits of actors could be determined from their body motion, and whether these traits were associated with their perceived intentions. We first recorded body motions from 26 professional actors, who were instructed to move in a “hero”-like or a “villain”-like manner. In the first experiment, 190 participants viewed individual video recordings of these actors and were required to provide ratings to the body motion stimuli along a series of different cognitive dimensions (intentions, attractiveness, dominance, trustworthiness, and distinctiveness). The intersubject ratings across observers were highly consistent, suggesting that social traits are readily determined from body motion. Moreover, correlational analyses between these ratings revealed consistent associations across traits, for example, that perceived “good” intentions were associated with higher ratings of attractiveness and dominance. Experiment 2 was designed to elucidate the qualitative body motion cues that were critical for determining specific intentions and traits from the hero- and villain-like body movements. The results revealed distinct body motions that were readily associated with the perception of either “good” or “bad” intentions. Moreover, regression analyses revealed that these ratings accurately predicted the perception of the portrayed character type. These findings indicate that intentions and social traits are communicated effectively via specific sets of body motion features. Furthermore, these results have important implications for the design of the motion of virtual characters to convey desired social information.

References

[1]
Linda Albright, David A. Kenny, and Thomas E. Malloy. 1988. Consensus in personality judgments at zero acquaintance. J. Pers. Soc. Psychol. 55 (1988), 387--395.
[2]
Truett Allison, Aina Puce, and Gregory McCarthy. 2000. Social perception from visual cues: role of the STS region. Trends Cogn. Sci. 4 (2000), 267--278.
[3]
Nalini Ambady and Robert Rosenthal. 1993. Half a minute - predicting teacher evaluations from thin slices of nonverbal behavior and physical attractiveness. J. Pers. Soc. Psychol. 64 (1992), 431--441.
[4]
Anthony P. Atkinson, Winand H. Dittrich, Andrew J. Gemmell, and Andrew W. Young. 2004. Emotion perception from dynamic and static body expressions in point-light and full-light displays. Perception 33 (2004), 717--746.
[5]
Peter Borkenau, Nadine Mauer, Rainer Riemann, Frank M. Spinath, and Alois Angleitner. 2004. Thin slices of behavior as cues of personality and intelligence. J. Pers. Soc. Psychol. 86 (2004), 599--614.
[6]
David H. Brainard. 1997. The psychophysics toolbox. Spatial Vision 10 (1997), 433--436.
[7]
Diane Chi, Monica Costa, Liwei W. Zhao, and Norman Badler. 2000. The EMOTE model for effort and shape. Siggraph 2000 Conference Proceedings, 173--182.
[8]
R. Elisabeth Cornwell, Miriam J. Law Smith, Lynda G. Boothroyd, Fhionna R. Moore, Hasker P. Davis, Michael Stirrat, Bernard Tiddeman and David I. Perrett. 2006. Reproductive strategy, sexual development and attraction to facial characteristics. Phil. Trans. Roy. Soc. B-Biol. Sci. 361 (2006), 2143--2154.
[9]
Elizabeth A. Crane and Melissa M. Gross. 2013. Effort-shape characteristics of emotion-related body movement. J. Nonverbal Behav. 37 (2013), 91--105.
[10]
Matthew J. C. Crump, John V. McDonnell, and Todd M. Gureckis. 2013. Evaluating Amazon's mechanical turk as a tool for experimental behavioral research. Plos One 8 (2013).
[11]
Nele Dael, Marcello Mortillaro, and Klaus R. Scherer. 2012. Emotion expression in body action and posture. Emotion 12 (2012), 1085--1101.
[12]
Beatrice de Gelder, Jan Van den Stock, Hanneke K. Meeren, Charlotte B. A. Sinke, Mariska E. Kret, and Marco Tamietto. 2010. Standing up for the body. Recent progress in uncovering the networks involved in the perception of bodies and bodily expressions. Neurosci. Biobehav. Rev. 34 (2010), 513--527.
[13]
Cecily Dell. 1977. A Primer for Movement Description. New York: Dance Notation Bureau.
[14]
Karen K. Dion, Ellen Berscheid, and Elaine Walster. 1972. What is beautiful is good. J. Pers. Soc. Psychol. 24 (1972), 285--290.
[15]
Bernhard Fink, Kathrin Täschner, Nick Neave, Nadine Hugill, and Laura Dane. 2010. Male faces and bodies: Evidence of a condition-dependent ornament of quality. Pers. Ind. Differ. 49 (2010), 436--440.
[16]
Joseph J. Fleishman, Mary L. Buckley, Mary J. Klosinsky, Nancy Smith, and Beverly Tuck. 1976. Judged attractiveness in recognition memory of womens’ faces. Percept. Mot. Skills 43 (1976), 709--710.
[17]
Jonathan B. Freeman, Kerii L. Johnson, Reginald B. Adams Jr., and Nalini Ambady. 2012. The social-sensory interface: Category interactions in person perception. Front. Integr. Neurosci. 6 (2012), 81.
[18]
Laura Germine, Ken Nakayama, Bradley C. Duchaine, Christopher F. Chabris, Garga Chatterjee, and Jeremy B. Wilmer. 2012. Is the Web as good as the lab? Comparable performance from Web and lab in cognitive/perceptual experiments. Psychon. Bull. Rev. 19 (2012), 847--857.
[19]
M. Melissa Gross, Elizabeth A. Crane, and Barbara L. Fredrickson. 2010. Methodology for assessing bodily expression of emotion. J. Nonverbal Behav. 34 (2010), 223--248.
[20]
M. Melissa Gross, Elizabeth A. Crane, and Barbara L. Fredrickson. 2012. Effort-Shape and kinematic assessment of bodily expression of emotion during gait. Hum. Mov. Sci. 31 (2012), 202--221.
[21]
Rebekah E. Gunns, Lucy Johnston, and Stephen M. Hudson. 2002. Victim selection and kinematics: A point-light investigation of vulnerability to attack. J. Nonverbal Behav. 26 (2002), 129--158.
[22]
J. Kiley Hamlin, Karen Wynn, and Paul Bloom. 2007. Social evaluation by preverbal infants. Nature 450 (2007), 557--559.
[23]
J. Kiley Hamlin, Karen Wynn, Paul Bloom, and Neha Mahajan. 2011. How infants and toddlers react to antisocial others. Proc. Natl. Acad. Sci. U S A. 108 (2011), 19931--19936.
[24]
Ludovic Hoyet, Kenneth Ryall, Katja Zibrek, Hwangpil Park, Jehee Lee, Jessica Hodgins, and Carol O’Sullivan. 2013. Evaluating the distinctiveness and attractiveness of human motions on realistic virtual bodies. ACM Trans. Graph. 32 (2013), Article 204.
[25]
Nadine Hugill, Bernhard Fink, and Nick Neave. 2010. The role of human body movements in mate selection. Evol. Psychol. 8 (2010), 66--89.
[26]
Kerri L. Johnson and Louis G. Tassinary. 2005. Perceiving sex directly and indirectly - Meaning in motion and morphology. Psychol. Sci. 16 (2005), 890--897.
[27]
Mario Kleiner, David Brainard, and Denis Pelli, 2007. What's new in Psychtoolbox-3? Perception 36 (2007), 14.
[28]
Markus Koppensteiner and Karl Grammer. 2011. Body movements of male and female speakers and their influence on perceptions of personality. Pers. Indiv. Differ. 51 (2011), 743--747.
[29]
Fani Loula, Sapna Prasad, Kent Harber, and Maggie Shiffrar. 2005. Recognizing people from their movement. J. Exp. Psychol. Hum. Percept. Perform. 31 (2005), 210--220.
[30]
Valeria Manera, Cristina Becchio, Andrea Cavallo, Luisa Sartori, and Umberto Castiello. 2011. Cooperation or competition? Discriminating between social intentions by observing prehensile movements. Exp. Brain Res. 211 (2011), 547--556.
[31]
Phil McAleer, Alexander Todorov, and Pascal Belin. 2014. How do you say ‘hello’? Personality impressions from brief novel voices. Plos One 9 (2014).
[32]
Joann M. Montepare and Leslie Zebrowitz-McArthur. 1988. Impressions of people created by age-related qualities of their gaits. J. Pers. Soc. Psychol. 55 (1988), 547--556.
[33]
Fiona N. Newell, Patrick Chiroro, and Tim Valentine. 1999. Recognizing unfamiliar faces: The effects of distinctiveness and view. Q. J. Exp. Psychol.-A 52 (1999), 509--534.
[34]
Nikolaas N. Oosterhof and Alexander Todorov. 2008. The functional basis of face evaluation. Proc. Natl. Acad. Sci. U S A. 105 (2008), 11087--11092.
[35]
Denis G. Pelli. 1997. The VideoToolbox software for visual psychophysics: Transforming numbers into movies. Spatial Vision 10 (1997), 437--442.
[36]
Gillian Rhodes and Tanya Tremewan. 1996. Averageness, exaggeration, and facial attractiveness. Psychol. Sci. 7 (1996), 105--110.
[37]
Claire L. Roether, Lars Omlor, Andrea Christensen, and Martin A. Giese. 2009. Critical features for the perception of emotion from gait. J. Vis. 9 (2009).
[38]
Sverker Runeson and Gunilla Frykholm. 1983. Kinematic specification of dynamics as an informational basis for person-and-action perception - Expectation, gender recognition, and deceptive intention. J. Exp. Psychol. Gen. 112 (1983), 585--615.
[39]
Edward K. Sadalla, Douglas T. Kenrick, and Beth Vershure. 1987. Dominance and heterosexual attraction. J. Pers. Soc. Psychol. 52 (1987), 730--738.
[40]
Klaus R. Scherer and Heiner Ellgring. 2007. Multimodal expression of emotion: Affect programs or componential appraisal patterns? Emotion 7 (2007), 158--171.
[41]
Konrad Schindler, Luc Van Gool, and Beatrice de Gelder. 2008. Recognizing emotions expressed by body pose: A biologically inspired neural model. Neural Netw. 21 (2008), 1238--1246.
[42]
Peter N. Shapiro and Steven Penrod. 1986. Meta-analysis of facial identification studies. Psychol. Bull. 100 (1986), 139--156.
[43]
Clare A. M. Sutherland, Julian A. Oldmeadow, Isabel M. Santos, John Towler, D. Michael Burt, and Andrew W. Young. 2013. Social inferences from faces: Ambient images generate a three-dimensional model. Cognition 127 (2013), 105--118.
[44]
John C. Thoresen, Quoc C. Vuong, and Anthony P. Atkinson. 2012. First impressions: Gait cues drive reliable trait judgements. Cognition 124 (2012), 261--271.
[45]
Alexander Todorov, Chris P. Said, Andrew D. Engell, and Nikolaas N. Oosterhof. 2008. Understanding evaluation of faces on social dimensions. Trends Cogn. Sci. 12 (2008), 455--460.
[46]
Tim Valentine. 1991. A unified account of the effects of distinctiveness, inversion, and race in face recognition. Q. J. Exp. Psychol. A. 43 (1991), 161--204.
[47]
Tim Valentine, Michael B. Lewis, and Peter J. Hills. 2015. Face-space: A unifying concept in face recognition research. Q. J. Exp. Psychol. (2015), 1--24 {Epub ahead of print}.
[48]
Harald G. Wallbott. 1998. Bodily expression of emotion. Eur. J. Soc. Psychol. 28 (1998), 879--896.
[49]
Richard Wiseman. 1995. The Megalab Truth Test. Nature 373 (1995), 391--391.
[50]
Leslie A. Zebrowitz and Susan M. McDonald. 1991. The impact of litigants baby-facedness and attractiveness on adjudications in small claims courts. Law Human Behav. 15 (1991), 603--623.
[51]
Leslie A. Zebrowitz and Joann M. Montepare. 2008. Social psychological face perception: Why appearance matters. Soc. Personal Psychol. Compass 2 (2008), 1497.

Cited By

View all
  • (2024)Fostering the AR illusion: a study of how people interact with a shared artifact in collocated augmented realityFrontiers in Virtual Reality10.3389/frvir.2024.14287655Online publication date: 20-Aug-2024
  • (2018)Face recognition of full-bodied avatars by active observers in a virtual environmentVision Research10.1016/j.visres.2017.12.001Online publication date: Jan-2018

Index Terms

  1. Strutting Hero, Sneaking Villain: Utilizing Body Motion Cues to Predict the Intentions of Others

      Recommendations

      Comments

      Please enable JavaScript to view thecomments powered by Disqus.

      Information & Contributors

      Information

      Published In

      cover image ACM Transactions on Applied Perception
      ACM Transactions on Applied Perception  Volume 13, Issue 1
      December 2015
      112 pages
      ISSN:1544-3558
      EISSN:1544-3965
      DOI:10.1145/2837040
      Issue’s Table of Contents
      Permission to make digital or hard copies of all or part of this work for personal or classroom use is granted without fee provided that copies are not made or distributed for profit or commercial advantage and that copies bear this notice and the full citation on the first page. Copyrights for components of this work owned by others than ACM must be honored. Abstracting with credit is permitted. To copy otherwise, or republish, to post on servers or to redistribute to lists, requires prior specific permission and/or a fee. Request permissions from [email protected]

      Publisher

      Association for Computing Machinery

      New York, NY, United States

      Publication History

      Published: 20 October 2015
      Accepted: 01 June 2015
      Revised: 01 May 2015
      Received: 01 January 2015
      Published in TAP Volume 13, Issue 1

      Permissions

      Request permissions for this article.

      Check for updates

      Author Tags

      1. “Effort Shape” analysis
      2. Body motion
      3. cognitive dimensions
      4. intentions
      5. social inferences
      6. traits
      7. virtual humans

      Qualifiers

      • Research-article
      • Research
      • Refereed

      Funding Sources

      • Science Foundation Ireland Principal Investigator
      • CO'S

      Contributors

      Other Metrics

      Bibliometrics & Citations

      Bibliometrics

      Article Metrics

      • Downloads (Last 12 months)20
      • Downloads (Last 6 weeks)4
      Reflects downloads up to 12 Dec 2024

      Other Metrics

      Citations

      Cited By

      View all
      • (2024)Fostering the AR illusion: a study of how people interact with a shared artifact in collocated augmented realityFrontiers in Virtual Reality10.3389/frvir.2024.14287655Online publication date: 20-Aug-2024
      • (2018)Face recognition of full-bodied avatars by active observers in a virtual environmentVision Research10.1016/j.visres.2017.12.001Online publication date: Jan-2018

      View Options

      Login options

      Full Access

      View options

      PDF

      View or Download as a PDF file.

      PDF

      eReader

      View online with eReader.

      eReader

      Media

      Figures

      Other

      Tables

      Share

      Share

      Share this Publication link

      Share on social media