[go: up one dir, main page]
More Web Proxy on the site http://driver.im/ skip to main content
10.5555/3306127.3331759acmconferencesArticle/Chapter ViewAbstractPublication PagesaamasConference Proceedingsconference-collections
research-article

What do we express without knowing?: Emotion in Gesture

Published: 08 May 2019 Publication History

Abstract

Emotional Gestural Expression is an aspect of nonverbal communication that is critical for social agents. The way we perform gestures provides both intentional and unintentional information, leaking our emotional state. In this paper, we analyze the perception of a group of base gestures under a wide set of modifications to understand how users perceive the emotional content of the movement, according to the valence and arousal dimensions of Russell'sCircumplex model. From these results, we are able to extract the perceived emotional quality of the base gesture forms and how the modifications shift that perception. An analysis is provided on the impact of different performance modifications of the gesture

References

[1]
Joseph Bates. 1994. The role of emotion in believable agents. Commun. ACM 37, 7 (1994), 122--125.
[2]
Justine Cassell, Tom Stocky, Tim Bickmore, Yang Gao, Yukiko Nakano, Kimiko Ryokai, Dona Tversky, Catherine Vaucelle, and Hannes Vilhjálmsson. 2002. Mack: Media lab autonomous conversational kiosk. In Proc. of Imagina, Vol. 2. 12--15.
[3]
Justine Cassell, Hannes Högni Vilhjálmsson, and Timothy Bickmore. 2004. Beat: the behavior expression animation toolkit. In Life-Like Characters. Springer, 163--185.
[4]
Nele Dael, Martijn Goudbeek, and K R Sherer. 2013. Perceived gesture dynamics in nonverbal expression of emotion. Perception 42 (2013), 642--657.
[5]
Celso M de Melo, Peter J Carnevale, Stephen J Read, and Jonathan Gratch. 2014. Reading people's minds from emotion expressions in interdependent decision making. Journal of personality and social psychology 106, 1 (2014), 73.
[6]
Nesrine Fourati and Catherine Pelachaud. 2018. Perception of emotions and body movement in the emilya database. IEEE Transactions on Affective Computing 9, 1 (2018), 90--101.
[7]
Björn Hartmann, Maurizio Mancini, Stéphanie Buisine, and Catherine Pelachaud. 2005. Design and evaluation of expressive gesture synthesis for embodied conversational agents. In Proceedings of the fourth international joint conference on Autonomous agents and multiagent systems. ACM, 1095--1096.
[8]
Björn Hartmann, Maurizio Mancini, and Catherine Pelachaud. 2005. Implementing expressive gesture synthesis for embodied conversational agents. In International Gesture Workshop. Springer, 188--199.
[9]
Zhichao Hu, Michelle Dick, Chung-Ning Chang, Kevin Bowden, Michael Neff, Jean E Fox Tree, and Marilyn A Walker. 2016. A Corpus of Gesture-Annotated Dialogues for Monologue-to-Dialogue Generation from Personal Narratives. In LREC.
[10]
Michael Kipp. 2005. Gesture generation by imitation: From human behavior to computer character animation. Universal-Publishers.
[11]
Michael Kipp and Jean-Claude Martin. 2009. Gesture and emotion: Can basic gestural form features discriminate emotions?. In Affective Computing and Intelligent Interaction and Workshops, 2009. ACII 2009. 3rd International Conference on. IEEE, 1--8.
[12]
Margaux Lhommet and Stacy C Marsella. 2013. Gesture with meaning. In International Workshop on Intelligent Virtual Agents. Springer, 303--312.
[13]
David McNeill. 1992. Hand and mind: What gestures reveal about thought. University of Chicago press.
[14]
Radoslaw Niewiadomski, Elisabetta Bevacqua, Maurizio Mancini, and Catherine Pelachaud. 2009. Greta: an interactive expressive ECA system. In Proceedings of The 8th International Conference on Autonomous Agents and Multiagent Systems- Volume 2. International Foundation for Autonomous Agents and Multiagent Systems, 1399--1400.
[15]
Aline Normoyle, Fannie Liu, Mubbasir Kapadia, Norman I Badler, and Sophie Jörg. 2013. The effect of posture and dynamics on the perception of emotion. In Proceedings of the ACM Symposium on Applied Perception. ACM, 91--98.
[16]
Harrison Jesse Smith and Michael Neff. 2017. Understanding the impact of animated gesture performance on personality perceptions. ACM Transactions on Graphics (TOG) 36, 4 (2017), 49.
[17]
Jon Sprouse. 2011. A validation of Amazon Mechanical Turk for the collection of acceptability judgments in linguistic theory. Behavior research methods 43, 1 (2011), 155--167.
[18]
Ekaterina P. Volkova, Betty J. Mohler, Trevor J. Dodds, Tesch Joachim, and Heinrich H. Bulthoff. 2014. Emotion categorization of body expressions in narrative scenarios. Frontiers in Psychology 5 (2014).
[19]
Junchao Xu, Joost Broekens, Koen Hindriks, and Mark A Neerincx. 2013. Mood expression through parameterized functional behavior of robots. IEEE.

Cited By

View all
  • (2023)Listen, Denoise, Action! Audio-Driven Motion Synthesis with Diffusion ModelsACM Transactions on Graphics10.1145/359245842:4(1-20)Online publication date: 26-Jul-2023
  • (2022)Tunable tension for gesture animationProceedings of the 22nd ACM International Conference on Intelligent Virtual Agents10.1145/3514197.3549631(1-8)Online publication date: 6-Sep-2022
  • (2021)Evaluating Study Design and Strategies for Mitigating the Impact of Hand Tracking LossACM Symposium on Applied Perception 202110.1145/3474451.3476235(1-12)Online publication date: 16-Sep-2021
  • Show More Cited By

Recommendations

Comments

Please enable JavaScript to view thecomments powered by Disqus.

Information & Contributors

Information

Published In

cover image ACM Conferences
AAMAS '19: Proceedings of the 18th International Conference on Autonomous Agents and MultiAgent Systems
May 2019
2518 pages
ISBN:9781450363099

Sponsors

Publisher

International Foundation for Autonomous Agents and Multiagent Systems

Richland, SC

Publication History

Published: 08 May 2019

Check for updates

Author Tags

  1. embodied computational agents
  2. emotional expression
  3. gesture analysis

Qualifiers

  • Research-article

Funding Sources

  • UC MEXUS - CONACYT

Conference

AAMAS '19
Sponsor:

Acceptance Rates

AAMAS '19 Paper Acceptance Rate 193 of 793 submissions, 24%;
Overall Acceptance Rate 1,155 of 5,036 submissions, 23%

Contributors

Other Metrics

Bibliometrics & Citations

Bibliometrics

Article Metrics

  • Downloads (Last 12 months)35
  • Downloads (Last 6 weeks)3
Reflects downloads up to 03 Jan 2025

Other Metrics

Citations

Cited By

View all
  • (2023)Listen, Denoise, Action! Audio-Driven Motion Synthesis with Diffusion ModelsACM Transactions on Graphics10.1145/359245842:4(1-20)Online publication date: 26-Jul-2023
  • (2022)Tunable tension for gesture animationProceedings of the 22nd ACM International Conference on Intelligent Virtual Agents10.1145/3514197.3549631(1-8)Online publication date: 6-Sep-2022
  • (2021)Evaluating Study Design and Strategies for Mitigating the Impact of Hand Tracking LossACM Symposium on Applied Perception 202110.1145/3474451.3476235(1-12)Online publication date: 16-Sep-2021
  • (2021)Speech2AffectiveGestures: Synthesizing Co-Speech Gestures with Generative Adversarial Affective Expression LearningProceedings of the 29th ACM International Conference on Multimedia10.1145/3474085.3475223(2027-2036)Online publication date: 17-Oct-2021
  • (2020)Understanding the Predictability of Gesture Parameters from Speech and their Perceptual ImportanceProceedings of the 20th ACM International Conference on Intelligent Virtual Agents10.1145/3383652.3423882(1-8)Online publication date: 20-Oct-2020

View Options

Login options

View options

PDF

View or Download as a PDF file.

PDF

eReader

View online with eReader.

eReader

Media

Figures

Other

Tables

Share

Share

Share this Publication link

Share on social media