[go: up one dir, main page]
More Web Proxy on the site http://driver.im/ skip to main content
10.1109/FG.2018.00072guideproceedingsArticle/Chapter ViewAbstractPublication PagesConference Proceedingsacm-pubtype
research-article

Reverse Engineering Psychologically Valid Facial Expressions of Emotion into Social Robots

Published: 15 May 2018 Publication History

Abstract

Social robots are now part of human society, destined for schools, hospitals, and homes to perform a variety of tasks. To engage their human users, social robots must be equipped with the essential social skill of facial expression communication. Yet, even state-of-the-art social robots are limited in this ability because they often rely on a restricted set of facial expressions derived from theory with well-known limitations such as lacking naturalistic dynamics. With no agreed methodology to objectively engineer a broader variance of more psychologically impactful facial expressions into the social robots' repertoire, human-robot interactions remain restricted. Here, we address this generic challenge with new methodologies that can reverse-engineer dynamic facial expressions into a social robot head. Our data-driven, user-centered approach, which combines human perception with psychophysical methods, produced highly recognizable and human-like dynamic facial expressions of the six classic emotions that generally outperformed state-of-art social robot facial expressions. Our data demonstrates the feasibility of our method applied to social robotics and highlights the benefits of using a data-driven approach that puts human users as central to deriving facial expressions for social robots. We also discuss future work to reverse-engineer a wider range of socially relevant facial expressions including conversational messages (e.g., interest, confusion) and personality traits (e.g., trustworthiness, attractiveness). Together, our results highlight the key role that psychology must continue to play in the design of social robots.

References

[1]
P. Ekman and W. V. Friesen, Manual for the facial action coding system: Consulting Psychologists Press, 1978.
[2]
P. Lucey, J. F. Cohn, T. Kanade, J. Saragih, Z. Ambadar, and I. Matthews, “The extended cohn-kanade dataset (ck+): A complete dataset for action unit and emotion-specified expression,” in Computer Vision and Pattern Recognition Workshops (CVPRW), 2010 IEEE Computer Society Conference on, 2010, pp. 94–101.
[3]
T. Hashimoto, S. Hitramatsu, T. Tsuji, and H. Kobayashi, “Development of the face robot SAYA for rich facial expressions,” in SICE-ICASE, International Joint Conference, 2006, pp. 5423–5428.
[4]
C. L. Breazeal, Designing sociable robots: MIT press, 2004.
[5]
P. Ekman, E. R. Sorenson, and W. V. Friesen, “Pan-cultural elements in facial displays of emotion,” Science, vol. 164, pp. 86–88, 1969.
[6]
R. E. Jack, O. G. Garrod, and P. G. Schyns, “Dynamic Facial Expressions of Emotion Transmit an Evolving Hierarchy of Signals over Time,” Current Biology, 2014.
[7]
R. E. Jack, “Culture and facial expressions of emotion,” Visual Cognition, vol. 21, pp. 1248–1286, 2013.
[8]
N. L. Nelson and J. A. Russell, “Universality revisited,” Emotion Review, vol. 5, pp. 8–15, 2013.
[9]
H. A. Elfenbein and N. Ambady, “Is there an in-group advantage in emotion recognition?” 2002.
[10]
I. Poggi and C. Pelachaud, “Performative facial expressions in animated faces,” Embodied conversational agents, pp. 155–189, 2000.
[11]
R. E. Jack, O. G. Garrod, H. Yu, R. Caldara, and P. G. Schyns, “Facial expressions of emotion are not culturally universal,” Proceedings of the National Academy of Sciences, vol. 109, pp. 7241–7244, 2012.
[12]
H. Yu, O. G. Garrod, and P. G. Schyns, “Perception-driven facial expression synthesis,” Computers and Graphics, vol. 36, pp. 152–162, 2012.
[13]
A. Ahumada and J. Lovell, “Stimulus features in signal detection.,” Journal of the Acoustical Society of America, vol. 49, pp. 1751–1756, 1971.
[14]
R. E. Jack, C. Crivelli, and T. Wheatley, “Data-Driven Methods to Diversify Knowledge of Human Psychology,” Trends in cognitive sciences, vol. 22, pp. 1–5, 2018.
[15]
R. E. Jack, R. Caldara, and P. G. Schyns, “Internal representations reveal cultural diversity in expectations of facial expressions of emotion,” Journal of Experimental Psychology: General, vol. 141, p. 19, 2012.
[16]
R. E. Jack and P. G. Schyns, “Toward a Social Psychophysics of Face Communication,” Annual Review of Psychology, vol. 68, pp. 269–297, 2017.
[17]
L. Ibrahimagic-Seper, A. Celebic, N. Petricevic, and E. Se-limovic, “Anthropometric differences between males and females in face dimensions and dimensions of central maxillary incisors,” Medicinski glasnik, vol. 3, pp. 58–62, 2006.
[18]
E. Hall, “Distances in Man,” The Hidden Dimension, pp. 101–129, 1966.
[19]
J. De Leersnyder, B. Mesquita, and H. S. Kim, “Where do my emotions belong? A study of immigrants' emotional acculturation,” Personality and Social Psychology Bulletin, vol. 37, pp. 451–463, 2011.
[20]
S. Al Moubayed, J. Beskow, G. Skantze, and B. Granstrm, “Furhat: a back-projected human-like robot head for multiparty human-machine interaction,” Cognitive behavioural systems, pp. 114–130, 2012.
[21]
M. Ochs, R. Niewiadomski, and C. Pelachaud, “18 Facial Expressions of Emotions for Virtual Characters,” The Oxford Handbook of Affective Computing, p. 261, 2015.
[22]
F. W. Smith and P. G. Schyns, “Smile through your fear and sadness transmitting and identifying facial expression signals over a range of viewing distances,” Psychological Science, vol. 20, pp. 1202–1208, 2009.
[23]
M. L. Smith, G. W. Cottrell, F. Gosselin, and P. G. Schyns, “Transmitting and decoding facial expressions,” Psychological science, vol. 16, pp. 184–189, 2005.
[24]
J. Gratch, D. DeVault, G. M. Lucas, and S. Marsella, “Ne-gotiation as a challenge problem for virtual humans,” in International Conference on Intelligent Virtual Agents, 2015, pp. 201–215.
[25]
T. Wu, N. J. Butko, P. Ruvulo, M. S. Bartlett, and J. R. Movellan, “Learning to make facial expressions,” in Development and Learning, 2009. ICDL 2009. IEEE 8th International Conference on, 2009, pp. 1–6.
[26]
C. Chen, O. Garrod, P. Schyns, and R. Jack, “The Face is the Mirror of the Cultural Mind,” Journal of vision, vol. 15, pp. 928–928, 2015.
[27]
C. Chen, C. Crivelli, O. Garrod, J.-M. Fernandez-Dols, P. Schyns, and R. Jack, “Facial Expressions of Pain and Pleasure are Highly Distinct,” Journal of Vision, vol. 16, 2016.
[28]
D. Gill, O. G. Garrod, R. E. Jack, and P. G. Schyns, “Facial movements strategically camouflage involuntary social signals of face morphology,” Psychological science, vol. 25, pp. 1079–1086, 2014.

Cited By

View all
  • (2020)Investigating perceptually based models to predict importance of facial blendshapesProceedings of the 13th ACM SIGGRAPH Conference on Motion, Interaction and Games10.1145/3424636.3426904(1-6)Online publication date: 16-Oct-2020
  • (2020)Building Culturally-Valid Dynamic Facial Expressions for a Conversational Virtual Agent Using Human PerceptionProceedings of the 20th ACM International Conference on Intelligent Virtual Agents10.1145/3383652.3423913(1-3)Online publication date: 20-Oct-2020
  • (2020)Improving Emotional Expression Recognition of Robots Using Regions of Interest from Human DataCompanion of the 2020 ACM/IEEE International Conference on Human-Robot Interaction10.1145/3371382.3378359(142-144)Online publication date: 23-Mar-2020

Index Terms

  1. Reverse Engineering Psychologically Valid Facial Expressions of Emotion into Social Robots
          Index terms have been assigned to the content through auto-classification.

          Recommendations

          Comments

          Please enable JavaScript to view thecomments powered by Disqus.

          Information & Contributors

          Information

          Published In

          cover image Guide Proceedings
          2018 13th IEEE International Conference on Automatic Face & Gesture Recognition (FG 2018)
          May 2018
          823 pages

          Publisher

          IEEE Press

          Publication History

          Published: 15 May 2018

          Qualifiers

          • Research-article

          Contributors

          Other Metrics

          Bibliometrics & Citations

          Bibliometrics

          Article Metrics

          • Downloads (Last 12 months)0
          • Downloads (Last 6 weeks)0
          Reflects downloads up to 19 Dec 2024

          Other Metrics

          Citations

          Cited By

          View all
          • (2020)Investigating perceptually based models to predict importance of facial blendshapesProceedings of the 13th ACM SIGGRAPH Conference on Motion, Interaction and Games10.1145/3424636.3426904(1-6)Online publication date: 16-Oct-2020
          • (2020)Building Culturally-Valid Dynamic Facial Expressions for a Conversational Virtual Agent Using Human PerceptionProceedings of the 20th ACM International Conference on Intelligent Virtual Agents10.1145/3383652.3423913(1-3)Online publication date: 20-Oct-2020
          • (2020)Improving Emotional Expression Recognition of Robots Using Regions of Interest from Human DataCompanion of the 2020 ACM/IEEE International Conference on Human-Robot Interaction10.1145/3371382.3378359(142-144)Online publication date: 23-Mar-2020

          View Options

          View options

          Media

          Figures

          Other

          Tables

          Share

          Share

          Share this Publication link

          Share on social media