[go: up one dir, main page]
More Web Proxy on the site http://driver.im/ skip to main content
10.1145/2157689.2157814acmconferencesArticle/Chapter ViewAbstractPublication PageshriConference Proceedingsconference-collections
research-article

The illusion of robotic life: principles and practices of animation for robots

Published: 05 March 2012 Publication History

Abstract

This paper describes our approach on the development of the expression of emotions on a robot with constrained facial expressions. We adapted principles and practices of animation from Disney and other animators for robots, and applied them on the development of emotional expressions for the EMYS robot. Our work shows that applying animation principles to robots is beneficial for human understanding of the robots' emotions.

Supplementary Material

JPG File (hri272.jpg)
MP4 File (hri272.mp4)

References

[1]
N. Badler, J. Allbeck, L. Zhao, and M. Byun. Representing and parameterizing agent behaviors. In Proceedings of the Computer Animation, Washington, DC, USA, 2002. IEEE Computer Society.
[2]
C. Bartneck. Affective expressions of machines. In CHI '01 extended abstracts on Human factors in computing systems, NY, USA, 2001.
[3]
C. Bartneck. Integrating the occ model of emotions in embodied characters, 2002.
[4]
C. Bartneck and M. J. Lyons. Facial expression analysis, modeling and synthesis: Overcoming the limitations of artificial intelligence with the art of the soluble. In Handbook of Research on Synthetic Emotions and Sociable Robotics. IGI Global, 2009.
[5]
J. Bates. The role of emotion in believable agents. Commun. ACM, 37:122--125, July 1994.
[6]
C. L. Bethel. Robots Without Faces: Non-Verbal Social Human-Robot Interaction. PhD thesis, University of South Florida, June 2009.
[7]
T. W. Bickmore and R. W. Picard. Establishing and maintaining long-term human-computer relationships. ACM Trans. Comput.-Hum. Interact., June 2005.
[8]
C. Breaseal and R. Brooks. Robot emotion: A functional perspective. Who needs emotions: The brain meets the robot., Oxford University Press, 2004.
[9]
C. Breazeal, C. Breazeal, and P. Fitzpatrick. That certain look: Social amplification of animate vision. In In Proceedings AAAI Fall Symposium, Socially Intelligent Agents - The Human in the Loop, 2000.
[10]
C. Breazeal. Emotion and sociable humanoid robots. Int. J. Hum.-Comput. Stud., 59:119--155, July 2003.
[11]
J. Canemaker. Tex Avery: the MGM years, 1942--1955. Turner Pub., 1996.
[12]
J. Cassell, H. H. Vilhjálmsson, and T. Bickmore. Beat: the behavior expression animation toolkit. In SIGGRAPH '01, New York, NY, USA, 2001.
[13]
N. E. Chafai, C. Pelachaud, and D. Pelé. Towards the specification of an eca with variants of gestures. In IVA '07, Berlin, 2007.
[14]
K. Dautenhahn. Design spaces and niche spaces of believable social robots. RO-MAN '02, 2002.
[15]
B. De Carolis, C. Pelachaud, I. Poggi, and M. Steedman. Apml, a markup language for believable behavior generation. Springer, 2004.
[16]
P. Ekman and W. Friesen. Unmasking the Face: A Guide to Recognizing Emotions From Facial Expressions. Prentice Hall, 1975.
[17]
C. Finch. Jim Henson - The Works. Random House, 1993.
[18]
S. Gibet and T. Lebourque. High level specification and control of communication gestures: The gessyca system. In Proceedings of the Computer Animation, Washington, DC, 1999.
[19]
J. F. Gorostiza et al. Multimodal human-robot interaction framework for a personal robot. In RO-MAN '06, 2006.
[20]
U. Hess. The communication of emotion. Emotions, Qualia, and Consciousness, Singapore, 2001.
[21]
J. Hodgson. Mastering Movement: The Life and Work of Rudolf Laban. Routledge, 2001.
[22]
H. W. Jung, Y. H. Seo, Sahngwon, and H. S. Yang. Affective communication system with multimodality for a humanoid robot, ami. 2004.
[23]
E. H. Kim, S. S. Kwak, K. H. Hyun, S.-H. Kim, and Y. K. Kwak. Design and development of an emotional interaction robot, mung. Advanced Robotics, 2009.
[24]
S. Kopp, B. Krenn, S. Marsella, A. Marshall, C. Pelachaud, H. Pirker, K. Thórisson, and H. Vilhjálmsson. Towards a common framework for multimodal generation: The behavior markup language. IVA '06, 2006.
[25]
S. Kopp and I. Wachsmuth. Synthesizing multimodal utterances for conversational agents: Research articles. Comput. Animat. Virtual Worlds, 2004.
[26]
A. Kranstedt, S. Kopp, and I. Wachsmuth. Murml: A multimodal utterance representation markup language for conversational agents. In Embodied Conversational Agents: Lets Specify and Compare them. WS at AAMAS 2002, Bologna: Italy, 2002.
[27]
B. J. A. Kröse, J. M. Porta, A. J. N. V. Breemen, K. Crucq, M. Nuttin, and E. Demeester. Lino, the user-interface robot. EUSAI 2003.
[28]
L. Maltin. Of Mice and Magic: A History of American Animated Cartoons. NAL Books, New York, 1987.
[29]
B. Meerbeek, M. Saerbeck, and C. Bartneck. Iterative design process for robots with personality. In AISB2009 Symposium on New Frontiers in Human-Robot Interaction. SSAISB, 2009.
[30]
B. Meerbeek, M. Saerbeck, and C. Bartneck. Towards a design method for expressive robots. In HRI '09, New York, 2009.
[31]
A. Mehrabian. Pleasure-arousal-dominance: A general framework for describing and measuring individual differences in temperament. Current Psychology, 14(4):261--292, Dec. 1996.
[32]
S. A. Moubayed, M. Baklouti, M. Chetouani, T. Dutoit, A. Mahdhaoui, J.-C. Martin, S. Ondas, C. Pelachaud, J. Urbain, and M. Yilmaz. Multimodal feedback from robots and agents in a storytelling experiment. 2008.
[33]
M. B. Moussa, Z. Kasap, N. Magnenat-Thalmann, and D. Hanson. Mpeg-4 faps applied to humanoid robot head. Immersive and Engaging Interaction with VH on Internet, 2010.
[34]
R. Niewiadomski, E. Bevacqua, M. Mancini, and C. Pelachaud. Greta: an interactive expressive eca system. AAMAS '09, 2009.
[35]
D. A. Norman. Emotional Design: Why We Love (or Hate) Everyday Things. Basic Books, 2003.
[36]
A. Ortony, G. L. Clore, and A. Collins. The Cognitive Structure of Emotions. 1988.
[37]
I. Pandzic and R. Forchheimer. MPEG-4 Facial Animation: The Standard, Implementation and Applications. John Wiley & Sons, Inc., New York, 2003.
[38]
H. Prendinger, S. Descamps, and M. Ishizuka. Mpml: a markup language for controlling the behavior of life-like characters. J. Vis. Lang. Comput., 2004.
[39]
T. Ribeiro, I. Leite, A. Paiva, J. Kedzierski, and A. Oleksy. Expressing emotions on robotic companions with limited facial expression capabilities. IVA '11, 2011.
[40]
M. Saerbeck and C. Bartneck. Perception of affect elicited by robot motion. In HRI '10, New York, 2010.
[41]
M. Scheeff, J. Pinto, K. Rahardja, S. Snibbe, and R. Tow. Experiences with sparky: A social robot. In WS on Interactive Robot Entertainment, 2000.
[42]
M. Schröder, H. Pirker, and M. Lamolle. First suggestions for an emotion annotation and representation language. 2006.
[43]
K. Suzuki et al. Intelligent agent system fßor human--robot interaction through artificial emotion. Proceedings of the IEEE SMC, 1998.
[44]
T. I. T. Tojo, Y. Matsusaka and T. Kobayashi. A conversational robot utilizing facial and body expressions. IEEE SMC '00, 2000.
[45]
F. Tanaka, K. Noda, T. Sawada, and M. Fujita. Associated emotion and its expression in an entertainment robot qrio. In ICEC, 2004.
[46]
F. Thomas and O. Johnston. The Illusion of Life: Disney Animation. Disney Editions, 1995.
[47]
A. van Breemen. Bringing robots to life: Applying principles of animation to robotics. In CHI'04 WS on Shaping Human-Robot Interaction, 2003.
[48]
A. J. N. van Breemen. Animation engine for believable interactive user-interface robots. IROS '04, 2004.
[49]
H. Vilhjálmsson, N. Cantelmo, J. Cassell, N. E. Chafai, M. Kipp, S. Kopp, M. Mancini, S. Marsella, A. N. Marshall, C. Pelachaud, Z. Ruttkay, K. R. Thórisson, H. Welbergen, and R. J. Werf. The behavior markup language: Recent developments and challenges. 2007.
[50]
S. N. Woods, M. L. Walters, K. L. Koay, and K. Dautenhahn. Methodological issues in hri: A comparison of live and video-based methods in robot to human approach direction trials. In Proceedings of the 15th IEEE International Symposium on Robot and Human Interactive Communication, pages 51--58. IEEE, 2006.
[51]
T. Wu, N. J. Butko, P. Ruvulo, M. S. Bartlett, and J. R. Movellan. Learning to make facial expressions. In Proceedings of the ICDL, Shanghai, China, 2009.
[52]
M. Zecca, S. Roccella, M. C. Carrozza, H. Miwa, K. Itoh, G. Cappiello, J. J. Cabibihan, M. Matsumoto, H. Takanobu, P. Dario, and A. Takanishi. On the development of the emotion expression humanoid robot we-4rii with rch-1. IEEE Humanoids '04, 2005.

Cited By

View all
  • (2023)Investigating the Role of Robot Voices and Sounds in Shaping Perceived IntentionsProceedings of the 11th International Conference on Human-Agent Interaction10.1145/3623809.3623949(425-427)Online publication date: 4-Dec-2023
  • (2023)Exploring the Design Space of Extra-Linguistic Expression for RobotsProceedings of the 2023 ACM Designing Interactive Systems Conference10.1145/3563657.3595968(2689-2706)Online publication date: 10-Jul-2023
  • (2023)Multiple Roles of Multimodality Among Interacting AgentsACM Transactions on Human-Robot Interaction10.1145/354995512:2(1-13)Online publication date: 15-Mar-2023
  • Show More Cited By

Recommendations

Comments

Please enable JavaScript to view thecomments powered by Disqus.

Information & Contributors

Information

Published In

cover image ACM Conferences
HRI '12: Proceedings of the seventh annual ACM/IEEE international conference on Human-Robot Interaction
March 2012
518 pages
ISBN:9781450310635
DOI:10.1145/2157689
Permission to make digital or hard copies of all or part of this work for personal or classroom use is granted without fee provided that copies are not made or distributed for profit or commercial advantage and that copies bear this notice and the full citation on the first page. Copyrights for components of this work owned by others than ACM must be honored. Abstracting with credit is permitted. To copy otherwise, or republish, to post on servers or to redistribute to lists, requires prior specific permission and/or a fee. Request permissions from [email protected]

Sponsors

In-Cooperation

  • IEEE-RAS: Robotics and Automation

Publisher

Association for Computing Machinery

New York, NY, United States

Publication History

Published: 05 March 2012

Permissions

Request permissions for this article.

Check for updates

Author Tags

  1. animation
  2. emotions
  3. facial expressions

Qualifiers

  • Research-article

Conference

HRI'12
Sponsor:
HRI'12: International Conference on Human-Robot Interaction
March 5 - 8, 2012
Massachusetts, Boston, USA

Acceptance Rates

Overall Acceptance Rate 268 of 1,124 submissions, 24%

Contributors

Other Metrics

Bibliometrics & Citations

Bibliometrics

Article Metrics

  • Downloads (Last 12 months)78
  • Downloads (Last 6 weeks)13
Reflects downloads up to 11 Dec 2024

Other Metrics

Citations

Cited By

View all
  • (2023)Investigating the Role of Robot Voices and Sounds in Shaping Perceived IntentionsProceedings of the 11th International Conference on Human-Agent Interaction10.1145/3623809.3623949(425-427)Online publication date: 4-Dec-2023
  • (2023)Exploring the Design Space of Extra-Linguistic Expression for RobotsProceedings of the 2023 ACM Designing Interactive Systems Conference10.1145/3563657.3595968(2689-2706)Online publication date: 10-Jul-2023
  • (2023)Multiple Roles of Multimodality Among Interacting AgentsACM Transactions on Human-Robot Interaction10.1145/354995512:2(1-13)Online publication date: 15-Mar-2023
  • (2023)Humorous Robotic Behavior as a New Approach to Mitigating Social AwkwardnessProceedings of the 2023 CHI Conference on Human Factors in Computing Systems10.1145/3544548.3580821(1-16)Online publication date: 19-Apr-2023
  • (2023)Investigating the Role of Multi-modal Social Cues in Human-Robot Collaboration in Industrial SettingsInternational Journal of Social Robotics10.1007/s12369-023-01018-915:7(1169-1179)Online publication date: 11-Jun-2023
  • (2023)Expanded linear dynamic affect-expression model for lingering emotional expression in social robotIntelligent Service Robotics10.1007/s11370-023-00483-516:5(619-631)Online publication date: 27-Sep-2023
  • (2022)Interactive Narrative and Story-tellingThe Handbook on Socially Interactive Agents10.1145/3563659.3563674(463-492)Online publication date: 27-Oct-2022
  • (2022)An Emotional Respiration Speech DatasetCompanion Publication of the 2022 International Conference on Multimodal Interaction10.1145/3536220.3558803(70-78)Online publication date: 7-Nov-2022
  • (2022)“I See You!”: A Design Framework for Interface Cues about Agent Visual Perception from a Thematic Analysis of VideogamesProceedings of the 2022 CHI Conference on Human Factors in Computing Systems10.1145/3491102.3517699(1-22)Online publication date: 29-Apr-2022
  • (2022)The LMA12-O Framework for Emotional Robot Eye Gestures2022 31st IEEE International Conference on Robot and Human Interactive Communication (RO-MAN)10.1109/RO-MAN53752.2022.9900752(442-449)Online publication date: 29-Aug-2022
  • Show More Cited By

View Options

Login options

View options

PDF

View or Download as a PDF file.

PDF

eReader

View online with eReader.

eReader

Media

Figures

Other

Tables

Share

Share

Share this Publication link

Share on social media