[go: up one dir, main page]
More Web Proxy on the site http://driver.im/ skip to main content
tutorial
Open access

Robot Expressive Motions: A Survey of Generation and Evaluation Methods

Published: 15 November 2019 Publication History

Abstract

Robots that have different forms and capabilities are used in a wide variety of situations; however, one common point to all robots interacting with humans is their ability to communicate with them. In addition to verbal communication or purely communicative movements, robots can also use their embodiment to generate expressive movements while achieving a task, to convey additional information to its human partner. This article surveys state-of-the-art techniques that generate whole-body expressive movements in robots and robot avatars. We consider different embodiments such as wheeled, legged, or flying systems and the different metrics used to evaluate the generated movements. Finally, we discuss future areas of improvement and the difficulties to overcome to develop truly expressive motions in artificial agents.

References

[1]
Henny Admoni and Brian Scassellati. 2017. Social eye gaze in human-robot interaction: A review. Journal of Human-Robot Interaction 6, 1 (2017), 25--63.
[2]
Abhijeet Agnihotri, Alison Shutterly, Abrar Fallatah, Brian Layng, and Heather Knight. 2018. ChairBot Café: Personality-based expressive motion. In Proceedings of the 2018 ACM/IEEE International Conference on Human-Robot Interaction Companion (HRI’18 Companion). 1--4.
[3]
Omid Alemi, William Li, and Philippe Pasquier. 2015. Affect-expressive movement generation with factored conditional restricted Boltzmann machines. In Proceedings of the 2015 International Conference on Affective Computing and Intelligent Interaction (ACII’15). IEEE, Los Alamitos, CA, 442--448.
[4]
Brenna D. Argall, Sonia Chernova, Manuela Veloso, and Brett Browning. 2009. A survey of robot learning from demonstration. Robotics and Autonomous Systems 57, 5 (2009), 469--483.
[5]
Ko Ayusawa and Eiichi Yoshida. 2017. Motion retargeting for humanoid robots based on simultaneous morphing parameter identification and motion optimization. IEEE Transactions on Robotics 33, 6, 1343--1357.
[6]
I. Bartenieff. 1965. Effort-Shape Analysis of Movement: The Unity of Expression and Function. Albert Einstein College of Medicine, Yeshiva University.
[7]
Aryel Beck, Brett Stevens, Kim A. Bard, and Lola Cañamero. 2012. Emotional body language displayed by artificial agents. ACM Transactions on Interactive Intelligent Systems 2, 1 (2012), 2.
[8]
Margaret M. Bradley and Peter J. Lang. 1994. Measuring emotion: The self-assessment manikin and the semantic differential. Journal of Behavior Therapy and Experimental Psychiatry 25, 1 (1994), 49--59.
[9]
C. Breazeal, C. Kidd, A. L. Thomaz, G. Hoffman, and M. Berlin. 2005. Effects of nonverbal communication on efficiency and robustness in human-robot teamwork. In Proceedings of the IEEE/RSJ International Conference on Intelligent Robotics and Systems. 708--713.
[10]
Cynthia Breazeal, Atsuo Takanishi, and Tetsunori Kobayashi. 2008. Social robots that interact with people. In Springer Handbook of Robotics. Springer, 1349--1369.
[11]
Mason Bretan, Guy Hoffman, and Gil Weinberg. 2015. Emotionally expressive dynamic physical behaviors in robots. International Journal of Human-Computer Studies 78 (2015), 1--16.
[12]
Sarah Jane Burton, Ali-Akbar Samadani, Rob Gorbet, and Dana Kulić. 2016. Laban movement analysis and affective movement generation for robots and other near-living creatures. In Dance Notations and Robot Motion. Springer, 25--48.
[13]
Jessica R. Cauchard, Kevin Y. Zhai, Marco Spadafora, and James A. Landay. 2016. Emotion encoding in human-drone interaction. In Proceedings of the 2016 11th ACM/IEEE International Conference on Human-Robot Interaction (HRI’16). IEEE, Los Alamitos, CA, 263--270.
[14]
Wan Kyun Chung, Li-Chen Fu, and Torsten Kröger. 2016. Motion Control. Springer International, Cham, Switzerland, 163--194.
[15]
Josep-Arnau Claret, Gentiane Venture, and Luis Basañez. 2017. Exploiting the robot kinematic redundancy for emotion conveyance to humans as a lower priority task. International Journal of Social Robotics 9, 2 (2017), 277--292.
[16]
Martin Do, Pedram Azad, Tamim Asfour, and Rudiger Dillmann. 2008. Imitation of human motion on a humanoid robot using non-linear optimization. In Proceedings of the 2008 8th IEEE-RAS International Conference onHumanoid Robots (Humanoids’08). IEEE, 545--552.
[17]
Anca Dragan and Siddhartha Srinivasa. 2013. Generating legible motion. In Proceedings of the Robotics: Science and Systems Conference.
[18]
Anca D. Dragan, Shira Bauman, Jodi Forlizzi, and Siddhartha S. Srinivasa. 2015. Effects of robot motion on human-robot collaboration. In Proceedings of the 10th Annual ACM/IEEE International Conference on Human-Robot Interaction (HRI’15). 51--58.
[19]
Anca D. Dragan, Rachel M. Holladay, and Siddhartha S. Srinivasa. 2014. An analysis of deceptive robot motion. In Proceedings of the Robotics: Science and SystemsConference. 10.
[20]
Anca D. Dragan, Kenton C. T. Lee, and Siddhartha S. Srinivasa. 2013. Legibility and predictability of robot motion. In Proceedings of the 2013 8th ACM/IEEE International Conference on Human-Robot Interaction (HRI’13). IEEE, Los Alamitos, CA, 301--308.
[21]
Magda Dubois, Josep-Arnau Claret, Luis Basañez, and Gentiane Venture. 2016. Influence of emotional motions in human-robot interactions. In Proceedings of the International Symposium on Experimental Robotics. 799--808.
[22]
Paul Ekman. 1957. A methodological discussion of nonverbal behavior. Journal of Psychology 43, 1 (1957), 141--149.
[23]
Martin L. Felis, Katja Mombaur, and Alain Berthoz. 2015. An optimal control approach to reconstruct human gait dynamics from kinematic data. In Proceedings of the 2015 IEEE-RAS 15th International Conference onHumanoid Robots (Humanoids’15). IEEE, Los Alamitos, CA, 1044--1051.
[24]
Martin L. Felis, Katja Mombaur, Hideki Kadone, and Alain Berthoz. 2013. Modeling and identification of emotional aspects of locomotion. Journal of Computational Science 4, 4 (2013), 255--261.
[25]
Paul Fitzpatrick, Kensuke Harada, Charles C. Kemp, Yoshio Matsumoto, Kazuhito Yokoi, and Eiichi Yoshida. 2016. Humanoids. In Springer Handbook of Robotics. Springer, 1789--1818.
[26]
Michael J. Gielniak and Andrea L. Thomaz. 2012. Enhancing interaction through exaggerated motion synthesis. In Proceedings of the 7th Annual ACM/IEEE International Conference on Human-Robot Interaction (HRI’12). ACM, New York, NY, 375--382.
[27]
Markus Häring, Nikolaus Bee, and Elisabeth André. 2011. Creation and evaluation of emotion expression with body movement, sound and eye color for humanoid robots. In Proceedings of the 2011 IEEE International Workshop on Robot and Human Communication (ROMAN’11). IEEE, Los Alamitos, CA, 204--209.
[28]
John Harris and Ehud Sharlin. 2011. Exploring the affect of abstract motion in social human-robot interaction. In Proceedings of the 2011 IEEE International Workshop on Robot and Human Communication (ROMAN’11). IEEE, Los Alamitos, CA, 441--448.
[29]
Kris Hauser, Timothy Bretl, Kensuke Harada, and Jean-Claude Latombe. 2008. Using motion primitives in probabilistic sample-based planning for humanoid robots. In Algorithmic Foundation of Robotics VII. Springer, 507--522.
[30]
Fritz Heider and Marianne Simmel. 1944. An experimental study of apparent behavior. American Journal of Psychology 57, 2 (1944), 243--259. http://www.jstor.org/stable/1416950.
[31]
Damith Herath, Christian Kroos, and Stelarc. 2016. Robots and Art: Exploring an Unlikely Symbiosis. Springer.
[32]
Guy Hoffman and Cynthia Breazeal. 2010. Effects of anticipatory perceptual simulation on practiced human-robot tasks. Autonomous Robots 28, 4 (2010), 403--423.
[33]
Eugene Hsu, Kari Pulli, and Jovan Popović. 2005. Style translation for human motion. ACM Transactions on Graphics 24, 1082--1089.
[34]
Chien-Ming Huang and Bilge Mutlu. 2014. Learning-based modeling of multimodal behaviors for humanlike robots. In Proceedings of the 2014 ACM/IEEE International Conference on Human-Robot Interaction (HRI’14). ACM, New York, NY, 57--64.
[35]
Auke Jan Ijspeert, Jun Nakanishi, Heiko Hoffmann, Peter Pastor, and Stefan Schaal. 2013. Dynamical movement primitives: Learning attractor models for motor behaviors. Neural Computation 25, 2 (2013), 328--373.
[36]
Yumeko Imamura, Ko Ayusawa, Eiichi Yoshida, and Takayuki Tanaka. 2017. Evaluation framework for passive assistive device based on humanoid experiments. International Journal of Humanoid Robotics 15, 3 (2017), 1750026.
[37]
Martin Inderbitzin, Aleksander Väljamäe, Jose Maria Blanco Calvo, Paul F. M. J. Verschure, and Ulysses Bernardet. 2011. Expression of emotional states during locomotion based on canonical parameters. In Proceedings of the 2011 IEEE International Conference on Automatic Face and Gesture Recognition and Workshops (FG’11). IEEE, Los Alamitos, CA, 809--814.
[38]
Takamune Izui, Isabelle Milleville, Sophie Sakka, and Gentiane Venture. 2015. Expressing emotions using gait of humanoid robot. In Proceedings of the 2015 24th IEEE International Symposium on Robot and Human Interactive Communication (ROMAN’15). IEEE, Los Alamitos, CA, 241--245.
[39]
Ashesh Jain, Amir R. Zamir, Silvio Savarese, and Ashutosh Saxena. 2016. Structural-RNN: Deep learning on spatio-temporal graphs. In Proceedings of the IEEE Conference on Computer Vision and Pattern Recognition. 5308--5317.
[40]
Elizabeth Jochum, Philip Millar, and David Nuñez. 2017. Sequence and chance: Design and control methods for entertainment robots. Robotics and Autonomous Systems 87 (2017), 372--380.
[41]
Jonas Jørgensen. 2017. Leveraging morphological computation for expressive movement generation in a soft robotic artwork. In Proceedings of the 4th International Conference on Movement Computing. ACM, New York, NY, 20.
[42]
Wendy Ju and Leila Takayama. 2009. Approachability: How people interpret automatic door movement as gesture. International Journal of Design 3, 2 (2009), 1--10.
[43]
Hiroko Kamide, Yasushi Mae, Koji Kawabe, Satoshi Shigemi, Masato Hirose, and Tatsuo Arai. 2012. New measurement of psychological safety for humanoid. In Proceedings of the 7th Annual ACM/IEEE International Conference on Human-Robot Interaction (HRI’12). ACM, 49--56.
[44]
M. Karg, A.-A. Samadani, R. Gorbet, K. Kuhnlenz, J. Hoey, and D. Kulic. 2013. Body movements for affective expression: A survey of automatic recognition and generation. IEEE Transactions on Affective Computing 4, 4 (2013), 341--359.
[45]
Michelle Karg, Mathias Schwimmbeck, Kolja Kühnlenz, and Martin Buss. 2010. Towards mapping emotive gait patterns from human to robot. In Proceedings of the IEEE International Workshop on Robot and Human Communication (ROMAN’10). IEEE, Los Alamitos, CA, 258--263.
[46]
Andrea Kleinsmith and Nadia Bianchi-Berthouze. 2013. Affective body expression perception and recognition: A survey. IEEE Transactions on Affective Computing 4, 1 (2013), 15--33.
[47]
Heather Knight and Reid Simmons. 2014. Expressive motion with x, y and theta: Laban effort features for mobile robots. In Proceedings of the the 23rd IEEE International Symposium on Human Interactive Communication (ROMAN’14). IEEE, Los Alamitos, CA, 267--273.
[48]
Heather Knight and Reid Simmons. 2016. Laban head-motions convey robot state: A call for robot body language. In Proceedings of the 2016 IEEE International Conference on Robotics and Automation (ICRA’16). IEEE, Los Alamitos, CA, 2881--2888.
[49]
Christian Kroos and Damith C. Herath. 2012. Evoking agency: Attention model and behavior control in a robotic art installation. Leonardo 45, 5 (2012), 401--407.
[50]
Minae Kwon, Sandy H. Huang, and Anca D. Dragan. 2018. Expressing robot incapability. In Proceedings of the 2018 ACM/IEEE International Conference on Human-Robot Interaction (HRI’18). ACM, New York, NY, 87--95.
[51]
R. Laban. 1972. The Mastery of Movement (3rd ed.). PLAYS., Boston, MA.
[52]
Caroline Larboulette and Sylvie Gibet. 2015. A review of computable expressive descriptors of human motion. In Proceedings of the 2nd International Workshop on Movement and Computing. ACM, New York, NY, 21--28.
[53]
Manfred Lau, Ziv Bar-Joseph, and James Kuffner. 2009. Modeling spatial and temporal variation in motion data. ACM Transactions on Graphics 28, 171.
[54]
Amy LaViers, Lin Bai, Masoud Bashiri, Gerald Heddy, and Yu Sheng. 2016. Abstractions for design-by-humans of heterogeneous behaviors. In Dance Notations and Robot Motion. Springer, 237--262.
[55]
Amy LaViers and Magnus Egerstedt. 2012. Style based robotic motion. In Proceedings of the 2012 American Control Conference (ACC’12). IEEE, Los Alamitos, CA, 4327--4332.
[56]
Iolanda Leite, Carlos Martinho, and Ana Paiva. 2013. Social robots for long-term interaction: A survey. International Journal of Social Robotics 5, 2 (2013), 291--308.
[57]
Angelica Lim, Tetsuya Ogata, and Hiroshi G. Okuno. 2011. Converting emotional voice to motion for robot telepresence. In Proceedings of the 2011 11th IEEE-RAS International Conference on Humanoid Robots (Humanoids’11). IEEE, Los Alamitos, CA, 472--479.
[58]
C. Karen Liu, Aaron Hertzmann, and Zoran Popović. 2005. Learning physics-based motion style with nonlinear inverse optimization. ACM Transactions on Graphics 24, 1071--1081.
[59]
C. Karen Liu and Zoran Popović. 2002. Synthesis of complex dynamic character motion from simple animations. ACM Transactions on Graphics 21, 3 (2002), 408--416.
[60]
Diana Löffler, Nina Schmidt, and Robert Tscharn. 2018. Multimodal expression of artificial emotion in social robots using color, motion and sound. In Proceedings of the 2018 ACM/IEEE International Conference on Human-Robot Interaction (HRI’18). ACM, New York, NY, 334--343.
[61]
Tamara Lorenz, Astrid Weiss, and Sandra Hirche. 2016. Synchrony and reciprocity: Key mechanisms for social companion robots in therapy and care. International Journal of Social Robotics 8, 1 (2016), 125--143.
[62]
Christoforos I. Mavrogiannis, Wil B. Thomason, and Ross A. Knepper. 2018. Social momentum: A framework for legible navigation in dynamic multi-agent environments. In Proceedings of the 2018 ACM/IEEE International Conference on Human-Robot Interaction (HRI’18). ACM, New York, NY, 361--369.
[63]
Albert Mehrabian and James A. Russell. 1974. An Approach to Environmental Psychology. MIT Press, Cambridge, MA.
[64]
Takashi Minato and Hiroshi Ishiguro. 2007. Generating natural posture in an android by mapping human posture in three-dimensional position space. In Proceedings of the 2007 IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS’07). IEEE, Los Alamitos, CA, 609--616.
[65]
Toru Nakata, Tomomasa Sato, Hiroshi Mizoguchi, and Taketoshi Mori. 1996. Synthesis of robot-to-human expressive behavior for human-robot symbiosis. In Proceedings of the 1996 IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS’96), Vol. 3. IEEE, Los Alamitos, CA, 1608--1613.
[66]
Michael Neff and Eugene Fiume. 2004. Artistically based computer generation of expressive motion. In Proceedings of the AISB 2004 Symposium on Language, Speech, and Gesture for Expressive Characters. 29--39.
[67]
Alena Neviarouskaya, Helmut Prendinger, and Mitsuru Ishizuka. 2010. User study on AffectIM, an avatar-based instant messaging system employing rule-based affect sensing from text. International Journal of Human-Computer Studies 68, 7 (2010), 432--450.
[68]
Jekaterina Novikova, Gang Ren, and Leon Watts. 2015. It’s not the way you look, it’s how you move: Validating a general scheme for robot affective behaviour. In Human-Computer Interaction. Springer, 239--258.
[69]
T. Okamoto, T. Shiratori, M. Glisson, K. Yamane, S. Kudoh, and K. Ikeuchi. 2014. Extraction of person-specific motion style based on a task model and imitation by humanoid robot. In Proceedings of the 2014 IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS’14). IEEE, Los Alamitos, CA, 1347--1354.
[70]
Nancy S. Pollard, Jessica K. Hodgins, Marcia J. Riley, and Christopher G. Atkeson. 2002. Adapting human motion for the control of a humanoid robot. In Proceedings of the IEEE International Conference on Robotics and Automation (ICRA’02), Vol. 2. IEEE, Los Alamitos, CA, 1390--1397.
[71]
Gao Qingji, Wang Kai, and Liu Haijuan. 2008. A robot emotion generation mechanism based on pad emotion space. In Proceedings of the International Conference on Intelligent Information Processing. 138--147.
[72]
Sarah M. Rabbitt, Alan E. Kazdin, and Brian Scassellati. 2015. Integrating socially assistive robotics into mental healthcare interventions: Applications and recommendations for expanded use. Clinical Psychology Review 35 (2015), 35--46.
[73]
Oscar E. Ramos, Nicolas Mansard, Olivier Stasse, Christophe Benazeth, Sovannara Hak, and Layale Saab. 2015. Dancing humanoid robots: Systematic use of OSID to compute dynamically consistent movements following a motion capture pattern. IEEE Robotics 8 Automation Magazine 22, 4 (2015), 16--26.
[74]
Marcia Riley, Ales Ude, and Christopher G. Atkeson. 2000. Methods for Motion Generation and Interaction with a Humanoid Robot: Case Studies of Dancing and Catching. Technical Report. Georgia Institute of Technology.
[75]
Liz Rincon, Enrique Coronado, Hansen Hendra, Julyando Phan, Zur Zainalkefli, and Gentiane Venture. 2018. Expressive states with a robot arm using adaptive fuzzy and robust predictive controllers. In Proceedings of the 2018 3rd International Conference on Control and Robotics Engineering (ICCRE’18). IEEE, Los Alamitos, CA, 11--15.
[76]
Philip Ross and David V. Keyson. 2007. The case of sculpting atmospheres: Towards design principles for expressive tangible interaction in control of ambient systems. Personal and Ubiquitous Computing 11, 2 (2007), 69--79.
[77]
James A. Russell. 1980. A circumplex model of affect.Journal of Personality and Social Psychology 39, 6 (1980), 1161.
[78]
Alessandra Maria Sabelli and Takayuki Kanda. 2016. Robovie as a mascot: A qualitative study for long-term presence of robots in a shopping mall. International Journal of Social Robotics 8, 2 (2016), 211--221.
[79]
Alessandra Maria Sabelli, Takayuki Kanda, and Norihiro Hagita. 2011. A conversational robot in an elderly care center: An ethnographic study. In Proceedings of the 2011 6th ACM/IEEE International Conference on Human-Robot Interaction (HRI’11). IEEE, Los Alamitos, CA, 37--44.
[80]
Martin Saerbeck and Christoph Bartneck. 2010. Perception of affect elicited by robot motion. In Proceedings of the 2010 5th ACM/IEEE International Conference on Human-Robot Interaction (HRI’10). IEEE, Los Alamitos, CA, 53--60.
[81]
A. Samadani, E. Kubica, R. Gorbet, and D. Kulić. 2013. Perception and generation of affective hand movements. International Journal of Social Robotics 5, 1 (2013), 35--51.
[82]
Brian Scassellati and Katherine M. Tsui. 2014. Co-Robots: Humans and robots operating as partners: The confluence of engineering-based robotics and human-centered application domains. In Handbook of Science and Technology Convergence. Springer, 1--10.
[83]
Megha Sharma, Dale Hildebrandt, Gem Newman, James E. Young, and Rasit Eskicioglu. 2013. Communicating affect via flight path: Exploring use of the Laban effort system for designing affective locomotion paths. In Proceedings of the 8th ACM/IEEE International Conference on Human-Robot Interaction (HRI’13). IEEE, Los Alamitos, CA, 293--300.
[84]
Sara Baber Sial, Muhammad Baber Sial, Yasar Ayaz, Syed Irtiza Ali Shah, and Aleksandar Zivanovic. 2016. Interaction of robot with humans by communicating simulated emotional states through expressive movements. Intelligent Service Robotics 9, 3 (2016), 231--255.
[85]
Paulo Sousa, João L. Oliveira, Luis Paulo Reis, and Fabien Gouyon. 2011. Humanized robot dancing: Humanoid motion retargeting based in a metrical representation of human dance styles. In Proceedings of the Portuguese Conference on Artificial Intelligence. 392--406.
[86]
Freek Stulp, Jonathan Grizou, Baptiste Busch, and Manuel Lopes. 2015. Facilitating intention prediction for humans by optimizing robot motions. In Proceedings of the 2015 IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS’15). IEEE, Los Alamitos, CA, 1249--1255.
[87]
Daniel Szafir, Bilge Mutlu, and Terrence Fong. 2014. Communication of intent in assistive free flyers. In Proceedings of the 2014 ACM/IEEE International Conference on Human-Robot Interaction (HRI’14). ACM, New York, NY, 358--365.
[88]
Leila Takayama, Doug Dooley, and Wendy Ju. 2011. Expressing thought: Improving robot readability with animation principles. In Proceedings of the 2011 6th ACM/IEEE International Conference on Human-Robot Interaction (HRI’11). IEEE, Los Alamitos, CA, 69--76.
[89]
Kazue Takayanagi, Takahiro Kirita, and Takanori Shibata. 2014. Comparison of verbal and emotional responses of elderly people with mild/moderate dementia and those with severe dementia in responses to seal robot, PARO. Frontiers in Aging Neuroscience 6 (2014), 257.
[90]
Auke Tellegen, David Watson, and Lee Anna Clark. 1999. On the dimensional and hierarchical structure of affect. Psychological Science 10, 4 (1999), 297--303.
[91]
Frank Thomas, Ollie Johnston, and Disney Animation. 1981. The Illusion of Life. Abbeville Press, New York, NY.
[92]
Lorenzo Torresani, Peggy Hackney, and Christoph Bregler. 2006. Learning to synthesize motion styles. In Proceedings of the Snowbird Learning Workshop.
[93]
Takuya Tsujimoto, Yasutake Takahashi, Shouhei Takeuchi, and Yoichiro Maeda. 2016. RNN with Russell’s circumplex model for emotion estimation and emotional gesture generation. In Proceedings of the 2016 IEEE Congress on Evolutionary Computation (CEC’16). IEEE, Los Alamitos, CA, 1427--1431.
[94]
Ales Ude, Curtis Man, Marcia Riley, and Christopher G. Atkeson. 2000. Automatic Generation of Kinematic Models for the Conversion of Human Motion Capture Data into Humanoid Robot Motion. Technical Report. Georgia Institute of Technology.
[95]
Raquel Urtasun, Pascal Glardon, Ronan Boulic, Daniel Thalmann, and Pascal Fua. 2004. Style-based motion synthesis. Computer Graphics Forum 23, 799--812.
[96]
A. J. N. Van Breemen. 2004. Bringing robots to life: Applying principles of animation to robots. In Proceedings of the Shaping Human-Robot Interaction Workshop held at CHI, Vol. 2004. 143--144.
[97]
Rik van den Brule, Ron Dotsch, Gijsbert Bijlstra, Daniel H. J. Wigboldus, and Pim Haselager. 2014. Do robot performance and behavioral style affect human trust? International Journal of Social Robotics 6, 4 (2014), 519--531.
[98]
Jeffrey John Ventrella. 1994. Disney Meets Darwin: An Evolution-Based Interface for Exploration and Design of Expressive Animated Behavior. Ph.D. Dissertation. Massachusetts Institute of Technology, Cambridge, MA.
[99]
Gentiane Venture, Hideki Kadone, Tianxiang Zhang, Julie Grèzes, Alain Berthoz, and Halim Hicheur. 2014. Recognizing emotions conveyed by human gait. International Journal of Social Robotics 6, 4 (2014), 621--632.
[100]
Zhikun Wang, Katharina Muelling, Marc Peter Deisenroth, Heni Ben Amor, David Vogt, Bernhard Scholkopf, and Jan Peters. 2013. Probabilistic movement modeling for intention inference in human-robot interaction. International Journal of Robotics Research 32, 7, 841--858.
[101]
David Watson and Lee Anna Clark. 1999. The PANAS-X: Manual for the Positive and Negative Affect Schedule-Expanded Form. University of Iowa.
[102]
M. Ersin Yumer and Niloy J. Mitra. 2016. Spectral style transfer for human motion between independent actions. ACM Transactions on Graphics 35, 4 (2016), 137.
[103]
Yue Zhao, Rong Xiong, Li Fang, and Xiaohe Dai. 2014. Generating a style-adaptive trajectory from multiple demonstrations. International Journal of Advanced Robotic Systems 11, 7 (2014), 103.
[104]
Allan Zhou and Anca D. Dragan. 2018. Cost functions for robot motion style. arXiv:1809.00092.
[105]
Allan Zhou, Dylan Hadfield-Menell, Anusha Nagabandi, and Anca D. Dragan. 2017. Expressive robot motion timing. In Proceedings of the 2017 ACM/IEEE International Conference on Human-Robot Interaction (HRI’17). ACM, New York, NY, 22--31.

Cited By

View all
  • (2024)Creating Expressive Social Robots That Convey Symbolic and Spontaneous CommunicationSensors10.3390/s2411367124:11(3671)Online publication date: 5-Jun-2024
  • (2024)Planning Socially Expressive Mobile Robot TrajectoriesSensors10.3390/s2411353324:11(3533)Online publication date: 30-May-2024
  • (2024)A survey of communicating robot learning during human-robot interactionThe International Journal of Robotics Research10.1177/02783649241281369Online publication date: 7-Oct-2024
  • Show More Cited By

Recommendations

Comments

Please enable JavaScript to view thecomments powered by Disqus.

Information & Contributors

Information

Published In

cover image ACM Transactions on Human-Robot Interaction
ACM Transactions on Human-Robot Interaction  Volume 8, Issue 4
Survey Paper and Special Issue on Representation Learning for Human and Robot Cognition
December 2019
108 pages
EISSN:2573-9522
DOI:10.1145/3372354
Issue’s Table of Contents
Permission to make digital or hard copies of all or part of this work for personal or classroom use is granted without fee provided that copies are not made or distributed for profit or commercial advantage and that copies bear this notice and the full citation on the first page. Copyrights for components of this work owned by others than ACM must be honored. Abstracting with credit is permitted. To copy otherwise, or republish, to post on servers or to redistribute to lists, requires prior specific permission and/or a fee. Request permissions from [email protected]

Publisher

Association for Computing Machinery

New York, NY, United States

Publication History

Published: 15 November 2019
Accepted: 01 July 2019
Revised: 01 May 2019
Received: 01 June 2018
Published in THRI Volume 8, Issue 4

Permissions

Request permissions for this article.

Check for updates

Author Tags

  1. Robotics
  2. affect
  3. expressive behavior
  4. motion control
  5. motion generation

Qualifiers

  • Tutorial
  • Research
  • Refereed

Contributors

Other Metrics

Bibliometrics & Citations

Bibliometrics

Article Metrics

  • Downloads (Last 12 months)702
  • Downloads (Last 6 weeks)82
Reflects downloads up to 12 Jan 2025

Other Metrics

Citations

Cited By

View all
  • (2024)Creating Expressive Social Robots That Convey Symbolic and Spontaneous CommunicationSensors10.3390/s2411367124:11(3671)Online publication date: 5-Jun-2024
  • (2024)Planning Socially Expressive Mobile Robot TrajectoriesSensors10.3390/s2411353324:11(3533)Online publication date: 30-May-2024
  • (2024)A survey of communicating robot learning during human-robot interactionThe International Journal of Robotics Research10.1177/02783649241281369Online publication date: 7-Oct-2024
  • (2024)Imitation or Innovation? Translating Features of Expressive Motion from Humans to RobotsProceedings of the 12th International Conference on Human-Agent Interaction10.1145/3687272.3688302(296-304)Online publication date: 24-Nov-2024
  • (2024)Influence of Simulation and Interactivity on Human Perceptions of a Robot During Navigation TasksACM Transactions on Human-Robot Interaction10.1145/367578413:4(1-19)Online publication date: 16-Jul-2024
  • (2024)The Case of the Curious Robot: On the Social Viability of Curious Behavior in Non-Human AgentsProceedings of the 24th ACM International Conference on Intelligent Virtual Agents10.1145/3652988.3673920(1-12)Online publication date: 16-Sep-2024
  • (2024)A Minimally Designed Audio-Animatronic RobotIEEE Transactions on Robotics10.1109/TRO.2024.341046740(3181-3198)Online publication date: 1-Jan-2024
  • (2024)Preliminary Analysis of Impression Changes by Adding a Pause to Piece Movement Motions of an Arm Robot2024 Joint 13th International Conference on Soft Computing and Intelligent Systems and 25th International Symposium on Advanced Intelligent Systems (SCIS&ISIS)10.1109/SCISISIS61014.2024.10759988(1-5)Online publication date: 9-Nov-2024
  • (2024)Exploring the Potential of Wheel-Based Mobile Motion as an Emotional Expression Modality2024 33rd IEEE International Conference on Robot and Human Interactive Communication (ROMAN)10.1109/RO-MAN60168.2024.10731351(2256-2263)Online publication date: 26-Aug-2024
  • (2024)Learning to Communicate Functional States With Nonverbal Expressions for Improved Human-Robot CollaborationIEEE Robotics and Automation Letters10.1109/LRA.2024.33840379:6(5393-5400)Online publication date: Jun-2024
  • Show More Cited By

View Options

View options

PDF

View or Download as a PDF file.

PDF

eReader

View online with eReader.

eReader

HTML Format

View this article in HTML Format.

HTML Format

Login options

Full Access

Media

Figures

Other

Tables

Share

Share

Share this Publication link

Share on social media