[go: up one dir, main page]
More Web Proxy on the site http://driver.im/ skip to main content
10.1145/1349822.1349832acmconferencesArticle/Chapter ViewAbstractPublication PageshriConference Proceedingsconference-collections
research-article

Construction and evaluation of a model of natural human motion based on motion diversity

Published: 12 March 2008 Publication History

Abstract

A natural human-robot communication is supported by a person's interpersonal behavior for a robot. The condition to elicit interpersonal behavior is thought to be related to a mechanism to support natural communication. In the present study, we hypothesize that motion diversity produced independently of a subject's intention contributes to the human-like nature of the motions of an android that closely resembles a human being. In order to verify this hypothesis, we construct a model of motion diversity through the observation of human motion, specifically, a touching motion. Psychological experiments have shown that the presence of motion diversity in android motion influences the impression toward the android.

References

[1]
B. Bodenheimer, A. V. Shleyfman, and J. K. Hodgins. The effects of noise on the perception of animated human running. In Proceedings of Eurographics Animation Workshop, pages 53--63, 1999.
[2]
T. Flash and N. Hogan. The coordination of arm movements: An experimentally confirmed mathematical model. Journal of Neuroscience, 5:1688--1703, 1985.
[3]
H. Ishiguro. Android science -toward a new cross-interdisciplinary framework. In Proceedings of the International Symposium of Robotics Research, 2005.
[4]
P. Jacob and M. Jeannerod. The motor theory of social cognition: a critique. Trends in Cognitive Sciences, 9(1):21--25, 2005.
[5]
T. Kashima and Y. Isurugi. Trajectory formation based on physiological characteristics of skeletal muscles. Biological Cybernetics, 78(6):413--422, 1998.
[6]
M. Kawato. Optimization and learning in neural networks for formation and control of coordinated movement. In D. M. D and S. Kornblum, editors, Attention and performance XIV, pages 821--849. MIT Press, 1992.
[7]
D. Matsui, T. Minato, K. F. MacDorman, and H. Ishiguro. Generating natural motion in an android by mapping human motion. In Proceedings of the IEEE/RSJ International Conference on Intelligent Robot Systems, pages 1089--1096, Edmonton, Alberta, Canada, 2005.
[8]
H. Miyamoto, E. Nakano, D. M. Wolpert, and M. Kawato. Tops (task optimization in the presence of signal-dependent noise) model. Systems and Computers in Japan, 35:48--58, 2004.
[9]
S. Nakaoka, A. Nakazawa, K. Yokoi, H. Hirukawa, and K. Ikeuchi. Generating whole body motions for a biped humanoid robot from captured human dances. In Proceedings of the 2003 IEEE International Conference on Robotics and Automation, pages 3905--3910, Taipei, Taiwan, 2003.
[10]
C. Nass, J. Steuer, and E. Tauber. Computers are social actors. In Proceedings of the ACM Conference on Human Factors in Computing Systems, pages 72--78, 1994.
[11]
K. Perlin. Real time responsive animation with personality. IEEE Transactions on Visualization and Computer Graphics, 1(1):5--15, 1995.
[12]
M. Riley, A. Ude, and C. G. Atkeson. Methods for motion generation and interaction with a humanoid robot: Case studies of dancing and catching. In Proceedings of AAAI and CMU Workshop on Interactive Robotics and Entertainment, 2000.
[13]
S. Schaal and D. Sternad. Origins and violations of the 2/3 power law in rhythmic 3d movements. Experimental Brain Research, 136:60--72, 2001.
[14]
E. Todorov and M. I. Jordan. Optimal feedback control as a theory of motor coordination. Nature Neuroscience, 5(11):1226--1235, 2002.
[15]
Y. Uno, M. Kawato, and R. Suzuki. Formation and control of optical trajectory in human multi-joint arm movement - minimim torque-change model. Biological Cybernetics, 61:89--101, 1989.

Cited By

View all
  • (2024)Charting User Experience in Physical Human–Robot InteractionACM Transactions on Human-Robot Interaction10.1145/365905813:2(1-29)Online publication date: 28-Jun-2024
  • (2023)Conventional, Heuristic and Learning-Based Robot Motion Planning: Reviewing Frameworks of Current Practical SignificanceMachines10.3390/machines1107072211:7(722)Online publication date: 7-Jul-2023
  • (2022)RoboGroove: Creating Fluid Motion for Dancing Robotic ArmsProceedings of the 8th International Conference on Movement and Computing10.1145/3537972.3537985(1-9)Online publication date: 22-Jun-2022
  • Show More Cited By

Recommendations

Comments

Please enable JavaScript to view thecomments powered by Disqus.

Information & Contributors

Information

Published In

cover image ACM Conferences
HRI '08: Proceedings of the 3rd ACM/IEEE international conference on Human robot interaction
March 2008
402 pages
ISBN:9781605580173
DOI:10.1145/1349822
Permission to make digital or hard copies of all or part of this work for personal or classroom use is granted without fee provided that copies are not made or distributed for profit or commercial advantage and that copies bear this notice and the full citation on the first page. Copyrights for components of this work owned by others than ACM must be honored. Abstracting with credit is permitted. To copy otherwise, or republish, to post on servers or to redistribute to lists, requires prior specific permission and/or a fee. Request permissions from [email protected]

Sponsors

Publisher

Association for Computing Machinery

New York, NY, United States

Publication History

Published: 12 March 2008

Permissions

Request permissions for this article.

Check for updates

Author Tags

  1. android
  2. human motion model
  3. natural motion

Qualifiers

  • Research-article

Conference

HRI '08
HRI '08: International Conference on Human Robot Interaction
March 12 - 15, 2008
Amsterdam, The Netherlands

Acceptance Rates

Overall Acceptance Rate 268 of 1,124 submissions, 24%

Contributors

Other Metrics

Bibliometrics & Citations

Bibliometrics

Article Metrics

  • Downloads (Last 12 months)5
  • Downloads (Last 6 weeks)0
Reflects downloads up to 17 Jan 2025

Other Metrics

Citations

Cited By

View all
  • (2024)Charting User Experience in Physical Human–Robot InteractionACM Transactions on Human-Robot Interaction10.1145/365905813:2(1-29)Online publication date: 28-Jun-2024
  • (2023)Conventional, Heuristic and Learning-Based Robot Motion Planning: Reviewing Frameworks of Current Practical SignificanceMachines10.3390/machines1107072211:7(722)Online publication date: 7-Jul-2023
  • (2022)RoboGroove: Creating Fluid Motion for Dancing Robotic ArmsProceedings of the 8th International Conference on Movement and Computing10.1145/3537972.3537985(1-9)Online publication date: 22-Jun-2022
  • (2018)Computational Human-Robot InteractionFoundations and Trends in Robotics10.1561/23000000494:2-3(105-223)Online publication date: 13-Dec-2018
  • (2018)Character ActorProceedings of the ACM on Interactive, Mobile, Wearable and Ubiquitous Technologies10.1145/31614071:4(1-23)Online publication date: 8-Jan-2018
  • (2018)Human-Like Motion Planning Based on Game Theoretic Decision MakingInternational Journal of Social Robotics10.1007/s12369-018-0487-2Online publication date: 11-Jul-2018
  • (2017)Learning bowing gesture with motion diversity by dynamic movement primitives2017 14th International Conference on Ubiquitous Robots and Ambient Intelligence (URAI)10.1109/URAI.2017.7992701(165-166)Online publication date: Jun-2017
  • (2012)The role of appearance and motion in action predictionPsychological Research10.1007/s00426-012-0426-z76:4(388-394)Online publication date: 28-Feb-2012
  • (2010)Perception of affect elicited by robot motionProceedings of the 5th ACM/IEEE international conference on Human-robot interaction10.5555/1734454.1734473(53-60)Online publication date: 2-Mar-2010
  • (2010)Perception of affect elicited by robot motion2010 5th ACM/IEEE International Conference on Human-Robot Interaction (HRI)10.1109/HRI.2010.5453269(53-60)Online publication date: Mar-2010
  • Show More Cited By

View Options

Login options

View options

PDF

View or Download as a PDF file.

PDF

eReader

View online with eReader.

eReader

Media

Figures

Other

Tables

Share

Share

Share this Publication link

Share on social media