[go: up one dir, main page]
More Web Proxy on the site http://driver.im/ skip to main content
research-article

Multimodal approach to affective human-robot interaction design with children

Published: 31 October 2011 Publication History

Abstract

Two studies examined the different features of humanoid robots and the influence on children's affective behavior. The first study looked at interaction styles and general features of robots. The second study looked at how the robot's attention influences children's behavior and engagement. Through activities familiar to young children (e.g., table setting, story telling), the first study found that cooperative interaction style elicited more oculesic behavior and social engagement. The second study found that quality of attention, type of attention, and length of interaction influences affective behavior and engagement. In the quality of attention, Wizard-of-Oz (woz) elicited the most affective behavior, but automatic attention worked as well as woz when the interaction was short. The type of attention going from nonverbal to verbal attention increased children's oculesic behavior, utterance, and physiological response. Affective interactions did not seem to depend on a single mechanism, but a well-chosen confluence of technical features.

References

[1]
Anderson, P. A. 2008. Nonverbal Communication: Forms and Functions. Waveland Press, Prospect Heights, IL.
[2]
Arsenio, A. 2004. Children, humanoid robots, and caregivers. In Proceedings of the 4th International Workshop on Epigenetic Robotics, L. Berthouze, H. Kozima, C. G. Prince, G. Sandini, G. Stojanov, G. Metta, and C. Balkenius, Eds., 117, 19--26.
[3]
Bar-Cohen, Y. and Breazeal, C. L. 2003. Biologically Inspired Intelligent Robots. SPIE Press, Bellingham, WA.
[4]
Bartneck, C. and Forlizzi, J. 2004. Shaping human-robot interaction: Understanding the social aspects of intelligent robot products. In Proceedings of the Computer Human Interaction Workshop. ACM, New York, 1731--1732.
[5]
Dautenhahn, K. and Werry, I. 2002. A quantitative technique for-analyzing robot-human interactions. In Proceedings of the International Conference on Intelligent Robots and Systems. IEEE/RSJ, 1132--1138.
[6]
Gelman, S. A. and Gottfried, G. M. 1996. Children's causal explanations of animate and inanimate motion. Child Devel. 67, 1970--1987.
[7]
Hauser, K., Ng-Thow-Hing, V., and González-baños, H. 2007. Multimodal motion planning for a humanoid robot manipulation task. In Proceedings of the 13th International Symposium on Robotics Research.
[8]
Honda Motor Co., Ltd. 2000. ASIMO year 2000 model. http://world.honda.com/ASIMO/technology/spec.html.
[9]
Kanda, T., Hirano, T., Eaton, D., and Ishiguro, H. 2004. Interactive robots as social partners and peer tutors for children: A field trial. Hum.-Robot Interact. 19, 61--84.
[10]
Mutlu, B., Osman, S., Forlizzi, J., Hodgins J., and Kiesler, S. 2006. Task structure and user attributes as elements of human-robot interaction design. In Proceedings of the 15th IEEE International Symposium on Robot and Human Interactive Communication. IEEE,74--79.
[11]
Ng-Thow-Hing, V., Luo, P., and Okita, S. 2010. Synchronized gesture and speech production for humanoid robots. In Proceedings of the International Conference on Intelligent Robots and Systems (IROS'10). IEEE/RSJ, 4617--4624.
[12]
Ng-Thow-Hing, V., Thórisson, K. R., Kiran Sarvadevabhatla, R., and Wormer, J. 2009. Cognitive map architecture: Facilitation of human-robot interaction in humanoid robots. IEEE Robot. Autom. Mag. 16, 1, 55--66.
[13]
Okita, S. and Schwartz, D. L. 2006. Young children's understanding of animacy and entertainment robots. Int. J. Hum. Robot. 3, 3, 393--412.
[14]
Okita, S. Y., Ng-Thow-Hing, V., and Sarvadevabhatla, R. K. 2009. Learning together: ASIMO developing an interactive learning partnership with children. In Proceedings of the 18th IEEE International Symposium on Robot and Human Interactive Communication. IEEE, 1125--1130.
[15]
Picard, R. 1997. Affective Computing. MIT Press, Cambridge, MA.
[16]
Poh, M. Z., Swenson, N. C., and Picard, R. W. 2010. A wearable sensor for unobtrusive, long-term assessment of electrodermal activity. IEEE Trans. Biomed. Engin. 57. 5, 1243--1252.
[17]
Ray, G. B. and Floyd, K. 2006. Nonverbal expressions of liking and disliking in initial interaction: Encoding and decoding perspectives. Southern Comm. J. 71, 45--64.
[18]
Robins, B., Dautenhahn, K., Te Boekhorst, R., and Billard, A. 2004. Effects of repeated exposure to a humanoid robot on children with autism. In Designing a More Inclusive World, S. Keates, J. Clarkson, P. Langdon, and P. Robinson, Eds. Springer, 225--236.
[19]
Sarvadevabhatla, R., Ng-Thow-Hing, V., and Okita, S. Y. 2010. Extended duration human-robot interaction: Tools and analysis. In Proceedings of the 19th IEEE International Symposium on Robot and Human Interactive Communication. IEEE, 7--14.
[20]
Sarvadevabhatla, R. and Ng-Thow-Hing, V. 2009. Panoramic attention for humanoid robots. In Proceedings of the 9th IEEE/RAS International Conference on Humanoid Robots. IEEE, 215--222.
[21]
Scherer, K. R. and Oshinsky, J. 1977. Cue utilization in emotion attribution from auditory stimuli. Motiv. Emot. 1, 331--46.
[22]
Sheridan, T. B. 2002. Humans and Automation: System Design and Research Issues. John Wiley & Sons, New York, 162--165.
[23]
Wada, D., Shibata, T., Saito, T., and Tanie, K. 2002. Analysis of factors that bring mental effects to elderly people in robot assisted activity. In Proceedings of the International Conference on Intelligent Robots and Systems. IEEE, 1152--1157.
[24]
Yoshida, H. and Smith, L. B. 2008. What's in view for toddlers? Using a head camera to study visual experience. Infancy 13, 3, 229--248.

Cited By

View all
  • (2024)The Child Factor in Child–Robot Interaction: Discovering the Impact of Developmental Stage and Individual CharacteristicsInternational Journal of Social Robotics10.1007/s12369-024-01121-516:8(1879-1900)Online publication date: 14-Aug-2024
  • (2022)Dialogue breakdowns in robot-assisted L2 learningComputer Assisted Language Learning10.1080/09588221.2022.2158203(1-22)Online publication date: 19-Dec-2022
  • (2022)Children’s facial expressions during collaborative coding: Objective versus subjective performancesInternational Journal of Child-Computer Interaction10.1016/j.ijcci.2022.10053634(100536)Online publication date: Dec-2022
  • Show More Cited By

Recommendations

Comments

Please enable JavaScript to view thecomments powered by Disqus.

Information & Contributors

Information

Published In

cover image ACM Transactions on Interactive Intelligent Systems
ACM Transactions on Interactive Intelligent Systems  Volume 1, Issue 1
October 2011
150 pages
ISSN:2160-6455
EISSN:2160-6463
DOI:10.1145/2030365
Issue’s Table of Contents
Permission to make digital or hard copies of all or part of this work for personal or classroom use is granted without fee provided that copies are not made or distributed for profit or commercial advantage and that copies bear this notice and the full citation on the first page. Copyrights for components of this work owned by others than ACM must be honored. Abstracting with credit is permitted. To copy otherwise, or republish, to post on servers or to redistribute to lists, requires prior specific permission and/or a fee. Request permissions from [email protected]

Publisher

Association for Computing Machinery

New York, NY, United States

Publication History

Published: 31 October 2011
Accepted: 01 August 2011
Revised: 01 August 2011
Received: 01 December 2010
Published in TIIS Volume 1, Issue 1

Permissions

Request permissions for this article.

Check for updates

Author Tags

  1. Human-robot interaction
  2. affective interaction
  3. collaboration
  4. human-robot communication
  5. social robots
  6. young children

Qualifiers

  • Research-article
  • Research
  • Refereed

Contributors

Other Metrics

Bibliometrics & Citations

Bibliometrics

Article Metrics

  • Downloads (Last 12 months)78
  • Downloads (Last 6 weeks)10
Reflects downloads up to 02 Mar 2025

Other Metrics

Citations

Cited By

View all
  • (2024)The Child Factor in Child–Robot Interaction: Discovering the Impact of Developmental Stage and Individual CharacteristicsInternational Journal of Social Robotics10.1007/s12369-024-01121-516:8(1879-1900)Online publication date: 14-Aug-2024
  • (2022)Dialogue breakdowns in robot-assisted L2 learningComputer Assisted Language Learning10.1080/09588221.2022.2158203(1-22)Online publication date: 19-Dec-2022
  • (2022)Children’s facial expressions during collaborative coding: Objective versus subjective performancesInternational Journal of Child-Computer Interaction10.1016/j.ijcci.2022.10053634(100536)Online publication date: Dec-2022
  • (2022)Sensing technologies and child–computer interactionInternational Journal of Child-Computer Interaction10.1016/j.ijcci.2021.10033130:COnline publication date: 9-Apr-2022
  • (2020)Robot Role Design for Implementing Social Facilitation Theory in Musical Instruments PracticingProceedings of the 2020 ACM/IEEE International Conference on Human-Robot Interaction10.1145/3319502.3374787(253-260)Online publication date: 9-Mar-2020
  • (2019)Child–Robot Relationship Formation: A Narrative Review of Empirical ResearchInternational Journal of Social Robotics10.1007/s12369-019-00569-012:2(325-344)Online publication date: 3-Jul-2019
  • (2019)Interactions with an Empathic Robot Tutor in Education: Students’ Perceptions Three Years LaterArtificial Intelligence and Inclusive Education10.1007/978-981-13-8161-4_5(77-99)Online publication date: 14-Jun-2019
  • (2017)A study on the potential roles of a robot peer in socio-emotional development of childrenInternational Journal of Computational Vision and Robotics10.1504/IJCVR.2017.0834477:3(335-343)Online publication date: 1-Jan-2017
  • (2017)Robot education peers in a situated primary school study: Personalisation promotes child learningPLOS ONE10.1371/journal.pone.017812612:5(e0178126)Online publication date: 23-May-2017
  • (2017)Robots educate in style: The effect of context and non-verbal behaviour on children's perceptions of warmth and competence2017 26th IEEE International Symposium on Robot and Human Interactive Communication (RO-MAN)10.1109/ROMAN.2017.8172341(449-455)Online publication date: Aug-2017
  • Show More Cited By

View Options

Login options

Full Access

View options

PDF

View or Download as a PDF file.

PDF

eReader

View online with eReader.

eReader

Figures

Tables

Media

Share

Share

Share this Publication link

Share on social media