[go: up one dir, main page]
More Web Proxy on the site http://driver.im/ skip to main content
10.1145/2556288.2557274acmconferencesArticle/Chapter ViewAbstractPublication PageschiConference Proceedingsconference-collections
research-article

Robot gestures make difficult tasks easier: the impact of gestures on perceived workload and task performance

Published: 26 April 2014 Publication History

Abstract

Gestures are important non-verbal signals in human communication. Research with virtual agents and robots has started to add to the scientific knowledge about gestures but many questions with respect to the use of gestures in human-computer interaction are still open. This paper investigates the influence of robot gestures on the users' perceived workload and task performance (i.e. information recall) in a direction-giving task. We conducted a 2 x 2 (robot gestures vs. no robot gestures x easy vs. difficult task) experiment. The results indicate that robot gestures increased user performance and decreased perceived workload in the difficult task but not in the easy task. Thus, robot gestures are a promising means to improve human-robot interaction particularly in challenging tasks.

References

[1]
Bergmann, K., Eyssel, F. A., and Kopp, S. A second chance to make a first impression' how appearance and nonverbal behavior affect perceived warmth and competence of virtual agents over time. M. Walker, M. Neff, A. Paiva, and Y. Nakano, Eds., Proceedings of the 12th International Conference on Intelligent Virtual Agents, Springer (2012), 126--138.
[2]
Bergmann, K., Kahl, S., and Kopp, S. Modeling the semantic coordination of speech and gesture under cognitive and linguistic constraints. In Intelligent Virtual Agents, R. Aylett, B. Krenn, C. Pelachaud, and H. Shimodaira, Eds., vol. 8108 of Lecture Notes in Computer Science. Springer Berlin Heidelberg, 2013, 203--216.
[3]
Breazeal, C., Kidd, C., Thomaz, A., Hoffman, G., and Berlin, M. Effects of nonverbal communication on efficiency and robustness in human-robot teamwork. In Intelligent Robots and Systems, 2005. (IROS 2005). 2005 IEEE/RSJ International Conference on (2005), 708--713.
[4]
Buisine, S., Abrilian, S., and Martin, J.-C. From brows to trust. Kluwer Academic Publishers, Norwell, MA, USA, 2004, ch. Evaluation of multimodal behaviour of embodied agents, 217--238.
[5]
Cassell, J. Embodied conversational agents. MIT Press, Cambridge, MA, USA, 2000, ch. Nudge nudge wink wink: elements of face-to-face conversation for embodied conversational agents, 1--27.
[6]
Chidambaram, V., Chiang, Y.-H., and Mutlu, B. Designing persuasive robots: how robots might persuade people using vocal and nonverbal cues. In Proceedings of the seventh annual ACM/IEEE international conference on Human-Robot Interaction, HRI '12, ACM (New York, NY, USA, 2012), 293--300.
[7]
Field, A. Discovering statistics using SPSS. Sage publications, 2009.
[8]
Fox, M. L., Dwyer, D. J., and Ganster, D. C. Effects of stressful job demands and control on physiological and attitudinal outcomes in a hospital setting. Academy of Management Journal 36, 2 (1993), 289--318.
[9]
Goldin-Meadow, S. Hearing Gesture: How Our Hands Help Us Think. Belknap Press of Harvard University Press, 2005.
[10]
Gu, E., and Badler, N. I. Visual attention and eye gaze during multiparty conversations with distractions. In IVA, J. Gratch, M. Young, R. Aylett, D. Ballin, and P. Olivier, Eds., vol. 4133 of Lecture Notes in Computer Science, Springer (2006), 193--204.
[11]
Häring, M., Eichberg, J., and André, E. Studies on grounding with gaze and pointing gestures in human-robot-interaction. In Social Robotics, S. Ge, O. Khatib, J.-J. Cabibihan, R. Simmons, and M.-A. Williams, Eds., vol. 7621 of Lecture Notes in Computer Science. Springer Berlin Heidelberg, 2012, 378--387.
[12]
Hart, S., and Staveland, L. Development of nasa-tlx (task load index): Results of empirical and theoretical research. In Human mental workload, P. A. Hancock and N. Meshkati (Eds.), Amsterdam: Elsevier (1988), 139--183.
[13]
Hart, S. G. Nasa-task load index (nasa-tlx); 20 years later. Proceedings of the Human Factors and Ergonomics Society Annual Meeting 50, 9 (2006), 904--908.
[14]
Hato, Y., Satake, S., Kanda, T., Imai, M., and Hagita, N. Pointing to space: modeling of deictic interaction referring to regions. In Proceedings of the Conference on Human-Robot Interaction, P. J. Hinds, H. Ishiguro, T. Kanda, and P. H. K. Jr., Eds., ACM (2010), 301--308.
[15]
Ilies, R., Dimotakis, N., and De Pater, I. E. Psychological and physiological reactions to high workloads: Implications for well-being. Personnel Psychology 63, 2 (2010), 407--436.
[16]
Iverson, J. M., Capirci, P., Longobardi, E., and Caselli, M. C. Gesturing in mother-child interaction. Cognitive Development 14 (1999), 57--75.
[17]
Kanda, T., Shiomi, M., Miyashita, Z., Ishiguro, H., and Hagita, N. A communication robot in a shopping mall. Robotics, IEEE Transactions on 26, 5 (2010), 897--913.
[18]
Kendon, A. Gesture and speech: How they interact. In Nonverbal Interaction (Sage Annual Reviews of Communication, Volume 11) J. Wiemann and R. Harrison, Eds. Sage Publications, Beverly Hills, California, 1983, 13--46.
[19]
Kendon, A. Gesture - Visible Action as Utterance. Cambridge University Press, Cambridge, 2004.
[20]
Kipp, M., Neff, M., Kipp, K. H., and Albrecht, I. Towards natural gesture synthesis: Evaluating gesture units in a data-driven approach to gesture synthesis. In Proceedings of the 7th international conference on Intelligent Virtual Agents, IVA '07, Springer-Verlag (Berlin, Heidelberg, 2007), 15--28.
[21]
Kita, S. Pointing: Where Language, Culture, and Cognition Meet. PSYCHOLOGY Press, 2003.
[22]
McNeil, N., Alibali, M., and Evans, J. The role of gesture in children's comprehension of spoken language: now they need it, now they don't. Journal of Nonverbal Behavior 24, 2 (2000), 131--150.
[23]
McNeill, D. Hand and Mind: What Gestures Reveal about Thought. University of Chicago Press, Chicago, 1992.
[24]
Miller, G. A. The magical number seven, plus or minus two: Some limits on our capacity for processing information. The Psychological Review 63, 2 (March 1956), 81--97.
[25]
Ng-Thow-Hing, V., Luo, P., and Okita, S. Y. Synchronized gesture and speech production for humanoid robots. In IROS, IEEE (2010), 4617--4624.
[26]
Okuno, Y., Kanda, T., Imai, M., Ishiguro, H., and Hagita, N. Providing route directions: design of robot's utterance, gesture, and timing. In Proceedings of the 4th ACM/IEEE international conference on Human robot interaction, HRI '09, ACM (New York, NY, USA, 2009), 53--60.
[27]
Pelachaud, C. Studies on gesture expressivity for a virtual agent. Speech Commun. 51, 7 (July 2009), 630--639.
[28]
Salem, M., Eyssel, F., Rohlfing, K., Kopp, S., and Joublin, F. To err is human(-like): Effects of robot gesture on perceived anthropomorphism and likability. International Journal of Social Robotics 5, 3 (2013), 313--323.
[29]
Steinfeld, A., Fong, T., Kaber, D., Lewis, M., Scholtz, J., Schultz, A., and Goodrich, M. Common metrics for human-robot interaction. In Proceedings of the 1st ACM SIGCHI/SIGART conference on Human-robot interaction, HRI '06, ACM (New York, NY, USA, 2006), 33--40.
[30]
Vitense, H. S., Jacko, J. A., and Emery, V. K. Multimodal feedback: an assessment of performance and mental workload. Ergonomics 46, 1--3 (2003), 68--87. 12554399.
[31]
Yoshiike, Y., Silva, P. R. S. D., and Okada, M. Mawari: A social interface to reduce the workload of the conversation. In ICSR, B. Mutlu, C. Bartneck, J. Ham, V. Evers, and T. Kanda, Eds., vol. 7072 of Lecture Notes in Computer Science, Springer (2011), 11--20.

Cited By

View all
  • (2024)ComPeer: A Generative Conversational Agent for Proactive Peer SupportProceedings of the 37th Annual ACM Symposium on User Interface Software and Technology10.1145/3654777.3676430(1-22)Online publication date: 13-Oct-2024
  • (2024)Effects of Feedback Styles on Performance and Preference for an Exercise Coach2024 33rd IEEE International Conference on Robot and Human Interactive Communication (ROMAN)10.1109/RO-MAN60168.2024.10731249(1516-1523)Online publication date: 26-Aug-2024
  • (2023)Data-Driven Communicative Behaviour Generation: A SurveyACM Transactions on Human-Robot Interaction10.1145/3609235Online publication date: 16-Aug-2023
  • Show More Cited By

Index Terms

  1. Robot gestures make difficult tasks easier: the impact of gestures on perceived workload and task performance

    Recommendations

    Comments

    Please enable JavaScript to view thecomments powered by Disqus.

    Information & Contributors

    Information

    Published In

    cover image ACM Conferences
    CHI '14: Proceedings of the SIGCHI Conference on Human Factors in Computing Systems
    April 2014
    4206 pages
    ISBN:9781450324731
    DOI:10.1145/2556288
    Permission to make digital or hard copies of all or part of this work for personal or classroom use is granted without fee provided that copies are not made or distributed for profit or commercial advantage and that copies bear this notice and the full citation on the first page. Copyrights for components of this work owned by others than ACM must be honored. Abstracting with credit is permitted. To copy otherwise, or republish, to post on servers or to redistribute to lists, requires prior specific permission and/or a fee. Request permissions from [email protected]

    Sponsors

    Publisher

    Association for Computing Machinery

    New York, NY, United States

    Publication History

    Published: 26 April 2014

    Permissions

    Request permissions for this article.

    Check for updates

    Author Tags

    1. gestures
    2. human-robot interaction
    3. perceived workload
    4. task performance

    Qualifiers

    • Research-article

    Conference

    CHI '14
    Sponsor:
    CHI '14: CHI Conference on Human Factors in Computing Systems
    April 26 - May 1, 2014
    Ontario, Toronto, Canada

    Acceptance Rates

    CHI '14 Paper Acceptance Rate 465 of 2,043 submissions, 23%;
    Overall Acceptance Rate 6,199 of 26,314 submissions, 24%

    Upcoming Conference

    CHI 2025
    ACM CHI Conference on Human Factors in Computing Systems
    April 26 - May 1, 2025
    Yokohama , Japan

    Contributors

    Other Metrics

    Bibliometrics & Citations

    Bibliometrics

    Article Metrics

    • Downloads (Last 12 months)63
    • Downloads (Last 6 weeks)8
    Reflects downloads up to 04 Jan 2025

    Other Metrics

    Citations

    Cited By

    View all
    • (2024)ComPeer: A Generative Conversational Agent for Proactive Peer SupportProceedings of the 37th Annual ACM Symposium on User Interface Software and Technology10.1145/3654777.3676430(1-22)Online publication date: 13-Oct-2024
    • (2024)Effects of Feedback Styles on Performance and Preference for an Exercise Coach2024 33rd IEEE International Conference on Robot and Human Interactive Communication (ROMAN)10.1109/RO-MAN60168.2024.10731249(1516-1523)Online publication date: 26-Aug-2024
    • (2023)Data-Driven Communicative Behaviour Generation: A SurveyACM Transactions on Human-Robot Interaction10.1145/3609235Online publication date: 16-Aug-2023
    • (2023)Nonverbal Cues in Human–Robot Interaction: A Communication Studies PerspectiveACM Transactions on Human-Robot Interaction10.1145/357016912:2(1-21)Online publication date: 15-Mar-2023
    • (2023)Reaching Across the Communication Gap: Evaluating How Augmented Reality Shared Gestural Spaces Impact Gesture and Language Usage2023 IEEE Frontiers in Education Conference (FIE)10.1109/FIE58773.2023.10343016(1-6)Online publication date: 18-Oct-2023
    • (2023)The Impacts of Social Humanoid Robot’s Nonverbal Communication on Perceived Personality TraitsInternational Journal of Human–Computer Interaction10.1080/10447318.2023.2295696(1-13)Online publication date: 27-Dec-2023
    • (2023)Deep Learning-Based Assessment of Facial Periodic Affect in Work-Like SettingsComputer Vision – ECCV 2022 Workshops10.1007/978-3-031-25072-9_20(307-322)Online publication date: 18-Feb-2023
    • (2022)The Design and Observed Effects of Robot-performed Manual Gestures: A Systematic ReviewACM Transactions on Human-Robot Interaction10.1145/354953012:1(1-62)Online publication date: 19-Jul-2022
    • (2022)Affective Robot Behavior Improves Learning in a Sorting Game2022 31st IEEE International Conference on Robot and Human Interactive Communication (RO-MAN)10.1109/RO-MAN53752.2022.9900654(436-441)Online publication date: 29-Aug-2022
    • (2022)Gesture-Based Feedback in Human-Robot Interaction for Object ManipulationTechnological Innovation for Digitalization and Virtualization10.1007/978-3-031-07520-9_12(122-132)Online publication date: 21-Jun-2022
    • Show More Cited By

    View Options

    Login options

    View options

    PDF

    View or Download as a PDF file.

    PDF

    eReader

    View online with eReader.

    eReader

    Media

    Figures

    Other

    Tables

    Share

    Share

    Share this Publication link

    Share on social media