[go: up one dir, main page]
More Web Proxy on the site http://driver.im/ skip to main content
10.1145/2157689.2157742acmconferencesArticle/Chapter ViewAbstractPublication PageshriConference Proceedingsconference-collections
abstract

Tracking aggregate vs. individual gaze behaviors during a robot-led tour simplifies overall engagement estimates

Published: 05 March 2012 Publication History

Abstract

As an early behavioral study of what non-verbal features a robot tourguide could use to analyze a crowd, personalize an interaction and/or maintain high levels of engagement, we analyze participant gaze statistics in response to a robot tour guide's deictic gestures. There were thirty-seven participants overall split into nine groups of three to five people each. In groups with the lowest engagement levels aggregate gaze responses in response to the robot deictic gesture involved the fewest total glance shifts, least time spent looking at indicated object and no intra-participant gaze. Our diverse participants had overlapping engagement ratings within their group, and we found that a robot that tracks group rather than individual analytics could capture less noisy and often stronger trends relating gaze features to self-reported engagement scores. Thus we have found indications that aggregate group analysis captures more salient and accurate assessments of overall humans-robot interactions, even with lower resolution features.

References

[1]
Gunhee, K., et al. 2004. The Autonomous Tour-Guide Robot Jinny. (IROS '04).
[2]
Knight, H. A Savvy Robot Standup Comic: Online Learning through Audience Tracking. Workshop paper (TEI '10).
[3]
Nourbakhsh, I. R., Kunz, C., Willeke, T. 2003. The mobot museum robot installations: five year experiment. (IROS '03).
[4]
Shiomi, M. Kanda, T. Ishiguro, H. Hagita. 2010. A larger audience, please! - Encouraging people to listen to a guide robot. (HRI '10).
[5]
Thrun, S., et al. 1999. MINERVA: a second-generation museum tour-guide robot. In Proc. Int. Conf. on Robotics and Automation (ICRA '99).

Cited By

View all
  • (2015)Characterizing Task-Based Human–Robot Collaboration Safety in ManufacturingIEEE Transactions on Systems, Man, and Cybernetics: Systems10.1109/TSMC.2014.233727545:2(260-275)Online publication date: Feb-2015

Index Terms

  1. Tracking aggregate vs. individual gaze behaviors during a robot-led tour simplifies overall engagement estimates

      Recommendations

      Comments

      Please enable JavaScript to view thecomments powered by Disqus.

      Information & Contributors

      Information

      Published In

      cover image ACM Conferences
      HRI '12: Proceedings of the seventh annual ACM/IEEE international conference on Human-Robot Interaction
      March 2012
      518 pages
      ISBN:9781450310635
      DOI:10.1145/2157689

      Sponsors

      In-Cooperation

      • IEEE-RAS: Robotics and Automation

      Publisher

      Association for Computing Machinery

      New York, NY, United States

      Publication History

      Published: 05 March 2012

      Permissions

      Request permissions for this article.

      Check for updates

      Author Tags

      1. human tracking
      2. humans-robot interaction
      3. low resolution sensing
      4. social dynamics modeling

      Qualifiers

      • Abstract

      Conference

      HRI'12
      Sponsor:
      HRI'12: International Conference on Human-Robot Interaction
      March 5 - 8, 2012
      Massachusetts, Boston, USA

      Acceptance Rates

      Overall Acceptance Rate 268 of 1,124 submissions, 24%

      Contributors

      Other Metrics

      Bibliometrics & Citations

      Bibliometrics

      Article Metrics

      • Downloads (Last 12 months)4
      • Downloads (Last 6 weeks)1
      Reflects downloads up to 11 Dec 2024

      Other Metrics

      Citations

      Cited By

      View all
      • (2015)Characterizing Task-Based Human–Robot Collaboration Safety in ManufacturingIEEE Transactions on Systems, Man, and Cybernetics: Systems10.1109/TSMC.2014.233727545:2(260-275)Online publication date: Feb-2015

      View Options

      Login options

      View options

      PDF

      View or Download as a PDF file.

      PDF

      eReader

      View online with eReader.

      eReader

      Media

      Figures

      Other

      Tables

      Share

      Share

      Share this Publication link

      Share on social media