[go: up one dir, main page]
More Web Proxy on the site http://driver.im/ skip to main content
10.1109/IROS45743.2020.9341119guideproceedingsArticle/Chapter ViewAbstractPublication PagesConference Proceedingsacm-pubtype
research-article

An Augmented Reality Human-Robot Physical Collaboration Interface Design for Shared, Large-Scale, Labour-Intensive Manufacturing Tasks

Published: 24 October 2020 Publication History

Abstract

This paper investigate potential use of augmented reality (AR) for physical human-robot collaboration in large-scale, labour-intensive manufacturing tasks. While it has been shown that use of AR can help increase task efficiency in teleoperative and robot programming tasks involving smaller-scale robots, its use for physical human-robot collaboration in shared workspaces and large-scale manufacturing tasks have not been well-studied. With the eventual goal of applying our AR system to collaborative aircraft body manufacturing, we compare in a user study the use of an AR interface we developed with a standard joystick for human robot collaboration in an experiment task simulating industrial carbon-fibre-reinforced-polymer manufacturing procedure. Results show that use of AR yields reduced task time and physical demand, with increased robot utilization.

References

[1]
B. Daniel, P. Korondi, and T. Thomessen, “New approach for industrial robot controller user interface,” in IECON, 2013, pp. 7831–7836.
[2]
B. Akgun, M. Cakmak, J. W. Yoo, et al., “Trajectories and keyframes for kinesthetic teaching: A human-robot interaction perspective,” in HRI, 2012, pp. 391–398.
[3]
S. Webel, U. Bockholt, T. Engelke, et al., “An augmented reality training platform for assembly and maintenance skills,” Robotics and Autonomous Systems, vol. 61, no. 4, pp. 398–403, 2013.
[4]
T. Engelke, J. Keil, P. Rojtberg, et al., “Content first: a concept for industrial augmented reality maintenance applications using mobile devices,” in ACM Multimedia Systems Conference, 2015, pp. 105–111.
[5]
S. Henderson and S. Feiner, “Exploring the benefits of augmented reality documentation for maintenance and repair,” TVCG, vol. 17, no. 10, pp. 1355–1368, 2011.
[6]
J. Chong, S. Ong, A. Nee, et al., “Robot programming using augmented reality: An interactive method for planning collision-free paths,” Robotics and Computer-Integrated Manufacturing, vol. 25, no. 3, pp. 689 – 701, 2009.
[7]
H. Fang, S. Ong, and A. Nee, “Novel ar-based interface for human-robot interaction and visualization,” Advances in Manufacturing, vol. 2, no. 4, pp. 275–288, 2014.
[8]
M. E. Walker, H. Hedayati, and D. Szafir, “Robot teleoperation with augmented reality virtual surrogates,” in HRI, 2019, pp. 202–210.
[9]
R. S. Andersen, O. Madsen, T. B. Moeslund, et al., “Projecting robot intentions into human environments,” in RO-MAN, 2016, pp. 294–301.
[10]
H. Ro, J. Byun, I. Kim, et al., “Projection-based augmented reality robot prototype with human-awareness,” in HRI, 2019, pp. 598–599.
[11]
C. Lim, J. Choi, J. Park, et al., “Interactive augmented reality system using projector-camera system and smart phone,” in ISCE, 2015, pp. 1–2.
[12]
Y. Kemmoku and T. Komuro, “Ar tabletop interface using a head-mounted projector,” in ISMAR, 2016, pp. 288–291.
[13]
R. Hanson, W. Falkenström, and M. Miettinen, “Augmented reality as a means of conveying picking information in kit preparation for mixed-model assembly,” Computers & Industrial Engineering, vol. 113, pp. 570–575, 2017.
[14]
D. Ni, A. Yew, S. Ong, et al., “Haptic and visual augmented reality interface for programming welding robots,” Advances in Manufacturing, vol. 5, no. 3, pp. 191–198, 2017.
[15]
S. Stadler, K. Kain, M. Giuliani, et al., “Augmented reality for industrial robot programmers: Workload analysis for task-based, augmented reality-supported robot control,” in RO-MAN, 2016, pp. 179–184.
[16]
C. P. Quintero, S. Li, M. K. X. J. Pan, et al., “Robot programming through augmented trajectories in augmented reality,” IROS, pp. 1838–1844, 2018.
[17]
S. Ong, A. Yew, N. Thanigaivel, et al., “Augmented reality-assisted robot programming system for industrial applications,” Robotics and Computer-Integrated Manufacturing, vol. 61, p. 101820, 2020.
[18]
J. A. Frank, M. Moorhead, and V. Kapila, “Realizing mixed-reality environments with tablets for intuitive human-robot collaboration for object manipulation tasks,” in RO-MAN, 2016, pp. 302–307.
[19]
B. Gleeson, K. MacLean, A. Haddadi, et al., “Gestures for industry intuitive human-robot communication from human observation,” in HRI, 2013, pp. 349–356.
[20]
B. C. Kress and W. J. Cummings, “Towards the ultimate mixed reality experience: Hololens display architecture choices,” in SID Symposium Digest of Technical Papers, vol. 48, no. 1, 2017, pp. 127–131.
[21]
[22]
[23]
S. G. Hart and L. E. Staveland, “Development of nasa-tlx (task load index): Results of empirical and theoretical research,” in Advances in psychology. Elsevier, 1988, vol. 52, pp. 139–183.
[24]
J. de Winter and D. Dodou, “Five-point likert items: t test versus mann–whitney–wilcoxon,” Practical Assessment, Research and Evaluation, vol. 15, 2010.
[25]
T. Klamer and S. B. Allouch, “Acceptance and use of a social robot by elderly users in a domestic environment,” in International Conference on Pervasive Computing Technologies for Healthcare, 2010, pp. 1–8.
[26]
M. Heerink, B. Krose, V. Evers, et al., “The influence of a robot’s social abilities on acceptance by elderly users,” in RO-MAN, 2006, pp. 521–526.
[27]
M. Moradi, M. Moradi, and F. Bayat, “On robot acceptance and adoption a case study,” in Conference of AI Robotics and 10th RoboCup Iran Open International Symposium, 2018, pp. 21–25.

Cited By

View all
  • (2023)Usability Evaluation of an Augmented Reality System for Collaborative Fabrication between Multiple Humans and Industrial RobotsProceedings of the 2023 ACM Symposium on Spatial User Interaction10.1145/3607822.3614528(1-10)Online publication date: 13-Oct-2023
  • (2022)Augmented Reality and Robotics: A Survey and Taxonomy for AR-enhanced Human-Robot Interaction and Robotic InterfacesProceedings of the 2022 CHI Conference on Human Factors in Computing Systems10.1145/3491102.3517719(1-33)Online publication date: 29-Apr-2022

Index Terms

  1. An Augmented Reality Human-Robot Physical Collaboration Interface Design for Shared, Large-Scale, Labour-Intensive Manufacturing Tasks
          Index terms have been assigned to the content through auto-classification.

          Recommendations

          Comments

          Please enable JavaScript to view thecomments powered by Disqus.

          Information & Contributors

          Information

          Published In

          cover image Guide Proceedings
          2020 IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS)
          Oct 2020
          5642 pages

          Publisher

          IEEE Press

          Publication History

          Published: 24 October 2020

          Qualifiers

          • Research-article

          Contributors

          Other Metrics

          Bibliometrics & Citations

          Bibliometrics

          Article Metrics

          • Downloads (Last 12 months)0
          • Downloads (Last 6 weeks)0
          Reflects downloads up to 03 Jan 2025

          Other Metrics

          Citations

          Cited By

          View all
          • (2023)Usability Evaluation of an Augmented Reality System for Collaborative Fabrication between Multiple Humans and Industrial RobotsProceedings of the 2023 ACM Symposium on Spatial User Interaction10.1145/3607822.3614528(1-10)Online publication date: 13-Oct-2023
          • (2022)Augmented Reality and Robotics: A Survey and Taxonomy for AR-enhanced Human-Robot Interaction and Robotic InterfacesProceedings of the 2022 CHI Conference on Human Factors in Computing Systems10.1145/3491102.3517719(1-33)Online publication date: 29-Apr-2022

          View Options

          View options

          Media

          Figures

          Other

          Tables

          Share

          Share

          Share this Publication link

          Share on social media