[go: up one dir, main page]
More Web Proxy on the site http://driver.im/ skip to main content
10.1145/3377325.3377497acmconferencesArticle/Chapter ViewAbstractPublication PagesiuiConference Proceedingsconference-collections
research-article

Detecting errors in pick and place procedures: detecting errors in multi-stage and sequence-constrained manual retrieve-assembly procedures

Published: 17 March 2020 Publication History

Abstract

Many human activities, such as manufacturing and assembly, are sequence-constrained procedural tasks (SPTs): they consist of a series of steps that must be executed in a specific spatial/temporal order. However, these tasks can be error prone - steps can be missed out, executed out-of-order, and repeated. The ability to automatically predict if a person is about to commit an error could greatly help in these cases. The prediction could be used, for example, to provide feedback to prevent mistakes or mitigate their effects. In this paper, we present a novel approach for real-time error prediction for multi-step sequence tasks which uses a minimum viable set of behavioural signals. We have three main contributions. The first we present an architecture for real-time error prediction based on task tracking and intent prediction. The second is to explore the effectiveness of using hand position and eye-gaze tracking for task tracking. We confirm that eye-gaze is more effective for intent prediction, hand tracking is more accurate for task tracking and that combining the two provides the best overall response. We show that using Hands and Gaze tracking data we can predict selection/placement errors with an F1 score of 97%, approximately 300ms before the error would occur. Finally, we discuss the application of this hand-gaze error detection architecture used in conjunction with head-mounted AR displays, to support industrial manual assembly.

References

[1]
Admoni, H. and Srinivasa, S. 2016. Predicting user intent through eye gaze for shared autonomy. AAAI Fall Symposium - Technical Report. FS-16-01-, (2016), 298--303.
[2]
Arthur Tang, Charles Owen, Frank Biocca, W.M. 2003. Comparative Effectiveness of Augmented Reality in Object Assembly. CHI. (2003).
[3]
Bader, S. and Aehnelt, M. 2014. Tracking Assembly Processes and Providing Assistance in Smart Factories. May 2014 (2014), 161--168.
[4]
Biguer, B. et al. 1982. The coordination of eye, head, and arm movements during reaching at a single visual target. Experimental Brain Research. 46, 2 (May 1982), 301--304.
[5]
Epelboim, J. et al. 1995. The Function of Visual Search and Memory in Sequential Looking Tasks.
[6]
Funk, M. et al. 2015. A benchmark for interactive augmented reality instructions for assembly tasks. Proceedings of the 14th International Conference on Mobile and Ubiquitous Multimedia - MUM '15. Mum (2015), 253--257.
[7]
Funk, M. et al. 2015. Stop helping me - I'm bored! Why assembly assistance needs to be adaptive. UbiComp and ISWC 2015 - Proceedings of the 2015 ACM International Joint Conference on Pervasive and Ubiquitous Computing and the Proceedings of the 2015 ACM International Symposium on Wearable Computers. (2015), 1269--1278.
[8]
Funk, M. et al. 2017. Working with augmented reality? A long-term analysis of in-situ instructions at the assembly workplace. ACM International Conference Proceeding Series. Part F1285, (2017), 222--229.
[9]
González-Díaz, I. et al. 2018. Perceptually-guided Understanding of Egocentric Video Content. Proceedings of the 2018 ACM on International Conference on Multimedia Retrieval - ICMR '18 (New York, New York, USA, 2018), 434--441.
[10]
Huang, C.-M. et al. 2015. Using gaze patterns to predict task intent in collaboration. Frontiers in Psychology. 6, (Jul. 2015), 1049.
[11]
Huang, C.-M. and Mutlu, B. 2016. Anticipatory Robot Control for Efficient Human-Robot Collaboration.
[12]
Jackson, S.L. The Design of Guided Learner-Adaptable Scaffolding in Interactive Learning Environments.
[13]
L. S. Vygotsky 1979. Mind in Society: The Development of Higher Psychological Processes.
[14]
Pelz, J. et al. 2001. The coordination of eye, head, and hand movements in a natural task. Experimental Brain Research. 139, 3 (Aug. 2001), 266--277.
[15]
Pimminger, S. et al. Low-Cost Tracking of Assembly Tasks in Industrial Environments. 86--93.
[16]
Ratwani, R.M. and Trafton, J.G. 2011. A Real-Time Eye Tracking System for Predicting and Preventing Postcompletion Errors. Human-Computer Interaction. 26, 3 (2011), 205--245.
[17]
Ravichandar, H. et al. 2016. Bayesian Human Intention Inference Through Multiple Model Filtering with Gaze-based Priors. 2016 19th International Conference on Information Fusion (FUSION). (2016), 2296--2302.
[18]
Ravichandar, H.C. et al. 2016. Learning and Predicting Sequential Tasks Using Recurrent Neural Networks and Multiple Model Filtering. aaai.org. (2016), 331--337.
[19]
Rijsbergen, V. and J., C. 1979. Information Retrieval. 2nd. Newton, MA.
[20]
Sakata, N. et al. 2006. Visual assist with a laser pointer and wearable display for remote collaboration. CollabTech. (2006), 66--71.
[21]
Sakita, K. et al. 2005. Flexible cooperation between human and robot by interpreting human intention from gaze information. 2004 IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS) (IEEE Cat. No. 04CH37566) (2005), 846--851.
[22]
Schydlo, P. et al. 2018. Anticipation in Human-Robot Cooperation: A Recurrent Neural Network Approach for Multiple Action Sequences Prediction.
[23]
Stiefmeierl, T. et al. 2006. Event-Based Activity Tracking In Work Environments. 3rd International Forum on Applied Wearable Computing. (2006), 1--10.
[24]
Wei, P. et al. 2018. Where and Why Are They Looking? Jointly Inferring Human Attention and Intentions in Complex Tasks.

Cited By

View all
  • (2024)PrISM-Observer: Intervention Agent to Help Users Perform Everyday Procedures Sensed using a SmartwatchProceedings of the 37th Annual ACM Symposium on User Interface Software and Technology10.1145/3654777.3676350(1-16)Online publication date: 13-Oct-2024
  • (2024)Communication breakdown: Gaze-based prediction of system error for AI-assisted robotic arm simulated in VRProceedings of the 2024 Symposium on Eye Tracking Research and Applications10.1145/3649902.3653339(1-7)Online publication date: 4-Jun-2024
  • (2024)Zero-Shot Learning to Enable Error Awareness in Data-Driven HRIProceedings of the 2024 ACM/IEEE International Conference on Human-Robot Interaction10.1145/3610977.3634940(592-601)Online publication date: 11-Mar-2024
  • Show More Cited By

Index Terms

  1. Detecting errors in pick and place procedures: detecting errors in multi-stage and sequence-constrained manual retrieve-assembly procedures

          Recommendations

          Comments

          Please enable JavaScript to view thecomments powered by Disqus.

          Information & Contributors

          Information

          Published In

          cover image ACM Conferences
          IUI '20: Proceedings of the 25th International Conference on Intelligent User Interfaces
          March 2020
          607 pages
          ISBN:9781450371186
          DOI:10.1145/3377325
          Permission to make digital or hard copies of all or part of this work for personal or classroom use is granted without fee provided that copies are not made or distributed for profit or commercial advantage and that copies bear this notice and the full citation on the first page. Copyrights for components of this work owned by others than the author(s) must be honored. Abstracting with credit is permitted. To copy otherwise, or republish, to post on servers or to redistribute to lists, requires prior specific permission and/or a fee. Request permissions from [email protected].

          Sponsors

          Publisher

          Association for Computing Machinery

          New York, NY, United States

          Publication History

          Published: 17 March 2020

          Permissions

          Request permissions for this article.

          Check for updates

          Author Tags

          1. error prediction
          2. human-centered design
          3. intelligent assistive systems
          4. long-short term memory
          5. manual assembly procedures
          6. user intent prediction

          Qualifiers

          • Research-article

          Funding Sources

          • European Union

          Conference

          IUI '20
          Sponsor:

          Acceptance Rates

          Overall Acceptance Rate 746 of 2,811 submissions, 27%

          Upcoming Conference

          IUI '25

          Contributors

          Other Metrics

          Bibliometrics & Citations

          Bibliometrics

          Article Metrics

          • Downloads (Last 12 months)68
          • Downloads (Last 6 weeks)12
          Reflects downloads up to 11 Dec 2024

          Other Metrics

          Citations

          Cited By

          View all
          • (2024)PrISM-Observer: Intervention Agent to Help Users Perform Everyday Procedures Sensed using a SmartwatchProceedings of the 37th Annual ACM Symposium on User Interface Software and Technology10.1145/3654777.3676350(1-16)Online publication date: 13-Oct-2024
          • (2024)Communication breakdown: Gaze-based prediction of system error for AI-assisted robotic arm simulated in VRProceedings of the 2024 Symposium on Eye Tracking Research and Applications10.1145/3649902.3653339(1-7)Online publication date: 4-Jun-2024
          • (2024)Zero-Shot Learning to Enable Error Awareness in Data-Driven HRIProceedings of the 2024 ACM/IEEE International Conference on Human-Robot Interaction10.1145/3610977.3634940(592-601)Online publication date: 11-Mar-2024
          • (2024)Effect of Changes in the Sequence of Assembly Operations on Error Rates: A Case Study From the Car Manufacturing IndustryIEEE Access10.1109/ACCESS.2024.337198012(34644-34655)Online publication date: 2024
          • (2024)How does the status of errant robot affect our desire for contact? – The moderating effect of team interdependenceErgonomics10.1080/00140139.2024.234867267:11(1683-1701)Online publication date: 23-May-2024
          • (2024)Eye-tracking support for analyzing human factors in human-robot collaboration during repetitive long-duration assembly processesProduction Engineering10.1007/s11740-024-01294-yOnline publication date: 20-Jun-2024
          • (2023)RetroLens: A Human-AI Collaborative System for Multi-step Retrosynthetic Route PlanningProceedings of the 2023 CHI Conference on Human Factors in Computing Systems10.1145/3544548.3581469(1-20)Online publication date: 19-Apr-2023
          • (2023)Assembling Method-based Toolboxes for the Implementation of Industry 4.0 Technologies in the Digital Lean Manufacturing World2023 IEEE International Conference on Engineering, Technology and Innovation (ICE/ITMC)10.1109/ICE/ITMC58018.2023.10332367(1-7)Online publication date: 19-Jun-2023
          • (2022)Gaze as an Indicator of Input Recognition ErrorsProceedings of the ACM on Human-Computer Interaction10.1145/35308836:ETRA(1-18)Online publication date: 13-May-2022
          • (2022)Modeling Human Response to Robot Errors for Timely Error Detection2022 IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS)10.1109/IROS47612.2022.9981726(676-683)Online publication date: 23-Oct-2022
          • Show More Cited By

          View Options

          Login options

          View options

          PDF

          View or Download as a PDF file.

          PDF

          eReader

          View online with eReader.

          eReader

          Media

          Figures

          Other

          Tables

          Share

          Share

          Share this Publication link

          Share on social media