[go: up one dir, main page]
More Web Proxy on the site http://driver.im/ skip to main content
10.1145/3316782.3321523acmotherconferencesArticle/Chapter ViewAbstractPublication PagespetraConference Proceedingsconference-collections
research-article

A multi-sensor algorithm for activity and workflow recognition in an industrial setting

Published: 05 June 2019 Publication History

Abstract

In the recent revival of human labour in industry, and the subsequent push to optimally combine the strengths of man and machine in industrial processes, there is an increased need for methods allowing machines to understand and interpret the actions of their users. An important aspect of this is the understanding and evaluation of the progress of the workflows that are to be executed. Methods for this require both an appropriate choice of sensors, as well as algorithms capable of quickly and efficiently evaluating activity and workflow progress.
In this paper we present such an algorithm, which provides activity and workflow recognition using both depth and RGB cameras as input. The algorithm's main purpose is to be used in an industrial training station, allowing novice workers to learn the necessary steps in assembling nordic ski products without the need for human supervision. We will describe how the algorithm recognizes predefined workflows in the sensor data, and present a comprehensive evaluation of the algorithm's performance on a real data recording of operators performing their work in an industrial setting. We will show that the algorithm fulfills the necessary requirements and is ready to be implemented in the training station application.

References

[1]
Jake K Aggarwal and Lu Xia. 2014. Human activity recognition from 3d data: A review. Pattern Recognition Letters 48 (2014), 70--80.
[2]
Sabrina Amrouche, Benedikt Gollan, Alois Ferscha, and Josef Heftberger. 2018. Activity Segmentation and Identification based on Eye Gaze Features. In Proceedings of the 11th Pervasive Technologies Related to Assistive Environments Conference. ACM, 75--82.
[3]
Elisabeth Behrmann and Christoph Rauwald. 2016. Mercedes Boots Robots From the Production Line. https://www.bloomberg.com/news/articles/2016-02-25/why-mercedes-is-halting-robots-reign-on-the-production-line. (2016). Accessed: 2017-02-01.
[4]
Robert Grover Brown, Patrick YC Hwang, et al. 1992. Introduction to random signals and applied Kalman filtering. Vol. 3. Wiley New York.
[5]
Andreas Bulling, Ulf Blanke, and Bernt Schiele. 2014. A tutorial on human activity recognition using body-worn inertial sensors. ACM Computing Surveys (CSUR) 46, 3 (2014), 33.
[6]
Zhenghua Chen, Qingchang Zhu, Yeng Chai Soh, and Le Zhang. 2017. Robust human activity recognition using smartphone sensors via ct-pca and online svm. IEEE Transactions on Industrial Informatics 13, 6 (2017), 3070--3080.
[7]
Olga Dergachyova, David Bouget, Arnaud Huaulme, Xavier Morandi, and Pierre Jannin. 2016. Automatic data-driven real-time segmentation and recognition of surgical workflow. International journal of computer assisted radiology and surgery 11, 6 (2016), 1081--1089.
[8]
Adnan Farooq, Ahmad Jalal, and Shaharyar Kamal. 2015. Dense RGB-D map-based human tracking and activity recognition using skin joints features and self-organizing map. KSII Transactions on Internet and Information Systems (TIIS) 9, 5 (2015), 1856--1869.
[9]
Paul M Fitts, MS Viteles, NL Barr, DR Brimhall, Glen Finch, Eric Gardner, WF Grether, WE Kellum, and SS Stevens. 1951. Human engineering for an effective air-navigation and traffic-control system, and appendixes 1 thru 3. Technical Report. OHIO STATE UNIV RESEARCH FOUNDATION COLUMBUS.
[10]
Michael Haslgrübler, Peter Fritz, Benedikt Gollan, and Alois Ferscha. 2017. Getting through: modality selection in a multi-sensor-actuator industrial IoT environment. In Proceedings of the Seventh International Conference on the Internet of Things. ACM, 21.
[11]
Mohammed Mehedi Hassan, Md Zia Uddin, Amr Mohamed, and Ahmad Almogren. 2018. A robust human activity recognition system using smartphone sensors and deep learning. Future Generation Computer Systems 81 (2018), 307--313.
[12]
Ahmad Jalal, Yeonho Kim, Shaharyar Kamal, Adnan Farooq, and Daijin Kim. 2015. Human daily activity recognition with joints plus body features representation using Kinect sensor. In Informatics, Electronics & Vision (ICIEV), 2015 International Conference on. IEEE, 1--6.
[13]
Rudolph Emil Kalman. 1960. A new approach to linear filtering and prediction problems. Journal of basic Engineering 82, 1 (1960), 35--45.
[14]
Eunju Kim, Sumi Helal, and Diane Cook. 2010. Human activity recognition and pattern discovery. IEEE Pervasive Computing/IEEE Computer Society {and} IEEE Communications Society 9, 1 (2010), 48.
[15]
Heli Koskimaki, Ville Huikari, Pekka Siirtola, Perttu Laurinen, and Juha Roning. 2009. Activity recognition using a wrist-worn inertial measurement unit: A case study for industrial assembly lines. In Control and Automation, 2009. MED'09. 17th Mediterranean Conference on. IEEE, 401--405.
[16]
Michael Kranzfelder, Armin Schneider, Adam Fiolka, Sebastian Koller, Silvano Reiser, Thomas Vogel, Dirk Wilhelm, and Hubertus Feussner. 2014. Reliability of sensor-based real-time workflow recognition in laparoscopic cholecystectomy. International journal of computer assisted radiology and surgery 9, 6 (2014), 941--948.
[17]
Marc Kurz, Gerold Hölzl, Alois Ferscha, Alberto Calatroni, Daniel Roggen, and Gerhard Tröster. 2011. Real-time transfer and evaluation of activity recognition capabilities in an opportunistic system. machine learning 1, 7 (2011), 8.
[18]
Young-Seol Lee and Sung-Bae Cho. 2014. Activity recognition with android phone using mixture-of-experts co-trained with labeled and unlabeled data. Neurocomputing 126 (2014), 106--115.
[19]
Yunji Liang, Xingshe Zhou, Bin Guo, and Zhiwen Yu. 2018. Activity Recognition Using Ubiquitous Sensors: An Overview. In Wearable Technologies: Concepts, Methodologies, Tools, and Applications. IGI Global, 199--230.
[20]
Adrien Malaisé, Pauline Maurice, Francis Colas, François Charpillet, and Serena Ivaldi. 2018. Activity Recognition With Multiple Wearable Sensors for Industrial Applications. In Advances in Computer-Human Interactions.
[21]
Yasuo Namioka, Daisuke Nakai, Kazuya Ohara, and Takuya Maekawa. 2017. Automatic Measurement of Lead Time of Repetitive Assembly Work in a Factory Using a Wearable Sensor. Journal of Information Processing 25 (2017), 901--911.
[22]
Nicolas Padoy, Tobias Blum, Seyed-Ahmad Ahmadi, Hubertus Feussner, Marie-Odile Berger, and Nassir Navab. 2012. Statistical modeling and recognition of surgical workflow. Medical image analysis 16, 3 (2012), 632--641.
[23]
Igor Pernek and Alois Ferscha. 2017. A survey of context recognition in surgery. Medical & biological engineering & computing 55, 10 (2017), 1719--1734.
[24]
Ronald Poppe. 2010. A survey on vision-based human action recognition. Image and vision computing 28, 6 (2010), 976--990.
[25]
E Protopapadakis, A Doulamis, Konstantinos Makantasis, and A Voulodimos. 2012. A semi-supervised approach for industrial workflow recognition. In Proceedings of the Second International Conference on Advanced Communications and Computation (INFOCOMP 2012), Venice, Italy. 21--26.
[26]
Daniel Roggen, Kilian Forster, Alberto Calatroni, Thomas Holleczek, Yu Fang, Gerhard Troster, Alois Ferscha, Clemens Holzmann, Andreas Riener, Paul Lukowicz, et al. 2009. OPPORTUNITY: Towards opportunistic activity and context recognition systems. In World of Wireless, Mobile and Multimedia Networks & Workshops, 2009. WoWMoM 2009. IEEE International Symposium on a. IEEE, 1--6.
[27]
Muhammad Shoaib, Ozlem Durmaz Incel, Hans Scholten, and Paul Havinga. 2018. SmokeSense: Online Activity Recognition Framework on Smartwatches. In International Conference on Mobile Computing, Applications, and Services. Springer, 106--124.
[28]
Ralf Stauder, Daniel Ostler, Michael Kranzfelder, Sebastian Koller, Hubertus Feußner, and Nassir Navab. 2016. The TUM LapChole dataset for the M2CAI 2016 workflow challenge. arXiv preprint arXiv:1610.09278 (2016).
[29]
Timo Sztyler and Heiner Stuckenschmidt. 2016. On-body localization of wearable devices: An investigation of position-aware activity recognition. In Pervasive Computing and Communications (PerCom), 2016 IEEE International Conference on. IEEE, 1--9.
[30]
Athanasios Voulodimos, Dimitrios Kosmopoulos, Georgios Vasileiou, Emmanuel Sardis, Vasileios Anagnostopoulos, Constantinos Lalos, Anastasios Doulamis, and Theodora Varvarigou. 2012. A threefold dataset for activity and workflow recognition in complex industrial environments. IEEE MultiMedia 19, 3 (2012), 42--52.
[31]
Min-Yu Wu, Tzu-Yang Chen, Kuan-Yu Chen, and Li-Chen Fu. 2016. Daily activity recognition using the informative features from skeletal and depth data. In Robotics and Automation (ICRA), 2016 IEEE International Conference on. IEEE, 1628--1633.
[32]
Xiaochen Zheng, Meiqing Wang, and Joaquín Ordieres-Meré. 2018. Comparison of Data Preprocessing Approaches for Applying Deep Learning to Human Activity Recognition in the Context of Industry 4.0. Sensors (Basel, Switzerland) 18, 7 (2018).

Cited By

View all
  • (2024)Robust Feature Representation Using Multi-Task Learning for Human Activity RecognitionSensors10.3390/s2402068124:2(681)Online publication date: 21-Jan-2024
  • (2024)An assembly sequence monitoring method based on workflow modeling for human–robot collaborative assemblyThe International Journal of Advanced Manufacturing Technology10.1007/s00170-024-13735-0133:1-2(99-114)Online publication date: 10-May-2024
  • (2021)Attention-based encoder-decoder networks for workflow recognitionMultimedia Tools and Applications10.1007/s11042-021-10633-5Online publication date: 6-Mar-2021
  • Show More Cited By

Recommendations

Comments

Please enable JavaScript to view thecomments powered by Disqus.

Information & Contributors

Information

Published In

cover image ACM Other conferences
PETRA '19: Proceedings of the 12th ACM International Conference on PErvasive Technologies Related to Assistive Environments
June 2019
655 pages
ISBN:9781450362320
DOI:10.1145/3316782
Permission to make digital or hard copies of all or part of this work for personal or classroom use is granted without fee provided that copies are not made or distributed for profit or commercial advantage and that copies bear this notice and the full citation on the first page. Copyrights for components of this work owned by others than ACM must be honored. Abstracting with credit is permitted. To copy otherwise, or republish, to post on servers or to redistribute to lists, requires prior specific permission and/or a fee. Request permissions from [email protected]

Publisher

Association for Computing Machinery

New York, NY, United States

Publication History

Published: 05 June 2019

Permissions

Request permissions for this article.

Check for updates

Author Tags

  1. activity recognition
  2. industrial assistance systems
  3. pattern recognition
  4. pervasive computing
  5. sensor fusion

Qualifiers

  • Research-article

Funding Sources

  • FFG

Conference

PETRA '19

Contributors

Other Metrics

Bibliometrics & Citations

Bibliometrics

Article Metrics

  • Downloads (Last 12 months)9
  • Downloads (Last 6 weeks)0
Reflects downloads up to 11 Dec 2024

Other Metrics

Citations

Cited By

View all
  • (2024)Robust Feature Representation Using Multi-Task Learning for Human Activity RecognitionSensors10.3390/s2402068124:2(681)Online publication date: 21-Jan-2024
  • (2024)An assembly sequence monitoring method based on workflow modeling for human–robot collaborative assemblyThe International Journal of Advanced Manufacturing Technology10.1007/s00170-024-13735-0133:1-2(99-114)Online publication date: 10-May-2024
  • (2021)Attention-based encoder-decoder networks for workflow recognitionMultimedia Tools and Applications10.1007/s11042-021-10633-5Online publication date: 6-Mar-2021
  • (2019)Human/machine/roboter: technologies for cognitive processesMensch/Maschine/Roboter: Technologien für kognitive Prozessee & i Elektrotechnik und Informationstechnik10.1007/s00502-019-00740-5Online publication date: 21-Oct-2019

View Options

Login options

View options

PDF

View or Download as a PDF file.

PDF

eReader

View online with eReader.

eReader

Media

Figures

Other

Tables

Share

Share

Share this Publication link

Share on social media