Abstract
Its significant influence on overall progress makes excavator productivity a major concern of construction project managers, particularly during initial-stage earthwork activities. The ability to track excavator performance autonomously in real time is essential to monitoring and managing earthwork effectively. A novel, vision-based autonomous excavator action recognition and productivity measurement is introduced in this study that integrates the you only watch once deep learning method to recognize excavator actions, which are used to calculate action time and average cycle time. The average cycle time is then used to calculate the productivity of excavation work. The images of the excavator action used to construct the model were extracted from a video recorded at the construction site. The algorithm recognized excavator actions with an F1 score of 87.6% and mAP value of 81.6%. The outputs were then used in the proposed framework to measure excavator productivity. Omitting the outliers/misclassifications (< 95% confidence) before calculating average action times was found to yield a cycle time accuracy (99.7%) that was significantly higher than that achieved using the unadjusted mean value (81.59%). The successful implementation of the proposed framework in this study demonstrates the feasibility of using the proposed autonomous productivity measurement in construction as a more economical, faster, and real-time measure of excavator productivity.
Similar content being viewed by others
References
Davila Delgado JM et al (2019) Robotics and automated systems in construction: understanding industry-specific challenges for adoption. J Build Eng 26:100868
Luo M, Fan H, Liu G (2021) A target-oriented DEA model for regional construction productive efficiency improvement in China. Adv Eng Inform 47:101208
Cheng M-Y, Cao M-T, JayaMendrofa AY (2021) Dynamic feature selection for accurately predicting construction productivity using symbiotic organisms search-optimized least square support vector machine. J Build Eng 35:101973
Durdyev S, Ismail S, Kandymov N (2018) Structural equation model of the factors affecting construction labor productivity. J Constr Eng Manag 144(4):04018007
Assaf SA, Al-Hejji S (2006) Causes of delay in large construction projects. Int J Project Manag 24(4):349–357
Sveikauskas L et al. (2016) Productivity growth in construction. J Constr Eng Manag 142(10):04016045
Bankvall L et al (2010) Interdependence in supply chains and projects in construction. Supply Chain Manag 15(5):385–393
Timmer MP, Inklaar R, Mahony MO (2011) Productivity and economic growth in Europe: a comparative industry perspective. Int Product Monit 21:3–23
Tran V, Tookey J (2011) Labour productivity in the New Zealand construction industry: a thorough investigation. Australas J Constr Econ Build 11(1):41–60
Li J, Greenwood D, Kassem M (2019) Blockchain in the built environment and construction industry: a systematic review, conceptual models and practical use cases. Autom Constr 102:288–307
Lentini V, Castelli F (2019) Numerical modelling and experimental monitoring of a full-scale diaphragm wall. Int J Civil Eng 17(6):659–672
Langroodi AK, Vahdatikhaki F, Doree A (2021) Activity recognition of construction equipment using fractional random forest. Autom Constr 122:103465
Jiang F et al (2022) Application of canny operator threshold adaptive segmentation algorithm combined with digital image processing in tunnel face crevice extraction. J Supercomput 78:11601–11620
Tamilarasi R, Prabu S (2021) Automated building and road classifications from hyperspectral imagery through a fully convolutional network and support vector machine. J Supercomput 77(11):13243–13261
Nikose TJ, Sonparote RS (2020) Computing dynamic across-wind response of tall buildings using artificial neural network. J Supercomput 76(5):3788–3813
Doghri W, Saddoud A, ChaariFourati L (2022) Cyber-physical systems for structural health monitoring: sensing technologies and intelligent computing. J Supercomput 78(1):766–809
Fang W et al (2017) Automated detection of workers and heavy equipment on construction sites: a convolutional neural network approach. Adv Eng Inform 2018(37):139–149
Kim H et al (2018) Detecting construction equipment using a region-based fully convolutional network and transfer learning. J Comput Civ Eng 32(2):1–15
Arif F, Khan WA (2021) Smart progress monitoring framework for building construction elements using videography–MATLAB–BIM integration. Int J Civ Eng 19(6):717–732
Golparvar-Fard M, Heydarian A, Niebles JC (2013) Vision-based action recognition of earthmoving equipment using spatio-temporal features and support vector machine classifiers. Adv Eng Inform 27(4):652–663
Gong J, Caldas CH, Gordon C (2011) Learning and classifying actions of construction workers and equipment using Bag-of-Video-Feature-Words and Bayesian network models. Adv Eng Inform 25(4):771–782
Rezazadeh Azar E, Dickinson S, McCabe B (2013) Server-customer interaction tracker: computer vision-based system to estimate dirt-loading cycles. J Constr Eng Manag 139(7):785–794
Kim J, Chi S, Seo J (2017) Interaction analysis for vision-based activity identification of earthmoving excavators and dump trucks. Autom Constr 2018(87):297–308
Roberts D, Golparvar-Fard M (2018) End-to-end vision-based detection, tracking and activity analysis of earthmoving equipment filmed at ground level. Autom Constr 2019(105):102811–102811
Kim J, Chi S (2019) Action recognition of earthmoving excavators based on sequential pattern analysis of visual features and operation cycles. Autom Constr 104(May):255–264
Köpüklü O, Wei X, Rigoll G (2019) You only watch once: a unified CNN architecture for real-time spatiotemporal action localization, p 1–28
Sherafat B et al (2019) A hybrid kinematic-acoustic system for automated activity detection of construction equipment. Sensors 19(19):4286
Rashid KM, Louis J (2020) Automated activity identification for construction equipment using motion data from articulated members. Front Built Environ 5:144
Park MW, Makhmalbaf A, Brilakis I (2011) Comparative study of vision tracking methods for tracking of construction site resources. Autom Constr 20(7):905–915
Xiao B, Lin Q, Chen Y (2021) A vision-based method for automatic tracking of construction machines at nighttime based on deep learning illumination enhancement. Autom Constr 127:103721
Cao J et al (2017) Excavation equipment recognition based on novel acoustic statistical features. IEEE Trans Cybern 47(12):4392–4404
Rashid KM, Louis J (2019) Times-series data augmentation and deep learning for construction equipment activity recognition. Adv Eng Inform 42:100944
Chen C, Zhu Z, Hammad A (2020) Automated excavators activity recognition and productivity analysis from construction site surveillance videos. Autom Constr 110:103045
Ji S et al (2013) 3D Convolutional neural networks for human action recognition. IEEE Trans Pattern Anal Mach Intell 35(1):221–231
Redmon J, Farhadi A (2017) YOLO9000: better, faster, stronger. In: Proceedings—30th IEEE conference on computer vision and pattern recognition, CVPR 2017, 2017. 2017-Janua, pp 6517–6525
Arcos-García Á, Álvarez-García JA, Soria-Morillo LM (2018) Evaluation of deep neural networks for traffic sign detection systems. Neurocomputing 316:332–344
Köpüklü O et al (2019) Resource efficient 3D convolutional neural networks
Zhang X et al (2018) ShuffleNet: an extremely efficient convolutional neural network for mobile devices. In: Proceedings of the IEEE computer society conference on computer vision and pattern recognition, pp 6848–6856
Ma N et al. Shufflenet V2: practical guidelines for efficient CNN architecture design.
Sasaki Y (2007) The truth of the F-measure. Teach Tutor mater, p 1–5
Chen WF, Liew R (2002) The civil engineering handbook
Paszke A et al (2019) PyTorch: an imperative style, high-performance deep learning library. (NeurIPS)
Author information
Authors and Affiliations
Corresponding author
Ethics declarations
Conflict of interest
All authors wish to confirm that there are no known conflicts of interest associated with this publication and there has been no significant financial support for this work that could have influenced its outcome.
Additional information
Publisher's Note
Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.
Rights and permissions
Springer Nature or its licensor holds exclusive rights to this article under a publishing agreement with the author(s) or other rightsholder(s); author self-archiving of the accepted manuscript version of this article is solely governed by the terms of such publishing agreement and applicable law.
About this article
Cite this article
Cheng, MY., Cao, MT. & Nuralim, C.K. Computer vision-based deep learning for supervising excavator operations and measuring real-time earthwork productivity. J Supercomput 79, 4468–4492 (2023). https://doi.org/10.1007/s11227-022-04803-x
Accepted:
Published:
Issue Date:
DOI: https://doi.org/10.1007/s11227-022-04803-x