[go: up one dir, main page]
More Web Proxy on the site http://driver.im/ Skip to main content

Advertisement

Log in

Computer vision-based deep learning for supervising excavator operations and measuring real-time earthwork productivity

  • Published:
The Journal of Supercomputing Aims and scope Submit manuscript

Abstract

Its significant influence on overall progress makes excavator productivity a major concern of construction project managers, particularly during initial-stage earthwork activities. The ability to track excavator performance autonomously in real time is essential to monitoring and managing earthwork effectively. A novel, vision-based autonomous excavator action recognition and productivity measurement is introduced in this study that integrates the you only watch once deep learning method to recognize excavator actions, which are used to calculate action time and average cycle time. The average cycle time is then used to calculate the productivity of excavation work. The images of the excavator action used to construct the model were extracted from a video recorded at the construction site. The algorithm recognized excavator actions with an F1 score of 87.6% and mAP value of 81.6%. The outputs were then used in the proposed framework to measure excavator productivity. Omitting the outliers/misclassifications (< 95% confidence) before calculating average action times was found to yield a cycle time accuracy (99.7%) that was significantly higher than that achieved using the unadjusted mean value (81.59%). The successful implementation of the proposed framework in this study demonstrates the feasibility of using the proposed autonomous productivity measurement in construction as a more economical, faster, and real-time measure of excavator productivity.

This is a preview of subscription content, log in via an institution to check access.

Access this article

Subscribe and save

Springer+ Basic
£29.99 /Month
  • Get 10 units per month
  • Download Article/Chapter or eBook
  • 1 Unit = 1 Article or 1 Chapter
  • Cancel anytime
Subscribe now

Buy Now

Price includes VAT (United Kingdom)

Instant access to the full article PDF.

Fig. 1.
Fig. 2
Fig. 3
Fig. 4
Fig. 5
Fig. 6
Fig. 7
Fig. 8
Fig. 9
Fig. 10
Fig. 11

Similar content being viewed by others

References

  1. Davila Delgado JM et al (2019) Robotics and automated systems in construction: understanding industry-specific challenges for adoption. J Build Eng 26:100868

    Article  Google Scholar 

  2. Luo M, Fan H, Liu G (2021) A target-oriented DEA model for regional construction productive efficiency improvement in China. Adv Eng Inform 47:101208

    Article  Google Scholar 

  3. Cheng M-Y, Cao M-T, JayaMendrofa AY (2021) Dynamic feature selection for accurately predicting construction productivity using symbiotic organisms search-optimized least square support vector machine. J Build Eng 35:101973

    Article  Google Scholar 

  4. Durdyev S, Ismail S, Kandymov N (2018) Structural equation model of the factors affecting construction labor productivity. J Constr Eng Manag 144(4):04018007

    Article  Google Scholar 

  5. Assaf SA, Al-Hejji S (2006) Causes of delay in large construction projects. Int J Project Manag 24(4):349–357

    Article  Google Scholar 

  6. Sveikauskas L et al. (2016) Productivity growth in construction. J Constr Eng Manag 142(10):04016045

    Article  Google Scholar 

  7. Bankvall L et al (2010) Interdependence in supply chains and projects in construction. Supply Chain Manag 15(5):385–393

    Article  Google Scholar 

  8. Timmer MP, Inklaar R, Mahony MO (2011) Productivity and economic growth in Europe: a comparative industry perspective. Int Product Monit 21:3–23

    Google Scholar 

  9. Tran V, Tookey J (2011) Labour productivity in the New Zealand construction industry: a thorough investigation. Australas J Constr Econ Build 11(1):41–60

    Google Scholar 

  10. Li J, Greenwood D, Kassem M (2019) Blockchain in the built environment and construction industry: a systematic review, conceptual models and practical use cases. Autom Constr 102:288–307

    Article  Google Scholar 

  11. Lentini V, Castelli F (2019) Numerical modelling and experimental monitoring of a full-scale diaphragm wall. Int J Civil Eng 17(6):659–672

    Article  Google Scholar 

  12. Langroodi AK, Vahdatikhaki F, Doree A (2021) Activity recognition of construction equipment using fractional random forest. Autom Constr 122:103465

    Article  Google Scholar 

  13. Jiang F et al (2022) Application of canny operator threshold adaptive segmentation algorithm combined with digital image processing in tunnel face crevice extraction. J Supercomput 78:11601–11620

    Article  Google Scholar 

  14. Tamilarasi R, Prabu S (2021) Automated building and road classifications from hyperspectral imagery through a fully convolutional network and support vector machine. J Supercomput 77(11):13243–13261

    Article  Google Scholar 

  15. Nikose TJ, Sonparote RS (2020) Computing dynamic across-wind response of tall buildings using artificial neural network. J Supercomput 76(5):3788–3813

    Article  Google Scholar 

  16. Doghri W, Saddoud A, ChaariFourati L (2022) Cyber-physical systems for structural health monitoring: sensing technologies and intelligent computing. J Supercomput 78(1):766–809

    Article  Google Scholar 

  17. Fang W et al (2017) Automated detection of workers and heavy equipment on construction sites: a convolutional neural network approach. Adv Eng Inform 2018(37):139–149

    Google Scholar 

  18. Kim H et al (2018) Detecting construction equipment using a region-based fully convolutional network and transfer learning. J Comput Civ Eng 32(2):1–15

    Article  Google Scholar 

  19. Arif F, Khan WA (2021) Smart progress monitoring framework for building construction elements using videography–MATLAB–BIM integration. Int J Civ Eng 19(6):717–732

    Article  Google Scholar 

  20. Golparvar-Fard M, Heydarian A, Niebles JC (2013) Vision-based action recognition of earthmoving equipment using spatio-temporal features and support vector machine classifiers. Adv Eng Inform 27(4):652–663

    Article  Google Scholar 

  21. Gong J, Caldas CH, Gordon C (2011) Learning and classifying actions of construction workers and equipment using Bag-of-Video-Feature-Words and Bayesian network models. Adv Eng Inform 25(4):771–782

    Article  Google Scholar 

  22. Rezazadeh Azar E, Dickinson S, McCabe B (2013) Server-customer interaction tracker: computer vision-based system to estimate dirt-loading cycles. J Constr Eng Manag 139(7):785–794

    Article  Google Scholar 

  23. Kim J, Chi S, Seo J (2017) Interaction analysis for vision-based activity identification of earthmoving excavators and dump trucks. Autom Constr 2018(87):297–308

    Google Scholar 

  24. Roberts D, Golparvar-Fard M (2018) End-to-end vision-based detection, tracking and activity analysis of earthmoving equipment filmed at ground level. Autom Constr 2019(105):102811–102811

    Google Scholar 

  25. Kim J, Chi S (2019) Action recognition of earthmoving excavators based on sequential pattern analysis of visual features and operation cycles. Autom Constr 104(May):255–264

    Article  Google Scholar 

  26. Köpüklü O, Wei X, Rigoll G (2019) You only watch once: a unified CNN architecture for real-time spatiotemporal action localization, p 1–28

  27. Sherafat B et al (2019) A hybrid kinematic-acoustic system for automated activity detection of construction equipment. Sensors 19(19):4286

    Article  Google Scholar 

  28. Rashid KM, Louis J (2020) Automated activity identification for construction equipment using motion data from articulated members. Front Built Environ 5:144

    Article  Google Scholar 

  29. Park MW, Makhmalbaf A, Brilakis I (2011) Comparative study of vision tracking methods for tracking of construction site resources. Autom Constr 20(7):905–915

    Article  Google Scholar 

  30. Xiao B, Lin Q, Chen Y (2021) A vision-based method for automatic tracking of construction machines at nighttime based on deep learning illumination enhancement. Autom Constr 127:103721

    Article  Google Scholar 

  31. Cao J et al (2017) Excavation equipment recognition based on novel acoustic statistical features. IEEE Trans Cybern 47(12):4392–4404

    Article  Google Scholar 

  32. Rashid KM, Louis J (2019) Times-series data augmentation and deep learning for construction equipment activity recognition. Adv Eng Inform 42:100944

    Article  Google Scholar 

  33. Chen C, Zhu Z, Hammad A (2020) Automated excavators activity recognition and productivity analysis from construction site surveillance videos. Autom Constr 110:103045

    Article  Google Scholar 

  34. Ji S et al (2013) 3D Convolutional neural networks for human action recognition. IEEE Trans Pattern Anal Mach Intell 35(1):221–231

    Article  Google Scholar 

  35. Redmon J, Farhadi A (2017) YOLO9000: better, faster, stronger. In: Proceedings—30th IEEE conference on computer vision and pattern recognition, CVPR 2017, 2017. 2017-Janua, pp 6517–6525

  36. Arcos-García Á, Álvarez-García JA, Soria-Morillo LM (2018) Evaluation of deep neural networks for traffic sign detection systems. Neurocomputing 316:332–344

    Article  Google Scholar 

  37. Köpüklü O et al (2019) Resource efficient 3D convolutional neural networks

  38. Zhang X et al (2018) ShuffleNet: an extremely efficient convolutional neural network for mobile devices. In: Proceedings of the IEEE computer society conference on computer vision and pattern recognition, pp 6848–6856

  39. Ma N et al. Shufflenet V2: practical guidelines for efficient CNN architecture design.

  40. Sasaki Y (2007) The truth of the F-measure. Teach Tutor mater, p 1–5

  41. Chen WF, Liew R (2002) The civil engineering handbook

  42. Paszke A et al (2019) PyTorch: an imperative style, high-performance deep learning library. (NeurIPS)

Download references

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Minh-Tu Cao.

Ethics declarations

Conflict of interest

All authors wish to confirm that there are no known conflicts of interest associated with this publication and there has been no significant financial support for this work that could have influenced its outcome.

Additional information

Publisher's Note

Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.

Rights and permissions

Springer Nature or its licensor holds exclusive rights to this article under a publishing agreement with the author(s) or other rightsholder(s); author self-archiving of the accepted manuscript version of this article is solely governed by the terms of such publishing agreement and applicable law.

Reprints and permissions

About this article

Check for updates. Verify currency and authenticity via CrossMark

Cite this article

Cheng, MY., Cao, MT. & Nuralim, C.K. Computer vision-based deep learning for supervising excavator operations and measuring real-time earthwork productivity. J Supercomput 79, 4468–4492 (2023). https://doi.org/10.1007/s11227-022-04803-x

Download citation

  • Accepted:

  • Published:

  • Issue Date:

  • DOI: https://doi.org/10.1007/s11227-022-04803-x

Keywords

Navigation