Deep Learning for Weed Detection and Segmentation in Agricultural Crops Using Images Captured by an Unmanned Aerial Vehicle
"> Figure 1
<p>The study area is located in the central-western region of Brazil (<b>a</b>), in the state of Goiás (<b>b</b>), at the Instituto Federal Goiano—Campus Ceres, in the Ceres municipality (<b>c</b>). The experimental area was planted with common bean (<span class="html-italic">Phaseolus vulgaris</span>) and soybean (<span class="html-italic">Glycine max</span>) (<b>d</b>), and a flight plan was used to cover the entire experimental area (<b>e</b>).</p> "> Figure 2
<p>Training process flow for weed segmentation. Steps: (1) images labeled and resized using the Roboflow tool; (2) images augmented and trained on U-Net, YOLO, and Mask R-CNN (framework Detectron2) segmentation models; (3) validation metrics used in the trained models; (4) field testing the model with greater efficiency.</p> "> Figure 3
<p>Isolation of objects from the image set.</p> "> Figure 4
<p>Resizing of objects from the image set to 640 × 640 pixels and 512 × 512 pixels.</p> "> Figure 5
<p>Implementation of image set augmentation.</p> "> Figure 6
<p>RGB images (512 × 512) labeled for mask creation from annotated and resized RGB images (<b>a</b>) and final mask created (<b>b</b>). The white color represents the weeds, and the black color represents the absence of weeds in the image.</p> "> Figure 7
<p>Network structure of the YOLOv8 model.</p> "> Figure 8
<p>Detectron2 architecture of the Mask R-CNN model.</p> "> Figure 9
<p>Architecture of the U-Net model.</p> "> Figure 10
<p>Comparison of predicted and ground truth segmentation masks between instance segmentation algorithms and their variants: (<b>a</b>) YOLOv8s, YOLOv7, and YOLOv5s; (<b>b</b>) Mask R-CNN (Detectron2) and its different backbones.</p> "> Figure 11
<p>Training results for the instance segmentation models: (<b>a</b>) YOLOv8s, YOLOv7, and YOLOv5s in 500 epochs; (<b>b</b>) Mask R-CNN (Detectron2) and its variants in 20,000 iterations.</p> ">
Abstract
:1. Introduction
2. Material and Methods
2.1. Field Experiment
2.2. Weed Processing and Segmentation Steps
2.3. Acquisition of RGB Images
2.3.1. Dataset Annotation
2.3.2. Dataset Resizing
2.3.3. Application of the Data Augmentation Technique
2.3.4. Generation of Dataset Masks
Algorithm 1: Set Pixel to White in Images (Generate Mask) |
2.4. Models
2.4.1. You Only Look Once (YOLO)
2.4.2. Detectron2
2.4.3. U-Net
2.5. Models and Parameters
2.6. Validation Metrics
Model Performance Calculations
3. Results
3.1. Performance of Different Backbones and Model Variants
3.2. Performance of Different Training Epochs and Iterations
4. Discussion
4.1. Performance of Different Backbones and Model Variants
4.2. Performance of Different Training Epochs
5. Conclusions
Author Contributions
Funding
Data Availability Statement
Acknowledgments
Conflicts of Interest
References
- Ilniyaz, O.; Du, Q.; Shen, H.; He, W.; Feng, L.; Azadi, H.; Kurban, A.; Chen, X. Leaf area index estimation of pergola-trained vineyards in arid regions using classical and deep learning methods based on UAV-based RGB images. Comput. Electron. Agric. 2023, 207, 107723. [Google Scholar] [CrossRef]
- Peng, M.; Han, W.; Li, C.; Yao, X.; Shao, G. Modeling the daytime net primary productivity of maize at the canopy scale based on UAV multispectral imagery and machine learning. J. Clean. Prod. 2022, 367, 133041. [Google Scholar] [CrossRef]
- Barbosa, B.D.S.; Ferraz, G.A.E.S.; Costa, L.; Ampatzidis, Y.; Vijayakumar, V.; Santos, L.M.D. UAV-based coffee yield prediction utilizing feature selection and deep learning. Smart Agric. Technol. 2021, 1, 100010. [Google Scholar] [CrossRef]
- Alabi, T.R.; Abebe, A.T.; Chigeza, G.; Fowobaje, K.R. Estimation of soybean grain yield from multispectral high-resolution UAV data with machine learning models in West Africa. Remote Sens. Appl. Soc. Environ. 2022, 27, 100782. [Google Scholar] [CrossRef]
- Teshome, F.T.; Bayabil, H.K.; Hoogenboom, G.; Schaffer, B.; Singh, A.; Ampatzidis, Y. Unmanned aerial vehicle (UAV) imaging and machine learning applications for plant phenotyping. Comput. Electron. Agric. 2023, 212, 108064. [Google Scholar] [CrossRef]
- Ariza-Sentís, M.; Valente, J.; Kooistra, L.; Kramer, H.; Mücher, S. Estimation of spinach (Spinacia oleracea) seed yield with 2D UAV data and deep learning. Smart Agric. Technol. 2022, 3, 100129. [Google Scholar] [CrossRef]
- Niu, B.; Feng, Q.; Chen, B.; Ou, C.; Liu, Y.; Yang, J. HSI-TransUNet: A Segmentation Model semantics based in transformer for crop mapping from UAV hyperspectral images. Comput. Electron. Agric. 2022, 201, 107297. [Google Scholar] [CrossRef]
- Pandey, A.; Jain, K. An intelligent system for crop identification and classification from UAV images using conjugated dense convolutional neural network. Comput. Electron. Agric. 2021, 192, 106543. [Google Scholar] [CrossRef]
- Vong, N.; Conway, L.S.; Feng, A.; Zhou, J.; Kitchen, N.R.; Sudduth, K.A. Estimating and Mapping Corn Emergence Uniformity using UAV imagery and deep learning. Comput. Electron. Agric. 2022, 198, 107008. [Google Scholar] [CrossRef]
- Chen, R.; Zhang, C.; Xu, B.; Zhu, Y.; Zhao, F.; Han, S.; Yang, G.; Yang, H. Predicting Individual Apple Yield using sensing data remote from multiple UAV sources and ensemble learning. Comput. Electron. Agric. 2022, 201, 107275. [Google Scholar] [CrossRef]
- Sharma, N.; Sharma, R.; Jindal, N. Machine Learning and Deep Learning Applications-A Vision. Glob. Trans. Proceed. 2021, 2, 24–28. [Google Scholar] [CrossRef]
- Arab, A.; Chinda, B.; Medvedev, G.; Siu, W.; Guo, H.; Gu, T.; Moreno, S.; Hamarneh, G.; Ester, M.; Song, X. A fast and fully-automated deep-learning approach for accurate hemorrhage segmentation and volume quantification in non-contrast whole-head CT. Sci. Rep. 2020, 10, 19389. [Google Scholar] [CrossRef] [PubMed]
- Lopez-Granados, F.; Jurado-Exposito, M.; Peña-Barragan, J.M.; García-Torres, L. Using remote sensing for identification of late-season grass weed patches in wheat. Weed Sci. 2006, 54, 346–353. [Google Scholar] [CrossRef]
- Feng, Y.; Chen, W.; Ma, Y.; Zhang, Z.; Gao, P.; Lv, X. Cotton Seedling Detection and Counting Based on UAV Multispectral Images and Deep Learning Methods. Remote Sens. 2023, 15, 2680. [Google Scholar] [CrossRef]
- Tunca, E.; Köksal, E.S.; Özturk, E.; Akayc, H.; Taner, S.Ç. Accurate leaf area index estimation in sorghum using high-resolution UAV data and machine learning models. Phys. Chem. Earth Parts A/B/C 2024, 133, 103537. [Google Scholar] [CrossRef]
- Genze, N.; Ajekwe, R.; Güreli, Z.; Haselbeck, F.; Grieb, M.; Grimm, D.G. Deep learning-based early weed segmentation using motion blurred UAV images of sorghum fields. Comput. Electron. Agric. 2022, 202, 107388. [Google Scholar] [CrossRef]
- Mohidem, N.A.; Che’ya, N.N.; Juraimi, A.S.; Ilahi, W.F.F.; Roslim, M.H.M.; Sulaiman, N.; Saberioon, M.; Noor, N.M. How can unmanned aerial vehicles be used for detecting weeds in agricultural fields? Agriculture 2021, 11, 1004. [Google Scholar] [CrossRef]
- Ma, J.; Liu, B.; Ji, L.; Zhu, Z.; Wu, Y.; Jiao, W. Field-scale yield prediction of winter wheat under different irrigation regimes based on the dynamic fusion of multimodal UAV imagery. Int. J. Appl. Earth Obs. Geoinf. 2023, 118, 103292. [Google Scholar] [CrossRef]
- Liu, S.; Jin, X.; Bai, Y.; Wu, W.; Cui, N.; Cheng, M.; Liu, Y.; Meng, L.; Jia, X.; Nie, C.; et al. UAV multispectral images for accurate estimation of the maize LAI considering the effect of soil background. Int. J. Appl. Earth Obs. Geoinf. 2023, 121, 103383. [Google Scholar] [CrossRef]
- Demir, S.; Dedeoğlu, M.; Başayiğit, L. Yield prediction models of organic oil rose farming with agricultural unmanned aerial vehicles (UAVs) images and machine learning algorithms. Remote Sens. Appl. Soc. Environ. 2023, 33, 101131. [Google Scholar] [CrossRef]
- Jamali, M.; Bakhshandeh, E.; Yeganeh, B.; Özdoğan, M. Development of machine learning models for estimating wheat biophysical variables using satellite-based vegetation indices. Adv. Space Res. 2024, 73, 498–513. [Google Scholar] [CrossRef]
- Qu, H.; Zheng, C.; Ji, H.; Barai, K.; Zhang, Y. A fast and efficient approach to estimate wild blueberry yield using machine learning with drone photography: Flight altitude, sampling method, and model effects. Comput. Electron. Agric. 2024, 216, 108543. [Google Scholar] [CrossRef]
- Sivakumar, A.N.V.; Li, J.; Scott, S.; Psota, E.; Jhala, A.J.; Luck, J.D.; Shi, Y. Comparison of object detection and patch-based classification deep learning models on mid-to late-season weed detection in UAV imagery. Remote Sens. 2020, 12, 2136. [Google Scholar] [CrossRef]
- Deng, J.; Zhang, X.; Yang, Z.; Zhou, C.; Wang, R.; Zhang, K.; Lv, X.; Yang, L.; Wang, Z.; Li, P.; et al. Pixel-level regression for UAV hyperspectral images: Deep learning-based quantitative inverse of wheat stripe rust disease index. Comput. Electron. Agric. 2023, 215, 108434. [Google Scholar] [CrossRef]
- Casas, E.; Arbelo, M.; Moreno-Ruiz, J.A.; Hernández-Leal, P.A.; Reyes-Carlos, J.A. UAV-Based Disease Detection in Palm Groves of Phoenix canariensis Using Machine Learning and Multispectral Imagery. Remote Sens. 2023, 15, 3584. [Google Scholar] [CrossRef]
- Somerville, G.J.; Sønderskov, M.; Mathiassen, S.K.; Metcalfe, H. Spatial Modelling of within-field weed populations; a review. Agronomy 2020, 10, 1044. [Google Scholar] [CrossRef]
- Rahman, A.; Lu, Y.; Wang, H. Performance Evaluation of Deep Learning Object Detectors for Herbal Detection weeds for cotton. Smart Agric. Technol. 2022, 3, 100126. [Google Scholar] [CrossRef]
- Ajayi, O.G.; Ashi, J.; Guda, B. Performance evaluation of YOLO v5 model for automatic crop and weed classification on UAV images. Smart Agric. Technol. 2023, 5, 100231. [Google Scholar] [CrossRef]
- Wang, H.; Feng, J.; Yin, H. Improved Method for Apple Fruit Target Detection Based on YOLOv5s. Agriculture 2023, 2167. [Google Scholar] [CrossRef]
- Guo, H.; Xiao, Y.; Li, M.; Hao, F.; Zhang, X.; Sun, H.; Beurs, K.; Fu, Y.H.; He, Y. Identifying crop phenology using maize height constructed from multi-sources images. Int. J. Appl. Earth Obs. Geoinf. 2022, 13, 115. [Google Scholar] [CrossRef]
- Nasiri, A.; Omid, M.; Taheri-Garavand, A.; Jafari, A. Deep learning-based precision agriculture through weed recognition in sugar beet fields. Sustain. Comput. Inform. Syst. 2022, 35, 100759. [Google Scholar] [CrossRef]
- Gallo, I.; Rehman, A.U.; Dehkord, R.H.; Landro, N.; La Grassa, R.; Boschetti, M. Deep Object Detection of Crop Weeds: Performance of YOLOv7 on a Real Case Dataset from UAV Images. Remote Sens. 2023, 15, 539. [Google Scholar] [CrossRef]
- Butt, M.; Glas, N.; Monsuur, J.; Stoop, R.; de Keijzer, A. Application of YOLOv8 and Detectron2 for Bullet Hole Detection and Score Calculation from Shooting Cards. AI 2024, 5, 72–90. [Google Scholar] [CrossRef]
- Redmon, J.; Divvala, S.; Girshick, R.; Farhadi, A. You Only Look Once: Unified, Real-Time Object Detection. In Proceedings of the 2016 IEEE Conference on Computer Vision and Pattern Recognition (CVPR), Las Vegas, NV, USA, 27–30 June 2016; pp. 779–788. [Google Scholar]
- Lin, T.-Y.; Dollar, P.; Girshick, R.; He, K.; Hariharan, B.; Belongie, S. Feature Pyramid Networks for Object Detection. In Proceedings of the IEEE Conference on Computer Vision and Pattern Recognition (CVPR), Honolulu, HI, USA, 21–26 July 2017. [Google Scholar]
- Ronneberger, O.; Fischer, P.; Brox, T. U-Net: Convolutional Networks for Biomedical Image Segmentation. 2015. Computer Science Department and BIOSS Centre for Biological Signalling Studies, University of Freiburg, Germany. Available online: http://lmb.informatik.uni-freiburg.de/ (accessed on 2 March 2024).
- Wang, J.; Lou, Y.; Wang, W.; Liu, S.; Zhang, H.; Hui, X.; Wang, Y.; Yan, H.; Maes, W.H. A robust model for diagnosing water stress of winter wheat by combining UAV multispectral and thermal remote sensing. Agric. Water Manag. 2024, 291, 108616. [Google Scholar] [CrossRef]
- Girshick, R. Fast R-CNN. In Proceedings of the IEEE International Conference on Computer Vision (ICCV), Santiago, Chile, 7–13 December 2015; pp. 1440–1448. [Google Scholar]
- Pan, Q.; Gao, M.; Wu, P.; Yan, J.; Li, S. A Deep-Learning-Based Approach for Wheat Yellow Rust Disease Recognition from Unmanned Aerial Vehicle Images. Sensors 2021, 21, 6540. [Google Scholar] [CrossRef]
- Lin, T.-Y.; Goyal, P.; Girshick, R.; He, K.; Dollar, P. Focal Loss for Dense Object Detection. In Proceedings of the IEEE International Conference on Computer Vision, Venice, Italy, 22–29 October 2017; pp. 2980–2988. [Google Scholar]
- Ramesh, N.; Tasdizen, T. Chapter 3—Detection and segmentation in microscopy images. Comput. Vis. Pattern Recognit. 2021, 43–71. [Google Scholar] [CrossRef]
- Öztürk, Ş.; Polat, K. Chapter 13—A novel polyp segmentation approach using U-net with saliency-like feature fusion. Intell. Data-Centric Syst. 2023, 251–269. [Google Scholar] [CrossRef]
- Kattenborn, T.; Leitloff, J.; Schiefer, F.; Hinz, S. Review on Convolutional Neural Networks (CNN) in Vegetation Remote Sensing March. J. Photogramm. Remote Sens. 2021, 173, 24–49. [Google Scholar] [CrossRef]
- dos Santos, H.G.; Jacomine, P.K.T.; dos Anjos, L.H.C.; de Oliveira, V.Á.; Lumbreras, J.F.; Coelho, M.R.; de Almeida, J.A.; Filho, J.C.d.A.; de Oliveira, J.B.; Cunha, T.J.F. Sistema Brasileiro de Classificação de Solos, 5th ed.; rev. e ampl. Embrapa DF: Embrapa: Brasília, Brazil, 2018. [Google Scholar]
- Roboflow. Available online: https://roboflow.com/ (accessed on 1 March 2024).
- BBCH English. Growth Stages of Mono-and Dicotyledonous Plants. BBCH Monograph. 2001, p. 158. Available online: https://www.reterurale.it/downloads/BBCH_engl_2001.pdf (accessed on 2 November 2024).
- Overleaf. Overleaf, Online LaTex Editor. 2024. Available online: https://pt.overleaf.com/ (accessed on 4 November 2024).
- Ju, R.Y.; Cai, W. Fracture Detection in Pediatric Wrist Trauma X-ray Images Using YOLOv8 Algorithm. arXiv 2023. [Google Scholar] [CrossRef]
- Jaccard, P. The distribution of the flora in the alpine zone. 1. New Phytol. 1912, 11, 37–50. [Google Scholar] [CrossRef]
- Nnadozie, E.C.; Iloanusi, O.N.; Ani, O.A.; Yu, K. Detecting Cassava Plants Under Different Field Conditions Using UAV-Based RGB Images and Deep Learning Models. Remote Sens. 2023, 15, 2322. [Google Scholar] [CrossRef]
- Hafeez, A.; Husain, M.A.; Singh, S.P.; Chauhan, A.; Khan, M.T.; Kumar, N.; Chauhan, A.; Soni, S.K. Implementation of drone technology for farm monitoring & pesticide spraying: A review. Inf. Process. Agric. 2022, 10, 192–203. [Google Scholar] [CrossRef]
- Robert, N. Colwell Determining the prevalence of certain cereal crop diseases by means of aerial photography. Hilgardia 1956, 26, 223–286. [Google Scholar] [CrossRef]
- Cisternas, I.; Velasquez, I.; Caro, A.; Rodriguez, A. Systematic literature review of implementations of precision agriculture. Comput. Electron. Agric. 2020, 176, 105626. [Google Scholar] [CrossRef]
- Wang, H.; Fapojuwo, A.O.; Davies, R.J. A wireless sensor network for feedlot animal health monitoring. IEEE Sens. J. 2016, 16, 6433–6446. [Google Scholar] [CrossRef]
- Sportelli, M.; Apolo-Apolo, O.E.; Fontanelli, M.; Frasconi, C.; Raffaelli, M.; Peruzzi, A.; Perez-Ruiz, M. Evaluation of YOLO Object Detectors for Weed Detection in Different Turfgrass Scenarios. Appl. Sci. 2023, 13, 8502. [Google Scholar] [CrossRef]
- Niu, W.; Lei, X.; Li, H.; Wu, H.; Hu, F.; Wen, X.; Zheng, D.; Song, H. YOLOv8-ECFS: A lightweight model for weed species detection in soybean fields. Crop Prot. 2024, 184, 106847. [Google Scholar] [CrossRef]
- Reis, D.; Kupec, J.; Hong, J.; Daoudi, A. Real-Time Flying Object Detection with YOLOv8. arXiv 2023. [Google Scholar] [CrossRef]
- Shao, Y.; Guan, X.; Xuan, G.; Gao, F.; Feng, W.; Gao, G.; Wang, Q.; Huang, X.; Li, J. GTCBS-YOLOv5s: A lightweight model for weed species identification in paddy fields. Comput. Eletron. Agric. 2023, 215, 108461. [Google Scholar] [CrossRef]
- Sapkota, R.; Ahmed, D.; Karkee, M. Comparing YOLOv8 and Mask R-CNN for instance segmentation in complex orchard environments. Art. Intel. Agric. 2024, 13, 84–99. [Google Scholar] [CrossRef]
- Amogi, B.R.; Ranjan, R.; Khot, L.R. Mask R-CNN aided fruit surface temperature monitoring algorithm with edge compute enabled internet of things system for automated apple heat stress management. Inform. Process. Agric. 2023, 10, 1–9. [Google Scholar] [CrossRef]
- Habib, M.; Sekha, S.; Tannouche, A.; Ounejjar, Y. New segmentation approach for effective weed management in agriculture. Smart Agric. Technol. 2024, 8, 100505. [Google Scholar] [CrossRef]
- Zunair, H.; Ben Hamza, A. Sharp U-Net: Depthwise convolutional network for biomedical image segmentation. Comput. Biol. Med. 2021, 136, 104699. [Google Scholar] [CrossRef] [PubMed]
- Karim, M.J.; Nahiduzzaman, M.; Ahsan, M.; Haider, J. Development of an early detection and automatic targeting system for cotton weeds using an improved lightweight YOLOv8 architecture on an edge device. Knowl.-Based Sys. 2024, 300, 112204. [Google Scholar] [CrossRef]
Cultures | Month | Day | Year | Principal Growth Stage | BBCH-Identification Keys |
---|---|---|---|---|---|
Bean (Phaseolus vulgaris)/Soybean (Glycine max) | 11 | 19 | 2022 | 1: Leaf development | 10 |
Bean (Phaseolus vulgaris)/Soybean (Glycine max) | 11 | 23 | 2022 | 1: Leaf development | 13 |
Bean (Phaseolus vulgaris)/Soybean (Glycine max) | 11 | 25 | 2022 | 1: Leaf development | 19 |
Bean (Phaseolus vulgaris)/Soybean (Glycine max) | 12 | 12 | 2022 | 2: Formation of side shoots | 21 |
Bean (Phaseolus vulgaris)/Soybean (Glycine max) | 12 | 14 | 2022 | 2: Formation of side shoots | 23 |
Bean (Phaseolus vulgaris)/Soybean (Glycine max) | 12 | 19 | 2022 | 2: Formation of side shoots | 28 |
Bean (Phaseolus vulgaris)/Soybean (Glycine max) | 12 | 20 | 2022 | 2: Formation of side shoots | 28 |
Bean (Phaseolus vulgaris)/Soybean (Glycine max) | 12 | 28 | 2022 | 5: Inflorescence emergence | 51 |
Bean (Phaseolus vulgaris)/Soybean (Glycine max) | 1 | 4 | 2023 | 5: Inflorescence emergence | 57 |
Bean (Phaseolus vulgaris)/Soybean (Glycine max) | 1 | 11 | 2023 | 6: Flowering | 64 |
Bean (Phaseolus vulgaris)/Soybean (Glycine max) | 1 | 18 | 2023 | 7: Development of fruit | 73 |
Bean (Phaseolus vulgaris)/Soybean (Glycine max) | 1 | 24 | 2023 | 7: Development of fruit | 78 |
Bean (Phaseolus vulgaris)/Soybean (Glycine max) | 2 | 1 | 2023 | 8: Ripening of fruit and seed | 81 |
Bean (Phaseolus vulgaris)/Soybean (Glycine max) | 2 | 6 | 2023 | 8: Ripening of fruit and seed | 85 |
Bean (Phaseolus vulgaris)/Soybean (Glycine max) | 2 | 15 | 2023 | 8: Ripening of fruit and seed | 88 |
Characteristics | Bean Soy Dataset |
---|---|
Number of images | 793 |
Number of instances | 16,113 |
Average weeds per picture | 8.92 |
Total number of weeds | 7074 |
Number of bean plants | 4532 |
Number of soybean plants | 4507 |
Number of Images Collected by UAV | Number of Images After Preprocessing | Pixels | Resize (Stretch) | Number of Classes | Number of Images After Augmentation | Training | Validation | Testing |
---|---|---|---|---|---|---|---|---|
793 | 1886 | 4000 × 3000 | 640 × 640/ 512 × 512 | 3 | 3021 | 2270 | 370 | 381 |
Name | Parameters/Version |
---|---|
Operating System | Windows 10 |
CPU 1 | AMD Ryzen 7 6800H |
GPU 2 | NVIDIA Tesla T4 |
RAM 3 | 16 GB (8GB × 2) |
Python | V3.12 |
Pytorch | V2.1 |
OpenCV 4 | V4.9.0 |
CUDA 5 | V12.2 |
Parameters | Values |
---|---|
Optimizer | SGD 1 |
Learning rate | SGD = 1 × 10–2 |
Optimizer momentum | 0.937 |
Base weight decay | 0.0005 |
Batch size per Image | 16 |
Annotation format | Pytorch TXT |
Pretrained | MS COCO Dataset 2 |
Image Format | PNG |
Parameters | Values |
---|---|
Architectures | mask_rcnn_R_101_DC5_3x mask_rcnn_R_101_FPN_3x mask_rcnn_X_101_32x8d_FPN_3x |
Max Iteration | 20,000 |
Evaluate Period | 200 |
Learning Rate | 0.001 |
Number of Classes (Class + 1) | 4 |
Batch Size per Image | 64 |
Annotation format | COCO |
Image Format | PNG |
Parameters | Values |
---|---|
Architecture | ResNet50 |
Optimizer | SGD |
Learning Rate | SGD = 1 × 10–2 |
Batch Size per Image | 4 |
Number of Classes | 3 |
Image Format | TIFF |
Model | Precision | Recall | mAP0.5 (M) 1 | mAP 2 0.5:0.95 (M) |
---|---|---|---|---|
YOLOv8n | 0.991 | 0.989 | 0.959 | 0.956 |
YOLOv8s | 0.997 | 0.990 | 0.970 | 0.968 |
YOLOv8m | 0.994 | 0.991 | 0.974 | 0.972 |
YOLOv8l | 0.993 | 0.990 | 0.974 | 0.972 |
Model | Backbone | Precision | Recall | mAP0.5 (M) | mAP0.5 (B) 1 | mAP0.5:0.95 (M) | mAP0.5:0.95 (B) |
---|---|---|---|---|---|---|---|
YOLOv8s | CSPDarkNet53 | 0.997 | 0.990 | 0.970 | 0.970 | 0.968 | 0.965 |
YOLOv7 | CSPDarkNet53 | 0.983 | 0.981 | 0.954 | 0.954 | 0.946 | 0.944 |
YOLOv5s | CSPDarkNet53 | 0.968 | 0.954 | 0.945 | 0.929 | 0.934 | 0.904 |
Model | Backbone | Acc_weed 1 | mAP_weed | mAP Score | F1-Score |
---|---|---|---|---|---|
YOLOv8s | CSPDarkNet53 | 0.980 | 0.987 | 0.985 | 0.964 |
YOLOv7 | CSPDarkNet53 | 0.990 | 0.990 | 0.988 | 0.951 |
YOLOv5s | CSPDarkNet53 | 0.980 | 0.987 | 0.991 | 0.960 |
Number of Epochs | YOLOv8s | YOLOv7 | YOLOv5s | ||||||
---|---|---|---|---|---|---|---|---|---|
Acc | mAP0.5 (M) | F1-Score 1 | Acc | mAP0.5 (M) | F1-Score | Acc | mAP0.5 (M) | F1-Score | |
100 | 0.943 | 0.933 | 0.956 | 0.930 | 0.890 | 0.950 | 0.877 | 0.816 | 0.910 |
300 | 0.950 | 0.956 | 0.961 | 0.937 | 0.933 | 0.953 | 0.940 | 0.904 | 0.965 |
500 | 0.957 | 0.970 | 0.964 | 0.950 | 0.954 | 0.951 | 0.943 | 0.930 | 0.960 |
700 | 0.940 | 0.973 | 0.962 | 0.952 | 0.984 | 0.982 | 0.930 | 0.945 | 0.952 |
Number of Epochs | YOLOv8s | YOLOv7 | YOLOv5s | |||
---|---|---|---|---|---|---|
Acc_weed | Precision | Acc_weed | Precision | Acc_weed | Precision | |
100 | 1.000 | 0.990 | 0.990 | 0.992 | 0.980 | 0.958 |
300 | 0.970 | 0.998 | 0.990 | 0.994 | 0.980 | 0.981 |
500 | 0.980 | 0.997 | 0.990 | 0.983 | 0.980 | 0.968 |
700 | 0.990 | 0.995 | 0.989 | 0.982 | 0.990 | 0.970 |
Number of Iters | R101-FPN | X101-FPN | R101-DC5 | ||||||
---|---|---|---|---|---|---|---|---|---|
Acc | Box AP_weed 1 | Mask AP_weed | Acc | Box AP_weed | Mask AP_weed | Acc | Box AP_weed | Mask AP_weed | |
5000 | 0.986 | 0.993 | 0.890 | 0.987 | 0.992 | 0.897 | 0.989 | 0.986 | 0.892 |
20000 | 0.988 | 0.977 | 0.888 | 0.991 | 0.995 | 0.914 | 0.989 | 0.995 | 0.906 |
Model | Backbone | Max Iteration/ Epoch | AP IoU 1 = 0.50:0.95 | AP50 | AP75 | APl 2 |
---|---|---|---|---|---|---|
Mask R-CNN | R101-FPN | 20000 | 0.911 | 0.944 | 0.941 | 0.911 |
Mask R-CNN | X101-FPN | 20000 | 0.924 | 0.953 | 0.949 | 0.924 |
Mask R-CNN | R101-DC5 | 20000 | 0.933 | 0.965 | 0.960 | 0.932 |
Model | Number of Epochs | Train Acc | Train Loss | Valid Acc | Valid Loss |
---|---|---|---|---|---|
U-Net | 30 | 0.967 | 0.047 | 0.958 | 0.051 |
U-Net | 50 | 0.968 | 0.040 | 0.958 | 0.047 |
U-Net | 80 | 0.968 | 0.042 | 0.960 | 0.049 |
U-Net | 100 | 0.971 | 0.024 | 0.961 | 0.033 |
Models | Backbone | Max Iteration/ Epoch | Acc | AP0.5:0.95/mAP0.5:0.95 | AP0.5/mAP0.5 | AR 1/Recall |
---|---|---|---|---|---|---|
Mask R-CNN | R101-FPN | 20000 | 0.980 | 0.911 | 0.944 | 0.948 |
Mask R-CNN | X101-FPN | 20000 | 0.989 | 0.924 | 0.953 | 0.924 |
Mask R-CNN | R101-DC5 | 20000 | 0.991 | 0.933 | 0.965 | 0.964 |
YOLOv8s | CSPDarkNet53 | 500 | 0.957 | 0.965 | 0.970 | 0.990 |
YOLOv7 | CSPDarkNet53 | 500 | 0.950 | 0.944 | 0.954 | 0.981 |
YOLOv5s | CSPDarkNet53 | 500 | 0.943 | 0.904 | 0.945 | 0.954 |
U-Net | ResNet50 | 100 | 0.971 | - | - | - |
Disclaimer/Publisher’s Note: The statements, opinions and data contained in all publications are solely those of the individual author(s) and contributor(s) and not of MDPI and/or the editor(s). MDPI and/or the editor(s) disclaim responsibility for any injury to people or property resulting from any ideas, methods, instructions or products referred to in the content. |
© 2024 by the authors. Licensee MDPI, Basel, Switzerland. This article is an open access article distributed under the terms and conditions of the Creative Commons Attribution (CC BY) license (https://creativecommons.org/licenses/by/4.0/).
Share and Cite
Silva, J.A.O.S.; Siqueira, V.S.d.; Mesquita, M.; Vale, L.S.R.; Marques, T.d.N.B.; Silva, J.L.B.d.; Silva, M.V.d.; Lacerda, L.N.; Oliveira-Júnior, J.F.d.; Lima, J.L.M.P.d.; et al. Deep Learning for Weed Detection and Segmentation in Agricultural Crops Using Images Captured by an Unmanned Aerial Vehicle. Remote Sens. 2024, 16, 4394. https://doi.org/10.3390/rs16234394
Silva JAOS, Siqueira VSd, Mesquita M, Vale LSR, Marques TdNB, Silva JLBd, Silva MVd, Lacerda LN, Oliveira-Júnior JFd, Lima JLMPd, et al. Deep Learning for Weed Detection and Segmentation in Agricultural Crops Using Images Captured by an Unmanned Aerial Vehicle. Remote Sensing. 2024; 16(23):4394. https://doi.org/10.3390/rs16234394
Chicago/Turabian StyleSilva, Josef Augusto Oberdan Souza, Vilson Soares de Siqueira, Marcio Mesquita, Luís Sérgio Rodrigues Vale, Thiago do Nascimento Borges Marques, Jhon Lennon Bezerra da Silva, Marcos Vinícius da Silva, Lorena Nunes Lacerda, José Francisco de Oliveira-Júnior, João Luís Mendes Pedroso de Lima, and et al. 2024. "Deep Learning for Weed Detection and Segmentation in Agricultural Crops Using Images Captured by an Unmanned Aerial Vehicle" Remote Sensing 16, no. 23: 4394. https://doi.org/10.3390/rs16234394
APA StyleSilva, J. A. O. S., Siqueira, V. S. d., Mesquita, M., Vale, L. S. R., Marques, T. d. N. B., Silva, J. L. B. d., Silva, M. V. d., Lacerda, L. N., Oliveira-Júnior, J. F. d., Lima, J. L. M. P. d., & Oliveira, H. F. E. d. (2024). Deep Learning for Weed Detection and Segmentation in Agricultural Crops Using Images Captured by an Unmanned Aerial Vehicle. Remote Sensing, 16(23), 4394. https://doi.org/10.3390/rs16234394