A Lightweight Model for Wheat Ear Fusarium Head Blight Detection Based on RGB Images
"> Figure 1
<p>Comparison between YOLOv4 and other state-of-the-art object detectors. (* AP refers to average accuracy. FPS refers to the number of frames processed per second. YOLOv4 is more faster than EfficientDet with the same AP value and improves YOLOv3’s AP and FPS by 10% and 12%, respectively.)</p> "> Figure 2
<p>Wheat images infected with FHB.</p> "> Figure 3
<p>The annotation results of diseased wheat ear image. (<b>a</b>) The original image. (<b>b</b>) PNG gray image.</p> "> Figure 4
<p>YOLO detection (YOLO algorithm models detection as a regression problem. It divides the image into an S × S grid and for each grid cell predicts bounding boxes, confidence for those boxes and class probabilities).</p> "> Figure 5
<p>(<b>a</b>) Standard convolution process and (<b>b</b>) Depth-wise separable convolution process.</p> "> Figure 6
<p>The proposed lightweight network structure based on the YOLOv4.</p> "> Figure 7
<p>The structure of bneck.</p> "> Figure 8
<p>The structure of PANet. (<b>a</b>) FPN backbone, (<b>b</b>) Bottom-up path augmentation, (<b>c</b>) Adaptive feature pooling, (<b>d</b>) Box branch, and (<b>e</b>) Fully-connected fusion.</p> "> Figure 9
<p>Examples of wheat disease images after image enhancement. (<b>a</b>) Original image, (<b>b</b>) horizontal flipping and cropping, (<b>c</b>) cropping and minor Gaussian blurring, and (<b>d</b>) cropping and contrast and brightness change.</p> "> Figure 10
<p>The loss values of YOLOv4 and the improved models.</p> "> Figure 11
<p>Detailed loss value between epoch 10 and 40.</p> "> Figure 12
<p>PR curves of YOLOv4 and improved models.</p> "> Figure 13
<p>Comparison between manual annotation and the proposed model. (<b>a</b>) Manual annotation and (<b>b</b>) detection results of the proposed model.</p> "> Figure 14
<p>Comparisons of FHB detection with different algorithms. (<b>a</b>) The original images. (<b>b</b>) YOLOv3. (<b>c</b>) YOLOv4. (<b>d</b>) YOLOv4-tiny. (<b>e</b>) YOLOv4 with ResNet50. (<b>f</b>) YOLOv4 with VGG. (<b>g</b>) Ours.</p> "> Figure 14 Cont.
<p>Comparisons of FHB detection with different algorithms. (<b>a</b>) The original images. (<b>b</b>) YOLOv3. (<b>c</b>) YOLOv4. (<b>d</b>) YOLOv4-tiny. (<b>e</b>) YOLOv4 with ResNet50. (<b>f</b>) YOLOv4 with VGG. (<b>g</b>) Ours.</p> "> Figure 15
<p>System architecture.</p> ">
Abstract
:1. Introduction
2. Related Work
2.1. Crop Detection Based on Deep Learning
2.2. Lightweight Model
2.3. The YOLO Algorithm
3. Materials and Methods
3.1. Data Collection and Preprocessing
3.2. Lightweight YOLOv4-Based Network Design
3.2.1. YOLOv4 Algorithm
3.2.2. MobileNet
3.2.3. The Proposed Lightweight Model
3.3. Network Training and Evaluation
3.4. Loss Function
4. Results
4.1. Loss Function of the Model
4.2. Network Parameters and PR Curve Analysis
4.3. Comparison of Different Detection Methods
4.4. FHB Detection
5. Discussion
6. Conclusions
- (1)
- This study focused solely on wheat ear FHB detection, ignoring wheat leaf FHB detection. Wheat is also susceptible to a variety of other diseases. As a result, further research of the disease types on this premise should be conducted to realize the detection of wheat leaf and other diseases;
- (2)
- Although the proposed model successfully detected FHB, incorrect detections of small objects were inevitable. In addition, from Figure 14, due to the background noise, the detection performance of the second image was not as accurate as other models. Future research should enhance the model’s detection performance for extremely small items in complex background;
- (3)
- This research focused merely on lowering the number of parameters and overlooked the differences between different edge platforms. Future studies should consider designing platform-dependent models to investigate the extended model’s generalization ability further.
Author Contributions
Funding
Data Availability Statement
Conflicts of Interest
References
- Chen, P.F. Estimation of Winter Wheat Grain Protein Content Based on Multisource Data Assimilation. Remote Sens. 2020, 12, 20. [Google Scholar] [CrossRef]
- Zhang, L.; Zhang, W.S.; Cui, Z.L.; Schmidhalter, U.; Chen, X.P. Environmental, human health, and ecosystem economic performance of long-term optimizing nitrogen management for wheat production. J. Clean Prod. 2021, 311, 11. [Google Scholar] [CrossRef]
- Brandfass, C.; Karlovsky, P. Upscaled CTAB-Based DNA Extraction and Real-Time PCR Assays for Fusarium culmorum and F. graminearum DNA in Plant Material with Reduced Sampling Error. Int. J. Mol. Sci. 2008, 9, 2306–2321. [Google Scholar] [CrossRef] [Green Version]
- Anderson, J.A. Marker-assisted selection for Fusarium head blight resistance in wheat. Int. J. Food Micro-Biol. 2007, 119, 51–53. [Google Scholar] [CrossRef] [PubMed]
- Xu, S.J.; Wang, Y.X.; Hu, J.Q.; Chen, X.R.; Qiu, Y.F.; Shi, J.R.; Wang, G.; Xu, J.H. Isolation and characterization of Bacillus amyloliquefaciens MQ01, a bifunctional biocontrol bacterium with antagonistic activity against Fusarium graminearum and biodegradation capacity of zearalenone. Food Control. 2021, 130, 10. [Google Scholar] [CrossRef]
- Ma, H.Q.; Huang, W.J.; Dong, Y.Y.; Liu, L.Y.; Guo, A.T. Using UAV-Based Hyperspectral Imagery to Detect Winter Wheat Fusarium Head Blight. Remote Sens. 2021, 13, 16. [Google Scholar] [CrossRef]
- Wegulo, S.N. Factors Influencing Deoxynivalenol Accumulation in Small Grain Cereals. Toxins 2012, 4, 1157–1180. [Google Scholar] [CrossRef]
- Barbedo, J.G.A.; Tibola, C.S.; Lima, M.I.P. Deoxynivalenol screening in wheat kernels using hyperspectral imaging. Biosyst. Eng. 2017, 155, 24–32. [Google Scholar] [CrossRef]
- Su, J.Y.; Liu, C.J.; Hu, X.P.; Xu, X.M.; Guo, L.; Chen, W.H. Spatio-temporal monitoring of wheat yellow rust using UAV multispectral imagery. Comput. Electron. Agric. 2019, 167, 10. [Google Scholar] [CrossRef]
- Liu, L.Y.; Dong, Y.Y.; Huang, W.J.; Du, X.P.; Ma, H.Q. Monitoring Wheat Fusarium Head Blight Using Unmanned Aerial Vehicle Hyperspectral Imagery. Remote Sens. 2020, 12, 19. [Google Scholar] [CrossRef]
- Reder, S.; Mund, J.P.; Albert, N.; Wassermann, L.; Miranda, L. Detection of Windthrown Tree Stems on UAV-Orthomosaics Using U-Net Convolutional Networks. Remote Sens. 2022, 14, 25. [Google Scholar] [CrossRef]
- Cui, Z.Y.; Li, Q.; Cao, Z.J.; Liu, N.Y. Dense Attention Pyramid Networks for Multi-Scale Ship Detection in SAR Images. IEEE Trans. Geosci. Remote Sens. 2019, 57, 8983–8997. [Google Scholar] [CrossRef]
- Wu, F.Y.; Duan, J.L.; Ai, P.Y.; Chen, Z.Y.; Yang, Z.; Zou, X.J. Rachis detection and three-dimensional localization of cut off point for vision-based banana robot. Comput. Electron. Agric. 2022, 198, 12. [Google Scholar] [CrossRef]
- Gu, C.Y.; Wang, D.Y.; Zhang, H.H.; Zhang, J.; Zhang, D.Y.; Liang, D. Fusion of Deep Convolution and Shallow Features to Recognize the Severity of Wheat Fusarium Head Blight. Front. Plant Sci. 2021, 11, 14. [Google Scholar] [CrossRef] [PubMed]
- Su, W.H.; Zhang, J.J.; Yang, C.; Page, R.; Szinyei, T.; Hirsch, C.D.; Steffenson, B.J. Automatic Evaluation of Wheat Resistance to Fusarium Head Blight Using Dual Mask-RCNN Deep Learning Frameworks in Computer Vision. Remote Sens. 2021, 13, 20. [Google Scholar] [CrossRef]
- Yu, D.B.; Xiao, J.; Wang, Y. Efficient Lightweight Surface Reconstruction Method from Rock-Mass Point Clouds. Remote Sens. 2022, 14, 22. [Google Scholar] [CrossRef]
- Bian, Z.; Vong, C.M.; Wong, P.K.; Wang, S. Fuzzy KNN Method With Adaptive Nearest Neighbors. IEEE Transact. Cybernet. 2020, 52, 5380–5393. [Google Scholar] [CrossRef] [PubMed]
- Cao, L.J.; Keerthi, S.S.; Ong, C.J.; Uvaraj, P.; Fu, X.J.; Lee, H.P. Developing parallel sequential minimal optimization for fast training support vector machine. Neurocomputing 2006, 70, 93–104. [Google Scholar] [CrossRef]
- Zou, B.; Li, L.Q.; Xu, Z.B.; Luo, T.; Tang, Y.Y. Generalization Performance of Fisher Linear Discriminant Based on Markov Sampling. IEEE Trans. Neural Netw. Learn. Syst. 2013, 24, 288–300. [Google Scholar]
- Luo, C.W.; Wang, Z.F.; Wang, S.B.; Zhang, J.Y.; Yu, J. Locating Facial Landmarks Using Probabilistic Random Forest. IEEE Signal Process. Lett. 2015, 22, 2324–2328. [Google Scholar] [CrossRef]
- Hernandez, S.; Lopez, J.L. Uncertainty quantification for plant disease detection using Bayesian deep learning. Appl. Soft. Comput. 2020, 96, 9. [Google Scholar] [CrossRef]
- Picon, A.; Seitz, M.; Alvarez-Gila, A.; Mohnke, P.; Ortiz-Barredo, A.; Echazarra, J. Crop conditional Convolutional Neural Networks for massive multi-crop plant disease classification over cell phone acquired images taken on real field conditions. Comput. Electron. Agric. 2019, 167, 10. [Google Scholar] [CrossRef]
- Espejo-Garcia, B.; Mylonas, N.; Athanasakos, L.; Fountas, S.; Vasilakoglou, I. Towards weeds identification assistance through transfer learning. Comput. Electron. Agric. 2020, 171, 10. [Google Scholar] [CrossRef]
- Chen, J.D.; Chen, J.X.; Zhang, D.F.; Sun, Y.D.; Nanehkaran, Y.A. Using deep transfer learning for image-based plant disease identification. Comput. Electron. Agric. 2020, 173, 11. [Google Scholar] [CrossRef]
- Xiao, Y.X.; Dong, Y.Y.; Huang, W.J.; Liu, L.Y.; Ma, H.Q. Wheat Fusarium Head Blight Detection Using UAV-Based Spectral and Texture Features in Optimal Window Size. Remote Sens. 2021, 13, 19. [Google Scholar] [CrossRef]
- Li, Y.S.; Xie, W.Y.; Li, H.Q. Hyperspectral image reconstruction by deep convolutional neural network for classification. Pattern Recognit. 2017, 63, 371–383. [Google Scholar] [CrossRef]
- Jin, X.; Jie, L.; Wang, S.; Qi, H.J.; Li, S.W. Classifying Wheat Hyperspectral Pixels of Healthy Heads and Fusarium Head Blight Disease Using a Deep Neural Network in the Wild Field. Remote Sens. 2018, 10, 20. [Google Scholar] [CrossRef] [Green Version]
- Singh, P.; Verma, V.K.; Rai, P.; Namboodiri, V.P. Acceleration of Deep Convolutional Neural Networks Using Adaptive Filter Pruning. IEEE J. Sel. Top. Signal Process. 2020, 14, 838–847. [Google Scholar] [CrossRef]
- Wu, J.F.; Hua, Y.Z.; Yang, S.Y.; Qin, H.S.; Qin, H.B. Speech Enhancement Using Generative Adversarial Network by Dis-tilling Knowledge from Statistical Method. Appl. Sci. 2019, 9, 8. [Google Scholar]
- Peng, P.; You, M.Y.; Xu, W.S.; Li, J.X. Fully integer-based quantization for mobile convolutional neural network inference. Neurocomputing 2021, 432, 194–205. [Google Scholar] [CrossRef]
- Howard, A.G.; Zhu, M.; Chen, B.; Kalenichenko, D.; Wang, W.; Weyand, T.; Andreetto, M.; Adam, H. Mobilenets: Efficient convolutional neural networks for mobile vision applications. arXiv 2017, arXiv:1704.04861. [Google Scholar]
- Zhang, X.; Zhou, X.Y.; Lin, M.X.; Sun, R.; IEEE. ShuffleNet: An Extremely Efficient Convolutional Neural Network for Mobile Devices. In Proceedings of the 31st IEEE/CVF Conference on Computer Vision and Pattern Recognition (CVPR), Salt Lake City, UT, USA, 18–23 June 2018; pp. 6848–6856.
- Ale, L.; Sheta, A.; Li, L.Z.; Wang, Y.; Zhang, N.; IEEE. Deep Learning based Plant Disease Detection for Smart Agriculture. In Proceedings of the IEEE Global Communications Conference (IEEE GLOBECOM), Waikoloa, HI, USA, 9–13 December 2019.
- Wang, C.; Zhou, J.; Wu, H.; Teng, G.; Zhao, C.; Li, J. Identification of vegetable leaf diseases based on improved multi-scale ResNet. Trans. Chin. Soc. Agricult. Eng. 2020, 36, 209–217. [Google Scholar]
- Toda, Y.; Okura, F. How Convolutional Neural Networks Diagnose Plant Disease. Plant Phenomics 2019, 2019, 14. [Google Scholar] [CrossRef] [PubMed]
- Tang, Z.; Yang, J.; Li, Z.; Qi, F. Grape disease image classification based on lightweight convolution neural networks and channelwise attention. Comput. Electron. Agric. 2020, 178, 105735. [Google Scholar] [CrossRef]
- Zhao, S.Y.; Peng, Y.; Liu, J.Z.; Wu, S. Tomato Leaf Disease Diagnosis Based on Improved Convolution Neural Network by Attention Module. Agriculture-Basel. 2021, 11, 15. [Google Scholar] [CrossRef]
- Chen, J.D.; Zhang, D.F.; Suzauddola, M.; Zeb, A. Identifying crop diseases using attention embedded MobileNet-V2 model. Appl. Soft. Comput. 2021, 113, 12. [Google Scholar] [CrossRef]
- Redmon, J.; Divvala, S.; Girshick, R.; Farhadi, A.; IEEE. You Only Look Once: Unified, Real-Time Object Detection. In Proceedings of the 2016 IEEE Conference on Computer Vision and Pattern Recognition (CVPR), Seattle, WA, USA, 27–30 June 2016; pp. 779–788.
- Redmon, J.; Farhadi, A.; IEEE. YOLO9000: Better, Faster, Stronger. In Proceedings of the 30th IEEE/CVF Conference on Computer Vision and Pattern Recognition (CVPR), Honolulu, HI, USA, 21–26 July 2017; pp. 6517–6525. [Google Scholar]
- Redmon, J.; Farhadi, A. Yolov3: An incremental improvement. arXiv 2018, arXiv:1804.02767. [Google Scholar]
- Zhu, L.Z.; Zhang, S.N.; Chen, K.Y.; Chen, S.; Wang, X.; Wei, D.X.; Zhao, H.C. Low-SNR Recognition of UAV-to-Ground Targets Based on Micro-Doppler Signatures Using Deep Convolutional Denoising Encoders and Deep Residual Learning. IEEE Trans. Geosci. Remote Sens. 2022, 60, 13. [Google Scholar] [CrossRef]
- Bochkovskiy, A.; Wang, C.-Y.; Liao, H.-Y.M. Yolov4: Optimal speed and accuracy of object detection. arXiv 2020, arXiv:2004.10934. [Google Scholar]
- Tian, Y.N.; Yang, G.D.; Wang, Z.; Wang, H.; Li, E.; Liang, Z.Z. Apple detection during different growth stages in orchards using the improved YOLO-V3 model. Comput. Electron. Agric. 2019, 157, 417–426. [Google Scholar] [CrossRef]
- Roy, A.M.; Bhaduri, J. Real-time growth stage detection model for high degree of occultation using DenseNet-fused YOLOv. Comput. Electron. Agric. 2022, 193, 14. [Google Scholar]
- Li, G.; Suo, R.; Zhao, G.A.; Gao, C.Q.; Fu, L.S.; Shi, F.X.; Dhupia, J.; Li, R.; Cui, Y.J. Real-time detection of kiwifruit flower and bud simultaneously in orchard using YOLOv4 for robotic pollination. Comput. Electron. Agric. 2022, 193, 8. [Google Scholar] [CrossRef]
- He, K.M.; Zhang, X.Y.; Ren, S.Q.; Sun, J. Spatial Pyramid Pooling in Deep Convolutional Networks for Visual Recognition. In Proceedings of the 13th European Conference on Computer Vision (ECCV), Zurich, Switzerland, 6–12 September 2014; pp. 346–361. [Google Scholar]
- Liu, S.; Qi, L.; Qin, H.F.; Shi, J.P.; Jia, J.Y.; IEEE. Path Aggregation Network for Instance Segmentation. In Proceedings of the 31st IEEE/CVF Conference on Computer Vision and Pattern Recognition (CVPR), Salt Lake City, UT, USA, 18–23 June 2018; pp. 8759–8768.
- Wang, C.Y.; Liao, H.Y.M.; Wu, Y.H.; Chen, P.Y.; Hsieh, J.W.; Yeh, I.H. CSPNet: A New Backbone that can Enhance Learning Capability of CNN. In Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition (CVPR), Electr Network, virtual, 14–19 June 2020; pp. 1571–1580. [Google Scholar]
- Hayit, T.; Erbay, H.; Varcin, F.; Hayit, F.; Akci, N. Determination of the severity level of yellow rust disease in wheat by using convolutional neural networks. J. Plant Pathol. 2021, 103, 923–934. [Google Scholar] [CrossRef]
- Huang, L.S.; Wu, K.; Huang, W.J.; Dong, Y.Y.; Ma, H.Q.; Liu, Y.; Liu, L.Y. Detection of Fusarium Head Blight in Wheat Ears Using Continuous Wavelet Analysis and PSO-SVM. Agriculture 2021, 11, 13. [Google Scholar] [CrossRef]
- Jiang, Z.C.; Dong, Z.X.; Jiang, W.P.; Yang, Y.Z. Recognition of rice leaf diseases and wheat leaf diseases based on multi-task deep transfer learning. Comput. Electron. Agric. 2021, 186, 9. [Google Scholar] [CrossRef]
- Ramcharan, A.; McCloskey, P.; Baranowski, K.; Mbilinyi, N.; Mrisho, L.; Ndalahwa, M.; Legg, J.; Hughes, D.P. A Mobile-Based Deep Learning Model for Cassava Disease Diagnosis. Front. Plant Sci. 2019, 10, 8. [Google Scholar] [CrossRef] [Green Version]
- Gonzalez-Huitron, V.; Leon-Borges, J.A.; Rodriguez-Mata, A.E.; Amabilis-Sosa, L.E.; Ramirez-Pereda, B.; Rodriguez, H. Disease detection in tomato leaves via CNN with lightweight architectures implemented in Raspberry Pi. Comput. Electron. Agric. 2021, 181, 9. [Google Scholar]
- Liu, J.; Wang, X.W. Tomato Diseases and Pests Detection Based on Improved Yolo V3 Convolutional Neural Network. Front. Plant Sci. 2020, 11, 12. [Google Scholar] [CrossRef]
Input | Operation | Number of Channels | SE | HS | Stride | Output |
---|---|---|---|---|---|---|
416 × 416 × 3 | Conv, 1 × 1 | 16 | - | - | 2 | 208 × 208 × 16 |
208 × 208 × 16 | Bneck, 3 × 3 | 16 | - | - | 1 | 208 × 208 × 16 |
208 × 208 × 16 | Bneck, 3 × 3 | 24 | - | - | 2 | 104 × 104 × 24 |
104 × 104 × 24 | Bneck, 3 × 3 | 24 | - | - | 1 | 104 × 104 × 24 |
104 × 104 × 24 | Bneck, 5 × 5 | 40 | √ | - | 2 | 52 × 52 × 40 |
52 × 52 × 40 | Bneck, 5 × 5 | 40 | √ | - | 1 | 52 × 52 × 40 |
52 × 52 × 40 | Bneck, 5 × 5 | 40 | √ | - | 1 | 52 × 52 × 40 |
52 × 52 × 40 | Bneck, 3 × 3 | 80 | - | √ | 2 | 26 × 26 × 80 |
26 × 26 × 80 | Bneck, 3 × 3 | 80 | - | √ | 1 | 26 × 26 × 80 |
26 × 26 × 80 | Bneck, 3 × 3 | 80 | - | √ | 1 | 26 × 26 × 80 |
26 × 26 × 80 | Bneck, 3 × 3 | 80 | - | √ | 1 | 26 × 26 × 80 |
26 × 26 × 80 | Bneck, 3 × 3 | 112 | √ | √ | 1 | 26 × 26 × 112 |
26 × 26 × 112 | Bneck, 3 × 3 | 112 | √ | √ | 1 | 26 × 26 × 112 |
26 × 26 × 80 | Bneck, 5 × 5 | 160 | √ | √ | 2 | 13 × 13 × 160 |
13 × 13 × 160 | Bneck, 5 × 5 | 160 | √ | √ | 1 | 13 × 13 × 160 |
13 × 13 × 160 | Bneck, 5 × 5 | 160 | √ | √ | 1 | 13 × 13 × 160 |
Network Model | Parameter/Ten Thousand |
---|---|
YOLOv4 | 5000–6000 |
MobileNetv1-YOLOv4 | 1269.20 |
MobileNetv2-YOLOv4 | 1080.12 |
MobileNetv3-YOLOv4 | 1172.91 |
Model | AP | F1 Score | Recall | Precision | Time/s | Size/MB |
---|---|---|---|---|---|---|
YOLOv3 | 76.44% | 58% | 41.81% | 86.08% | 0.1140 | 247 |
YOLOv4 | 81.28% | 78% | 68.43% | 90.72% | 0.1249 | 256 |
YOLOv4-tiny | 79.99% | 73% | 61.09% | 91.33% | 0.1004 | 24 |
YOLOv4 with ResNet50 | 80.81% | 76% | 65.36% | 92.07% | 0.1123 | 134 |
YOLOv4 with VGG | 80.41% | 75% | 62.63% | 92.68% | 0.1174 | 94 |
MobileNetv1-YOLOv4 | 79.50% | 74% | 61.43% | 92.31% | 0.1071 | 54 |
MobileNetv2-YOLOv4 | 76.64% | 72% | 59.22% | 92.04% | 0.1084 | 49 |
Ours | 80.17% | 76% | 63.31% | 93.69% | 0.1108 | 56 |
Reference | Plant | Model | Precision |
---|---|---|---|
Ramcharan et al. [53] | Cassava | MobileNet | 84.7% |
Gonzalez-Huitron et al. [54] | Tomato leaves | NASNetMobile | 84% |
Hayit et al. [50] | Wheat | Xception | 91% |
Liu et al. [55] | Tomato | Improved YOLOv3 | 92.39% |
Huang et al. [51] | Wheat ear | Continuous wavelet analysis and PSO-SVM | 93.5% |
Jiang et al. [52] | Wheat leaf | Multi-task deep learning transfer | 98.75% |
Proposed method | Wheat ear | MobileNetv3-YOLOv4 | 93.69% |
Publisher’s Note: MDPI stays neutral with regard to jurisdictional claims in published maps and institutional affiliations. |
© 2022 by the authors. Licensee MDPI, Basel, Switzerland. This article is an open access article distributed under the terms and conditions of the Creative Commons Attribution (CC BY) license (https://creativecommons.org/licenses/by/4.0/).
Share and Cite
Hong, Q.; Jiang, L.; Zhang, Z.; Ji, S.; Gu, C.; Mao, W.; Li, W.; Liu, T.; Li, B.; Tan, C. A Lightweight Model for Wheat Ear Fusarium Head Blight Detection Based on RGB Images. Remote Sens. 2022, 14, 3481. https://doi.org/10.3390/rs14143481
Hong Q, Jiang L, Zhang Z, Ji S, Gu C, Mao W, Li W, Liu T, Li B, Tan C. A Lightweight Model for Wheat Ear Fusarium Head Blight Detection Based on RGB Images. Remote Sensing. 2022; 14(14):3481. https://doi.org/10.3390/rs14143481
Chicago/Turabian StyleHong, Qingqing, Ling Jiang, Zhenghua Zhang, Shu Ji, Chen Gu, Wei Mao, Wenxi Li, Tao Liu, Bin Li, and Changwei Tan. 2022. "A Lightweight Model for Wheat Ear Fusarium Head Blight Detection Based on RGB Images" Remote Sensing 14, no. 14: 3481. https://doi.org/10.3390/rs14143481
APA StyleHong, Q., Jiang, L., Zhang, Z., Ji, S., Gu, C., Mao, W., Li, W., Liu, T., Li, B., & Tan, C. (2022). A Lightweight Model for Wheat Ear Fusarium Head Blight Detection Based on RGB Images. Remote Sensing, 14(14), 3481. https://doi.org/10.3390/rs14143481