A Robust Deep-Learning-Based Detector for Real-Time Tomato Plant Diseases and Pests Recognition
<p>Recognition and localization of plant diseases and pests: problem formulation. Our system aims to detect both class (what) and location (where) of the affected areas in the image.</p> "> Figure 2
<p>Flow chart of the deep meta-architecture approach used in this work. Our system proposes to treat a deep meta-architecture as an open system on which different feature extractors can be adapted to perform on our task. The system is trained and tested end-to-end using images captured in-place. The outputs are the class and localization of the infected area in the image.</p> "> Figure 3
<p>A representation of diseases and pests that affect tomato plants. (<b>a</b>) Gray mold, (<b>b</b>) Canker, (<b>c</b>) Leaf mold, (<b>d</b>) Plague, (<b>e</b>) Leaf miner, (<b>f</b>) Whitefly, (<b>g</b>) Low temperature, (<b>h</b>) Nutritional excess or deficiency, (<b>i</b>) Powdery mildew. The images are collected under different variations and environmental conditions. The patterns help to distinguish some proper characteristics of each disease and pest.</p> "> Figure 4
<p>System overview of the proposed deep-learning-based approach for plant diseases and pest recognition. Our deep meta-architecture approach consists of several steps that use input images as a source of information, and provide detection results in terms of class and location of the infected area of the plant in the image.</p> "> Figure 5
<p>Training loss curve of our proposed approach. The comparison includes results with and without data augmentation. Our network efficiently learns the data while achieving a lower error rate at about one hundred thousand iterations.</p> "> Figure 6
<p>Detection results of diseases and pests that affect tomato plants with Faster R-CNN and a VGG-16 detector. From left to right: the input image, annotated image, and predicted results. (<b>a</b>) Gray mold; (<b>b</b>) Canker; (<b>c</b>) Leaf mold; (<b>d</b>) Plague; (<b>e</b>) Leaf miner; (<b>f</b>) Whitefly; (<b>g</b>) Low temperature; (<b>h</b>) Nutritional excess or deficiency; (<b>i</b>) Powdery mildew.</p> "> Figure 6 Cont.
<p>Detection results of diseases and pests that affect tomato plants with Faster R-CNN and a VGG-16 detector. From left to right: the input image, annotated image, and predicted results. (<b>a</b>) Gray mold; (<b>b</b>) Canker; (<b>c</b>) Leaf mold; (<b>d</b>) Plague; (<b>e</b>) Leaf miner; (<b>f</b>) Whitefly; (<b>g</b>) Low temperature; (<b>h</b>) Nutritional excess or deficiency; (<b>i</b>) Powdery mildew.</p> "> Figure 7
<p>Deep feature maps visualization of diseases and pest (<b>a</b>) Canker; (<b>b</b>) Gray mold; (<b>c</b>) Leaf mold; (<b>d</b>) Low temperature; (<b>e</b>) Miner; (<b>f</b>) Nutritional excess; (<b>g</b>) Plague; (<b>h</b>) Powdery mildew; (<b>i</b>) Whitefly. Each feature map illustrates how our neural network system interprets a disease in the context after being classified by a SoftMax function.</p> "> Figure 8
<p>Detection results of inter- and intra-class variation of diseases and pests in the images. (<b>a</b>) Two classes affecting the same sample (powdery mildew and pest); (<b>b</b>) Three classes in the same sample (Gray mold, low temperature, and miners); (<b>c</b>) Leaf mold affecting the back side of the leaf; (<b>d</b>) Leaf mold affecting the front side of the leaf; (<b>e</b>) Gray mold in the early stage; (<b>f</b>) Gray mold in the last stage; (<b>g</b>) Plague can be also detected on other parts of the plant, such as fruits or stem; (<b>h</b>) Plague affecting the tomato production.</p> "> Figure 9
<p>Confusion matrix of the Tomato Diseases and Pests detection results (including background, which is a transversal class containing healthy parts of the plants and surrounding areas, such as part of the greenhouse).</p> "> Figure 10
<p>A representation of failure cases. (<b>a</b>) The intra-class variation makes the recognition harder and results in a low recognition rate. (<b>b</b>) Misdetected class due to confusion at earlier infection status (e.g., the real class is leaf mold, but the system recognizes it as canker).</p> ">
Abstract
:1. Introduction
- Our system uses images of plant diseases and pests taken in-place, thus we avoid the process of collecting samples and analyzing them in the laboratory.
- It considers the possibility that a plant can be simultaneously affected by more than one disease or pest in the same sample.
- Our approach uses input images captured by different camera devices with various resolutions, such as cell phone and other digital cameras.
- It can efficiently deal with different illumination conditions, the size of objects, and background variations, etc., contained in the surrounding area of the plant.
- It provides a practical real-time application that can be used in the field without employing any expensive and complex technology.
2. Related Works
2.1. Anomaly Detection in Plants
2.2. Deep Meta-Architectures for Object Detection
2.2.1. Faster Region-based Convolutional Neural Network (Faster R-CNN)
2.2.2. Single Shot Detector (SSD)
2.2.3. Region-based Fully Convolutional Network (R-FCN)
2.3. Feature Extractors
3. Deep Meta-Architectures-Based Plant Diseases and Pest Recognition
3.1. System Background
- Infection status: A plant shows different patterns along with their infection status according to the life cycle of the diseases.
- Location of the symptom: It considers that diseases not only affect leaves, but also other parts of the plant such as stem or fruits.
- Patterns of the leaf: Symptoms of the diseases show visible variations either on the front side or the back side of the leaves.
- Type of fungus: Identifying the type of fungus can be an easy way to visibly differentiate between some diseases.
- Color and shape: Depending on the disease, the plant may show different colors or shapes at different infection stages.
3.2. System Overview
3.3. Data Collection
- Images with various resolutions.
- Samples at early, medium, and last infection status.
- Images containing different infected areas in the plant (e.g., stem, leaves, fruits, etc.).
- Different plant sizes.
- Objects surrounding the plant in the greenhouse, etc.
3.4. Data Annotation
3.5. Data Augmentation
3.6. Disease and Pest Detection
3.6.1. Faster R-CNN
3.6.2. SSD
3.6.3. R-FCN
4. Experimental Results
4.1. Tomato Diseases and Pests Dataset
4.2. Experimental Setup
4.3. Quantitative Results
4.4. Qualitative Results
4.5. Deep Network Visualization
4.6. Diseases Effects in the Plant
4.7. Confusion Matrix
4.8. Failures Analysis and Discussion
5. Conclusions
Acknowledgments
Author Contributions
Conflicts of Interest
References
- Mabvakure, B.; Martin, D.P.; Kraberger, S.; Cloete, L.; Van Bruschot, S.; Geering, A.D.W.; Thomas, J.E.; Bananej, K.; Lett, J.; Lefeuvre, P.; et al. Ongoing geographical spread of Tomato yellow leaf curl virus. Virology 2016, 498, 257–264. [Google Scholar] [CrossRef] [PubMed]
- Canizares, M.C.; Rosas-Diaz, T.; Rodriguez-Negrete, E.; Hogenhout, S.A.; Bedford, I.D.; Bejarano, E.R.; Navas-Castillo, J.; Moriones, E. Arabidopsis thaliana, an experimental host for tomato yellow leaf curl disease-associated begomoviruses by agroinoculation and whitefly transmission. Plant Pathol. 2015, 64, 265–271. [Google Scholar] [CrossRef]
- The World Bank. Reducing Climate-Sensitive Risks. 2014, Volume 1. Available online: http://documents.worldbank.org/curated/en/486511468167944431/Reducing-climate-sensitive-disease-risks (accessed on 20 June 2017).
- Nutter, F.W., Jr.; Esker, P.D.; Coelho, R. Disease assessment concepts and the advancements made in improving the accuracy and precision of plant disease data. Eur. J. Plant Pathol. 2006, 115, 95–113. [Google Scholar] [CrossRef]
- Munyaneza, J.E.; Crosslin, J.M.; Buchman, J.L.; Sengoda, V.G. Susceptibility of Different Potato Plant Growth Stages of Purple Top Disease. Am. J. Potato Res. 2010, 87, 60–66. [Google Scholar] [CrossRef]
- Gilbertson, R.L.; Batuman, O. Emerging Viral and Other Diseases of Processing Tomatoes: Biology Diagnosis and Management. Acta Hortic. 2013, 1, 35–48. [Google Scholar] [CrossRef]
- Diaz-Pendon, J.A.; Canizares, M.C.; Moriones, E.; Bejarano, E.R.; Czosnek, H.; Navas-Castillo, J. Tomato yellow leaf curl viruses: Menage a trois between the virus complex, the plant and whitefly vector. Mol. Plant Pathol. 2010, 11, 414–450. [Google Scholar] [CrossRef] [PubMed]
- Coakley, S.M.; Scherm, H.; Chakraborty, S. Climate Change and Plant Disease Management. Annu. Rev. Phytopathol. 1999, 37, 399–426. [Google Scholar] [CrossRef] [PubMed]
- Food and Agriculture Organization of the United Nations. Plant Pests and Diseases. 2017. Available online: http://www.fao.org/emergencies/emergency-types/plant-pests-and-diseases/en/ (accessed on 20 June 2017).
- Fuentes, A.; Yoon, S.; Youngki, H.; Lee, Y.; Park, D.S. Characteristics of Tomato Plant Diseases—A study for tomato plant disease identification. Proc. Int. Symp. Inf. Technol. Converg. 2016, 1, 226–231. [Google Scholar]
- Food and Agriculture Organization of the United Nations. Value of Agricultural Production-Tomatoes. Food and Agriculture data. 2015. Available online: http://www.fao.org/faostat/en/#data/QV/visualize (accessed on 9 May 2017).
- Hanssen, I.; Lapidot, M.; Thomma, B. Emerging Viral Diseases of Tomato Crops. Mol. Plant Microbe Interact. 2010, 23, 539–548. [Google Scholar] [CrossRef] [PubMed]
- Sankaran, S.; Mishra, A.; Ehsani, R. A review of advanced techniques for detecting plant diseases. Comput. Electron. Agric. 2010, 72, 1–13. [Google Scholar] [CrossRef]
- Chaerani, R.; Voorrips, R.E. Tomato early blight (Alternaria solani): The pathogens, genetics, and breeding for resistance. J. Gen. Plant Pathol. 2006, 72, 335–347. [Google Scholar] [CrossRef]
- Alvarez, A.M. Integrated approaches for detection of plant pathogenic bacteria and diagnosis of bacterial diseases. Annu. Rev. Phytopathol. 2004, 42, 339–366. [Google Scholar] [CrossRef] [PubMed]
- Gutierrez-Aguirre, I.; Mehle, N.; Delic, D.; Gruden, K.; Mumford, R.; Ravnikar, M. Real-time quantitative PCR based sensitive detection and genotype discrimination of Pepino mosaic virus. J. Virol. Methods 2009, 162, 46–55. [Google Scholar] [CrossRef] [PubMed]
- Martinelli, F.; Scalenghe, R.; Davino, S.; Panno, S.; Scuderi, G.; Ruisi, P.; Villa, P.; Stropiana, D.; Boschetti, M.; Goudart, L.; et al. Advanced methods of plant disease detection. A review. Agron. Sust. Dev. 2015, 35, 1–25. [Google Scholar] [CrossRef]
- Bock, C.H.; Poole, G.H.; Parker, P.E.; Gottwald, T.R. Plant Disease Sensitivity Estimated Visually, by Digital Photography and Image Analysis, and by Hyperspectral Imaging. Crit. Rev. Plant Sci. 2007, 26, 59–107. [Google Scholar]
- Krizhenvshky, A.; Sutskever, I.; Hinton, G. Imagenet classification with deep convolutional networks. In Proceedings of the Conference Neural Information Processing Systems (NIPS), Lake Tahoe, NV, USA, 3–8 December 2012; pp. 1097–1105. [Google Scholar]
- Russakovsky, O.; Deng, J.; Su, H.; Krause, J.; Satheesh, S.; Ma, S.; Huang, Z.; Karpathy, A.; Khosla, A.; Bernstein, M.; et al. ImageNet Large Scale Visual Recognition Challenge. Int. J. Comput. Vis. 2015, 115, 211–252. [Google Scholar] [CrossRef]
- Lin, M.; Chen, Q.; Yan, S. Network in Network. arXiv 2013, arXiv:arXiv:1312.4400. [Google Scholar]
- Simonyan, K.; Zisserman, A. Very deep convolutional networks for large-scale image recognition. arXiv 2014, arXiv:arXiv:1409.1556. [Google Scholar]
- Szegedy, C.; Liu, W.; Jia, Y.; Sermanet, P.; Reed, S.; Anguelov, D.; Erhan, D.; Vanhoucke, V.; Rabinovich, A. Going deeper with convolutions. In Proceedings of the 2015 IEEE Conference on Computer Vision and Pattern Recognition, Boston, MA, USA, 7–12 June 2015; pp. 1–9. [Google Scholar] [CrossRef]
- He, K.; Zhang, X.; Ren, S.; Sun, J. Deep residual learning for image recognition. In Proceedings of the 2016 IEEE Conference on Computer, Vision, Pattern Recognition, Las Vegas, NV, USA, 27–30 June 2016; pp. 770–778. [Google Scholar] [CrossRef]
- He, K.; Zhang, X.; Ren, S.; Sun, J. Identity Mapping in deep residual networks. arXiv 2016, arXiv:arXiv:1603.05027. [Google Scholar]
- Xie, S.; Girshick, R.; Dollár, P.; Tu, Z.; He, K. Aggregated Residual Transformations for Deep Neural Networks. arXiv 2017, arXiv:arXiv:1611.05431. [Google Scholar]
- Zhang, K.; Sun, M.; Han, T.X.; Yuan, X.; Guo, L.; Liu, T. Residual Networks of Residual Networks: Multilevel Residual Networks. IEEE Trans. Circ. Syst. Video Technol. 2017, 99. [Google Scholar] [CrossRef]
- Zagoruyko, S.; Komodakis, N. Wide Residual Networks. arXiv 2016, arXiv:arXiv:1605.07146. [Google Scholar]
- Huang, J.; Rathod, V.; Sun, C.; Zhu, M.; Korattikara, A.; Fathi, A.; Fischer, I.; Wojna, Z.; Song, Y.; Guadarrama, S.; et al. Speed/accuracy trade-offs for modern convolutional object detectors. In Proceedings of the IEEE Computer Society Conference on Computer Vision and Pattern Recognition, Honolulu, HI, USA, 22–25 July 2017. [Google Scholar]
- Ren, S.; He, K.; Girshick, R.; Sun, J. Faster R-CNN: Towards Real-Time Object Detection with Region Proposal Networks. IEEE Trans. Pattern Anal. Mach. Intell. 2016, 39, 1137–1149. [Google Scholar] [CrossRef] [PubMed]
- Liu, W.; Anguelov, D.; Erhan, D.; Szegedy, C.; Reed, S.; Fu, C.; Berg, A.C. SSD: Single Shot MultiBox Detector. In Proceedings of the European Conference on Computer Vision—ECCV, Amsterdam, The Netherlands, 8–16 October 2016; pp. 21–37. [Google Scholar]
- Dai, J.; Li, Y.; He, K.; Sun, J. R-FCN: Object Detection via Region-based Fully Convolutional Networks. arXiv 2016, arXiv:arXiv:1605.06409v2. [Google Scholar]
- Irudayaraj, J. Pathogen Sensors. Sensors 2009, 9, 8610–8612. [Google Scholar] [CrossRef] [PubMed]
- Meroni, M.; Rosini, M.; Picchi, V.; Panigada, C.; Cogliati, S.; Nali, C.; Colombo, R. Asse Assessing Steady-state Fluorescence and PRI from Hyperspectral Proximal Sensing as Early Indicators of Plant Stress: The Case of Ozone Exposure. Sensors 2008, 8, 1740–1754. [Google Scholar] [CrossRef] [PubMed]
- Wah Liew, O.; Chong, P.; Li, B.; Asundi, K. Signature Optical Cues: Emerging Technologies for Monitoring Plant Health. Sensors 2008, 8, 3205–3239. [Google Scholar] [CrossRef] [PubMed]
- Mazarei, M.; Teplova, I.; Hajimorad, M.; Stewart, C. Pathogen Phytosensing: Plants to Report Plant Pathogens. Sensors 2008, 8, 2628–2641. [Google Scholar] [CrossRef] [PubMed]
- Ryant, P.; Dolezelova, E.; Fabrik, I.; Baloum, J.; Adam, V.; Babula, P.; Kizek, R. Electrochemical Determination of Low Molecular Mass Thiols Content in Potatoes (Solanum tuberosum) Cultivated in the Presence of Various Sulphur Forms and Infected by Late Blight (Phytophora infestans). Sensors 2008, 8, 3165–3182. [Google Scholar] [CrossRef] [PubMed]
- Dalal, N.; Trigs, B. Histogram of Oriented Gradients for Human Detection. In Proceedings of the IEEE Computer Society Conference on Computer Vision and Pattern Recognition, San Diego, CA, USA, 20–25 June 2005. [Google Scholar] [CrossRef]
- Lowe, D. Distinctive Image Features from Scale-Invariant Keypoints. Int. J. Comput. Vis. 2004, 60, 91–110. [Google Scholar] [CrossRef]
- Cortes, C.; Vapnik, V. Support Vector Networks. Mach. Learn. 1995, 20, 293–297. [Google Scholar] [CrossRef]
- Schapire, R. A Brief Introduction to Boosting. In Proceedings of the Sixteenth International Joint Conference on Artificial Intelligence, Stockholm, Sweden, 31 July 1999; Volume 2, pp. 1401–1406. [Google Scholar]
- Pawara, P.; Okafor, E.; Surinta, O.; Schomaker, L.; Wiering, M. Comparing Local Descriptors and Bags of Visual Words to Deep Convolutional Neural Networks for Plant Recognition. In Proceedings of the 6th International Conference on Pattern Recognition Applications and Methods (ICPRAM 2017), Porto, Portugal, 24–26 February 2017; pp. 479–486. [Google Scholar] [CrossRef]
- Cugu, I.; Sener, E.; Erciyes, C.; Balci, B.; Akin, E.; Onal, I.; Oguz-Akyuz, A. Treelogy: A Novel Tree Classifier Utilizing Deep and Hand-crafted Representations. arXiv 2017, arXiv:arXiv:1701.08291v1. [Google Scholar]
- Amara, J.; Bouaziz, B.; Algergawy, A. A Deep Learning-based Approach for Banana Leaf Diseases Classification. In Lecture Notes in Informatics (LNI); Gesellschaft für Informatik: Bonn, Germany, 2017. [Google Scholar]
- Johannes, A.; Picon, A.; Alvarez-Gila, A.; Echazarra, J.; Rodriguez-Vaamonde, S.; Diez-Navajas, A.; Ortiz-Barredo, A. Automatic plant disease diagnosis using mobile capture devices, applied on a wheat use case. Comput. Electron. Agric. 2017, 138, 200–209. [Google Scholar] [CrossRef]
- Fujita, E.; Kawasaki, Y.; Uga, H.; Kagiwada, S.; Iyatomi, H. Basic investigation on a robust and practical plant diagnostic system. In Proceedings of the 2016 15th IEEE International Conference on Machine Learning and Applications (ICMLA), Anaheim, CA, USA, 18–20 December 2016. [Google Scholar] [CrossRef]
- Kawasaki, Y.; Uga, H.; Kagiwada, S.; Iyatomi, H. Basic Study of Automated Diagnosis of Viral Plant Diseases Using Convolutional Neural Networks. In Advances in Visual Computing, Proceedings of the 11th International Symposium, ISVC 2015, Las Vegas, NV, USA, 14–16 December 2015; Bebis, G., Ed.; Lecture Notes in Computer Science; Springer: Cham, Switzerland, 2015; Volume 9475, pp. 638–645. [Google Scholar]
- Owomugisha, G.; Mwebaze, E. Machine Learning for Plant Disease Incidence and Severity Measurements from Leaf Images. In Proceedings of the 2016 15th IEEE International Conference on Machine Learning and Applications (ICMLA), Anaheim, CA, USA, 18–20 December 2016. [Google Scholar] [CrossRef]
- Sladojevic, S.; Arsenovic, M.; Anderla, A.; Culibrk, D.; Stefanovic, D. Deep Neural Networks Based Recognition of Plant Diseases by Leaf Image Classification. Comput. Intell. Neurosci. 2016, 2016, 3289801. [Google Scholar] [CrossRef] [PubMed]
- Mohanty, S.P.; Hughes, D.; Salathe, M. Using Deep Learning for Image-Based Plant Disease Detection. Front. Plant Sci. 2016, 7, 1419. [Google Scholar] [CrossRef] [PubMed]
- Hughes, D.P.; Salathe, M. An open access repository of images on plant health to enable the development of mobile disease diagnostics. arXiv 2016, arXiv:arXiv:1511.08060v2. [Google Scholar]
- Wang, G.; Sun, Y.; Wang, J. Automatic Image-Based Plant Disease Severity Estimation Using Deep Learning. Comput. Intell. Neurosci. 2017, 2017, 2917536. [Google Scholar] [CrossRef] [PubMed]
- Everingham, M.; Van Gool, L.; Williams, C.; Winn, J.; Zisserman, A. The Pascal Visual Object Classes (VOC) Challenge. Int. Comput. Vis. 2010, 88, 303–338. [Google Scholar] [CrossRef]
Feature Extractor | Parameters (M) | Number of Layers | Top-5 Error |
---|---|---|---|
AlexNet [19] | 61 | 8 | 15.3 |
ZFNet | - | 8 | 14.8 |
VGG-16 [22] | 138 | 16 | 7.40 |
GoogLeNet [23] | 6.9 | 22 | 6.66 |
ResNet-50 [24] | 25 | 50 | 3.57 |
ResNet-101 [24] | 42.6 | 101 | - |
ResNetXt-101 [26] | 42.6 | 101 | 3.03 |
Meta-Architecture | Feature Extractor | Bounding Box | Loss Function |
---|---|---|---|
Faster R-CNN | VGG-16 ResNet-50 ResNet-101 ResNet-152 ResNeXt-50 | Smooth | |
SSD | ResNet-50 | Smooth | |
R-FCN | ResNet-50 | Smooth |
Class | Number of Images in the Dataset 1 | Number of Annotated Samples (Bounding Boxes) 2 | Percentage of Bounding Box Samples (%) |
---|---|---|---|
Leaf mold | 1350 | 11,922 | 27.47 |
Gray mold | 335 | 2768 | 6.37 |
Canker | 309 | 2648 | 6.10 |
Plague | 296 | 2570 | 5.92 |
Miner | 339 | 2946 | 6.78 |
Low temperature | 55 | 477 | 1.09 |
Powdery mildew | 40 | 338 | 0.77 |
Whitefly | 49 | 404 | 0.93 |
Nutritional excess | 50 | 426 | 0.98 |
Background 3 | 2177 | 18,899 | 43.54 |
Total | 5000 | 43,398 | 100 |
Meta-Architectures | |||||||
---|---|---|---|---|---|---|---|
Faster R-CNN | R-FCN | SSD | |||||
Class/Feature Extractor | VGG-16 | ResNet-50 | ResNet-101 | ResNet-152 | ResNeXt-50 | ResNet-50 | ResNet-50 |
Leaf mold | 0.9060 | 0.8827 | 0.803 | 0.8273 | 0.840 | 0.8820 | 0.8510 |
Gray mold | 0.7968 | 0.6684 | 0.449 | 0.4499 | 0.620 | 0.7960 | 0.7620 |
Canker | 0.8569 | 0.7580 | 0.660 | 0.7154 | 0.738 | 0.8638 | 0.8326 |
Plague | 0.8762 | 0.7588 | 0.613 | 0.6809 | 0.742 | 0.8732 | 0.8409 |
Miner | 0.8046 | 0.7884 | 0.756 | 0.7793 | 0.767 | 0.8812 | 0.7963 |
Low temperature | 0.7824 | 0.6733 | 0.468 | 0.5221 | 0.623 | 0.7545 | 0.7892 |
Powdery mildew | 0.6556 | 0.5982 | 0.413 | 0.4928 | 0.505 | 0.7950 | 0.8014 |
Whitefly | 0.8301 | 0.8125 | 0.637 | 0.7001 | 0.720 | 0.9492 | 0.8402 |
Nutritional excess | 0.8971 | 0.7637 | 0.547 | 0.8109 | 0.814 | 0.9290 | 0.8553 |
Background | 0.9005 | 0.8331 | 0.624 | 0.7049 | 0.745 | 0.8644 | 0.8841 |
Total mean AP | 0.8306 | 0.7537 | 0.590 | 0.6683 | 0.711 | 0.8598 | 0.8253 |
Class | Without Data Augmentation | With Data Augmentation |
---|---|---|
Leaf mold | 0.6070 | 0.9060 |
Gray mold | 0.5338 | 0.7968 |
Canker | 0.5741 | 0.8569 |
Plague | 0.5870 | 0.8762 |
Miner | 0.5390 | 0.8046 |
Low temperature | 0.5242 | 0.7824 |
Powdery mildew | 0.4392 | 0.6556 |
Whitefly | 0.5591 | 0.8301 |
Nutritional excess | 0.6010 | 0.8971 |
Background | 0.6033 | 0.9005 |
Total mean AP | 0.5564 | 0.8306 |
© 2017 by the authors. Licensee MDPI, Basel, Switzerland. This article is an open access article distributed under the terms and conditions of the Creative Commons Attribution (CC BY) license (http://creativecommons.org/licenses/by/4.0/).
Share and Cite
Fuentes, A.; Yoon, S.; Kim, S.C.; Park, D.S. A Robust Deep-Learning-Based Detector for Real-Time Tomato Plant Diseases and Pests Recognition. Sensors 2017, 17, 2022. https://doi.org/10.3390/s17092022
Fuentes A, Yoon S, Kim SC, Park DS. A Robust Deep-Learning-Based Detector for Real-Time Tomato Plant Diseases and Pests Recognition. Sensors. 2017; 17(9):2022. https://doi.org/10.3390/s17092022
Chicago/Turabian StyleFuentes, Alvaro, Sook Yoon, Sang Cheol Kim, and Dong Sun Park. 2017. "A Robust Deep-Learning-Based Detector for Real-Time Tomato Plant Diseases and Pests Recognition" Sensors 17, no. 9: 2022. https://doi.org/10.3390/s17092022
APA StyleFuentes, A., Yoon, S., Kim, S. C., & Park, D. S. (2017). A Robust Deep-Learning-Based Detector for Real-Time Tomato Plant Diseases and Pests Recognition. Sensors, 17(9), 2022. https://doi.org/10.3390/s17092022