UAV and Machine Learning Based Refinement of a Satellite-Driven Vegetation Index for Precision Agriculture †
<p>Graphical representation of the proposed RarefyNet model. The overall residual architecture is depicted in the top part of the figure with a detailed overview of its inception modules. Input tensors are processed by two inception modules that build their representations on top of each other, concatenating outputs of their different branches.</p> "> Figure 2
<p>(<b>a</b>) Selected test field located in Serralunga d’Alba (Piedmont, northwest of Italy). The boundaries of the three considered parcels, named “Parcel-A”, “Parcel-B”, and “Parcel-C”, are marked with solid green polygons. The concurrent illustration of low resolution and high-resolution maps derived from satellite and UAV respectively is represented in false colors (near infrared, red, and green channels). (<b>b</b>) Enlargement of UAV imagery highlighted by the yellow square in <a href="#sensors-20-02530-f002" class="html-fig">Figure 2</a>a.</p> "> Figure 3
<p>Three-level vigor maps (<b>a</b>) <math display="inline"><semantics> <mrow> <msubsup> <mi>X</mi> <mrow> <mi>r</mi> <mi>a</mi> <mi>w</mi> </mrow> <mrow> <mi>I</mi> <mi>I</mi> </mrow> </msubsup> </mrow> </semantics></math>, (<b>b</b>) <math display="inline"><semantics> <mrow> <msup> <mover accent="true"> <mi>X</mi> <mo>^</mo> </mover> <mrow> <mi>I</mi> <mi>I</mi> </mrow> </msup> <mo>,</mo> <mtext> </mtext> </mrow> </semantics></math> and (<b>c</b>) <math display="inline"><semantics> <mrow> <msubsup> <mi>Y</mi> <mrow> <mi>U</mi> <mi>A</mi> <mi>V</mi> </mrow> <mrow> <mi>I</mi> <mi>I</mi> </mrow> </msubsup> </mrow> </semantics></math> of parcel B, derived from raw Sentinel-2 NDVI map <math display="inline"><semantics> <mrow> <msubsup> <mi>X</mi> <mrow> <mi>r</mi> <mi>a</mi> <mi>w</mi> </mrow> <mrow> <mi>I</mi> <mi>I</mi> </mrow> </msubsup> </mrow> </semantics></math>, refined satellite NDVI map <math display="inline"><semantics> <mrow> <msup> <mover accent="true"> <mi>X</mi> <mo>^</mo> </mover> <mrow> <mi>I</mi> <mi>I</mi> </mrow> </msup> </mrow> </semantics></math> and UAV-driven NDVI map <math display="inline"><semantics> <mrow> <msubsup> <mi>Y</mi> <mrow> <mi>U</mi> <mi>A</mi> <mi>V</mi> </mrow> <mrow> <mi>I</mi> <mi>I</mi> </mrow> </msubsup> </mrow> </semantics></math>, respectively. Vigor map (<b>d</b>) of parcel B from the expert’s in-field survey <math display="inline"><semantics> <mrow> <msubsup> <mi>V</mi> <mrow> <mi>f</mi> <mi>i</mi> <mi>e</mi> <mi>l</mi> <mi>d</mi> </mrow> <mrow> <mi>I</mi> <mi>I</mi> </mrow> </msubsup> </mrow> </semantics></math>. Maps <math display="inline"><semantics> <mrow> <msubsup> <mi>X</mi> <mrow> <mi>r</mi> <mi>a</mi> <mi>w</mi> </mrow> <mrow> <mi>I</mi> <mi>I</mi> </mrow> </msubsup> </mrow> </semantics></math>, <math display="inline"><semantics> <mrow> <msup> <mover accent="true"> <mi>X</mi> <mo>^</mo> </mover> <mrow> <mi>I</mi> <mi>I</mi> </mrow> </msup> </mrow> </semantics></math> and <math display="inline"><semantics> <mrow> <msubsup> <mi>Y</mi> <mrow> <mi>U</mi> <mi>A</mi> <mi>V</mi> </mrow> <mrow> <mi>I</mi> <mi>I</mi> </mrow> </msubsup> </mrow> </semantics></math> were obtained by the selected K-means based classifier.</p> "> Figure 4
<p>Pixel groups boxplots from raw satellite-driven map <math display="inline"><semantics> <mrow> <msubsup> <mi>X</mi> <mrow> <mi>r</mi> <mi>a</mi> <mi>w</mi> </mrow> <mrow> <mi>I</mi> <mi>I</mi> </mrow> </msubsup> </mrow> </semantics></math>, refined satellite-driven map <math display="inline"><semantics> <mrow> <msup> <mover accent="true"> <mi>X</mi> <mo>^</mo> </mover> <mrow> <mi>I</mi> <mi>I</mi> </mrow> </msup> </mrow> </semantics></math>, and UAV-driven map <math display="inline"><semantics> <mrow> <msubsup> <mi>Y</mi> <mrow> <mi>U</mi> <mi>A</mi> <mi>V</mi> </mrow> <mrow> <mi>I</mi> <mi>I</mi> </mrow> </msubsup> </mrow> </semantics></math>, clustered according to the three vigor classes “L”, “M”, and “H” defined in map <math display="inline"><semantics> <mrow> <msubsup> <mi>V</mi> <mrow> <mi>f</mi> <mi>i</mi> <mi>e</mi> <mi>l</mi> <mi>d</mi> </mrow> <mrow> <mi>I</mi> <mi>I</mi> </mrow> </msubsup> </mrow> </semantics></math>. The boxplots are individually computed for each parcel (<b>A</b>, <b>B</b>, and <b>C</b>).</p> "> Figure 5
<p>Pixel groups boxplots from refined satellite maps (<math display="inline"><semantics> <mrow> <msup> <mover accent="true"> <mi>X</mi> <mo>^</mo> </mover> <mi>I</mi> </msup> </mrow> </semantics></math>,<math display="inline"><semantics> <mrow> <msup> <mrow> <mover accent="true"> <mrow> <mo> </mo> <mi>X</mi> </mrow> <mo stretchy="true">^</mo> </mover> </mrow> <mrow> <mi>I</mi> <mi>I</mi> </mrow> </msup> <mo>,</mo> <mo> </mo> <msup> <mrow> <mover accent="true"> <mrow> <mo> </mo> <mi>X</mi> </mrow> <mo stretchy="true">^</mo> </mover> </mrow> <mrow> <mi>I</mi> <mi>I</mi> <mi>I</mi> </mrow> </msup> </mrow> </semantics></math>, and <math display="inline"><semantics> <mrow> <msup> <mover accent="true"> <mi>X</mi> <mo>^</mo> </mover> <mrow> <mi>I</mi> <mi>V</mi> </mrow> </msup> </mrow> </semantics></math>), clustered according to the three vigor classes “L”, “M”, and “H” defined in the UAV-driven clustered maps <math display="inline"><semantics> <mrow> <msubsup> <mi>Y</mi> <mrow> <mi>U</mi> <mi>A</mi> <mi>V</mi> </mrow> <mi>I</mi> </msubsup> <mo>,</mo> <mo> </mo> <msubsup> <mi>Y</mi> <mrow> <mi>U</mi> <mi>A</mi> <mi>V</mi> </mrow> <mrow> <mi>I</mi> <mi>I</mi> </mrow> </msubsup> <mo>,</mo> <msubsup> <mi>Y</mi> <mrow> <mi>U</mi> <mi>A</mi> <mi>V</mi> </mrow> <mrow> <mi>I</mi> <mi>I</mi> <mi>I</mi> </mrow> </msubsup> <mo>,</mo> <mo> </mo> <mrow> <mi>and</mi> <mtext> </mtext> </mrow> <msubsup> <mi>Y</mi> <mrow> <mi>U</mi> <mi>A</mi> <mi>V</mi> </mrow> <mrow> <mi>I</mi> <mi>V</mi> </mrow> </msubsup> </mrow> </semantics></math>, respectively. The boxplots are computed individually for each parcel (<b>A</b>, <b>B</b>, and <b>C</b>).</p> ">
Abstract
:1. Introduction
2. Methods
2.1. RarefyNet: Input, Output, and Architecture
2.2. RarefyNet: Training Phase
2.3. RarefyNet: Structure Optimization
2.4. Vigor Classifier
3. Experiments and Results
3.1. Satellite and UAV-Based Time Series Imagery
3.2. Experimental Settings
3.3. Results and Discussion
4. Conclusions
Author Contributions
Funding
Acknowledgments
Conflicts of Interest
References
- Pierce, F.J.; Nowak, P. Aspects of Precision Agriculture. In Advances in Agronomy; Elsevier: Amsterdam, The Netherlands, 1999; Volume 67, pp. 1–85. [Google Scholar] [CrossRef]
- Pathak, H.S.; Brown, P.; Best, T. A Systematic literature review of the factors affecting the precision agriculture adoption process. Precis. Agric. 2019, 20, 1292–1316. [Google Scholar] [CrossRef]
- Stafford, J.V. The role of the technology in the emergence and current status of precision agriculture. In Handbook of Precision Agriculture: Principles and Applications; Srinivasan, A., Ed.; Food Products Press: New York, NY, USA, 2006; pp. 19–56. [Google Scholar]
- Comba, L.; Gay, P.; Ricauda, A.D. Robot ensembles for grafting herbaceous crops. Biosyst. Eng. 2016, 146, 227–239. [Google Scholar] [CrossRef]
- Arnó, J.; Martínez Casasnovas, J.A.; Ribes, D.M.; Rosell, J.R. Review. Precision viticulture. Research topics, challenges and opportunities in site-specific vineyard management. Span. J. Agric. Res. 2009, 7, 779–790. [Google Scholar] [CrossRef] [Green Version]
- Ferrer, M.; Echeverría, G.; Pereyra, G.; Gonzalez-Neves, G.; Pan, D.; Mirás-Avalos, J.M. Mapping vineyard vigour using airborne remote sensing: Relations with yield, berry composition and sanitary status under humid climate conditions. Precis. Agric. 2020, 21, 178–197. [Google Scholar] [CrossRef]
- Zaman, S.; Comba, L.; Biglia, A.; Ricauda, A.D.; Barge, P.; Gay, P. Cost-effective visual odometry system for vehicle motion control in agricultural environments. Comput. Electron. Agric. 2019, 162, 82–94. [Google Scholar] [CrossRef]
- Pádua, L.; Marques, P.; Adão, T.; Guimarães, N.; Sousa, A.; Peres, E.; Sousa, J.J. Vineyard Variability Analysis through UAV-Based Vigour Maps to Assess Climate Change Impacts. Agronomy 2019, 9, 581. [Google Scholar] [CrossRef] [Green Version]
- Pichon, L.; Leroux, C.; Macombe, C.; Taylor, J.; Tisseyre, B. What relevant information can be identified by experts on unmanned aerial vehicles’ visible images for precision viticulture? Precis. Agric. 2019, 20, 278–294. [Google Scholar] [CrossRef]
- Jenal, A.; Bareth, G.; Bolten, A.; Kneer, C.; Weber, I.; Bongartz, J. Development of a VNIR/SWIR Multispectral Imaging System for Vegetation Monitoring with Unmanned Aerial Vehicles. Sensors 2019, 19, 5507. [Google Scholar] [CrossRef] [Green Version]
- Fanigliulo, R.; Antonucci, F.; Figorilli, S.; Pochi, D.; Pallottino, F.; Fornaciari, L.; Grilli, R.; Costa, C. Light Drone-Based Application to Assess Soil Tillage Quality Parameters. Sensors 2020, 20, 728. [Google Scholar] [CrossRef] [Green Version]
- Vieira, M.A.; Formaggio, A.R.; Rennó, C.D.; Atzberger, C.; Aguiar, D.A.; Mello, M.P. Object Based Image Analysis and Data Mining applied to a remotely sensed Landsat time-series to map sugarcane over large areas. Remote Sens. Environ. 2012, 123, 553–562. [Google Scholar] [CrossRef]
- Da Silva Junior, C.A.; Leonel-Junior, A.H.S.; Saragosa Rossi, F.; Correia Filho, W.L.F.; de Barros Santiago, D.; de Oliveira-Júnior, J.F.; Teodoro, P.E.; Lima, M.; Capristo-Silva, G.F. Mapping soybean planting area in midwest Brazil with remotely sensed T images and phenology-based algorithm using the Google Earth Engine platform. Comput. Electron. Agric. 2020, 169, 105194. [Google Scholar] [CrossRef]
- Maponya, M.G.; van Niekerk, A.; Mashimbye, Z.E. Pre-harvest classification of crop types using a Sentinel-2 time-series and machine learning. Comput. Electron. Agric. 2020, 169, 105164. [Google Scholar] [CrossRef]
- Rembold, F.; Atzberger, C.; Savin, I.; Rojas, O. Using Low Resolution Satellite Imagery for Yield Prediction and Yield Anomaly Detection. Remote Sens. 2013, 5, 1704–1733. [Google Scholar] [CrossRef] [Green Version]
- Bhattacharya, B.K.; Chattopadhyay, C. A multi-stage tracking for mustard rot disease combining surface meteorology and satellite remote sensing. Comput. Electron. Agric. 2013, 90, 35–44. [Google Scholar] [CrossRef]
- Castillejo-González, I.L.; López-Granados, F.; García-Ferrer, A.; Peña-Barragán, J.M.; Jurado-Expósito, M.; de la Orden, M.S.; González-Audicana, M. Object- and pixel-based analysis for mapping crops and their agro-environmental associated measures using QuickBird imagery. Comput. Electron. Agric. 2009, 68, 207–215. [Google Scholar] [CrossRef]
- Khaliq, A.; Comba, L.; Biglia, A.; Ricauda Aimonino, D.; Chiaberge, M.; Gay, P. Comparison of satellite and UAV-based multispectral imagery for vineyard variability assessment. Remote Sens. 2019, 11, 436. [Google Scholar] [CrossRef] [Green Version]
- Padró, J.C.; Muñoz, F.J.; Planas, J.; Pons, X. Comparison of four UAV georeferencing methods for environmental monitoring purposes focusing on the combined use with airborne and satellite remote sensing platforms. Int. J. Appl. Earth Obs. Geoinf. 2019, 75, 130–140. [Google Scholar] [CrossRef]
- Comba, L.; Biglia, A.; Ricauda Aimonino, D.; Tortia, C.; Mania, E.; Guidoni, S.; Gay, P. Leaf Area Index evaluation in vineyards using 3D point clouds from UAV imagery. Precis. Agric. 2019. [Google Scholar] [CrossRef] [Green Version]
- Primicerio, J.; Gay, P.; Ricauda Aimonino, D.; Comba, L.; Matese, A.; Di Gennaro, S.F. NDVI-based vigour maps production using automatic detection of vine rows in ultra-high resolution aerial images. In Precision Agriculture; Wageningen Academic Publishers: Wageningen, The Netherlands, 2015; pp. 465–470. [Google Scholar] [CrossRef]
- Primicerio, J.; Caruso, G.; Comba, L.; Crisci, A.; Gay, P.; Guidoni, S.; Genesio, L.; Ricauda Aimonino, D.; Vaccari, F.P. Individual plant definition and missing plant characterization in vineyards from high-resolution UAV imagery. Eur. J. Remote Sens. 2017, 50, 179–186. [Google Scholar] [CrossRef]
- Diouf, A.A.; Hiernaux, P.; Brandt, M.; Faye, G.; Djaby, B.; Diop, M.B.; Ndione, J.A.; Tychon, B. Do Agrometeorological Data Improve Optical Satellite-Based Estimations of the Herbaceous Yield in Sahelian Semi-Arid Ecosystems? Remote Sens. 2016, 8, 668. [Google Scholar] [CrossRef] [Green Version]
- Hu, Y.; Xu, X.; Wu, F.; Sun, Z.; Xia, H.; Meng, Q.; Huang, W.; Zhou, H.; Gao, J.; Li, W.; et al. Estimating Forest Stock Volume in Hunan Province, China, by Integrating In Situ Plot Data, Sentinel-2 Images, and Linear and Machine Learning Regression Models. Remote Sens. 2020, 12, 186. [Google Scholar] [CrossRef] [Green Version]
- Liu, S.; Lv, Y.; Tong, X.; Xie, H.; Liu, J.; Chen, L. An Alternative Approach for Registration of High-Resolution Satellite Optical Imagery and ICESat Laser Altimetry Data. Sensors 2016, 16, 2008. [Google Scholar] [CrossRef] [Green Version]
- Comba, L.; Biglia, A.; Ricauda Aimonino, D.; Barge, P.; Tortia, C.; Gay, P. 2D and 3D data fusion for crop monitoring in precision agriculture. In Proceedings of the 2019 IEEE International Workshop on Metrology for Agriculture and Forestry, Portici, Italy, 24–26 October 2019; pp. 62–67. [Google Scholar] [CrossRef]
- Zheng, Y.; Wu, B.; Zhang, M.; Zeng, H. Crop Phenology Detection Using High Spatio-Temporal Resolution Data Fused from SPOT5 and MODIS Products. Sensors 2016, 16, 2099. [Google Scholar] [CrossRef] [PubMed] [Green Version]
- Zhao, L.; Shi, Y.; Liu, B.; Hovis, C.; Duan, Y.; Shi, Z. Finer Classification of Crops by Fusing UAV Images and Sentinel-2A Data. Remote Sens. 2019, 11, 3012. [Google Scholar] [CrossRef] [Green Version]
- Xiao, A.; Wang, Z.; Wang, L.; Ren, Y. Super-Resolution for “Jilin-1” Satellite Video Imagery via a Convolutional Network. Sensors 2018, 18, 1194. [Google Scholar] [CrossRef] [Green Version]
- Lu, T.; Wang, J.; Zhang, Y.; Wang, Z.; Jiang, J. Satellite Image Super-Resolution via Multi-Scale Residual Deep Neural Network. Remote Sens. 2019, 11, 1588. [Google Scholar] [CrossRef] [Green Version]
- Tang, X.; Zhang, X.; Liu, F.; Jiao, L. Unsupervised Deep Feature Learning for Remote Sensing Image Retrieval. Remote Sens. 2018, 10, 1243. [Google Scholar] [CrossRef] [Green Version]
- Cai, D.; Chen, K.; Qian, Y.; Kämäräinen, J.K. Convolutional low-resolution fine-grained classification. Pattern Recognit. Lett. 2019, 119, 166–171. [Google Scholar] [CrossRef] [Green Version]
- Arun, P.V.; Herrmann, I.; Budhiraju, K.M.; Karnieli, A. Convolutional network architectures for super-resolution/sub-pixel mapping of drone-derived images. Pattern Recognit. 2019, 88, 431–446. [Google Scholar] [CrossRef]
- Tai, Y.; Yang, J.; Liu, X. Image super-resolution via deep recursive residual network. In Proceedings of the IEEE Conference on Computer Vision and Pattern Recognition (CVPR), Honolulu, HI, USA, 21–26 July 2017; pp. 3147–3155. [Google Scholar] [CrossRef]
- Lim, B.; Son, S.; Kim, H.; Nah, S.; Lee, K.M. Enhanced deep residual networks for single image super-resolution. In Proceedings of the IEEE Conference on Computer Vision and Pattern Recognition Workshops (CVPRW), Honolulu, HI, USA, 21–26 July 2017; pp. 136–144. [Google Scholar] [CrossRef] [Green Version]
- Douarre, C.; Crispim-Junior, C.F.; Gelibert, A.; Tougne, L.; Rousseau, D. Novel data augmentation strategies to boost supervised segmentation of plant disease. Comput. Electron. Agric. 2019, 165, 104967. [Google Scholar] [CrossRef]
- Popescu, D.; Stoican, F.; Stamatescu, G.; Ichim, L.; Dragana, C. Advanced UAV–WSN System for Intelligent Monitoring in Precision Agriculture. Sensors 2020, 20, 817. [Google Scholar] [CrossRef] [PubMed] [Green Version]
- Sankaran, S.; Quirós, J.J.; Miklas, P.N. Unmanned aerial system and satellite-based high resolution imagery for high-throughput phenotyping in dry bean. Comput. Electron. Agric. 2019, 165, 104965. [Google Scholar] [CrossRef]
- Jay, S.; Baret, F.; Dutartre, D.; Malatesta, G.; Héno, S.; Comarc, A.; Weissb, M.; Maupasa, F. Exploiting the centimeter resolution of UAV multispectral imagery to improve remote-sensing estimates of canopy structure and biochemistry in sugar beet crops. Remote Sens. Environ. 2019, 231, 110898. [Google Scholar] [CrossRef]
- Cui, T.W.; Liang, X.J.; Gong, J.L.; Tong, C.; Xiao, Y.F.; Liu, R.J.; Zhang, X.; Zhang, J. Assessing and refining the satellite-derived massive green macro-algal coverage in the Yellow Sea with high resolution images. ISPRS J. Photogram. Remote Sens. 2018, 144, 315–324. [Google Scholar] [CrossRef]
- Martin, F.-M.; Müllerová, J.; Borgniet, L.; Dommanget, F.; Breton, V.; Evette, A. Using Single- and Multi-Date UAV and Satellite Imagery to Accurately Monitor Invasive Knotweed Species. Remote Sens. 2018, 10, 1662. [Google Scholar] [CrossRef] [Green Version]
- Zhang, S.; Zhao, G. A Harmonious Satellite-Unmanned Aerial Vehicle-Ground Measurement Inversion Method for Monitoring Salinity in Coastal Saline Soil. Remote Sens. 2019, 11, 1700. [Google Scholar] [CrossRef] [Green Version]
- Szegedy, C.; Liu, W.; Jia, Y.; Sermanet, P.; Reed, S.; Anguelov, D.; Erhan, D.; Vanhoucke, V.; Rabinovich, A. Going deeper with convolutions. In Proceedings of the IEEE Conference on Computer Vision and Pattern Recognition (CVPR), Boston, MA, USA, 7–12 June 2015; pp. 1–9. [Google Scholar] [CrossRef] [Green Version]
- Khaliq, A.; Mazzia, V.; Chiaberge, M. Refining satellite imagery by using UAV imagery for vineyard environment: A CNN Based approach. In Proceedings of the IEEE International Workshop on Metrology for Agriculture and Forestry (MetroAgriFor), Portici, Italy, 24–26 October 2019; pp. 25–29. [Google Scholar] [CrossRef]
- Ioffe, S.; Szegedy, C. Batch normalization: Accelerating deep network training by reducing internal covariate shift. arXiv 2015, arXiv:1502.03167. [Google Scholar]
- Clevert, D.A.; Unterthiner, T.; Hochreiter, S. Fast and accurate deep network learning by exponential linear units (elus). arXiv 2015, arXiv:1511.07289. [Google Scholar]
- Srivastava, N.; Hinton, G.; Krizhevsky, A.; Sutskever, I.; Salakhutdinov, R. Dropout: A simple way to prevent neural networks from overfitting. J. Mach. Learn. Res. 2014, 15, 1929–1958. [Google Scholar] [CrossRef]
- Smith, L.N. Cyclical learning rates for training neural networks. In Proceedings of the 2017 IEEE Winter Conference on Applications of Computer Vision (WACV), Santa Rosa, CA, USA, 24–31 March 2017; pp. 464–472. [Google Scholar] [CrossRef] [Green Version]
- Loshchilov, I.; Hutter, F. Decoupled weight decay regularization. arXiv 2017, arXiv:1711.05101. [Google Scholar]
- Kingma, D.P.; Ba, J. Adam: A method for stochastic optimization. arXiv 2014, arXiv:1412.6980. [Google Scholar]
- Abadi, M.; Barham, P.; Chen, J.; Chen, Z.; Davis, A.; Dean, J.; Devin, M.; Ghemawat, S.; Irving, G.; Isard, M.; et al. Tensorflow: A system for large-scale machine learning. In Proceedings of the 12th USENIX Symposium on Operating Systems Design and Implementation, Savannah, GA, USA, 2–4 November 2016; pp. 265–283. [Google Scholar]
- Richter, R.; Wang, X.; Bachmann, M.; Schläpfer, D. Correction of cirrus effects in Sentinel-2 type of imagery. Int. J. Remote Sens. 2011, 32, 2931–2941. [Google Scholar] [CrossRef]
- Louis, J.; Charantonis, A.; Berthelot, B. Cloud Detection for Sentinel-2. In Proceedings of the ESA Living Planet Symposium, Bergen, Norway, 28 June–2 July 2010. [Google Scholar]
- Kaufman, Y.; Sendra, C. Algorithm for automatic atmospheric corrections to visibleand near-IR satellite imagery. Int. J. Remote Sens. 1988, 9, 1357–1381. [Google Scholar] [CrossRef]
- EESA. Sentinel-2 User Handbook. Available online: https://earth.esa.int/documents/247904/685211/Sentinel-2_User_Handbook (accessed on 25 November 2017).
- Mulla, D.J. Twenty five years of remote sensing in precision agriculture: Key advances and remaining knowledge gaps. Biosyst. Eng. 2013, 114, 358–371. [Google Scholar] [CrossRef]
- Borgogno-Mondino, E.; Lessio, A.; Tarricone, L.; Novello, V.; de Palma, L. A comparison between multispectral aerial and satellite imagery in precision viticulture. Precis. Agric. 2018, 19, 195. [Google Scholar] [CrossRef]
Variable | Definition |
---|---|
a | a vector |
A | a matrix |
A | a tensor |
i-th element of a vector a | |
element i,j,k of a 3-D tensor A | |
2-D slice of a 3-D tensor A | |
a set of elements/a map | |
i-th sample from a dataset | |
ground-truth associated with the i-th sample |
Time Period | Dataset Name | Acquisition Date | Source |
---|---|---|---|
I | 30 April 2017 | Sentinel-2 | |
5 May 2017 | UAV | ||
II | 6 July 2017 | Sentinel-2 | |
29 June 2017 | UAV | ||
III | 5 August 2017 | Sentinel-2 | |
1 August 2017 | UAV | ||
IV | 17 September 2017 | Sentinel-2 | |
13 September 2017 | UAV |
Datasets (Grouped by) | Parcel | Source | DF 1 | SS 1 | MS 1 | F-Value | p-Value |
---|---|---|---|---|---|---|---|
() | Parcel-A | Classes | 2 | 0.3084 | 0.1541 | 3.4582 | 0.044081 |
Error | 31 | 1.3821 | 0.0445 | ||||
Total | 33 | 1.6905 | |||||
Parcel-B | Classes | 2 | 0.3938 | 0.1969 | 4.8928 | 0.010587 | |
Error | 63 | 2.5353 | 0.0402 | ||||
Total | 65 | 2.9291 | |||||
Parcel-C | Classes | 2 | 0.1985 | 0.0992 | 1.4555 | 0.264401 | |
Error | 15 | 1.0228 | 0.0681 | ||||
Total | 17 | 1.2213 | |||||
() | Parcel-A | Classes | 2 | 0.4749 | 0.2374 | 8.0112 | 0.001568 |
Error | 31 | 0.9189 | 0.0296 | ||||
Total | 33 | 1.3938 | |||||
Parcel-B | Classes | 2 | 1.3735 | 0.6867 | 22.9984 | 3.17 × 10−8 | |
Error | 63 | 1.8812 | 0.0298 | ||||
Total | 65 | 3.2547 | |||||
Parcel-C | Classes | 2 | 0.7071 | 0.3535 | 11.7444 | 0.000852 | |
Error | 15 | 0.4515 | 0.0301 | ||||
Total | 17 | 1.1586 | |||||
() | Parcel-A | Classes | 2 | 1.3608 | 0.6804 | 30.0925 | 5.46×10−8 |
Error | 31 | 0.7009 | 0.0226 | ||||
Total | 33 | 2.0617 | |||||
Parcel-B | Classes | 2 | 2.7135 | 1.3567 | 71.1664 | 6.87× 10 −17 | |
Error | 63 | 1.2010 | 0.0190 | ||||
Total | 65 | 3.9145 | |||||
Parcel-C | Classes | 2 | 0.9447 | 0.4723 | 8.7803 | 0.002988 | |
Error | 15 | 0.8069 | 0.0537 | ||||
Total | 17 | 1.7516 |
Datasets (Grouped by) | Parcel | Source | DF 1 | SS 1 | MS 1 | F-Value | p-Value |
---|---|---|---|---|---|---|---|
() | Parcel-A | Classes | 2 | 0.7907 | 0.7907 | 32.4702 | 2.60 × 10−6 |
Error | 31 | 0.7792 | 0.0243 | ||||
Total | 33 | 1.5699 | |||||
Parcel-B | Classes | 2 | 1.39025 | 1.3902 | 78.7860 | 9.31 × 10−13 | |
Error | 63 | 1.1293 | 0.0176 | ||||
Total | 65 | 2.5196 | |||||
Parcel-C | Classes | 2 | 1.1914 | 1.1917 | 113.4301 | 1.14 × 10−8 | |
Error | 15 | 0.1681 | 0.0105 | ||||
Total | 17 | 1.3596 | |||||
() | Parcel-A | Classes | 2 | 0.6968 | 0.6968 | 31.9907 | 2.94 × 10 −6 |
Error | 31 | 0.6970 | 0.0218 | ||||
Total | 33 | 1.3939 | |||||
Parcel-B | Classes | 2 | 1.5536 | 1.5536 | 58.4472 | 1.36 × 10 −10 | |
Error | 63 | 1.7012 | 0.0266 | ||||
Total | 65 | 3.2548 | |||||
Parcel-C | Classes | 2 | 0.7978 | 0.7978 | 35.3635 | 2.05 × 10 −5 | |
Error | 15 | 0.3609 | 0.0225 | ||||
Total | 17 | 1.1587 | |||||
() | Parcel-A | Classes | 2 | 0.4195 | 0.4194 | 13.4022 | 0.000898 |
Error | 31 | 1.0015 | 0.0313 | ||||
Total | 33 | 1.4210 | |||||
Parcel-B | Classes | 2 | 0.6561 | 0.6560 | 29.8767 | 8.10 × 10−7 | |
Error | 63 | 1.4054 | 0.0219 | ||||
Total | 65 | 2.0614 | |||||
Parcel-C | Classes | 2 | 0.1808 | 0.1808 | 2.1895 | 0.158372 | |
Error | 15 | 1.3218 | 0.0826 | ||||
Total | 17 | 1.5026 | |||||
() | Parcel-A | Classes | 2 | 0.2441 | 0.2441 | 4.6372 | 0.038924 |
Error | 31 | 1.6846 | 0.0526 | ||||
Total | 33 | 1.9287 | |||||
Parcel-B | Classes | 2 | 0.6649 | 0.6649 | 20.8288 | 2.33 × 10−5 | |
Error | 63 | 2.0431 | 0.0319 | ||||
Total | 65 | 2.7081 | |||||
Parcel-C | Classes | 2 | 0.8174 | 0.8173 | 25.5642 | 0.000117 | |
Error | 15 | 0.5116 | 0.0319 | ||||
Total | 17 | 1.3289 |
© 2020 by the authors. Licensee MDPI, Basel, Switzerland. This article is an open access article distributed under the terms and conditions of the Creative Commons Attribution (CC BY) license (http://creativecommons.org/licenses/by/4.0/).
Share and Cite
Mazzia, V.; Comba, L.; Khaliq, A.; Chiaberge, M.; Gay, P. UAV and Machine Learning Based Refinement of a Satellite-Driven Vegetation Index for Precision Agriculture. Sensors 2020, 20, 2530. https://doi.org/10.3390/s20092530
Mazzia V, Comba L, Khaliq A, Chiaberge M, Gay P. UAV and Machine Learning Based Refinement of a Satellite-Driven Vegetation Index for Precision Agriculture. Sensors. 2020; 20(9):2530. https://doi.org/10.3390/s20092530
Chicago/Turabian StyleMazzia, Vittorio, Lorenzo Comba, Aleem Khaliq, Marcello Chiaberge, and Paolo Gay. 2020. "UAV and Machine Learning Based Refinement of a Satellite-Driven Vegetation Index for Precision Agriculture" Sensors 20, no. 9: 2530. https://doi.org/10.3390/s20092530
APA StyleMazzia, V., Comba, L., Khaliq, A., Chiaberge, M., & Gay, P. (2020). UAV and Machine Learning Based Refinement of a Satellite-Driven Vegetation Index for Precision Agriculture. Sensors, 20(9), 2530. https://doi.org/10.3390/s20092530