Distinguishing Planting Structures of Different Complexity from UAV Multispectral Images
<p>Geographical location of the study areas.</p> "> Figure 2
<p>UAV images mosaic results of the study areas.((<b>A</b>) UAV images mosaic result of study area 1; (<b>B</b>) UAV images mosaic result of study area 2; (<b>C</b>) UAV images mosaic result of study area 3).</p> "> Figure 3
<p>The ground crop distribution maps. ((<b>A</b>) the standard ground crop distribution map of study area 1; (<b>B</b>) the ground crop distribution map of study area 2; (<b>C</b>) the ground crop distribution map of study area 3).</p> "> Figure 4
<p>Schematic diagram of spectral curve collection work.</p> "> Figure 5
<p>Workflow of planting structure extraction.</p> "> Figure 6
<p>Image segmentation results of the study areas. ((<b>A</b>) image segmentation result of study area 1; (<b>B</b>) image segmentation result of study area 2; (<b>C</b>) image segmentation result of study area 3).</p> "> Figure 7
<p>Classification results of OB-RF model. ((<b>A</b>) classification result of OB-RF model in study area 1; (<b>B</b>) classifica<a href="#sensors-21-01994-t002" class="html-table">Table 2</a>. (<b>C</b>) classification result of OB-RF model in study area 3). <b>Note:</b> OB-RF stands for Object-oriented random forest classification model.</p> "> Figure 8
<p>Classification results of OB-SVM model. ((<b>A</b>) classification result of OB-SVM model in study area 1; (<b>B</b>), classification result of OB-SVM model in study area 2; (<b>C</b>) classification result of OB-SVM model in study area 3). <b>Note:</b> OB-SVM stands for support vector machine classification model.</p> "> Figure 9
<p>Classification error map of OB-RF and OB-SVM in study areas. ((<b>A</b>) classification error map of OB-RF in study area 1; (<b>B</b>) classification error map of OB-SVM in study area 1; (<b>C</b>) classification error map of OB-RF in study area 2; (<b>D</b>) classification error map of OB-SVM in study area 2; (<b>E</b>), classification error map of OB-RF in study area 3; (<b>F</b>) classification error map of OB-SVM in study area 3.)</p> "> Figure 10
<p>Spectral curve of ground objects.</p> "> Figure A1
<p>Relationship between accumulated multispectral bands and overall accuracy.</p> "> Figure A2
<p>Relationship between accumulated vegetation indices and overall accuracy.</p> "> Figure A3
<p>Relationship between accumulated textural features and overall accuracy.</p> ">
Abstract
:1. Introduction
2. Study Area and Data Preparation
2.1. Overview of the Study Area
2.2. The Collection of UAV Remote Sensing Data
2.3. The Collection of Ground Data
2.3.1. The Ground Distribution Data of Crops
2.3.2. Crop–Ground Spectral Curves
3. Research Procedure and Method
3.1. Sample Selection
3.2. Construction and Screening of Feature Parameters
3.2.1. Construction of Spectral Features and Texture Features
3.2.2. Screening of Characteristic Parameters
3.3. Multiresolution Segmentation
3.4. Classification Methods
3.4.1. RF
3.4.2. SVM
3.5. Classification Accuracy Assessment
4. Results
5. Discussion
5.1. Classification Error Analysis
5.2. Model Performance under Different Planting Structure Complexity
5.3. Classification Potential of UAV Multispectral Remote Sensing Technology under Complex Planting Structures
6. Conclusions
- (1)
- The OB-SVM model’s classification accuracy in areas with low-, medium-, and high-complexity planting structures was respectively 1.99%, 4.60%, and 8.22% higher than that of the OB-RF model. As the planting structure complexity increased, the classification advantages of the OB-SVM model became more evident. This indicates that the OB-SVM model offers higher classification accuracy under land fragmentation and highly complex planting structures, and is more suitable for the fine classification of farmland features with highly complex agricultural planting patterns;
- (2)
- Based on UAV multispectral remote sensing technology and the OB-SVM classification model, the overall accuracy of the study areas with low, medium, and high complexity were as high as 99.13%, 99.08%, and 97.21%, respectively. The extraction accuracy of each crop was at least 92.59%, 94.81% and 85.65% in the three study areas, respectively. As the planting structure complexity increased from low to high, the classification accuracy and extraction accuracy decreased, but the overall accuracy only decreased by 1.92%. Therefore, UAV multispectral remote sensing technology has vast application potential for the fine classification of farmland features under highly complex planting structures. The conclusions can provide new ideas for accurately obtaining crop distribution maps in areas with complex planting structures, and then provide technical support for protecting food security and the rational allocation of water resources.
Author Contributions
Funding
Institutional Review Board Statement
Informed Consent Statement
Data Availability Statement
Acknowledgments
Conflicts of Interest
Appendix A
Multispectral Bands | B1 | B2 | B3 | B4 | B5 |
---|---|---|---|---|---|
SA1 | Red band | Blue band | NIR | Green band | Red edge |
SA2 | NIR | Red edge | Green band | Red band | Blue band |
SA3 | Blue band | NIR | Red band | Red edge | Green band |
Vegetation Indices | B1 | B2 | B3 | B4 | B5 | B6 | B7 |
---|---|---|---|---|---|---|---|
SA1 | VDVI | ExG | DVI | RVI | NGBDI | NDVI | NGRDI |
SA2 | NGBDI | ExG | DVI | RVI | NGRDI | NDVI | VDVI |
SA3 | NGBDI | RVI | NDVI | ExG | DVI | VDVI | NGRDI |
Texture Features | SA1 | SA2 | SA3 |
---|---|---|---|
B1 | Red mean | Red contrast | Red homogeneity |
B2 | Green mean | Blue entropy | Green homogeneity |
B3 | Blue mean | Blue homogeneity | Blue dissimilarity |
B4 | Red correlation | Green dissimilarity | Red correlation |
B5 | Green homogeneity | Red second moment | Red contrast |
B6 | Blue correlation | Near-infrared dissimilarity | Near-infrared contrast |
B7 | Near-infrared homogeneity | Near-infrared correlation | Near-infrared homogeneity |
B8 | Near-infrared entropy | Blue correlation | Blue correlation |
B9 | Red variance | Red dissimilarity | Red variance |
B10 | Green dissimilarity | Red entropy | Green correlation |
B11 | Blue entropy | Red-edge second moment | Near-infrared second moment |
B12 | Green entropy | Green homogeneity | Red-edge second moment |
B13 | Green second moment | Green correlation | Green dissimilarity |
B14 | Red-edge mean | Red-edge dissimilarity | Near-infrared contrast |
B15 | Red-edge homogeneity | Red-edge homogeneity | Red-edge homogeneity |
B16 | Near-infrared variance | Blue dissimilarity | Near-infrared dissimilarity |
B17 | Red contrast | Red variance | Red mean |
B18 | Blue variance | Green entropy | Green second moment |
B19 | Near-infrared contrast | Near-infrared variance | Near-infrared mean |
B20 | Green correlation | Red correlation | Green contrast |
B21 | Green contrast | Blue mean | Green mean |
B22 | Red dissimilarity | Blue second moment | Near-infrared entropy |
B23 | Red-edge variance | Near-infrared mean | Near-infrared correlation |
B24 | Near-infrared second moment | Near-infrared homogeneity | Blue mean |
B25 | Red homogeneity | Red mean | Red entropy |
B26 | Red entropy | Green contrast | Blue entropy |
B27 | Near-infrared dissimilarity | Red variance | Near-infrared variance |
B28 | Red second moment | Green mean | Green variance |
B29 | Green variance | Blue variance | Blue homogeneity |
B30 | Red-edge contrast | Red-edge contrast | Red-edge second moment |
B31 | Red entropy | Red-edge second moment | Red-edge correlation |
B32 | Near-infrared mean | Near-infrared entropy | Blue variance |
B33 | Red dissimilarity | Red homogeneity | Red dissimilarity |
B34 | Red contrast | Blue contrast | Near-infrared variance |
B35 | Near-infrared correlation | Near-infrared contrast | Red-edge homogeneity |
B36 | Blue homogeneity | Green second moment | Green entropy |
B37 | Blue dissimilarity | Green mean | Blue contrast |
B38 | Red-edge second moment | Red-edge correlation | Red-edge dissimilarity |
B39 | Red-edge correlation | Red-edge entropy | Red-edge entropy |
B40 | Blue second moment | Red-edge mean | Blue entropy |
References
- Atzberger, C. Advances in remote sensing of agriculture: Context description, existing operational monitoring systems and major information needs. Remote Sens. 2013, 5, 949–981. [Google Scholar] [CrossRef] [Green Version]
- Gerland, P.; Raftery, A.E.; Ševčíková, H.; Li, N.; Gu, D.; Spoorenberg, T.; Alkema, L.; Fosdick, B.K.; Chunn, J.; Lalic, N. World population stabilization unlikely this century. Science 2014, 346, 234–237. [Google Scholar] [CrossRef] [PubMed] [Green Version]
- Petitjean, F.; Inglada, J.; Gançarski, P. Satellite image time series analysis under time warping. IEEE Trans. Geosc. Remote Sens. 2012, 50, 3081–3095. [Google Scholar] [CrossRef]
- Petitjean, F.; Kurtz, C.; Passat, N.; Gançarski, P. Spatio-temporal reasoning for the classification of satellite image time series. Pattern Recognit. Lett. 2012, 33, 1805–1815. [Google Scholar] [CrossRef] [Green Version]
- Liu, Z.; Zhang, F.; Ma, Q.; An, D.; Li, L.; Zhang, X.; Zhu, D.; Li, S. Advances in crop phenotyping and multi-environment trials. Front. Agric. Sci. Eng. 2015, 2, 28–37. [Google Scholar] [CrossRef]
- Zhang, X.; Zhang, F.; Qi, Y.; Deng, L.; Wang, X.; Yang, S. New research methods for vegetation information extraction based on visible light remote sensing images from an unmanned aerial vehicle (UAV). Int. J. Appl. Earth Obs. Geoinf. 2019, 78, 215–226. [Google Scholar] [CrossRef]
- Pajares, G. Overview and current status of remote sensing applications based on unmanned aerial vehicles (UAVs). Photogramm. Eng. Remote Sens. 2015, 81, 281–330. [Google Scholar] [CrossRef] [Green Version]
- Bansod, B.; Singh, R.; Thakur, R.; Singhal, G. A comparison between satellite based and drone based remote sensing technology to achieve sustainable development: A review. J. Agric. Envir. Int. Dev. 2017, 111, 383–407. [Google Scholar] [CrossRef]
- Fawcett, D.; Panigada, C.; Tagliabue, G.; Boschetti, M.; Celesti, M.; Evdokimov, A.; Biriukova, K.; Colombo, R.; Miglietta, F.; Rascher, U. Multiscale evaluation of drone-based multispectral surface reflectance and vegetation indices in operational conditions. Remote Sens. 2020, 12, 514. [Google Scholar] [CrossRef] [Green Version]
- Bazi, Y.; Melgani, F. Toward an Optimal SVM Classification System for Hyperspectral Remote Sensing Images. IEEE Trans. Geosci. Remote Sens. 2006, 44, 3374–3385. [Google Scholar] [CrossRef]
- Li, W.; Fu, H.; You, Y.; Yu, L.; Fang, J. Parallel multiclass support vector machine for remote sensing data classification on multicore and many-core architectures. IEEE J. Sel. Top. Appl. Earth Observ. Remote Sens. 2017, 10, 4387–4398. [Google Scholar] [CrossRef]
- Sheykhmousa, M.; Mahdianpari, M.; Ghanbari, H.; Mohammadimanesh, F.; Ghamisi, P.; Homayouni, S. Support Vector Machine Versus Random Forest for Remote Sensing Image Classification: A Meta-analysis and systematic review. IEEE J. Sel. Top. Appl. Earth Observ. Remote Sens. 2020, 13, 6308–6325. [Google Scholar] [CrossRef]
- Xu, S.; Zhao, Q.; Yin, K.; Zhang, F.; Liu, D.; Yang, G. Combining random forest and support vector machines for object-based rural-land-cover classification using high spatial resolution imagery. J. Appl. Remote Sens. 2019, 13, 014521. [Google Scholar] [CrossRef]
- Pal, M.; Mather, P.M. Support vector machines for classification in remote sensing. Int. J. Remote Sens. 2005, 26, 1007–1011. [Google Scholar] [CrossRef]
- Pal, M. Random forest classifier for remote sensing classification. Int. J. Remote Sens. 2005, 26, 217–222. [Google Scholar] [CrossRef]
- Wang, P.; Fan, E.; Wang, P. Comparative analysis of image classification algorithms based on traditional machine learning and deep learning. Pattern Recognit. Lett. 2021, 141, 61–67. [Google Scholar] [CrossRef]
- Liu, P.; Choo, K.R.; Wang, L.; Huang, F. SVM or deep learning? A comparative study on remote sensing image classification. Soft Comput. 2017, 21, 7053–7065. [Google Scholar] [CrossRef]
- Csillik, O.; Cherbini, J.; Johnson, R.; Lyons, A.; Kelly, M. Identification of citrus trees from unmanned aerial vehicle imagery using convolutional neural networks. Drones 2018, 2, 39. [Google Scholar] [CrossRef] [Green Version]
- Li, L.M.; Guo, P.; Zhang, G.S.; Zhou, Q.; Wu, S.Z. Research on area information extraction of cotton field based on UAV visible light remote sensing. Xinjiang Agric. Sci. 2018, 55, 162–169, (In Chinese with English Abstract). [Google Scholar] [CrossRef]
- Dong, M.; Su, J.D.; Liu, G.Y.; Yang, J.T.; Chen, X.Z.; Tian, L.; Wang, M.X. Extraction of tobacco planting areas from UAV remote sensing imagery by object-oriented classification method. Sci. Surv. Mapp. 2014, 39, 87–90, (In Chinese with English Abstract). [Google Scholar] [CrossRef]
- Wu, J.; Liu, H.; Zhang, J.S. Paddy planting acreage estimation in city level based on UAV images and object-oriented classification method. Trans. CSAE 2018, 34, 70–77, (In Chinese with English Abstract). [Google Scholar] [CrossRef]
- Liu, H.; Zhang, J.; Pan, Y.; Shuai, G.; Zhu, X.; Zhu, S. An efficient approach based on UAV orthographic imagery to map paddy with support of field-level canopy height from point cloud data. IEEE J. Sel. Top. Appl. Earth Observ. Remote Sens. 2018, 11, 2034–2046. [Google Scholar] [CrossRef]
- Hall, O.; Dahlin, S.; Marstorp, H.; Archila Bustos, M.F.; Öborn, I.; Jirström, M. Classification of maize in complex smallholder farming systems using UAV imagery. Drones 2018, 2, 22. [Google Scholar] [CrossRef] [Green Version]
- Castillejo-González, I.L.; López-Granados, F.; García-Ferrer, A.; Peña-Barragán, J.M.; Jurado-Expósito, M.; Orden, M.S.; González-Audicana, M. Object-and pixel-based analysis for mapping crops and their agro-environmental associated measures using QuickBird imagery. Comput. Electron. Agric. 2009, 68, 207–215. [Google Scholar] [CrossRef]
- Peña-Barragán, J.M.; Ngugi, M.K.; Plant, R.E.; Six, J. Object-based crop identification using multiple vegetation indices, textural features and crop phenology. Remote Sens. Environ. 2011, 115, 1301–1316. [Google Scholar] [CrossRef]
- Guo, P.; Wu, F.D.; Dai, J.G.; Wang, H.J.; Xu, L.P.; Zhang, G.S. Comparison of farmland crop classification methods based on visible light images of unmanned aerial vehicles. Trans. CSAE 2017, 33, 112–119, (In Chinese with English Abstract). [Google Scholar] [CrossRef]
- Chen, Z.H.; Xu, Y.H.; Tong, Q.Q. Extraction and verification of crop information based on visible remote sensing image of unmanned aerial vehicle. Guizhou Agric. Sci. 2020, 48, 127–130, (In Chinese with English Abstract). [Google Scholar]
- Wei, Q.; Zhang, B.Z.; Wang, Z. Research on object recognition based on UAV multispectral image. Xinjiang Agric. Sci. 2020, 57, 932–939, (In Chinese with English Abstract). [Google Scholar] [CrossRef]
- Wang, L.; Liu, J.; Yang, L.B. Applications of unmanned aerial vehicle images on agricultural remote sensing monitoring. Trans. CSAE 2013, 29, 136–145, (In Chinese with English Abstract). [Google Scholar] [CrossRef]
- Park, J.K.; Park, J.H. Crops classification using imagery of unmanned aerial vehicle (UAV). J. Korean Soc. Agric. Eng. 2015, 57, 91–97. [Google Scholar] [CrossRef]
- Wu, F.M.; Zhang, M.; Wu, B.F. Object-oriented rapid estimation of rice acreage from uav imagery. J. Geo-Inf. Sci. 2019, 21, 789–798, (In Chinese with English Abstract). [Google Scholar] [CrossRef]
- Liu, B.; Shi, Y.; Duan, Y.; Wu, W. UAV-based Crops Classification with joint features from Orthoimage and DSM data. Int. Arch. Photogramm. RSSIS 2018, 42, 1023–1028. [Google Scholar] [CrossRef] [Green Version]
- Carlson, T.N.; Ripley., D.A. On the relation between NDVI, fractional vegetation cover, and leaf area index. Remote Sens. Environ. 1997, 62, 241–251. [Google Scholar] [CrossRef]
- Jordan, C.F. Derivation of leaf area index from quality of light on the forest floor. Ecology 1969, 50, 663–666. [Google Scholar] [CrossRef]
- Tucker, C.J. Red and photographic infrared linear combinations for monitoring vegetation. Remote Sens. Environ. 1979, 8, 127–150. [Google Scholar] [CrossRef] [Green Version]
- Meyer, G.E.; Mehta, T.; Kocher, M.F.; Mortensen, D.A.; Samal, A. Textural imaging and discriminant analysis for distinguishing weeds for spot spraying. Amer. Soc. Agric. Biol. Eng. 1998, 41, 1189–1197. [Google Scholar] [CrossRef]
- Wang, X.; Wang, M.; Wang, S.; Wu, Y. Extraction of vegetation information from visible unmanned aerial vehicle images. Trans. CSAE 2015, 31, 152–159, (In Chinese with English Abstract). [Google Scholar] [CrossRef]
- Verrelst, J.; Schaepman, E.M.; Koetz, B.; Kneub, M. Angular sensitivity analysis of vegetation indices derived from CHRIS/PROBA data. Remote Sens. Environ. 2008, 112, 2341–2353. [Google Scholar] [CrossRef]
- Jannoura, R.; Brinkmann, K.; Uteau, D.; Bruns, C.; Joergensen, R.G. Monitoring of crop biomass using true colour aerial photographs taken from a remote controlled hexacopter. Biosyst. Eng. 2019, 129, 341–351. [Google Scholar] [CrossRef]
- Woebbecke, D.M.; Meyer, G.E.; Von Bargen, K.V.; Mortensen, D.A. Color Indices for Weed Identification Under Various Soil, Residue, and Lighting Conditions. Am. Soc. Agric. Biol. Eng. 1995, 38, 259–269. [Google Scholar] [CrossRef]
- Chen, Q.; Meng, Z.; Liu, X.; Jin, Q.; Su, R. Decision Variants for the Automatic Determination of Optimal Feature Subset in RF-RFE. Genes 2018, 9, 301. [Google Scholar] [CrossRef] [Green Version]
- Belgiu, M.; Drăguţ, L. Random forest in remote sensing: A review of applications and future directions. ISPRS J. Photogramm. Remote Sens. 2016, 114, 24–31. [Google Scholar] [CrossRef]
- Biau, G.; Scornet, E. A random forest guided tour. Test 2016, 25, 197–227. [Google Scholar] [CrossRef] [Green Version]
- Fradkin, D.; Muchnik, I. Support vector machines for classification. DIMACS Ser. Discrete. Math. Theor Comput. Sci. 2006, 70, 13–20. [Google Scholar] [CrossRef] [Green Version]
- Li, M. A High Spatial Resolution Remote Sensing Image Classification Study Based on SVM Algorithm and Application Thereof. Master’s Thesis, China University of Geosciences, Beijing, China, 2015. Available online: https://kns.cnki.net/kcms/detail/detail.aspx?dbcode=CMFD&dbname=CMFD201601&filename=1015385537.nh&v=WVxPpasNAZQMxSuuuMI2iJhEuqdvSs%25mmd2FYy7HlfXI%25mmd2BQG35yzQQZMUOX7bbd3yICpsB (accessed on 3 February 2021).
- Lewis, H.G.; Brown, M. A generalized confusion matrix for assessing area estimates from remotely sensed data. Int. J. Remote Sens. 2001, 22, 3223–3235. [Google Scholar] [CrossRef]
- Knipling, E.B. Physical and physiological basis for the reflectance of visible and near-infrared radiation from vegetation. Int. J. Remote Sens. 1970, 1, 155–159. [Google Scholar] [CrossRef]
- Chen, P.; Chiang, Y.; Weng, P. Imaging Using Unmanned Aerial Vehicles for Agriculture Land Use Classification. Agriculture 2020, 10, 416. [Google Scholar] [CrossRef]
- Ishida, T.; Kurihara, J.; Viray, F.A.; Namuco, S.B.; Paringit, E.C.; Perez, G.J.; Takahashi, Y.; Marciano Jr, J.J. A novel approach for vegetation classification using UAV-based hyperspectral imaging. Comput. Electron. Agric. 2018, 144, 80–85. [Google Scholar] [CrossRef]
Unmanned Aerial Vehicle (UAV) | Camera | ||
---|---|---|---|
Parameters | Values | Parameters | Values |
Wheelbase/mm | 900 | Camera model | MicaSense RedEdge-M |
Takeoff mass/kg | 4.7–8.2 | Pixels | 1280 × 960 |
Payload/g | 820 | Band | 5 |
Endurance time/min | 20 | Wavelength/nm | 400–900 |
Digital communication distance/km | 3 | Focal length/mm | 5.5 |
Battery power/(mAh) | 16,000 | Field of view/(°) | 47.2 |
Cruising speed/(m·s−1) | 5 |
Air Temperature (°C) | Air Humidity (%) | Illuminance (1 × 104 lux) | Wind Speed (m/s) | PM2.5 (μg/m3) | PM10 (μg/m3) | |
---|---|---|---|---|---|---|
26 July 2020 | 25.43 | 67.08 | 23.28 | 2.20 | 15.00 | 16.50 |
29 July 2020 | 28.63 | 51.98 | 21.53 | 1.60 | 5.00 | 5.25 |
1 August 2020 | 28.65 | 58.13 | 23.48 | 1.50 | 13.00 | 14.00 |
Parameters | Values |
---|---|
Spectral range | 325–1075 nm |
Spectral resolution | 3.5 nm at 700 nm |
Sampling interval | 1.6 nm |
Integration time | 2n × 17 ms for n = 0, 1, …, 15 |
Wavelength accuracy | ±1 nm |
Noise equivalent radiance | 5.0 E–9 W/cm2/nm/sr at 700 nm |
Study Area 1 | Study Area 2 | Study Area 3 | ||||||
---|---|---|---|---|---|---|---|---|
Crops | TS | VS | Crops | TS | VS | Crops | TS | VS |
Corn | 35 | 15 | Corn | 33 | 11 | Sunflower | 48 | 17 |
Sunflower | 38 | 12 | Sunflower | 40 | 15 | Zucchini | 32 | 12 |
Zucchini | 40 | 17 | Zucchini | 42 | 15 | Hami melon | 12 | 5 |
Bare land | 20 | 8 | Hami melon | 25 | 8 | Pepper | 21 | 9 |
Pepper | 27 | 9 | Sapling | 12 | 6 | |||
Bare land | 18 | 6 | Watermelon | 14 | 6 | |||
Cherry tomato | 23 | 8 | ||||||
Tomato | 30 | 11 | ||||||
Bare land | 25 | 9 |
Vegetation Indices | Full Name | Formula |
---|---|---|
NDVI | Normalized difference vegetation index | |
RVI | Ratio vegetation index | |
DVI | Difference vegetation index | |
ExG | Excess green | |
VDVI | Visible-band difference vegetation index | |
NGBDI | Normalized green-blue difference index | |
NGRDI | Normalized green-red difference index | |
WI | Woebbecke index |
Feature Types | Feature Subset | |
---|---|---|
SA1 | Multispectral bands | Red-band; Blue-band |
Vegetation indices | VDVI; ExG | |
Texture features | Red-mean; Green-mean; Blue-mean; Red-correlation; Green-homogeneity; Blue-correlation; NIR-correlation | |
SA2 | Multispectral bands | Red-band; Green-band; Blue-band; NIR; Red-edge |
Vegetation indices | NGBDI; ExG; RVI; | |
Texture features | Blue-contrast; Blue-entropy; Blue-homogeneity; Green-dissimilarity; Red-second-moment; NIR-dissimilarity | |
SA3 | Multispectral bands | Red-band; Green-band; Blue-band; NIR; Red-edge |
Vegetation indices | NGBDI; RVI; DVI; NGRDI | |
Texture features | Red-homogeneity; Green-homogeneity; Blue-dissimilarity; Red- correlation |
Methods | Objects | Zucchini | Corn | Sunflower | Bare Land | |
---|---|---|---|---|---|---|
Accuracy (%) | ||||||
OB-RF | PA | 98.31 | 100.00 | 95.19 | 98.01 | |
UA | 90.90 | 99.83 | 99.30 | 93.98 | ||
F | 94.46 | 99.91 | 97.20 | 95.95 | ||
Overall accuracy = 97.09%, Kappa = 0.95 | ||||||
OB-SVM | PA | 99.41 | 99.87 | 99.87 | 87.50 | |
UA | 99.37 | 98.99 | 99.10 | 98.32 | ||
F | 99.39 | 99.43 | 99.48 | 92.59 | ||
Overall accuracy = 99.13%, Kappa = 0.99 |
Methods | Objects | Zucchini | Corn | Sunflower | Bare Land | Pepper | Hami Melon | |
---|---|---|---|---|---|---|---|---|
Accuracy (%) | ||||||||
OB-RF | PA | 84.37 | 99.62 | 99.53 | 92.63 | 99.12 | 86.11 | |
UA | 99.68 | 98.93 | 99.94 | 98.43 | 73.85 | 67.45 | ||
F | 91.39 | 99.27 | 99.73 | 95.44 | 84.86 | 75.65 | ||
Overall accuracy = 92.61%, Kappa = 0.90 | ||||||||
OB-SVM | PA | 99.74 | 99.69 | 99.57 | 91.07 | 96.48 | 99.51 | |
UA | 99.40 | 97.61 | 99.41 | 98.87 | 98.53 | 99.29 | ||
F | 99.57 | 98.64 | 99.49 | 94.81 | 97.49 | 99.40 | ||
Overall accuracy = 99.08%, Kappa = 0.99 |
Methods | Objects | Zucchini | Sunflower | Bare Land | Pepper | Hami Melon | Watermelon | Tomato | Cherry Tomato | Sapling | |
---|---|---|---|---|---|---|---|---|---|---|---|
Accuracy (%) | |||||||||||
OB-RF | PA | 93.33 | 90.55 | 91.77 | 89.01 | 98.35 | 100.00 | 99.94 | 72.04 | 26.67 | |
UA | 88.12 | 95.39 | 89.32 | 79.41 | 80.32 | 88.24 | 85.69 | 98.85 | 72.73 | ||
F | 90.65 | 92.91 | 90.53 | 83.94 | 88.43 | 93.75 | 92.27 | 83.34 | 39.03 | ||
overall accuracy = 88.99%, Kappa = 0.86 | |||||||||||
OB-SVM | PA | 98.87 | 99.15 | 92.07 | 93.59 | 98.35 | 100.00 | 99.94 | 91.84 | 84.22 | |
UA | 99.98 | 99.68 | 94.08 | 94.19 | 80.32 | 98.47 | 91.51 | 98.82 | 87.13 | ||
F | 99.42 | 99.41 | 93.06 | 93.89 | 88.43 | 99.23 | 95.54 | 95.20 | 85.65 | ||
Overall accuracy = 97.21%, Kappa = 0.97 |
Publisher’s Note: MDPI stays neutral with regard to jurisdictional claims in published maps and institutional affiliations. |
© 2021 by the authors. Licensee MDPI, Basel, Switzerland. This article is an open access article distributed under the terms and conditions of the Creative Commons Attribution (CC BY) license (http://creativecommons.org/licenses/by/4.0/).
Share and Cite
Ma, Q.; Han, W.; Huang, S.; Dong, S.; Li, G.; Chen, H. Distinguishing Planting Structures of Different Complexity from UAV Multispectral Images. Sensors 2021, 21, 1994. https://doi.org/10.3390/s21061994
Ma Q, Han W, Huang S, Dong S, Li G, Chen H. Distinguishing Planting Structures of Different Complexity from UAV Multispectral Images. Sensors. 2021; 21(6):1994. https://doi.org/10.3390/s21061994
Chicago/Turabian StyleMa, Qian, Wenting Han, Shenjin Huang, Shide Dong, Guang Li, and Haipeng Chen. 2021. "Distinguishing Planting Structures of Different Complexity from UAV Multispectral Images" Sensors 21, no. 6: 1994. https://doi.org/10.3390/s21061994
APA StyleMa, Q., Han, W., Huang, S., Dong, S., Li, G., & Chen, H. (2021). Distinguishing Planting Structures of Different Complexity from UAV Multispectral Images. Sensors, 21(6), 1994. https://doi.org/10.3390/s21061994