Forest Conservation with Deep Learning: A Deeper Understanding of Human Geography around the Betampona Nature Reserve, Madagascar
"> Figure 1
<p>(<b>a</b>) WorldView-3 0.3 m pansharpened RGB imagery with minimal cloud cover, over the study area—the Betampona Nature Reserve (BNR) (white boundary polygon), located on the (<b>b</b>) eastern coast of Madagascar, which is (<b>c</b>) an island to the southeast of Africa.</p> "> Figure 2
<p>In situ images of a few classes being mapped, collected during the ground reference data collection in 2018. Additional classes include invasive guava, residential, and open water.</p> "> Figure 3
<p>U-net architecture with the encoding (<b>left</b>) and decoding (<b>right</b>) sections that produce the characteristic U-shape of the architecture, producing pixel-wise classification of the input imagery. ResNet layers forms the encoder.</p> "> Figure 4
<p>Network architecture implemented for the Deep Neural Network (DNN) model along with the number of neurons that were optimized for each hidden layer. The output layer contains 11 neurons, corresponding to the number of classes to be classified.</p> "> Figure 5
<p><span class="html-italic">F</span>1 score (Equation (6)), for all classes and models, showing the superiority of the FCNN U-Net model.</p> "> Figure 6
<p>Comparison of confusion matrixes for (<b>a</b>) DNN where SWIR bands were included, (<b>b</b>) DNN where SWIR bands were excluded, (<b>c</b>) U-Net where SWIR bands were included, and (<b>d</b>) U-Net where SWIR bands were excluded. A greater number of <span class="html-italic">FP</span>s and <span class="html-italic">FN</span>s are seen in (<b>b</b>,<b>d</b>) because of the removal of the SWIR bands. The DNN and U-Net models are chosen for comparison because of the highest and lowest reduction in accuracy due to the removal of SWIR.</p> "> Figure 7
<p><span class="html-italic">F</span>1 scores (Equation (6)) for the U-Net and DNN models including (16 bands) and excluding (8 bands) SWIR bands across all classes. The U-Net and DNN models trained on 16 bands show a higher <span class="html-italic">F</span>1 score compared to the models trained on eight bands.</p> "> Figure 8
<p>Detailed and highly accurate classification map of the BNR and surrounding areas, created using WorldView-3 imagery and post-classification editing, showing the distribution of 11 classes.</p> "> Figure 9
<p>(<b>a</b>) The 2010 classification map and (<b>b</b>) the associated classification map created in 2019. The use of identical classes enabled the quantification of land cover change over time.</p> "> Figure 10
<p>Residential areas in 2019 (red boundary polygon) in the study area (extent shown in inset map) overlain over the 2010 classification map showing the increase in residential area extent and the emergence of new hamlets in 2019 for (<b>a</b>) a location east of the BNR and (<b>b</b>) northwest of the BNR.</p> "> Figure 11
<p>An increase in tree cover in residential areas in hectares from 2010 (gray diagonal bar) to 2019 (black solid bar) is attributed to maturing native trees and successful agroforestry efforts.</p> "> Figure 12
<p>Percentage of classes in 2010 that were converted to Shrubland in 2019, within the study area based on the classification map extent seen in <a href="#remotesensing-13-03495-f009" class="html-fig">Figure 9</a>. This information is useful for awareness-raising and conservation efforts.</p> "> Figure 13
<p>Evergreen Forest areas in 2010 (green boundary polygons) that were converted to Shrubland in 2019 located east of the BNR, overlaid over the 2019 classification map. The inset map shows the 2019 classification map.</p> "> Figure 14
<p>Percentage of classes in 2010 that were converted to Mixed Forest in 2019 within the BNR, shows a higher conversion of invasive plants—Molucca Raspberry, Madagascar Cardamom, and Guava—to Mixed Forest compared to Evergreen Forest.</p> "> Figure 15
<p>Invasive plant species within the BNR and the land cover types seen 100 m within the BNR boundary (gray line) and within the ZOP (100 m extending from the BNR boundary). Forest covers are observed along with a few scattered agricultural fields within the 200 m wide region on two sides of the boundary. Note that wider streams represented by the Open Water class are located further south of this map extent, which is not visible here. The Open Water and Streams classes should be treated as the same class.</p> "> Figure 16
<p>Classification map created using the SVM model, with an 88.6% overall accuracy, zoomed into a location on the western section of the BNR that shows the salt-and-pepper effect.</p> "> Figure 17
<p>Reflectance spectra of all classes, showing the overlapping reflectance spectra for vegetation classes—for example between Evergreen Forest, Grassland, and Mixed Forest.</p> "> Figure 18
<p><span class="html-italic">F</span>1 scores of all classification methods employed including (16 bands) and excluding SWIR bands (8 bands), showing the percent change due to the removal of SWIR bands for (<b>a</b>) Guava, (<b>b</b>) Madagascar Cardamom, (<b>c</b>) Evergreen Forest, (<b>d</b>) Mixed Forest, (<b>e</b>) Molucca Raspberry, (<b>f</b>) Row Crops, (<b>g</b>) Residential, (<b>h</b>) Fallow, (<b>i</b>) Shrubland, and (<b>j</b>) Grassland.</p> "> Figure 19
<p>Percent change of (<b>a</b>) Evergreen Forest and (<b>b</b>) Agriculture—including Row Crops and Fallow—over the BNR overlain over a DEM, showing areas of increased (red) and decreased (blue) extents based on a 10 m × 10 m grid cell. The BNR boundary is shown via the black line and the ZOP and 100 m region within the BNR used for change analysis is displayed through the gray polygon.</p> ">
Abstract
:1. Introduction
2. Materials and Methods
2.1. Study Area
2.2. Data
2.2.1. Ground Truth Data Collection
2.2.2. Imagery Data
2.2.3. Training Samples
2.3. U-Net
2.4. Support Vector Machine (SVM)
2.5. Random Forest (RF)
2.6. Deep Neural Network
2.7. Accuracy Assessment
2.8. Land Cover and Land Use Change
3. Results
3.1. Classification Results
3.2. SWIR Bands
3.3. Spatial Distribution of LCLU
3.4. Land Cover and Land Use Change Analysis
3.4.1. Change in the Study Area
3.4.2. Change within the BNR
3.4.3. Change within the Zone of Protection (ZOP)
4. Discussion
4.1. Land Cover and Land Use Classification
4.2. Contribution of SWIR Bands
4.3. Accuracy of Classification Maps
4.4. Conservation Efforts in the BNR
4.5. Human Geography Implications
5. Conclusions
Author Contributions
Funding
Data Availability Statement
Acknowledgments
Conflicts of Interest
References
- FAO. REDD+ Reducing Emissions from Deforestation and Forest Degradation. Available online: http://www.fao.org/redd/en/ (accessed on 27 August 2021).
- UN. What Is REDD+? Available online: https://unfccc.int/topics/land-use/workstreams/redd/what-is-redd (accessed on 27 August 2021).
- Goodman, S.M.; Benstead, J.P. Updated estimates of biotic diversity and endemism for Madagascar. Oryx 2005, 39, 73–77. [Google Scholar] [CrossRef] [Green Version]
- Vieilledent, G.; Grinand, C.; Rakotomalala, F.A.; Ranaivosoa, R.; Rakotoarijaona, J.R.; Allnutt, T.F.; Achard, F. Combining global tree cover loss data with historical national forest cover maps to look at six decades of deforestation and forest fragmentation in Madagascar. Biol. Conserv. 2018, 222, 189–197. [Google Scholar] [CrossRef]
- Golden, C.D.; Rabehatonina, J.G.; Rakotosoa, A.; Moore, M. Socio-ecological analysis of natural resource use in Betampona Strict Natural Reserve. Madag. Conserv. Dev. 2014, 9, 83–89. [Google Scholar] [CrossRef]
- Armstrong, A.H.; Shugart, H.H.; Fatoyinbo, T.E. Characterization of community composition and forest structure in a Madagascar lowland rainforest. Trop. Conserv. Sci. 2011, 4, 428–444. [Google Scholar] [CrossRef] [Green Version]
- Gibson, L.; Lynam, A.J.; Bradshaw, C.J.A.; He, F.L.; Bickford, D.P.; Woodruff, D.S.; Bumrungsri, S.; Laurance, W.F. Near-Complete Extinction of Native Small Mammal Fauna 25 Years after Forest Fragmentation. Science 2013, 341, 1508–1510. [Google Scholar] [CrossRef] [PubMed]
- Rosa, G.M.; Andreone, F.; Crottini, A.; Hauswaldt, J.S.; Noel, J.; Rabibisoa, N.H.; Randriambahiniarime, M.O.; Rebelo, R.; Raxworthy, C.J. The amphibians of the relict Betampona low-elevation rainforest, eastern Madagascar: An application of the integrative taxonomy approach to biodiversity assessments. Biodivers. Conserv. 2012, 21, 1531–1559. [Google Scholar] [CrossRef]
- Kull, C.A.; Tassin, J.; Carrière, S.M. Approaching invasive species in Madagascar. Madag. Conserv. Dev. 2014, 9, 60–70. [Google Scholar] [CrossRef]
- Ghulam, A.; Porton, I.; Freeman, K. Detecting subcanopy invasive plant species in tropical rainforest by integrating optical and microwave (InSAR/PolInSAR) remote sensing data, and a decision tree algorithm. ISPRS J. Photogramm. 2014, 88, 174–192. [Google Scholar] [CrossRef]
- Ratovonamana, R.Y. Ecological Study of Exotic Invasive Plants in the RNI No 1 of Betampona in the Region of Toamasina, Madagascar; Department of Plant Ecology, University of Antananarivo: Antananarivo, Madagascar, 2006. [Google Scholar]
- Den Biggelaar, C.; Moore, M. The Changing Nature of Agricultural Livelihoods along a Peri-urban to Rural Gradient in Eastern Madagascar. Am. J. Rural Dev. 2016, 4, 31–42. [Google Scholar]
- Farris, A.R.; Misyak, S.; O’Keefe, K.; VanSicklin, L.; Porton, I. Understanding the drivers of food choice and barriers to diet diversity in Madagascar. J. Hunger Environ. Nutr. 2020, 15, 388–400. [Google Scholar] [CrossRef]
- Ghulam, A. Monitoring Tropical Forest Degradation in Betampona Nature Reserve, Madagascar Using Multisource Remote Sensing Data Fusion. IEEE J. Sel. Top. Appl. Earth Obs. Remote Sens. 2014, 7, 4960–4971. [Google Scholar] [CrossRef]
- Pandey, P.C.; Koutsias, N.; Petropoulos, G.P.; Srivastava, P.K.; Ben Dor, E. Land use/land cover in view of earth observation: Data sources, input dimensions, and classifiers—A review of the state of the art. Geocarto Int. 2019, 36, 957–988. [Google Scholar] [CrossRef]
- Lee, S.-H.; Han, K.-J.; Lee, K.; Lee, K.-J.; Oh, K.-Y.; Lee, M.-J. Classification of Landscape Affected by Deforestation Using High-Resolution Remote Sensing Data and Deep-Learning Techniques. Remote Sens. 2020, 12, 3372. [Google Scholar] [CrossRef]
- Wagner, F.H.; Sanchez, A.; Tarabalka, Y.; Lotte, R.G.; Ferreira, M.P.; Aidar, M.P.; Gloor, E.; Phillips, O.L.; Aragao, L.E. Using the U-net convolutional network to map forest types and disturbance in the Atlantic rainforest with very high resolution images. Remote Sens. Ecol. Conserv. 2019, 5, 360–375. [Google Scholar] [CrossRef] [Green Version]
- Ferreira, M.P.; Wagner, F.H.; Aragao, L.E.O.C.; Shimabukuro, Y.E.; de Souza, C.R. Tree species classification in tropical forests using visible to shortwave infrared WorldView-3 images and texture analysis. ISPRS J. Photogramm. 2019, 149, 119–131. [Google Scholar] [CrossRef]
- Hartling, S.; Sagan, V.; Sidike, P.; Maimaitijiang, M.; Carron, J. Urban Tree Species Classification Using a WorldView-2/3 and LiDAR Data Fusion Approach and Deep Learning. Sensors 2019, 19, 1284. [Google Scholar] [CrossRef] [PubMed] [Green Version]
- LeCun, Y.; Bengio, Y.; Hinton, G. Deep learning. Nature 2015, 521, 436–444. [Google Scholar] [CrossRef] [PubMed]
- Kussul, N.; Lavreniuk, M.; Skakun, S.; Shelestov, A. Deep Learning Classification of Land Cover and Crop Types Using Remote Sensing Data. IEEE Geosci. Remote Sens. Lett. 2017, 14, 778–782. [Google Scholar] [CrossRef]
- Krizhevsky, A.; Sutskever, I.; Hinton, G.E. Imagenet classification with deep convolutional neural networks. Adv. Neural. Inf. Process. Syst. 2012, 25, 1097–1105. [Google Scholar] [CrossRef]
- Simonyan, K.; Zisserman, A. Very deep convolutional networks for large-scale image recognition. arXiv 2014, arXiv:1409.1556. preprint. [Google Scholar]
- He, K.; Zhang, X.; Ren, S.; Sun, J. Deep residual learning for image recognition. In Proceedings of the IEEE Conference on Computer Vision and Pattern Recognition, Las Vegas, NV, USA, 27–30 June 2016; pp. 770–778. [Google Scholar]
- Huang, G.; Liu, Z.; Van Der Maaten, L.; Weinberger, K.Q. Densely connected convolutional networks. In Proceedings of the IEEE Conference on Computer Vision and Pattern Recognition, Honolulu, HI, USA, 21–26 July 2017; pp. 4700–4708. [Google Scholar]
- Long, J.; Shelhamer, E.; Darrell, T. Fully convolutional Networks for semantic segmentation. In Proceedings of the IEEE Conference on Computer Vision and Pattern Recognition, Boston, MA, USA, 7–12 June 2015; pp. 3431–3440. [Google Scholar]
- Ronneberger, O.; Fischer, P.; Brox, T. U-net: Convolutional networks for biomedical image segmentation. In Proceedings of the International Conference on Medical Image Computing and Computer-Assisted Intervention, Strabourg, France, 27 September–1 October 2021; pp. 234–241. [Google Scholar]
- Badrinarayanan, V.; Kendall, A.; Cipolla, R. Segnet: A deep convolutional encoder-decoder architecture for image segmentation. IEEE Trans. Pattern Anal. Mach. Intell. 2017, 39, 2481–2495. [Google Scholar] [CrossRef] [PubMed]
- Fu, G.; Liu, C.; Zhou, R.; Sun, T.; Zhang, Q. Classification for high resolution remote sensing imagery using a fully convolutional network. Remote Sens. 2017, 9, 498. [Google Scholar] [CrossRef] [Green Version]
- Han, Z.M.; Dian, Y.Y.; Xia, H.; Zhou, J.J.; Jian, Y.F.; Yao, C.H.; Wang, X.; Li, Y. Comparing Fully Deep Convolutional Neural Networks for Land Cover Classification with High-Spatial-Resolution Gaofen-2 Images. ISPRS Int. J. Geo-Inf. 2020, 9, 478. [Google Scholar] [CrossRef]
- Ji, S.; Wang, D.; Luo, M. Generative adversarial network-based full-space domain adaptation for land cover classification from multiple-source remote sensing images. IEEE Trans. Geosci. Remote 2020, 59, 3816–3828. [Google Scholar] [CrossRef]
- Zhu, Y.; Geiss, C.; So, E.M.; Jin, Y. Multitemporal Relearning with Convolutional LSTM Models for Land Use Classification. IEEE J. Sel. Top. Appl. Earth Obs. Remote Sens. 2021, 14, 3251–3265. [Google Scholar] [CrossRef]
- Yuan, Q.Q.; Shen, H.F.; Li, T.W.; Li, Z.W.; Li, S.W.; Jiang, Y.; Xu, H.Z.; Tan, W.W.; Yang, Q.Q.; Wang, J.W.; et al. Deep learning in environmental remote sensing: Achievements and challenges. Remote Sens. Environ. 2020, 241, 11176. [Google Scholar] [CrossRef]
- Hamdi, Z.M.; Brandmeier, M.; Straub, C. Forest Damage Assessment Using Deep Learning on High Resolution Remote Sensing Data. Remote Sens. 2019, 11, 1976. [Google Scholar] [CrossRef] [Green Version]
- Pacifici, F.; Longbotham, N.; Emery, W.J. The Importance of Physical Quantities for the Analysis of Multitemporal and Multiangular Optical Very High Spatial Resolution Images. IEEE Trans. Geosci. Remote 2014, 52, 6241–6256. [Google Scholar] [CrossRef]
- Smith, M.J. A Comparison of DG AComp, FLAASH and QUAC Atmospheric Compensation Algorithms Using WorldView-2 Imagery. Available online: https://digitalglobe-marketing.s3.amazonaws.com/files/blog/MichaelSmith_Masters_Report_ACOMP_Assessment.pdf (accessed on 1 August 2020).
- Kuester, M. Radiometric Use of Worldview-3 Imagery; DigitalGlobe: Longmont, CO, USA, 2016. [Google Scholar]
- DigitalGlobe. Geolocation Accuracy of WorldView Products. Available online: https://rmgsc.cr.usgs.gov/outgoing/KramerLBNL/WorldView_Geolocation_Accuracy.pdf (accessed on 12 August 2021).
- ArcGIS API for Python. Available online: https://developers.arcgis.com/python/ (accessed on 27 August 2021).
- Giang, T.L.; Dang, K.B.; Le, Q.T.; Nguyen, V.G.; Tong, S.S.; Pham, V. U-Net Convolutional Networks for Mining Land Cover Classification Based on High-Resolution UAV Imagery. IEEE Access 2020, 8, 186257–186273. [Google Scholar] [CrossRef]
- Cortes, C.; Vapnik, V. Support-Vector Networks. Mach. Learn. 1995, 20, 273–297. [Google Scholar] [CrossRef]
- Fauvel, M.; Chanussot, J.; Benediktsson, J.A. Evaluation of kernels for multiclass classification of hyperspectral remote sensing data. In Proceedings of the 2006 IEEE International Conference on Acoustics Speech and Signal Processing Proceedings, Toulouse, France, 14–19 May 2006; p. II. [Google Scholar]
- Breiman, L. Random forests. Mach. Learn. 2001, 45, 5–32. [Google Scholar] [CrossRef] [Green Version]
- Olofsson, P.; Foody, G.M.; Stehman, S.V.; Woodcock, C.E. Making better use of accuracy data in land change studies: Estimating accuracy and area and quantifying uncertainty using stratified estimation. Remote Sens. Environ. 2013, 129, 122–131. [Google Scholar] [CrossRef]
- Ghulam, A.; Ghulam, O.; Maimaitijiang, M.; Freeman, K.; Porton, I.; Maimaitiyiming, M. Remote Sensing Based Spatial Statistics to Document Tropical Rainforest Transition Pathways. Remote Sens. 2015, 7, 6257–6279. [Google Scholar] [CrossRef] [Green Version]
- Li, M.; Zang, S.Y.; Zhang, B.; Li, S.S.; Wu, C.S. A Review of Remote Sensing Image Classification Techniques: The Role of Spatio-contextual Information. Eur. J. Remote Sens. 2014, 47, 389–411. [Google Scholar] [CrossRef]
- Marmanis, D.; Datcu, M.; Esch, T.; Stilla, U. Deep Learning Earth Observation Classification Using ImageNet Pretrained Networks. IEEE Geosci. Remote Sens. Lett. 2016, 13, 105–109. [Google Scholar] [CrossRef] [Green Version]
- Tsai, Y.H.; Stow, D.; Chen, H.L.; Lewison, R.; An, L.; Shi, L. Mapping Vegetation and Land Use Types in Fanjingshan National Nature Reserve Using Google Earth Engine. Remote Sens. 2018, 10, 927. [Google Scholar] [CrossRef] [Green Version]
- Sidike, P.; Sagan, V.; Maimaitijiang, M.; Maimaitiyiming, M.; Shakoor, N.; Burken, J.; Mockler, T.; Fritschi, F.B. dPEN: Deep Progressively Expanded Network for mapping heterogeneous agricultural landscape using WorldView-3 satellite imagery. Remote Sens. Environ. 2019, 221, 756–772. [Google Scholar] [CrossRef]
- Jensen, J.R. Introductory Digital Image Processing: A Remote Sensing Perspective, 3rd ed.; Prentice-Hall: Upper Saddle River, NJ, USA, 2005. [Google Scholar]
- DeFries, R.; Karanth, K.K.; Pareeth, S. Interactions between protected areas and their surroundings in human-dominated tropical landscapes. Biol. Conserv. 2010, 143, 2870–2880. [Google Scholar] [CrossRef]
- Styger, E.; Rakotondramasy, H.M.; Pfeffer, M.J.; Fernandes, E.C.M.; Bates, D.M. Influence of slash-and-burn farming practices on fallow succession and land degradation in the rainforest region of Madagascar. Agric. Ecosyst. Environ. 2007, 119, 257–269. [Google Scholar] [CrossRef]
Classes | Description |
---|---|
Mixed Forest | Young and mixed forest or degraded forest |
Evergreen Forest | Pristine and degraded forest |
Residential | Built-up areas |
Molucca Raspberry | Invasive plant species |
Row crops | Agricultural land |
Fallow | Area cleared prior to planting |
Shrubland | Shrubs dominating the land cover |
Open Water | Water bodies within the study area, i.e., rivers, streams |
Madagascar Cardamom | Plant species invasive to Betampona region |
Grassland | Grasses dominating the land cover |
Guava | Invasive plant species |
Accuracy Metric | Classification Class | U-Net | SVM | RF | DNN |
---|---|---|---|---|---|
Producer’s Accuracy | Mixed Forest | 91.5 | 67.5 | 64.5 | 70.2 |
Evergreen Forest | 78.6 | 69.9 | 59.4 | 68.8 | |
Residential | 100.0 | 97.8 | 98.4 | 88.2 | |
Molucca Raspberry | 95.4 | 96.7 | 99.4 | 99.4 | |
Row Crops | 96.9 | 97.6 | 90.3 | 92.8 | |
Fallow | 100.0 | 95.4 | 91.0 | 99.0 | |
Shrubland | 79.8 | 79.2 | 76.2 | 72.0 | |
Open water | 100.0 | 100.0 | 100.0 | 100.0 | |
Madagascar Cardamom | 81.6 | 90.0 | 80.0 | 93.3 | |
Grassland | 81.6 | 88.7 | 85.0 | 81.7 | |
Guava | 89.8 | 82.0 | 80.6 | 81.0 | |
User’s Accuracy | Mixed Forest | 82.2 | 75.8 | 70.6 | 72.3 |
Evergreen Forest | 85.1 | 75.1 | 64.1 | 68.2 | |
Residential | 99.9 | 95.1 | 89.8 | 99.0 | |
Molucca Raspberry | 95.5 | 96.0 | 96.3 | 99.8 | |
Row Crops | 100.0 | 98.9 | 95.4 | 94.2 | |
Fallow | 100.0 | 98.2 | 98.7 | 91.0 | |
Shrubland | 70.8 | 75.0 | 67.7 | 80.7 | |
Open Water | 100.0 | 100.0 | 100.0 | 100.0 | |
Madagascar Cardamom | 84.5 | 81.8 | 77.5 | 76.9 | |
Grassland | 97.3 | 93.9 | 93.7 | 93.2 | |
Guava | 81.3 | 72.9 | 71.2 | 73.4 | |
Overall Accuracy (%) | 90.9 | 88.6 | 84.8 | 86.6 | |
Kappa Coefficient | 0.901 | 0.875 | 0.835 | 0.854 |
Accuracy Metric | U-Net | SVM | RF | DNN |
---|---|---|---|---|
Overall Accuracy (%) (16 bands) | 90.9 | 88.6 | 84.8 | 86.6 |
Overall Accuracy (%) (8 bands) | 89.2 | 83.8 | 82.1 | 79.2 |
Percent Decrease | −1.87 | −5.42 | −3.18 | −8.55 |
Classes | 2019 (%) | 2019 (ha) |
---|---|---|
Mixed Forest | 19.74 | 32,990.21 |
Evergreen Forest | 13.22 | 22,099.87 |
Residential | 0.38 | 650.62 |
Molucca Raspberry | 1.31 | 2190.60 |
Row Crops | 3.08 | 5147.37 |
Fallow | 4.97 | 8313.04 |
Shrubland | 48.55 | 81,148.99 |
Open Water | 1.39 | 2327.05 |
Madagascar Cardamom | 2.02 | 3386.98 |
Grassland | 3.81 | 6369.07 |
Guava | 1.49 | 2490.60 |
Classification Category | 2010 (%) | 2019 (%) | Percent Change |
---|---|---|---|
Mixed Forest | 18.15 | 19.04 | 4.87 |
Evergreen Forest | 21.52 | 21.85 | 1.54 |
Residential | 0.23 | 0.33 | 44.31 |
Molucca Raspberry | 3.27 | 1.45 | −55.69 |
Row Crops | 6.43 | 2.82 | −56.13 |
Fallow | 6.03 | 4.16 | −31.01 |
Shrubland | 31.11 | 42.69 | 37.22 |
Open Water | 0.62 | 0.95 | 53.57 |
Madagascar Cardamom | 0.86 | 1.26 | 47.49 |
Grassland | 9.80 | 3.59 | −63.39 |
Guava | 1.98 | 1.86 | −6.10 |
Forest Cover Class | 2010 (%) | 2019 (%) | Percent Change |
---|---|---|---|
Mixed Forest | 10.3 | 11.5 | 12.5 |
Evergreen Forest | 81.0 | 81.6 | 0.7 |
Molucca Raspberry | 1.8 | 1.3 | −28.4 |
Madagascar Cardamom | 2.5 | 2.2 | −12.8 |
Guava | 3.1 | 3.1 | −0.3 |
Classes | 2010 (%) | 2019 (%) | Percent Change (%) |
---|---|---|---|
Mixed Forest | 16.55 | 21.23 | 28.36 |
Evergreen Forest | 16.82 | 26.79 | 59.43 |
Residential | 0.15 | 0.12 | −17.91 |
Molucca Raspberry | 8.42 | 2.70 | −67.93 |
Row Crops | 11.27 | 2.69 | −76.13 |
Fallow | 1.51 | 1.43 | −4.93 |
Shrubland | 25.63 | 30.23 | 18.04 |
Open water | 0.07 | 0.36 | 430.64 |
Madagascar Cardamom | 5.26 | 5.26 | 0.06 |
Grassland | 10.03 | 1.99 | −80.13 |
Guava | 4.29 | 7.21 | 68.09 |
Model Name | Accuracy (%) | Kappa Coeff. |
---|---|---|
SVM | 66.5 | 0.650 |
U-Net | 77.3 | 0.655 |
Publisher’s Note: MDPI stays neutral with regard to jurisdictional claims in published maps and institutional affiliations. |
© 2021 by the authors. Licensee MDPI, Basel, Switzerland. This article is an open access article distributed under the terms and conditions of the Creative Commons Attribution (CC BY) license (https://creativecommons.org/licenses/by/4.0/).
Share and Cite
Cota, G.; Sagan, V.; Maimaitijiang, M.; Freeman, K. Forest Conservation with Deep Learning: A Deeper Understanding of Human Geography around the Betampona Nature Reserve, Madagascar. Remote Sens. 2021, 13, 3495. https://doi.org/10.3390/rs13173495
Cota G, Sagan V, Maimaitijiang M, Freeman K. Forest Conservation with Deep Learning: A Deeper Understanding of Human Geography around the Betampona Nature Reserve, Madagascar. Remote Sensing. 2021; 13(17):3495. https://doi.org/10.3390/rs13173495
Chicago/Turabian StyleCota, Gizelle, Vasit Sagan, Maitiniyazi Maimaitijiang, and Karen Freeman. 2021. "Forest Conservation with Deep Learning: A Deeper Understanding of Human Geography around the Betampona Nature Reserve, Madagascar" Remote Sensing 13, no. 17: 3495. https://doi.org/10.3390/rs13173495
APA StyleCota, G., Sagan, V., Maimaitijiang, M., & Freeman, K. (2021). Forest Conservation with Deep Learning: A Deeper Understanding of Human Geography around the Betampona Nature Reserve, Madagascar. Remote Sensing, 13(17), 3495. https://doi.org/10.3390/rs13173495