Deep Fully Convolutional Networks for Cadastral Boundary Detection from UAV Images
<p>Study areas. Two case study sites in Rwanda were selected, namely Busogo and Muhoza, representing a sub-urban and urban setting, respectively.</p> "> Figure 2
<p>The Unmanned Aerial Vehicle (UAV) images and boundary reference for selected tiles. TR1, TR2, TR3 and TS1 are selected tiles from Busogo; TR4, TR5, TR6 and TS2 are selected tiles from Muhoza. For each area, the former three were used for training and the last one was used for testing the algorithms. The boundary references in TR1~TR6 are the yellow lines. In TS1 and TS2, we separated the boundary references as visible (green lines) and invisible (red lines).</p> "> Figure 3
<p>Architecture of the proposed FCN.</p> "> Figure 4
<p>Learning curves of the FCNs in Busogo (<b>left</b>) and Muhoza (<b>right</b>).</p> "> Figure 5
<p>Reference and classification maps obtained by the investigated techniques. The visible boundary references are the green lines; the invisible are the red lines; and the detected boundaries are the yellow lines.</p> "> Figure 6
<p>The error map of the investigated techniques. Yellow lines are TP; red lines are FP; and green lines are FN.</p> ">
Abstract
:1. Introduction
2. Study Area
3. Materials and Methods
3.1. Data Preparation
3.2. Boundary Detection
3.2.1. Fully Convolutional Networks
3.2.2. Globalized Probability of Boundary (gPb)
3.2.3. Multi-Resolution Segmentation (MRS)
3.3. Accuracy Assessment
4. Results
5. Discussion
6. Conclusions
Author Contributions
Funding
Acknowledgments
Conflicts of Interest
References
- Luo, X.; Bennett, R.M.; Koeva, M.; Lemmen, C. Investigating Semi-Automated Cadastral Boundaries Extraction from Airborne Laser Scanned Data. Land 2017, 6, 60. [Google Scholar] [CrossRef]
- Williamson, I. The justification of cadastral systems in developing countries. Geomatica 1997, 51, 21–36. [Google Scholar]
- Enemark, S.; Mclaren, R.; Lemmen, C.; Antonio, D.; Gitau, J.; De Zeeuw, K.; Dijkstra, P.; Quinlan, V.; Freccia, S. Fit-For-Purpoes Land Administration: Guiding Principles for Country Implementation; GLTN: Nairobi, Kenya, 2016. [Google Scholar]
- Enemark, S.; Bell, K.C.; Lemmen, C.; McLaren, R. Fit-For-Purpose Land Administration; International Federation of Surveyors: Copenhagen, Denmark, 2014; p. 44. [Google Scholar]
- Ramadhani, S.A.; Bennett, R.M.; Nex, F.C. Exploring UAV in Indonesian cadastral boundary data acquisition. Earth Sci. Inform. 2018, 11, 129–146. [Google Scholar] [CrossRef]
- Crommelinck, S.; Bennett, R.; Gerke, M.; Nex, F.; Yang, M.; Vosselman, G. Review of Automatic Feature Extraction from High-Resolution Optical Sensor Data for UAV-Based Cadastral Mapping. Remote Sens. 2016, 8, 689. [Google Scholar] [CrossRef]
- Pal, N.R.; Pal, S.K. A review on image segmentation techniques. Pattern Recognit. 1993, 26, 1277–1294. [Google Scholar] [CrossRef]
- García-Pedrero, A.; Gonzalo-Martín, C.; Lillo-Saavedra, M. A machine learning approach for agricultural parcel delineation through agglomerative segmentation. Int. J. Remote Sens. 2017, 38, 1809–1819. [Google Scholar] [CrossRef] [Green Version]
- Drǎguţ, L.; Csillik, O.; Eisank, C.; Tiede, D. Automated parameterisation for multi-scale image segmentation on multiple layers. ISPRS J. Photogramm. Remote Sens. 2014, 88, 119–127. [Google Scholar] [CrossRef] [PubMed]
- Li, Y.; Wang, S.; Tian, Q.; Ding, X. A survey of recent advances in visual feature detection. Neurocomputing 2015, 149, 736–751. [Google Scholar] [CrossRef]
- Arbeláez, P.; Maire, M.; Fowlkes, C.; Malik, J. Contour Detection and Hierarchical Image Segmentation. IEEE Trans. Pattern Anal. Mach. Intell. 2011, 33, 898–916. [Google Scholar] [CrossRef] [PubMed]
- Zhu, X.X.; Tuia, D.; Mou, L.; Xia, G.-S.; Zhang, L.; Xu, F.; Fraundorfer, F. Deep Learning in Remote Sensing: A Comprehensive Review and List of Resources. IEEE Geosci. Remote Sens. Mag. 2017, 5, 8–36. [Google Scholar] [CrossRef] [Green Version]
- Bergado, J.R.; Persello, C.; Gevaert, C. A Deep Learning Approach to the Classification of sub-decimetre Resolution Aerial Images. In Proceedings of the IEEE International Geoscience and Remote Sensing Symposium, Beijing, China, 10–15 July 2016; pp. 1516–1519. [Google Scholar]
- LeCun, Y.; Bengio, Y.; Hinton, G. Deep learning. Nature 2015, 521, 436–444. [Google Scholar] [CrossRef] [PubMed]
- Shelhamer, E.; Long, J.; Darrell, T. Fully Convolutional Networks for Semantic Segmentation. IEEE Trans. Pattern Anal. Mach. Intell. 2017, 39, 640–651. [Google Scholar] [CrossRef] [PubMed]
- Maurice, M.J.; Koeva, M.N.; Gerke, M.; Nex, F.; Gevaert, C. A photogrammetric approach for map updating using UAV in Rwanda. In Proceedings of the GeoTechRwanda 2015, Kigali, Rwanda, 18–20 November 2015; pp. 1–8. [Google Scholar]
- Stöcker, C.; Ho, S.; Nkerabigwi, P.; Schmidt, C.; Koeva, M.; Bennett, R.; Zevenbergen, J. Unmanned Aerial System Imagery, Land Data and User Needs: A Socio-Technical Assessment in Rwanda. Remote Sens. 2019, 11, 1035. [Google Scholar] [CrossRef]
- Persello, C.; Stein, A. Deep Fully Convolutional Networks for the Detection of Informal Settlements in VHR Images. IEEE Geosci. Remote Sens. Lett. 2017, 14, 2325–2329. [Google Scholar] [CrossRef] [Green Version]
- Ioffe, S.; Szegedy, C. Batch Normalization: Accelerating Deep Network Training by Reducing Internal Covariate Shift. In Proceedings of the International Conference on Machine Learning, Lille, France, 6–11 July 2015. [Google Scholar]
- He, K.; Zhang, X.; Ren, S.; Sun, J. Delving Deep into Rectifiers: Surpassing Human-Level Performance on ImageNet Classification. In Proceedings of the 2015 IEEE International Conference on Computer Vision (ICCV), Santiago, Chile, 7–13 December 2015; pp. 1–11. [Google Scholar] [CrossRef]
- Martin, D.R.; Fowlkes, C.C.; Malik, J. Learning to detect natural image boundaries using local brightness, color, and texture cues. IEEE Trans. Pattern Anal. Mach. Intell. 2004, 26, 530–549. [Google Scholar] [CrossRef] [PubMed]
- Baatz, M.; Schäpe, A. Multiresolution Segmentation: An Optimization Approach for High Quality Multi-Scale Image Segmentation; Wichmann Verlag: Karlsruhe, Germany, 2000. [Google Scholar]
- Nyandwi, E.; Koeva, M.; Kohli, D.; Bennett, R.; Nyandwi, E.; Koeva, M.; Kohli, D.; Bennett, R. Comparing Human Versus Machine-Driven Cadastral Boundary Feature Extraction. Remote Sens. 2019, 11, 1662. [Google Scholar] [CrossRef]
- IAAO. Standard on Digital Cadastral Maps and Parcel Identifiers; International Association of Assessing Officers: Kansas City, MO, USA, 2015. [Google Scholar]
- Hossin, M.; Sulaiman, M.N. Review on Evaluation Metrics for Data Classification Evaluations. Int. J. Data Min. Knowl. Manag. Process 2015, 5, 1–11. [Google Scholar] [CrossRef]
- Wassie, Y.A.; Koeva, M.N.; Bennett, R.M.; Lemmen, C.H.J. A procedure for semi-automated cadastral boundary feature extraction from high-resolution satellite imagery. J. Spat. Sci. 2018, 63, 75–92. [Google Scholar] [CrossRef]
- Persello, C.; Tolpekin, V.A.; Bergado, J.R.; de By, R.A. Delineation of agricultural fields in smallholder farms from satellite images using fully convolutional networks and combinatorial grouping. Remote Sens. Environ. 2019, 231, 111253. [Google Scholar] [CrossRef]
- Goodfellow, I.J.; Pouget-Abadie, J.; Mirza, M.; Xu, B.; Warde-Farley, D.; Ozair, S.; Courville, A.; Bengio, Y. Generative Adversarial Networks. In Proceedings of the Neural Information Processing Systems, Montreal, QC, Canada, 8–13 December 2014; pp. 2672–2680. [Google Scholar]
Object class | Visible Cadastral Boundary | |
Input data | 0.1 m × 0.1 m UAV image | |
Reference frame | Coordinate System: WGS 1984 UTM zone 35S Projection: Transverse Mercator False Easting: 500,000 False Northing: 10,000,000 Central Meridian: 27 Scale factor: 0.9996 Latitude of origin: 0.000 Units: Meter | |
Definition | A visible cadastral boundary is a line of geographical features representing limits of an entity considered to be a single area under homogeneous real property rights and unique ownership. | |
Identifying visible cadastral boundaries | (a) Strip of stone (b) Water drainage (c) Road ridges (d) Fences(e) Textural pattern transition (f) Edge of rooftop | |
Extraction |
|
Positive Prediction | Negative Prediction | |
---|---|---|
Positive Class | True Positive (TP) | False Negative (FN) |
Negative Class | False Positive (FP) | True Negative (TN) |
Algorithm | Reference | TS1 | TS2 | ||||
---|---|---|---|---|---|---|---|
P | R | F | P | R | F | ||
FCN | visible | 0.75 | 0.65 | 0.70 | 0.74 | 0.45 | 0.56 |
invisible | 0.06 | 0.07 | 0.06 | 0.06 | 0.09 | 0.07 | |
all | 0.78 | 0.39 | 0.52 | 0.79 | 0.35 | 0.48 | |
gPb-owt-ucm | visible | 0.21 | 0.87 | 0.34 | 0.23 | 0.93 | 0.37 |
invisible | 0.03 | 0.19 | 0.06 | 0.04 | 0.39 | 0.07 | |
all | 0.24 | 0.57 | 0.33 | 0.26 | 0.78 | 0.39 | |
MRS | visible | 0.19 | 0.82 | 0.31 | 0.18 | 0.90 | 0.30 |
invisible | 0.05 | 0.27 | 0.08 | 0.04 | 0.56 | 0.08 | |
all | 0.23 | 0.57 | 0.33 | 0.22 | 0.80 | 0.35 |
© 2019 by the authors. Licensee MDPI, Basel, Switzerland. This article is an open access article distributed under the terms and conditions of the Creative Commons Attribution (CC BY) license (http://creativecommons.org/licenses/by/4.0/).
Share and Cite
Xia, X.; Persello, C.; Koeva, M. Deep Fully Convolutional Networks for Cadastral Boundary Detection from UAV Images. Remote Sens. 2019, 11, 1725. https://doi.org/10.3390/rs11141725
Xia X, Persello C, Koeva M. Deep Fully Convolutional Networks for Cadastral Boundary Detection from UAV Images. Remote Sensing. 2019; 11(14):1725. https://doi.org/10.3390/rs11141725
Chicago/Turabian StyleXia, Xue, Claudio Persello, and Mila Koeva. 2019. "Deep Fully Convolutional Networks for Cadastral Boundary Detection from UAV Images" Remote Sensing 11, no. 14: 1725. https://doi.org/10.3390/rs11141725
APA StyleXia, X., Persello, C., & Koeva, M. (2019). Deep Fully Convolutional Networks for Cadastral Boundary Detection from UAV Images. Remote Sensing, 11(14), 1725. https://doi.org/10.3390/rs11141725