High-Quality UAV-Based Orthophotos for Cadastral Mapping: Guidance for Optimal Flight Configurations
"> Figure 1
<p>Conceptual framework.</p> "> Figure 2
<p>Overview of all datasets presented as orthomosaics (<b>a</b>) Bentelo, (<b>b</b>) Gerleve, (<b>c</b>) Mukingo, (<b>d</b>) Kajiado, (<b>e</b>) Kibonde, and (<b>f</b>) Muhoza (scales vary).</p> "> Figure 3
<p>Distribution of GCPs for experimental assessment of the spatial accuracy.</p> "> Figure 4
<p>Workflow to define reference lines and a search mask for lines representing concrete walls and rooftops.</p> "> Figure 5
<p>Workflow to compute and select MCG lines representing rooftops or walls and analytical tools to describe geometric characteristics of selected MCG lines.</p> "> Figure 6
<p>Standardized values of automatic tie-points using SIFT, AKAZE, and SURF as feature extraction, detection, and matching algorithm. The mean number of automatic tie-points per algorithm and land use class is reflected as bars. The <span class="html-italic">x</span>-axis represents land use classes as defined in <a href="#remotesensing-12-03625-t003" class="html-table">Table 3</a>.</p> "> Figure 7
<p>RMSE of checkpoint residuals measured in the DSM (vertical) and orthophoto (horizontal).</p> "> Figure 8
<p>Selected reference lines representing rooftops (green) and walls (red) for the area of interest in Kibonde.</p> "> Figure 9
<p>Example showing the differences of automatically extracted rooftops and walls separated according to flight configuration (forward overlap (%), side lap (%)) and flight pattern (CF = cross flight, noCF = no cross flight).</p> "> Figure 10
<p>Box-whisker plot of point distances to reference lines separated according to the reference wall and rooftop. Box represents the interquartile range (IQR) with the median; whisker represent 1.5 IQR, points represent outliers. <span class="html-italic">x</span>-axes label refers to flight parameter, e.g., 5060CF means 50% forward overlap, 60% side lap and cross-flight (CF) pattern. Distances reflect the length of perpendicular lines from points to reference lines. Points were created every 10 cm from a line geometry that was derived by feature extraction with the MCG algorithm.</p> "> Figure 11
<p>Scatterplot of error metrics for delineated rooftops and walls of orthophotos captured with different flight configurations. Absolute accuracy of the orthophoto is given on the <span class="html-italic">y</span>-axis with the RMSE of horizontal checkpoint residuals. Relative accuracy is shown on the <span class="html-italic">x</span>-axis displayed by the RMSE of point distances to reference lines. Note that both axes have different scales.</p> "> Figure 12
<p>Box-whisker plot of distances to reference lines separated according to the direction of walls (parallel or perpendicular to the flight direction). Box represents the interquartile range (IQR) with the median; whisker represent 1.5 IQR, points represent outliers.</p> ">
Abstract
:1. Introduction
2. Materials and Methods
2.1. UAV and GNSS Data Collection
2.2. Estimating the Impact of Land Cover on the Number of Automatic Tie Points
2.3. Estimating the Impact of the Number of GCPs on the Final Geometric Accuracy
2.4. Estimating the Impact of Different Flight Plans on the Characteristics of Extracted Cadastral Features
3. Results
3.1. Image Matching: Image Correspondences
3.2. Absolute Accuracy: Checkpoint Residuals in DSM and Orthophotos
3.3. Relative Accuracy: Characteristics of Automatically Extracted Cadastral Features
4. Discussion
5. Conclusions
- Land use has a significant impact on the generation of tie-points. Image scenes characterized by a high percentage of vegetated areas and especially trees or forest require image overlap settings of at least 80–90% to establish sufficient image correspondences.
- Independent of the size of the study area, the error level of planimetric and vertical residuals remains steady after seven equally distributed GCPs (according to the scheme presented in Figure 3), given at least 70% forward overlap and 70% side lap. As the absolute accuracy does not increase significantly with adding more GCPs, 7 GCPs can be recommended as optimal survey design.
- The quality of reconstructed thin cadastral objects, as exemplified for concrete walls, is highly variable to the flight configuration. A large image overlap, as well as a cross-flight pattern, has proven to enhance the reliability of the generated orthophoto as quantified by the increased accuracy and completeness of automatically delineated walls. In contrast, the delineation results of rooftops showed less sensitivity to the flight configuration.
- Even though checkpoint residuals indicate high absolute accuracy of an orthophoto, the reliability of reconstructed scene objects could vary, particularly in adverse conditions with large variations in the height component. We furthermore recommend measuring checkpoint residuals in the generated orthophoto in addition to after the BBA.
Author Contributions
Funding
Acknowledgments
Conflicts of Interest
References
- World Economic Forum. Unlocking Technology for the Global Goals; World Economic Forum: Geneva, Switzerland, 2020. [Google Scholar]
- Jazayeri, I.; Rajabifard, A.; Kalantari, M. A geometric and semantic evaluation of 3D data sourcing methods for land and property information. Land Use Policy 2014, 36, 219–230. [Google Scholar] [CrossRef]
- Shakhatreh, H.; Sawalmeh, A.H.; Al-Fuqaha, A.; Dou, Z.; Almaita, E.; Khalil, I.; Othman, N.S.; Khreishah, A.; Guizani, M. Unmanned Aerial Vehicles (UAVs): A Survey on civil applications and key research challenges. IEEE Access 2019, 7, 48572–48634. [Google Scholar] [CrossRef]
- Barnes, G.; Volkmann, W. High-resolution mapping with unmanned aerial systems. Surv. L. Inf. Sci. 2015, 74, 5–13. [Google Scholar]
- Kurczynski, Z.; Bakuła, K.; Karabin, M.; Markiewicz, J.S.; Ostrowski, W.; Podlasiak, P.; Zawieska, D. The possibility of using images obtained from the uas in cadastral works. Int. Arch. Photogramm. Remote Sens. Spat. Inf. Sci.-ISPRS Arch. 2016, 41, 909–915. [Google Scholar] [CrossRef]
- Rijsdijk, M.; Van Hinsbergh, W.H.M.; Witteveen, W.; Buuren, G.H.M.; Schakelaar, G.A.; Poppinga, G.; Van Persie, M.; Ladiges, R. Unmanned aerial systems in the process of juridical verification of cadastral border. Int. Arch. Photogramm. Remote Sens. 2013, 40, 4–6. [Google Scholar] [CrossRef] [Green Version]
- Manyoky, M.; Theiler, P.; Steudler, D.; Eisenbeiss, H. Unmanned aerial vehicle in cadastral applications. ISPRS-Int. Arch. Photogramm. Remote Sens. Spat. Inf. Sci. 2012, 38, 57–62. [Google Scholar] [CrossRef] [Green Version]
- Mumbone, M.; Bennett, R.M.; Gerke, M.; Volkmann, W. Innovations in boundary mapping: Namibia, customary lands and UAVs. In Proceedings of the Land and Poverty Conference 2015: Linking Land Tenure and Use for Shared Prosperity, Washington, DC, USA, 23–27 March 2015. [Google Scholar]
- Koeva, M.; Stöcker, C.; Crommelinck, S.; Ho, S.; Chipofya, M.; Sahib, J.; Bennett, R.; Zevenbergen, J.; Vosselman, G.; Lemmen, C.; et al. Innovative remote sensing methodologies for Kenyan land tenure mapping. Remote Sens. 2020, 12, 273. [Google Scholar] [CrossRef] [Green Version]
- Koeva, M.; Muneza, M.; Gevaert, C.; Gerke, M.; Nex, F. Using UAVs for map creation and updating. A case study in Rwanda. Surv. Rev. 2016, 50, 1–14. [Google Scholar] [CrossRef] [Green Version]
- Ramadhani, S.A.; Bennett, R.M.; Nex, F.C. Exploring UAV in Indonesian cadastral boundary data acquisition. Earth Sci. Informatics 2018, 11, 129–146. [Google Scholar] [CrossRef]
- Crommelinck, S.; Koeva, M.N.; Yang, M.Y.; Vosselman, G. Interactive cadastral boundary delineation from UAV data. In Proceedings of the ISPRS Annals of Photogrammetry, Remote Sensing and Spatial Information Sciences, Riva del Garda, Italy, 4–7 June 2018; Volume 4, pp. 4–7. [Google Scholar]
- Yu, X.; Zhang, Y. Sense and avoid technologies with applications to unmanned aircraft systems: Review and prospects. Prog. Aerosp. Sci. 2015, 74, 152–166. [Google Scholar] [CrossRef]
- Fetai, B.; Oštir, K.; Fras, M.K.; Lisec, A. Extraction of visible boundaries for cadastral mapping based on UAV imagery. Remote Sens. 2019, 11, 1510. [Google Scholar] [CrossRef] [Green Version]
- Xia, X.; Persello, C.; Koeva, M. Deep fully convolutional networks for cadastral boundary detection from UAV images. Remote Sens. 2019, 11, 1725. [Google Scholar] [CrossRef] [Green Version]
- International Standardization Organization (ISO). ISO 19157: 2013 Geographic Information-Data Quality; European Committee for Standardization: Brussels, Belgium, 2013. [Google Scholar]
- Grant, D.; Enemark, S.; Zevenbergen, J.; Mitchell, D.; McCamley, G. The Cadastral triangular model. Land Use Policy 2020, 97, 104758. [Google Scholar] [CrossRef]
- Förstner, W.; Gülch, E. A Fast Operator for Detection and Precise Location of Distinct Points, Corners and Centres of Circular Features. In Proceedings of the ISPRS Intercommission Conference on Fast Processing of Photogrammetric Data, Interlaken, Switzerland, 2–4 June 1987; pp. 281–305. [Google Scholar]
- Lowe, G. SIFT—The Scale Invariant Feature Transform. Int. J. 2004, 2, 91–110. [Google Scholar]
- Snavely, N.; Seitz, S.M.; Szeliski, R. Photo tourism: Exploring image collections in 3D. Proc. SIGGRAPH 2006 2006, 1, 835–846. [Google Scholar]
- Remondino, F.; Spera, M.G.; Nocerino, E.; Menna, F.; Nex, F. State of the art in high density image matching. Photogramm. Rec. 2014, 29, 144–166. [Google Scholar] [CrossRef] [Green Version]
- Gruen, A. Development and status of image matching in photogrammetry. Photogramm. Rec. 2012, 27, 36–57. [Google Scholar] [CrossRef]
- Singh, K.K.; Frazier, A.E. A meta-analysis and review of unmanned aircraft system (UAS) imagery for terrestrial applications. Int. J. Remote Sens. 2018, 39, 5078–5098. [Google Scholar] [CrossRef]
- Rehak, M.; Skaloud, J. Applicability of new approaches of sensor orientation to micro aerial vehicles. ISPRS Ann. Photogramm. Remote Sens. Spat. Inf. Sci. 2016, 3, 441–447. [Google Scholar] [CrossRef]
- Pfeifer, N.; Glira, P.; Briese, C. Direct georeferencing with on board navigation components of light weight UAV platforms. Int. Arch. Photogramm. Remote Sens. Spat. Inf. Sci.-ISPRS Arch. 2012, 39, 487–492. [Google Scholar] [CrossRef] [Green Version]
- Turner, D.; Lucieer, A.; Wallace, L. Direct georeferencing of ultrahigh-resolution UAV imagery. IEEE Trans. Geosci. Remote Sens. 2014, 52, 2738–2745. [Google Scholar] [CrossRef]
- Haala, N.; Cramer, M.; Weimer, F.; Trittler, M. Performance test on UAV-based photogrammetric data collection. ISPRS-Int. Arch. Photogramm. Remote Sens. Spat. Inf. Sci. 2012, 38, 7–12. [Google Scholar] [CrossRef] [Green Version]
- Eling, C.; Klingbeil, L.; Kuhlmann, H. Development of an RTK-GPS System for Precise Real-time Positioning of Lightweight UAVs. In Ingenieurvermessung 14, Proceedings of the 17. Ingenieurvermessungskurs, Zürich, Switzerland, 14–17 January 2014; Wieser, A., Ed.; S. Wichmann Verlag: Berlin, Germang, 2014; pp. 111–123. [Google Scholar]
- Stöcker, C.; Nex, F.; Koeva, M.; Gerke, M. Quality assessment of combined IMU/GNSS data for direct georeferencing in the context of UAV-based mapping. In Proceedings of the International Archives of the Photogrammetry, Remote Sensing and Spatial Information Sciences—ISPRS Archives, Colone, Germany, 4–7 September 2017; Volume 42. [Google Scholar]
- Gerke, M.; Przybilla, H.J. Accuracy analysis of photogrammetric UAV image blocks: Influence of onboard RTK-GNSS and cross flight patterns. Photogramm.-Fernerkundung-Geoinf. 2016, 2016, 17–30. [Google Scholar] [CrossRef] [Green Version]
- Hugenholtz, C.; Brown, O.; Walker, J.; Barchyn, T.; Nesbit, P.; Kucharczyk, M.; Myshak, S. Spatial accuracy of UAV-derived orthoimagery and topography: Comparing photogrammetric models processed with direct georeferencing and ground control points. Geomatica 2016, 70, 21–30. [Google Scholar] [CrossRef]
- Forlani, G.; Dall’Asta, E.; Diotri, F.; di Cella, U.M.; Roncella, R.; Santise, M. Quality assessment of DSMs produced from UAV flights georeferenced with onboard RTK positioning. Remote Sens. 2018, 10, 311. [Google Scholar] [CrossRef] [Green Version]
- Ekaso, D.; Nex, F.; Kerle, N. Accuracy assessment of real-time kinematics (RTK) measurements on unmanned aerial vehicles (UAV) for direct georeferencing. Geo-Spatial Inf. Sci. 2020, 23, 165–181. [Google Scholar] [CrossRef] [Green Version]
- James, M.R.; Robson, S.; Centre, L.E.; Engineering, G. Mitigating systematic error in topographic models derived from UAV and ground-based image networks. Earth Surf. Process. Landf. 2014, 1420, 1413–1420. [Google Scholar] [CrossRef] [Green Version]
- Manfreda, S.; Dvorak, P.; Mullerova, J.; Herban, S.; Vuono, P.; Arranz Justel, J.; Perks, M. Assessing the accuracy of digital surface models derived from optical imagery acquired with unmanned aerial systems. Drones 2019, 3, 15. [Google Scholar] [CrossRef] [Green Version]
- Mesas-Carrascosa, F.J.; Torres-Sánchez, J.; Clavero-Rumbao, I.; García-Ferrer, A.; Peña, J.M.; Borra-Serrano, I.; López-Granados, F. Assessing optimal flight parameters for generating accurate multispectral orthomosaicks by uav to support site-specific crop management. Remote Sens. 2015, 7, 12793–12814. [Google Scholar] [CrossRef] [Green Version]
- Villanueva, J.K.S.; Blanco, A.C. Optimization of ground control point (GCP) configuration for unmanned aerial vehicle (UAV) survey using structure from motion (SFM). Int. Arch. Photogramm. Remote Sens. Spat. Inf. Sci.-ISPRS Arch. 2019, 42, 167–174. [Google Scholar] [CrossRef] [Green Version]
- Tonkin, T.N.; Midgley, N.G. Ground-control networks for image based surface reconstruction: An investigation of optimum survey designs using UAV derived imagery and structure-from-motion photogrammetry. Remote Sens. 2016, 8, 786. [Google Scholar] [CrossRef] [Green Version]
- Sanz-Ablanedo, E.; Chandler, J.H.; Rodríguez-Pérez, J.R.; Ordóñez, C. Accuracy of Unmanned Aerial Vehicle (UAV) and SfM photogrammetry survey as a function of the number and location of ground control points used. Remote Sens. 2018, 10, 1606. [Google Scholar] [CrossRef] [Green Version]
- Agüera-Vega, F.; Carvajal-Ramírez, F.; Martínez-Carricondo, P. Assessment of photogrammetric mapping accuracy based on variation ground control points number using unmanned aerial vehicle. Measurement 2017, 98, 221–227. [Google Scholar] [CrossRef]
- Oniga, V.E.; Breaban, A.I.; Pfeifer, N.; Chirila, C. Determining the suitable number of ground control points for UAS images georeferencing by varying number and spatial distribution. Remote Sens. 2020, 12, 876. [Google Scholar] [CrossRef] [Green Version]
- James, M.R.; Robson, S.; d’Oleire-Oltmanns, S.; Niethammer, U. Optimising UAV topographic surveys processed with structure-from-motion: Ground control quality, quantity and bundle adjustment. Geomorphology 2017, 280, 51–66. [Google Scholar] [CrossRef] [Green Version]
- Lowe, D.G. Object recognition from local scale-invariant features. Proc. IEEE Int. Conf. Comput. Vis. 1999, 2, 1150–1157. [Google Scholar]
- Bay, H.; Ess, A.; Tuytelaars, T.; Van Gool, L. Speeded-up robust features (SURF). Comput. Vis. Image Underst. 2008, 110, 346–359. [Google Scholar] [CrossRef]
- Alcantarilla, P.F.; Nuevo, J.; Bartoli, A. Fast explicit diffusion for accelerated features in nonlinear scale spaces. In Proceedings of the BMVC 2013-Electronic Proceedings of the British Machine Vision Conference, Bristol, UK, 9–13 September 2013. [Google Scholar]
- Gonzales-Aguilera, D.; Ruiz de Ona, E.; Lopez-Fernandez, L.; Farella, E.M.; Stathopoulou, E.; Toschi, I.; Remondino, F.; Fusiello, A.; Nex, F. Photomatch: An open-source multi-view and multi-modal feature matching tool for photogrammetric applications. ISPRS Ann. Photogramm. Remote Sens. Spat. Inf. Sci. 2020, 43, 213–219. [Google Scholar] [CrossRef]
- Lu, C.; Xu, L.; Jia, J. Contrast preserving decolorization. In Proceedings of the 2012 IEEE International Conference on Computational Photography (ICCP), Seattle, WA, USA, 28–29 April 2012. [Google Scholar]
- Crommelinck, S. Delineation Tool. Available online: https://github.com/SCrommelinck/delineation-tool (accessed on 10 July 2020).
- Pont-Tuset, J.; Arbelaez, P.; Barron, J.T.; Marques, F.; Malik, J. Multiscale combinatorial grouping for image segmentation and object proposal generation. IEEE Trans. Pattern Anal. Mach. Intell. 2017, 39, 128–140. [Google Scholar] [CrossRef] [Green Version]
- Crommelinck, S.; Koeva, M.; Yang, M.Y.; Vosselman, G. Application of deep learning for delineation of visible cadastral boundaries from remote sensing imagery. Remote Sens. 2019, 11, 2505. [Google Scholar] [CrossRef] [Green Version]
- Seifert, E.; Seifert, S.; Vogt, H.; Drew, D.; van Aardt, J.; Kunneke, A.; Seifert, T. Influence of drone altitude, image overlap, and optical sensor resolution on multi-view reconstruction of forest images. Remote Sens. 2019, 11, 1252. [Google Scholar] [CrossRef] [Green Version]
Dataset | Area (km2) | GSD (cm) | UAV | Camera | Sensor Size (mm) | Resolution (MP) |
---|---|---|---|---|---|---|
Muhoza | 0.98 | 2.1 | BirdEyeView FireFLY6 | SONY ILCE-6000 | 13.50 × 15.60 | 24.00 |
Mukingo | 0.50 | 2.2 | DJI Inspire 2 | DJI FC652 | 13.00 × 17.30 | 20.89 |
Kajiado | 8.70 | 5.8 | DJI Phantom 4 | DJI FC330 | 06.20 × 04.65 | 19.96 |
Kibonde | 0.15 | 3.0 | SenseFly Ebee Plus | SenseFly S.O.D.A. | 12.70 × 08.50 | 19.96 |
Gerleve | 1.10 | 2.8 | DelairTech DT18 | DT 3Bands | 08.45 × 07.07 | 5.00 |
Bentelo | 0.14 | 2.7 | DJI Phantom 4 | DJI FC330 | 06.20 × 04.65 | 11.94 |
Dataset | GNSS Device | Measured Points | Original Datum | Target Datum |
---|---|---|---|---|
Muhoza | Leica CS10 | 17 | ITRF 2005 | WGS84 UTM35S |
Mukingo | Leica CS10 | 19 | ITRF 2005 | WGS84 UTM35S |
Kajiado | CHC X900+ | 16 | Cassini | WGS84 UTM37S |
Kibonde | Sokkia Stratus | 11 | Arc1960 | WGS84 UTM37S |
Gerleve | Trimble | 22 | ECEF | ETRS89 UTM32N |
Bentelo | Leica GS14 | 18 | Amersfoort | WGS84 UTM32N |
Land Use Class | Definition | Ben | Ger | Kaj | Kib | Muh | Muk |
---|---|---|---|---|---|---|---|
Forest | >70% covered by trees | 4 | 5 | ||||
Agriculture (cropland) | >70% cultivated agricultural fields | 5 | 5 | ||||
Agriculture (grassland or uncovered soil) | >70% bare soil or sparse grass vegetation | 5 | 5 | 5 | 5 | 5 | |
Rural context | <20% structures, a predominance of agricultural activities | 5 | 5 | 5 | |||
Peri-urban context | 20–70% structures | 5 | 5 | 5 | 5 | ||
Urban context | >70% structures, densely populated | 5 | 5 |
60% Overlap | 70% Overlap | 80% Overlap | 90% Overlap | |
---|---|---|---|---|
Agriculture (not cultivated) | 289 | 666 | 2519 | n/a |
Rural | 83 | 116 | 291 | n/a |
Peri-urban | 18 | 302 | 326 | n/a |
Forest | 0 | 5 | 6 | 50 |
Bentelo h/v (GSD) | Gerleve h/v (GSD) | Kajiado h/v (GSD) | Muhoza h/v (GSD) | Mukingo h/v (GSD) | |
---|---|---|---|---|---|
0 GCP | 0.39/−1.96 | −0.03/−0.84 | 0.05/−0.45 | 0.29/0.69 | 0.38/0.22 |
1 GCP | −0.11/−0.08 | 0.05/0.73 | −0.09/−0.10 | −0.01/3.79 | −0.31/−0.02 |
2 GCP | 0.37/0.07 | −0.02/−0.11 | −0.24/−0.18 | 0.01/2.72 | −0.28/−0.22 |
3 GCP | 0.05/−0.28 | −0.19/0.20 | 0.18/−0.57 | 0.72/1.07 | 0.24/0.16 |
4 GCP | 0.15/−0.52 | 2.11/−0.72 | 0.23/0.94 | 0.60/−4.12 | −0.27/0.49 |
5 GCP | 0.10/0.01 | 2.17/−1.09 | 0.44/−0.24 | 2.81/0.18 | 0.15/−0.34 |
6 GCP | 0.15/0.05 | −0.15/−0.55 | 0.58/−0.37 | 1.60/0.18 | 0.19/0.16 |
7 GCP | −0.13/−0.35 | −0.05/0.57 | 0.24/−0.70 | 1.42/−0.12 | 0.23/−0.23 |
8 GCP | −0.08/−0.22 | −0.03/1.63 | 0.35/−0.37 | −0.81/0.41 | 0.27/−0.07 |
9 GCP | −0.22/−0.31 | −1.55/−1.02 | 0.29/0.39 | −0.50/−4.89 | 0.11/−0.66 |
10 GCP | −0.11/−0.31 | 0.42/0.68 | 0.17/−0.02 | −0.24/−2.60 | 0.28/−0.05 |
Image Overlap (f/s) | 50%/60% | 50%/70% | 50%/80% | 75%/60% | 75%/70% | 75%/80% | |||||||
---|---|---|---|---|---|---|---|---|---|---|---|---|---|
no CF | CF | no CF | CF | no CF | CF | no CF | CF | no CF | CF | no CF | CF | ||
Candidate lines 0.5 m buffer (count) | W | 144 | 146 | 121 | 133 | 177 | 157 | 158 | 122 | 165 | 133 | 129 | 134 |
R | 333 | 273 | 410 | 285 | 505 | 310 | 402 | 271 | 366 | 295 | 444 | 369 | |
Selected line segments (count) | W | 73 | 42 | 76 | 63 | 67 | 50 | 78 | 44 | 75 | 57 | 54 | 40 |
R | 209 | 177 | 233 | 180 | 256 | 180 | 243 | 161 | 189 | 168 | 220 | 173 | |
Mean length of line segments (m) | W | 3.22 | 4.68 | 3.50 | 3.39 | 3.32 | 4.96 | 3.31 | 3.71 | 3.05 | 3.86 | 4.37 | 5.47 |
R | 3.65 | 4.37 | 3.34 | 4.28 | 3.01 | 4.21 | 3.24 | 4.87 | 4.14 | 4.54 | 4.40 | 4.50 | |
Correspondence with reference (%) | W | 71.5 | 85.8 | 79.0 | 90.1 | 82.5 | 87.8 | 88.5 | 83.6 | 93.0 | 87.6 | 91.9 | 93.0 |
R | 94.9 | 95.8 | 94.8 | 95.1 | 96.6 | 95.6 | 96.3 | 95.9 | 95.1 | 95.7 | 97.2 | 97.8 | |
Sinuosity | W | 1.77 | 1.78 | 1.76 | 1.70 | 1.68 | 1.65 | 1.66 | 1.62 | 1.74 | 1.62 | 1.66 | 1.58 |
R | 1.58 | 1.59 | 1.58 | 1.60 | 1.59 | 1.60 | 1.60 | 1.61 | 1.58 | 1.59 | 1.61 | 1.58 |
Publisher’s Note: MDPI stays neutral with regard to jurisdictional claims in published maps and institutional affiliations. |
© 2020 by the authors. Licensee MDPI, Basel, Switzerland. This article is an open access article distributed under the terms and conditions of the Creative Commons Attribution (CC BY) license (http://creativecommons.org/licenses/by/4.0/).
Share and Cite
Stöcker, C.; Nex, F.; Koeva, M.; Gerke, M. High-Quality UAV-Based Orthophotos for Cadastral Mapping: Guidance for Optimal Flight Configurations. Remote Sens. 2020, 12, 3625. https://doi.org/10.3390/rs12213625
Stöcker C, Nex F, Koeva M, Gerke M. High-Quality UAV-Based Orthophotos for Cadastral Mapping: Guidance for Optimal Flight Configurations. Remote Sensing. 2020; 12(21):3625. https://doi.org/10.3390/rs12213625
Chicago/Turabian StyleStöcker, Claudia, Francesco Nex, Mila Koeva, and Markus Gerke. 2020. "High-Quality UAV-Based Orthophotos for Cadastral Mapping: Guidance for Optimal Flight Configurations" Remote Sensing 12, no. 21: 3625. https://doi.org/10.3390/rs12213625
APA StyleStöcker, C., Nex, F., Koeva, M., & Gerke, M. (2020). High-Quality UAV-Based Orthophotos for Cadastral Mapping: Guidance for Optimal Flight Configurations. Remote Sensing, 12(21), 3625. https://doi.org/10.3390/rs12213625