True Orthophoto Generation from Aerial Frame Images and LiDAR Data: An Update
<p>Problem of double-mapped areas within the differential rectification method. DSM: digital surface model.</p> "> Figure 2
<p>(<b>a</b>) Part of a perspective image where the red arrows indicate relief displacements along the facades, and (<b>b</b>) the corresponding orthophoto with double-mapped areas enclosed by the red outlines.</p> "> Figure 3
<p>Principle of occlusion detection using the Z-buffer method.</p> "> Figure 4
<p>Principle of occlusion detection using the height-based method.</p> "> Figure 5
<p>Principle of occlusion detection using the height-gradient-based method.</p> "> Figure 6
<p>Principle of occlusion detection using the angle-based method.</p> "> Figure 7
<p>Conceptual details of occlusion extension within the angle-based method.</p> "> Figure 8
<p>Principle of the radial sweep method for angle-based occlusion detection.</p> "> Figure 9
<p>Conceptual details of DSM partitioning within the adaptive radial sweep method.</p> "> Figure 10
<p>(<b>a</b>) False visibility reported within the adaptive radial sweep method, and (<b>b</b>) considering an overlap between two consecutive radial sections to resolve the reported false visibility.</p> "> Figure 11
<p>Occlusion detection result for an image: (<b>a</b>) Detected occlusions (orange areas) visualized on an orthophoto; and (<b>b</b>) Respective visibility map representing the visible DSM cells (magenta portions).</p> "> Figure 12
<p>(<b>a</b>) Contribution of images to part of a true orthophoto generated using the closest visible image criteria, and (<b>b</b>) contribution of the same images in the inverse distance weighting (IDW)-based spectral balancing process.</p> "> Figure 13
<p>As proposed by Hinks et al. [<a href="#B42-remotesensing-10-00581" class="html-bibr">42</a>], airborne LiDAR data suitable for urban modeling tasks can be acquired using: (<b>a</b>) two sets of parallel flight lines perpendicular to each other, which are oriented <math display="inline"> <semantics> <msup> <mn>45</mn> <mo>∘</mo> </msup> </semantics> </math> from the street layout, and (<b>b</b>) a 67% overlap between adjacent strips.</p> "> Figure 14
<p>Raster DSMs generated from LiDAR point data for: (<b>a</b>) Trinity College, and (<b>b</b>) Dawson Street study areas. Cell size is equal to 10 cm in both DSMs.</p> "> Figure 15
<p>The impact of occlusion extension on the angle-based occlusion detection results: (<b>a</b>) sample results before occlusion extension, and (<b>b</b>) after applying an occlusion extension equal to <math display="inline"> <semantics> <mrow> <msup> <mn>0.1</mn> <mo>∘</mo> </msup> </mrow> </semantics> </math>.</p> "> Figure 16
<p>(<b>a</b>) False visibility at the border of two radial sections within the adaptive radial sweep method, and (<b>b</b>) considering an overlap between the radial sections to resolve the false visibility.</p> "> Figure 17
<p>Results of occlusion detection using the Z-buffer and angle-based methods: (<b>a</b>) perspective image, (<b>b</b>) differentially-rectified orthophoto, (<b>c</b>) detected occlusions using the Z-buffer technique, and (<b>d</b>) detected occlusions using the angle-based method. Pixel size is equal to 10 cm in the orthophotos.</p> "> Figure 17 Cont.
<p>Results of occlusion detection using the Z-buffer and angle-based methods: (<b>a</b>) perspective image, (<b>b</b>) differentially-rectified orthophoto, (<b>c</b>) detected occlusions using the Z-buffer technique, and (<b>d</b>) detected occlusions using the angle-based method. Pixel size is equal to 10 cm in the orthophotos.</p> "> Figure 18
<p>Detected occlusions using the height-based and angle-based techniques: (<b>a</b>) Perspective image, (<b>b</b>) Differentially-rectified orthophoto, (<b>c</b>) Detected occlusions using the height-based method, and (<b>d</b>) Detected occlusions using the angle-based technique. Pixel size is 10 cm in the orthophotos.</p> "> Figure 19
<p>(<b>a</b>) Differentially-rectified orthophoto, (<b>b</b>) True orthophoto with occluded areas shown in orange, (<b>c</b>) True orthophoto generated using the closest visible image criteria, and (<b>d</b>) Contribution of images to the true orthophoto shown in part (c). Pixel size is equal to 10 cm in all images.</p> "> Figure 20
<p>(<b>a</b>,<b>c</b>) Two parts of a true orthophoto generated using the closest visible image criteria, (<b>b</b>,<b>d</b>) true orthophotos produced using the IDW-based color balancing approach for the same areas shown in figures (a,c), respectively. Pixel size is equal to 10 cm in the orthophotos.</p> "> Figure 21
<p>(<b>a</b>) True orthophoto for the Trinity College study area; (<b>b</b>) A closer look at the area enclosed by the red rectangle in part (a); (<b>c</b>) True orthophoto for the Dawson Street study area; and (<b>d</b>) A close-up of the area inside the red rectangle in part (c). Pixel size is equal to 10 cm in the orthophotos.</p> "> Figure 22
<p>(<b>a</b>) Part of the true orthophoto for the Trinity College study area, and (<b>b</b>) digital terrain model (DTM)-based orthophoto for the same area; (<b>c</b>) A portion of the true orthophoto for the Dawson Street study area, and (<b>d</b>) the corresponding DTM-based orthophoto. Pixel size is 10 cm in the orthophotos.</p> "> Figure 23
<p>(<b>a</b>) A portion of the LiDAR-based true orthophoto, and (<b>b</b>) image-matching-based true orthophoto for the same area; (<b>c</b>) Another part of the LiDAR-based true orthophoto, and (<b>d</b>) the corresponding image-matching-based true orthophoto. Pixel size is equal to 10 cm in the orthophotos.</p> "> Figure 24
<p>(<b>a</b>) Part of the LiDAR-based DSM for the Trinity College study area, and (<b>b</b>) DSM generated from the image-matching-based point cloud for the same area; (<b>c</b>) Another portion of the LiDAR-based DSM, and (<b>d</b>) the corresponding image-matching-based DSM. Cell size is equal to 10 cm in the DSMs.</p> ">
Abstract
:1. Introduction
2. Related Work
3. Angle-Based Methodology for True Orthophoto Generation
3.1. Angle-Based Occlusion Detection
3.2. Adaptive Radial Sweep Method
3.3. True Orthophoto Mosaic Generation
4. Experimental Results
4.1. Test Data and Study Areas
4.2. Impact of Modifications to the Angle-Based Method
4.3. Comparison of the Occlusion Detection Techniques
4.4. LiDAR-Based True Orthophotos
4.5. LiDAR-Based versus Image-Matching-Based Products
5. Conclusions
Acknowledgments
Author Contributions
Conflicts of Interest
References
- Hirschmüller, H. Stereo processing by semiglobal matching and mutual information. IEEE Trans. Pattern Anal. Mach. Intell. 2008, 30, 328–341. [Google Scholar] [CrossRef] [PubMed]
- Haala, N.; Rothermel, M. Image-Based 3D Data Capture in Urban Scenarios; Photogrammetric Week ’15; Wichmann/VDE Verlag: Berlin/Offenbach, Germany, 2015; pp. 119–130. [Google Scholar]
- Wehr, A.; Lohr, U. Airborne laser scanning—An introduction and overview. ISPRS J. Photogramm. Remote Sens. 1999, 54, 68–82. [Google Scholar] [CrossRef]
- Vosselman, G.; Mass, H.G. Airborne and Terrestrial Laser Scanning; Whittles Publishing: Dunbeath, UK, 2010. [Google Scholar]
- Rottensteiner, F.; Trinder, J.; Clode, S.; Kubik, K. Using the Dempster-Shafer method for the fusion of LIDAR data and multi-spectral images for building detection. Inf. Fus. 2005, 6, 283–300. [Google Scholar] [CrossRef]
- Kim, C.; Habib, A. Object-based Integration of Photogrammetric and LiDAR Data for Accurate Reconstruction and Visualization of Building Models. Sensors 2009, 9, 5679–5701. [Google Scholar] [CrossRef] [PubMed]
- Khoshelham, K.; Nardinocchi, C.; Frontoni, E.; Mancini, A.; Zingaretti, P. Performance evaluation of automated approaches to building detection in multi-source aerial data. ISPRS J. Photogramm. Remote Sens. 2010, 65, 123–133. [Google Scholar] [CrossRef]
- Kwak, E.; Habib, A. Automatic 3D Building Model Generation From Lidar and Image Data Using Sequential Minimum Bounding Rectangle. Int. Arch. Photogramm. Remote Sens. Spatial Inf. Sci. 2012. [Google Scholar] [CrossRef]
- Awrangjeb, M.; Ravanbakhsh, M.; Fraser, C.S. Automatic detection of residential buildings using LIDAR data and multispectral imagery. ISPRS J. Photogramm. Remote Sens. 2010, 65, 457–467. [Google Scholar] [CrossRef]
- Awrangjeb, M.; Zhang, C.; Fraser, C.S. Automatic extraction of building roofs using LIDAR data and multispectral imagery. ISPRS J. Photogramm. Remote Sens. 2013, 83, 1–18. [Google Scholar] [CrossRef]
- Gilani, S.A.N.; Awrangjeb, M.; Lu, G. An Automatic Building Extraction and Regularisation Technique Using LiDAR Point Cloud Data and Orthoimage. Remote Sens. 2016, 8, 258. [Google Scholar] [CrossRef]
- Habib, A.F.; Zhai, R.; Kim, C. Generation of Complex Polyhedral Building Models by Integrating Stereo-Aerial Imagery and Lidar Data. Photogramm. Eng. Remote Sens. 2010, 76, 609–623. [Google Scholar] [CrossRef]
- Arefi, H.; Reinartz, P. Building reconstruction using DSM and orthorectified images. Remote Sens. 2013, 5, 1681–1703. [Google Scholar] [CrossRef] [Green Version]
- Guo, L.; Chehata, N.; Mallet, C.; Boukir, S. Relevance of airborne lidar and multispectral image data for urban scene classification using Random Forests. ISPRS J. Photogramm. Remote Sens. 2011, 66, 56–66. [Google Scholar] [CrossRef]
- Gerke, M.; Xiao, J. Fusion of airborne laserscanning point clouds and images for supervised and unsupervised scene classification. ISPRS J. Photogramm. Remote Sens. 2014, 87, 78–92. [Google Scholar] [CrossRef]
- Habib, A.; Jarvis, A.; Kersting, A.P.; Alghamdi, Y. Comparative analysis of georeferencing procedures using various sources of control data. Int. Arch. Photogramm. Remote Sens. Spat. Inf. Sci. 2008, 37, 1147–1152. [Google Scholar]
- Parmehr, E.G.; Fraser, C.S.; Zhang, C.; Leach, J. Automatic registration of optical imagery with 3d lidar data using local combined mutual information. ISPRS J. Photogramm. Remote Sens. 2014, 88, 28–40. [Google Scholar] [CrossRef]
- Habib, A.F.; Kim, E.M.; Kim, C.J. New Methodologies for True Orthophoto Generation. Photogramm. Eng. Remote Sens. 2007, 73, 25–36. [Google Scholar] [CrossRef]
- Mikhail, E.M.; Bethel, J.S.; McGlone, J.C. Introduction to Modern Photogrammetry; John Wiley & Sons, Inc.: New York, NY, USA, 2001. [Google Scholar]
- Günay, A.; Arefi, H.; Hahn, M. Semi-automatic true orthophoto production by using LIDAR data. In Proceedings of the International Geoscience and Remote Sensing Symposium (IGARSS), Barcelona, Spain, 23–28 July 2007; pp. 2873–2876. [Google Scholar]
- Barazzetti, L.; Brovelli, M.; Scaioni, M. Generation of true-orthophotos with lidar high resolution digital surface models. Photogramm. J. Finl. 2008, 21, 26–36. [Google Scholar]
- Novak, K. Rectification of Digital Imagery. Photogramm. Eng. Remote Sens. 1992, 58, 339–344. [Google Scholar]
- Skarlatos, D. Orthophotograph Production in Urban Areas. Phtogramm. Rec. 1999, 16, 643–650. [Google Scholar] [CrossRef]
- Catmull, E.E. A Subdivision Algorithm for Computer Display of Curved Surfaces. Ph.D. Thesis, The University of Utah, Salt Lake City, UT, USA, 1974. [Google Scholar]
- Ahmar, F.; Jansa, J.; Ries, C. The generation of true orthophotos using a 3D building model in Conjunction with a conventional dtm. Iaprs 1998, 32, 16–22. [Google Scholar]
- Rau, J.; Chen, N.Y.; Chen, L.C. True Orthophoto Generation of Built-Up Areas Using Multi-View Images. Photogramm. Eng. Remote Sens. 2002, 68, 581–588. [Google Scholar]
- Sheng, Y.; Gong, P.; Biging, G.S. True Orthoimage Production for Forested Areas from Large-Scale Aerial Photographs. Photogramm. Eng. Remote Sens. 2003, 69, 259–266. [Google Scholar] [CrossRef]
- Sheng, Y. Minimising algorithm-induced artefacts in true ortho-image generation: A direct method implemented in the vector domain. Photogramm. Rec. 2007, 22, 151–163. [Google Scholar] [CrossRef]
- Zhou, G.; Schickler, W.; Thorpe, A.; Song, P.; Chen, W.; Song, C. True orthoimage generation in urban areas with very tall buildings. Int. J. Remote Sens. 2004, 25, 5163–5180. [Google Scholar] [CrossRef]
- Chen, L.C.; Teo, T.A.; Wen, J.Y.; Rau, J.Y. Occlusion-Compensated True Orthorectification for High-Resolution Satellite Images. Photogramm. Rec. 2007, 22, 39–52. [Google Scholar] [CrossRef]
- Deng, F.; Kang, J.; Li, P.; Wan, F. Automatic true orthophoto generation based on three-dimensional building model using multiview urban aerial images. J. Appl. Remote Sens. 2015, 9, 095087. [Google Scholar] [CrossRef]
- Zhou, G.; Wang, Y.; Yue, T.; Ye, S.; Wang, W. Building Occlusion Detection from Ghost Images. IEEE Trans. Geosci. Remote Sens. 2017, 55, 1074–1084. [Google Scholar] [CrossRef]
- Habib, A.; Bang, K.; Kim, C.; Shin, S. True Ortho-Photo Generation from High Resolution Satellite Imagery; Springer: Berlin, Germany, 2006; pp. 641–656. [Google Scholar]
- De Oliveira, H.C.; Galo, M.; Dal Poz, A.P. Height-Gradient-Based Method for Occlusion Detection in True Orthophoto Generation. IEEE Geosci. Remote Sens. Lett. 2015, 12, 2222–2226. [Google Scholar] [CrossRef]
- De Oliveira, H.C.; Porfírio Dal Poz, A.; Galo, M.; Habib, A.F. Surface Gradient Approach for Occlusion Detection Based on Triangulated Irregular Network for True Orthophoto Generation. IEEE J. Sel. Top. Appl. Earth Obs. Remote Sens. 2018, 11, 443–457. [Google Scholar] [CrossRef]
- Radwan, M.; Makarovic, B. Digital mono-plotting system-improvements and tests. ITC J. 1980, 3, 511–533. [Google Scholar]
- Haala, N. Multiray Photogrammetry and Dense Image Matching; Photogrammetric Week ’11; Wichmann/VDE Verlag: Berlin/Offenbach, Germany, 2011; pp. 185–195. [Google Scholar]
- Wang, Q.; Yan, L.; Sun, Y.; Cui, X.; Mortimer, H.; Li, Y. True orthophoto generation using line segment matches. Photogramm. Rec. 2018. [Google Scholar] [CrossRef]
- Shepard, D. A two-dimensional interpolation function for irregularly-spaced data. In Proceedings of the 1968 ACM National Conference, New York, NY, USA, 27–29 August 1968; pp. 517–524. [Google Scholar]
- OpenCV. Open Source Computer Vision Library. Available online: https://opencv.org/ (accessed on 8 April 2018).
- Laefer, D.; Abuwarda, S.; Vu, V.A.; Truong-Hong, L.; Gharibi, H. 2015 Aerial Laser and Photogrammetry Survey of Dublin City Collection Record; New York University: New York, NY, USA, 2017; Available online: https://geo.nyu.edu/catalog/nyu_2451_38684 (accessed on 8 April 2018).
- Hinks, T.; Carr, H.; Laefer, D.F. Flight Optimization Algorithms for Aerial LiDAR Capture for Urban Infrastructure Model Generation. J. Comput. Civ. Eng. 2009, 23, 330–339. [Google Scholar] [CrossRef]
- Hinks, T.; Carr, H.; Gharibi, H.; Laefer, D.F. Visualisation of urban airborne laser scanning data with occlusion images. ISPRS J. Photogramm. Remote Sens. 2015, 104, 77–87. [Google Scholar] [CrossRef]
- ASPRS. LAS Format. Available online: https://www.asprs.org/committee-general/laser-las-file-format-exchange-activities.html (accessed on 8 April 2018).
- Gharibi, H. Extraction and Improvement of Digital Surface Models from Dense Point Clouds. Master’s Thesis, University of Stuttgart, Stuttgart, Germany, 2012. [Google Scholar]
- Tarsha-Kurdi, F.; Landes, T.; Grussenmeyer, P.; Smigiel, E. New Approach for Automatic Detection of Buildings in Airborne Laser Scanner Data Using First Echo Only. Int. Arch. Photogramme. Remote Sens. Spat. Inf. Sci. 2006, XXXVI, 1–6. [Google Scholar]
- Trimble. Inpho. Available online: https://geospatial.trimble.com/products-and-solutions/inpho (accessed on 8 April 2018).
- Terrasolid. TerraPhoto. Available online: http://www.terrasolid.com/products/terraphotopage.php (accessed on 8 April 2018).
- Pix4D. Pix4Dmapper. Available online: https://pix4d.com/product/pix4dmapper-photogrammetry-software/ (accessed on 8 April 2018).
Items | Values |
---|---|
Laser scanner system | RIEGL LMS-Q680i |
GNSS/IMU system | POS AV 510 |
Flying height AGL | 300 m |
Scan angle | ( on each side) |
Strip overlap | More than 70% |
Swath/strip width | 350 m |
Horizontal point density | 300 pt/m2 |
Total no. of points | Over 1.4 billion |
Items | Values |
---|---|
Imaging sensor | Leica RCD30 |
Flying height AGL | 300 m |
Frame resolution | 9000 × 6732 pixels |
GSD | 3.4 cm |
Along track overlap | More than 80% |
Across track overlap | More than 70% |
Total no. of images | 4210 |
Algorithm | Running Time (mm:ss) |
---|---|
Angle-based | 00:07 |
Height-based | 00:12 |
Z-buffer | 00:04 |
Study Area | No. of Images | Running Time (mm:ss) | ||
---|---|---|---|---|
Orthophoto | True Orthophoto 1 | True Orthophoto 2 | ||
Trinity College | 98 | 03:23 | 16:17 | 31:53 |
Dawson Street | 101 | 03:49 | 17:26 | 34:15 |
© 2018 by the authors. Licensee MDPI, Basel, Switzerland. This article is an open access article distributed under the terms and conditions of the Creative Commons Attribution (CC BY) license (http://creativecommons.org/licenses/by/4.0/).
Share and Cite
Gharibi, H.; Habib, A. True Orthophoto Generation from Aerial Frame Images and LiDAR Data: An Update. Remote Sens. 2018, 10, 581. https://doi.org/10.3390/rs10040581
Gharibi H, Habib A. True Orthophoto Generation from Aerial Frame Images and LiDAR Data: An Update. Remote Sensing. 2018; 10(4):581. https://doi.org/10.3390/rs10040581
Chicago/Turabian StyleGharibi, Hamid, and Ayman Habib. 2018. "True Orthophoto Generation from Aerial Frame Images and LiDAR Data: An Update" Remote Sensing 10, no. 4: 581. https://doi.org/10.3390/rs10040581
APA StyleGharibi, H., & Habib, A. (2018). True Orthophoto Generation from Aerial Frame Images and LiDAR Data: An Update. Remote Sensing, 10(4), 581. https://doi.org/10.3390/rs10040581