Automatic Registration of Optical Images with Airborne LiDAR Point Cloud in Urban Scenes Based on Line-Point Similarity Invariant and Extended Collinearity Equations
<p>The flow chart of automatic registration of urban aerial images with LiDAR points based on the line-point similarity invariant and extended collinearity equations.</p> "> Figure 2
<p>In the iterative process, a point <math display="inline"><semantics> <mi>β</mi> </semantics></math> approaches its true position: <span class="html-italic">L</span>’ and <span class="html-italic">L</span> constitute a matched line pair lying on image space and LiDAR space, respectively. The true corresponding point is determined as soon as parameter <math display="inline"><semantics> <mi>λ</mi> </semantics></math> is calculated in the iterative process.</p> "> Figure 3
<p>Line features extraction based on LiDAR point clouds: (<b>a</b>) LiDAR point clouds echoed from the wall; (<b>b</b>) the small cuboid formed by moving the wall or the roof plane along its normal direction.</p> "> Figure 4
<p>The corresponding relationship between points and lines in a local area.</p> "> Figure 5
<p>Search region determination.</p> "> Figure 6
<p>Distance between two straight line segments.</p> "> Figure 7
<p>Experimental data: (<b>a</b>) location map of experimental area (<b>b</b>) LiDAR point clouds and (<b>c</b>) aerial optical image.</p> "> Figure 8
<p>Extracted lines and points: (<b>a</b>) and (<b>b</b>) show the features extracted from LiDAR data; while (<b>c</b>) and (<b>d</b>) show the features extracted from optical images.</p> "> Figure 9
<p>Visual comparison showing the coarse registration results, on the top row, and the precise registration results, on the bottom row.</p> "> Figure 10
<p>3D visualization of a rural scene in Changchun using the colorized 3D LiDAR point.</p> ">
Abstract
:1. Introduction
2. Transformation Function Based on the Extended Collinearity Equations
3. The Extraction of Registration Primitives and Automatic Matching Based on Line-Point Similarity Invariant
3.1. Line Features Extraction from LiDAR Data and Optical Image
3.2. Line Matching Based on Line-Point Similarity Invariant
- Define a rectangular search region surrounding line q in the LiDAR intensity image, whose length and breadth are determined by and , respectively (Figure 5). Parameters and control the size of the rectangle, which can be determined empirically. It is optimal in many cases that the parameters are 1.5 to 2 times the length of line . One side of the rectangle is parallel to line . Since the optical image has been coarsely registered to the LiDAR intensity image, a corresponding search region that is approximately the same as this one can be formed in the optical image. Find all matching points within these two search regions from set S. Carry out the same process for each extracted line in the intensity image.
- Considering that not all matched points in a search region are correct, a similarity measure is defined according to formula (10), after the distance ratios have been calculated with (9), which means that if line matches with line , and the point pair , are correctly matched, then tends toward 1. The similarity measure of each of the matched points in both search regions is calculated, and lines and are labeled as a matched pair when approaches 1.
- Lines matched by step (2) may lack robustness. The right part of Figure 4 demonstrates this more clearly: both line pairs , and , meet the requirement of the line-point similarity invariant, but neither nor is matched to line . To overcome this problem, the distance between the two lines is introduced as an auxiliary similarity measure. This definition was illustrated by Figure 6, where A, B are the end points of line , and the distances from points A and B to the line are denoted by and , respectively. The distance from and is then defined by Equation (11). If is greater than two times the average distance of laser points, then line p is labeled as not matching with line .
- To speed up the matching process, the following strategy is adopted: considering that a pair of initially-matched lines should be nearly parallel after the coarse registration, because most distortions have been eliminated, we set a threshold of angle tolerance and compare it to the acute angle spanned by a given pair of matched lines. Pairs with a spanned angle larger than the threshold are labeled as mismatched, and are deleted from the candidates waiting for matching. Matching speed is accelerated greatly in this way.
- Repeat steps (1) to (4) until all lines have been traversed.
4. Experimental Results and Discussions
5. Conclusions
Author Contributions
Funding
Conflicts of Interest
References
- Matese, A.; Toscano, P.; Gennaro, S.F.D.; Genesio, L.; Vaccari, F.P.; Primicerio, J.; Belli, C.; Zaldei, A.; Bianconi, R.; Gioli, B. Intercomparison of UAV, Aircraft and Satellite Remote Sensing Platforms for Precision Viticulture. Remote Sens. 2015, 7, 2971–2990. [Google Scholar] [CrossRef]
- Stankov, K. Detection of Buildings in Multispectral Very High Spatial Resolution Images Using the Percentage Occupancy Hit-or-Miss Transform. IEEE J. Sel. Top. Appl. Earth Observ. Remote Sens. 2014, 7, 4069–4080. [Google Scholar] [CrossRef]
- Bing, L.; He, Y. Species classification using Unmanned Aerial Vehicle (UAV)-acquired high spatial resolution imagery in a heterogeneous grassland. ISPRS J. Photogramm. Remote Sens. 2017, 128, 73–85. [Google Scholar]
- Pu, R.; Landry, S. A comparative analysis of high spatial resolution IKONOS and WorldView-2 imagery for mapping urban tree species. Remote Sens. Environ. 2012, 124, 516–533. [Google Scholar] [CrossRef]
- Travelletti, J.; Delacourt, C.; Allemand, P.; Malet, J.P.; Schmittbuhl, J.; Toussaint, R.; Bastard, M. Correlation of multi-temporal ground-based optical images for landslide monitoring: Application, potential and limitations. J. Volcanol. Geotherm. Res. 2012, 70, 39–55. [Google Scholar] [CrossRef]
- Parmehr, E.G.; Fraser, C.S.; Zhang, C.; Leach, J. Automatic registration of optical imagery with 3D LiDAR data using statistical similarity. ISPRS J. Photogramm. 2014, 88, 28–40. [Google Scholar] [CrossRef]
- Kumar, R.; Zhang, Y. A Review of Optical Imagery and Airborne LiDAR Data Registration Methods. Open Remote Sens. J. 2012, 5, 54–63. [Google Scholar] [CrossRef]
- Satoshi, I.; David, W.; Yasuyuki, M.; Kiyoharu, A. Photometric Stereo Using Sparse Bayesian Regression for General Diffuse Surfaces. IEEE Trans. Pattern Anal. Mach. Intel. 2014, 36, 1816–1831. [Google Scholar]
- Li, X.; Cheng, X.; Chen, W.; Chen, G.; Liu, S. Identification of Forested Landslides Using LiDAR Data, Object-based Image Analysis, and Machine Learning Algorithms. Remote Sens. 2015, 7, 9705–9726. [Google Scholar] [CrossRef]
- Yang, M.S.; Wu, M.C.; Liu, J.K. Analysis of spatial characteristics for landslides vegetation restoration monitoring by LiDAR surface roughness data and multispectrum imagery. In Proceedings of the Geoscience and Remote Sensing Symposium, Munich, Germany, 22–27 July 2012; pp. 7561–7564. [Google Scholar]
- Joyce, K.E.; Samsonov, S.V.; Levick, S.R.; Engelbrecht, J.; Belliss, S. Mapping and monitoring geological hazards using optical, LiDAR, and synthetic aperture RADAR image data. Nat. Hazards 2014, 73, 137–163. [Google Scholar] [CrossRef]
- Balic, N.; Weinacker, H.; Koch, B. Generation of mosaiced digital true orthophotographs from multispectral and liDAR data. Int. J. Remote Sens. 2007, 28, 3679–3688. [Google Scholar] [CrossRef]
- Oliveira, H.C.D.; Galo, M.; Poz, A.P.D. Height-Gradient-Based Method for Occlusion Detection in True Orthophoto Generation. IEEE Geosci. Remote Sens. Lett. 2015, 12, 2222–2226. [Google Scholar] [CrossRef]
- Du, S.; Zhang, Y.; Qin, R.; Yang, Z.; Zou, Z.; Tang, Y.; Fan, C. Building Change Detection Using Old Aerial Images and New LiDAR Data. Remote Sens. 2016, 8, 1030. [Google Scholar] [CrossRef]
- Symeonakis, E.; Caccetta, P.A.; Wallace, J.F.; Arnau-Rosalen, E.; Calvo-Cases, A.; Koukoulas, S. Multi-temporal Forest Cover Change and Forest Density Trend Detection in a Mediterranean Environment. Land Degrad. Dev. 2017, 28, 1188–1198. [Google Scholar] [CrossRef]
- Ali-Sisto, D.; Packalen, P. Forest Change Detection by Using Point Clouds from Dense Image Matching Together with a LiDAR-Derived Terrain Model. IEEE J. Sel. Top. Appl. Earth Observ. Remote Sens. 2017, 10, 1197–1206. [Google Scholar] [CrossRef]
- Li, H.; Zhong, C.; Hu, X.; Xiao, L.; Huang, X. New methodologies for precise building boundary extraction from LiDAR data and high resolution image. Sens. Rev. 2013, 33, 157–165. [Google Scholar] [CrossRef]
- Youn, J.; Bethel, J.S.; Mikhail, E.M.; Lee, C. Extracting Urban Road Networks from High-resolution True Orthoimage and Lidar. Photogramm. Eng. Remote Sens. 2015, 74, 227–237. [Google Scholar] [CrossRef]
- Zhang, W.; Li, W.; Zhang, C.; Hanink, D.M.; Li, X.; Wang, W. Parcel-based urban land use classification in megacity using airborne LiDAR, high resolution orthoimagery, and Google Street View. Comput. Environ. Urban Syst. 2017, 64, 215–228. [Google Scholar] [CrossRef]
- Bandyopadhyay, M.; Aardt, J.N.V.; Cawse-Nicholson, K.; Ientilucci, E. On the Fusion of Lidar and Aerial Color Imagery to Detect Urban Vegetation and Buildings. Photogramm. Eng. Remote Sens. 2017, 83, 123–136. [Google Scholar] [CrossRef]
- Khodadadzadeh, M.; Li, J.; Prasad, S.; Plaza, A. Fusion of Hyperspectral and LiDAR Remote Sensing Data Using Multiple Feature Learning. IEEE J. Sel. Top. Appl. Earth Observ. Remote Sens. 2015, 8, 2971–2983. [Google Scholar] [CrossRef]
- Huber, M.; Schickler, W.; Hinz, S.; Baumgartner, A. Fusion of LIDAR data and aerial imagery for automatic reconstruction of building surfaces. In Proceedings of the Workshop on Remote Sensing & Data Fusion Over Urban Areas, Berlin, Germany, 22–23 May 2003. [Google Scholar]
- Mastin, A.; Kepner, J.; Fisher, J. Automatic registration of LIDAR and optical images of urban scenes. In Proceedings of the IEEE Conference on Computer Vision and Pattern Recognition, Miami, FL, USA, 20–25 June 2009; pp. 2639–2646. [Google Scholar]
- Parmehr, E.G.; Fraser, C.S.; Zhang, C. Automatic Parameter Selection for Intensity-Based Registration of Imagery to LiDAR Data. IEEE Trans. Geosci. Remote Sens. 2016, 54, 7032–7043. [Google Scholar] [CrossRef]
- Liu, S.; Tong, X.; Chen, J.; Liu, X.; Sun, W.; Xie, H.; Chen, P.; Jin, Y.; Ye, Z. A Linear Feature-Based Approach for the Registration of Unmanned Aerial Vehicle Remotely-Sensed Images and Airborne LiDAR Data. Remote Sens. 2016, 8, 82. [Google Scholar] [CrossRef]
- Pluim, J.P.W.; Maintz, J.B.A.; Viergever, M.A. Image Registration by Maximization of Combined Mutual Information and Gradient Information. IEEE Trans. Med. Imaging 2000, 19, 809–814. [Google Scholar] [CrossRef] [PubMed]
- Parmehr, E.G.; Fraser, C.S.; Zhang, C.; Leach, J. Automatic Registration of Aerial Images with 3D LiDAR Data Using a Hybrid Intensity-Based Method. In Proceedings of the International Conference on Digital Image Computing Techniques & Applications, Fremantle, Australia, 3–5 December 2012. [Google Scholar]
- Abedini, A.; Hahn, M.; Samadzadegan, F. An investigation into the registration of LIDAR intensity data and aerial images using the SIFT approach. In Proceedings of the ISPRS Congress, Beijing, China, 3–11 July 2008. [Google Scholar]
- Palenichka, R.M.; Zaremba, M.B. Automatic Extraction of Control Points for the Registration of Optical Satellite and LiDAR Images. IEEE Trans. Geosci. Remote Sens. 2010, 48, 2864–2879. [Google Scholar] [CrossRef]
- Harrison, J.W.; Iles, P.J.W.; Ferrie, F.P.; Hefford, S.; Kusevic, K.; Samson, C.; Mrstik, P. Tessellation of Ground-Based LIDAR Data for ICP Registration. In Proceedings of the Canadian Conference on Computer and Robot Vision, Windsor, ON, Canada, 28–30 May 2008; pp. 345–351. [Google Scholar]
- Teo, T.A.; Huang, S.H. Automatic Co-Registration of Optical Satellite Images and Airborne Lidar Data Using Relative and Absolute Orientations. IEEE J. Sel. Top. Appl. Earth Observ. Remote Sens. 2013, 6, 2229–2237. [Google Scholar] [CrossRef]
- Abayowa, B.O.; Yilmaz, A.; Hardie, R.C. Automatic registration of optical aerial imagery to a LiDAR point cloud for generation of city models. ISPRS J. Photogramm. Remote Sens. 2015, 106, 68–81. [Google Scholar] [CrossRef]
- Habib, A.F.; Shin, S.; Kim, C.; Al-Durgham, M. Integration of Photogrammetric and LIDAR Data in a Multi-Primitive Triangulation Environment. In Proceedings of the Innovations in 3D Geo Information Systems, First International Workshop on 3D Geoinformation, Kuala Lumpur, Malaysia, 7–8 August 2006; pp. 29–45. [Google Scholar]
- Wuming, Z.; Jing, Z.; Mei, C.; Yiming, C.; Kai, Y.; Linyuan, L.; Jianbo, Q.; Xiaoyan, W.; Jinghui, L.; Qing, C. Registration of optical imagery and LiDAR data using an inherent geometrical constraint. Opt. Express 2015, 23, 7694–7702. [Google Scholar]
- Ding, M.; Lyngbaek, K.; Zakhor, A. Automatic registration of aerial imagery with untextured 3D LiDAR models. In Proceedings of the IEEE Conference on Computer Vision and Pattern Recognition, Anchorage, AK, USA, 23–28 June 2008; pp. 1–8. [Google Scholar]
- Choi, K. Precise Geometric Resigtration of Aerial Imagery and LiDAR Data. ETRI J. 2011, 33, 506–516. [Google Scholar] [CrossRef]
- Abdel-Aziz, Y.I.; Karara, H.M.; Hauck, M. Direct Linear Transformation from Comparator Coordinates into Object Space Coordinates in Close-Range Photogrammetry. Photogramm. Eng. Remote Sens. 2015, 81, 103–107. [Google Scholar] [CrossRef]
- Förstner, M.A.; Glch, E. A Fast Operator for Detection and Precise Location of Distinct Points, Corners and Centers of Circular Features. In Proceedings of the THE ISPRS Intercommission Workshop, Interlaken, Switzerland, 2–4 June 1987. [Google Scholar]
- Zhang, Y.; Xiong, X.; Shen, X. Automatic registration of urban aerial imagery with airborne LiDAR data. J. Remote Sens. 2012, 16, 579–595. [Google Scholar]
- Zheng, S.; Huang, R.; Zhou, Y. Registration of optical images with LiDAR data and its accuracy assessment. Photogramm. Eng. Remote Sens. 2013, 79, 731–741. [Google Scholar] [CrossRef]
- Huang, R.; Zheng, S.; Hu, K. Registration of Aerial Optical Images with LiDAR Data Using the Closest Point Principle and Collinearity Equations. Sensors 2018, 18, 1770. [Google Scholar] [CrossRef] [PubMed]
- Habib, A.; Ghanma, M.; Morgan, M.; Al-Ruzouq, R. Photogrammetric and Lidar Data Registration Using Linear Features. Photogramm. Eng. Remote Sens. 2005, 71, 699–707. [Google Scholar] [CrossRef]
- Yang, B.; Chen, C. Automatic registration of UAV-borne sequent images and LiDAR data. ISPRS J. Photogramm. Remote Sens. 2015, 101, 262–274. [Google Scholar] [CrossRef]
- Sungwoong, S.; Habib, A.F.; Mwafag, G.; Changjae, K.; Euimyoung, K. Algorithms for Multi-sensor and Multi-primitive Photogrammetric Triangulation. ETRI J. 2007, 29, 411–420. [Google Scholar]
- Armenakis, C.; Gao, Y.; Sohn, G. Co-registration of aerial photogrammetric and LiDAR point clouds in urban environments using automatic plane correspondence. Appl. Geomat. 2013, 5, 155–166. [Google Scholar] [CrossRef]
- Wolf, P.R.; Dewitt, B.A. Elements of Photogrammetry with Application in GIS, 4th ed.; McGraw-Hill: New York, NY, USA, 2014. [Google Scholar]
- Bay, H.; Ferrari, V.; Gool, L.J.V. Wide-Baseline Stereo Matching with Line Segments. In Proceedings of the IEEE Computer Society Conference on Computer Vision and Pattern Recognition, San Diego, CA, USA, 20–25 June 2005; Volume 1, pp. 329–336. [Google Scholar]
- Wang, Z.; Wu, F.; Hu, Z. MSLD: A robust descriptor for line matching. Pattern Recognit. 2009, 42, 941–953. [Google Scholar] [CrossRef]
- Lourakis, M.I.A.; Halkidis, S.T.; Orphanoudakis, S.C. Matching disparate views of planar surfaces using projective invariants. Image Vis. Comput. 2000, 18, 673–683. [Google Scholar] [CrossRef]
- Fan, B.; Wu, F.; Hu, Z. Robust line matching through line-point invariants. Pattern Recognit. 2012, 45, 794–805. [Google Scholar] [CrossRef]
- Shen, W.; Zhang, J.; Yuan, F. A new algorithm of building boundary extraction based on LIDAR data. In Proceedings of the International Conference on Geoinformatics, Shanghai, China, 24–26 June 2011; pp. 1–4. [Google Scholar]
- Rafael, G.v.G.; Jérémie, J.; Jean-Michel, M.; Gregory, R. LSD: A fast line segment detector with a false detection control. IEEE Trans. Pattern Anal. Mach. Intel. 2010, 32, 722–732. [Google Scholar]
Registrations | Error Statistics (Unit: Meter) | ||
---|---|---|---|
MAX | MEAN | RMSE | |
Coarse registration by initial exterior orientation elements | 2.76 | 1.36 | 0.84 |
Coarse registration by Förstner operator and DLT transformation model | 1.52 | 0.61 | 0.45 |
Precise registration by extended collinearity equations | 0.52 | 0.24 | 0.13 |
© 2019 by the authors. Licensee MDPI, Basel, Switzerland. This article is an open access article distributed under the terms and conditions of the Creative Commons Attribution (CC BY) license (http://creativecommons.org/licenses/by/4.0/).
Share and Cite
Peng, S.; Ma, H.; Zhang, L. Automatic Registration of Optical Images with Airborne LiDAR Point Cloud in Urban Scenes Based on Line-Point Similarity Invariant and Extended Collinearity Equations. Sensors 2019, 19, 1086. https://doi.org/10.3390/s19051086
Peng S, Ma H, Zhang L. Automatic Registration of Optical Images with Airborne LiDAR Point Cloud in Urban Scenes Based on Line-Point Similarity Invariant and Extended Collinearity Equations. Sensors. 2019; 19(5):1086. https://doi.org/10.3390/s19051086
Chicago/Turabian StylePeng, Shubiao, Hongchao Ma, and Liang Zhang. 2019. "Automatic Registration of Optical Images with Airborne LiDAR Point Cloud in Urban Scenes Based on Line-Point Similarity Invariant and Extended Collinearity Equations" Sensors 19, no. 5: 1086. https://doi.org/10.3390/s19051086
APA StylePeng, S., Ma, H., & Zhang, L. (2019). Automatic Registration of Optical Images with Airborne LiDAR Point Cloud in Urban Scenes Based on Line-Point Similarity Invariant and Extended Collinearity Equations. Sensors, 19(5), 1086. https://doi.org/10.3390/s19051086