Abstract
High-precision point cloud maps have drawn increasing attention due to their wide range of applications. In recent decades, point cloud maps are normally generated by simultaneous localization and mapping (SLAM) methods, which favor real-time performance over high precision. These methods generally focus on trajectory accuracy resulting in the unclearness of map accuracy. Therefore, to build a high-precision point cloud map and evaluate the mapping performance directly, this study proposes a tight coupling mapping method that integrates the error-state Kalman filter (ESKF), the general framework for graph optimization (g2o), and the point cloud alignment. An ESKF and a g2o are both used to improve the precision of the mapping process. Also, experiments based on a mobile mapping backpack prototype are conducted to verify the proposed method. Targets in the environment and a high-precision reference point cloud map are used to directly evaluate the map performance. The results indicate that the generated point cloud map is sufficiently precise and can reach the centimeter level.
Similar content being viewed by others
Abbreviations
- EKF:
-
Extended Kalman filter
- ESKF:
-
Error-state Kalman filter
- ELS:
-
Enhanced line simplification
- FG:
-
Factor graph
- g2o:
-
General framework for graph optimization
- GICP:
-
Generalized iterative closest point
- ICP:
-
Iterative closest point
- IMU:
-
Inertial measurement unit
- LiDAR:
-
Light detection and ranging
- NDT:
-
Normal distributions transform
- SLAM:
-
Simultaneous localization and mapping
- \(0_{3 \times 3}\) :
-
The zero matrix of size 3
- \(a_{m}\) :
-
The output of the three-dimension accelerometer
- \(a_{n}\) :
-
The white Gaussian noise of the accelerometer
- \({\text{C}}_{{\text{L}}}^{{\text{I}}}\) :
-
The rotation matrix between LiDAR and IMU
- \({\text{C}}_{{\text{i}}}^{{\text{X}}}\) :
-
The rotation matrix of Frame i in the coordinate system FX
- Fk :
-
The state transition matrix at time-step k
- F b :
-
The body-fixed front-right-down coordinate system
- F g :
-
The g2o optimization coordinate system
- F n :
-
The local north-east-down coordinate system
- Hk + 1 :
-
The observation matrix at time-step k + 1
- \({\text{I}}_{3 \times 3}\) :
-
The identity matrix of size 3
- k :
-
Subscript k indicates the value is at time-step k
- M :
-
Subscript M denotes the sensor measurement
- \(n_{a}\) :
-
The Gaussian random walk noise of the accelerometer
- \(n_{{\text{g}}}\) :
-
The Gaussian random walk noise of the gyroscope
- \(p^{ + }\) :
-
The estimated position from the state equation
- \(p_{{\text{L}}}\) :
-
The position from the vertexes in the g2o
- \(q^{ + }\) :
-
The estimated quaternion from the state equation
- \(q_{{\text{L}}}\) :
-
The position from the vertexes in the g2o
- R:
-
The rotation matrix
- T:
-
The superscript T represents the transpose operation
- \({\text{t}}_{{\text{L}}}^{{\text{I}}}\) :
-
The translation vector between LiDAR and IMU
- \({\text{t}}_{{\text{i}}}^{{\text{X}}}\) :
-
The position of Frame i in the coordinate system FX
- \({\text{v}}^{ + }\) :
-
The estimated velocity from the state equation
- \({\text{v}}_{{\text{L}}}\) :
-
The velocity from the vertexes in the g2o
- \(x_{k}\) :
-
The state vector is at time-step k
- \(z_{k}\) :
-
The measurement vector is at time-step k
- \(\delta p\) :
-
The position error
- \(\delta q\) :
-
The quaternion error
- \(\delta {\text{v}}\) :
-
The velocity error
- \(\delta \beta_{a}\) :
-
The accelerometer bias error
- \(\delta \beta_{{\text{g}}}\) :
-
The gyroscope bias error
- \(\delta \theta\) :
-
The attitude angle error concerning the quaternion
- \(\Delta t\) :
-
The discrete-time interval
- \(\omega_{m}\) :
-
The output of the three-dimension gyroscope
- \(\omega_{n}\) :
-
The white Gaussian noise of the gyroscope
- \(\cdot\) :
-
The superscript \(\cdot\) indicates derivatives
- \(\otimes\) :
-
The multiplication of the quaternion
- \(*\) :
-
The superscript \(*\) represents the conjugate operation
References
Persad RA, Armenakis C (2015) ALIGNMENT OF POINT CLOUD DSMs FROM TLS AND UAV PLATFORMS. ISPRS – Int Arch Photogramm, Remote Sens Spatial Inf Sci 40(1):369–373. https://doi.org/10.5194/isprsarchives-XL-1-W4-369-2015
Tian Y, Song W, Sun S, Fong S, Zou S (2019) 3D object recognition method with multiple feature extraction from LiDAR point clouds. J Supercomput 75(8):4430–4442. https://doi.org/10.1007/s11227-019-02830-9
Ahmed W, Shi W, Xu W (2018) Modeling complex building structure (LoD2) using image-based point cloud. In: 2018 IEEE International Conference on Image Processing, Applications and Systems (IPAS). IEEE, 110–114
Deschaud J-E (2018) IMLS-SLAM: Scan-to-model matching based on 3D Data. In: 2018 IEEE International Conference on Robotics and Automation (ICRA). IEEE, 2480–2485
Wang C, Hou S, Wen C, Gong Z, Li Q, Sun X, Li J (2018) Semantic line framework-based indoor building modeling using backpacked laser scanning point cloud. ISPRS J Photogramm Remote Sens 143:150–166. https://doi.org/10.1016/j.isprsjprs.2018.03.025
Schauer J, Nüchter A (2018) Removing non-static objects from 3D laser scan data. ISPRS J Photogramm Remote Sens 143:15–38. https://doi.org/10.1016/j.isprsjprs.2018.05.019
Elseberg J, Borrmann D, Nüchter A (2013) One billion points in the cloud – an octree for efficient processing of 3D laser scans. ISPRS J Photogramm Remote Sens 76:76–88. https://doi.org/10.1016/j.isprsjprs.2012.10.004
Wen J, Qian C, Tang J, Liu H, Ye W, Fan X (2018) 2D LiDAR SLAM back-end optimization with control network constraint for mobile mapping. Sensors 18(11):3668. https://doi.org/10.3390/s18113668
Akai N, Morales LY, Takeuchi E, Yoshihara Y, Ninomiya Y Robust localization using 3D NDT scan matching with experimentally determined uncertainty and road marker matching. In: 2017 IEEE Intelligent Vehicles Symposium (IV). IEEE, pp 1356–1363
Ren Z, Wang L, Bi L (2019) Robust GICP-based 3D LiDAR SLAM for underground mining environment. Sensors 19(13):2915. https://doi.org/10.3390/s19132915
Le Gentil C, Vidal-Calleja T, Huang S (2020) IN2LAAMA: inertial lidar localization autocalibration and mapping. IEEE Trans on Robot 37(1):275–290. https://doi.org/10.1109/tro.2020.3018641
Nagy B, Benedek C (2019) Real-time point cloud alignment for vehicle localization in a high resolution 3D Map. In: European Conference on Computer Vision. Springer International Publishing, 226–239 https://doi.org/10.1007/978-3-030-11009-3_13
Zhang J, Singh S (2014) LOAM: lidar odometry and mapping in real-time. In: Robotics, Science and Systems Conference (RSS). SAGE, 1–9
Pathak K, Birk A, Vaškevičius N, Poppinga J (2010) Fast registration based on noisy planes with unknown correspondences for 3-D mapping. IEEE Trans Rob 26(3):424–441. https://doi.org/10.1109/tro.2010.2042989
Kummerle R, Grisetti G, Strasdat H, Konolige K, Burgard W (2011) G2o: A general framework for graph optimization. In: 2011 IEEE International Conference on Robotics and Automation:3607–3613. doi:https://doi.org/10.1109/ICRA.2011.5979949
Wang D, Liang H, Mei T, Zhu H, Fu J, Tao X (2013) Lidar Scan matching EKF-SLAM using the differential model of vehicle motion.In: 2013 IEEE Intelligent Vehicles Symposium (IV):908–912. doi:https://doi.org/10.1109/IVS.2013.6629582
Hess W, Kohler D, Rapp H, Andor D (2016) Real-time loop closure in 2D LIDAR SLAM. In: 2016 IEEE International Conference on Robotics and Automation (ICRA). IEEE, 1271–1278
Sunderhauf N, Protzel P (2012) Towards a robust back-end for pose graph SLAM. In: IEEE International Conference on Robotics and Automation. IEEE, 1254–1261
Palomer A, Ridao P, Ribas D, Mallios A, Vallicrosa G (2013) A comparison of G2o graph SLAM and EKF pose based SLAM with bathymetry grids. IFAC Proc Vol 46(33):286–291. https://doi.org/10.3182/20130918-4-JP-3022.00065
Wen W, Pfeifer T, Bai X, Hsu L-T (2020) It is time for factor graph optimization for GNSS/INS Integration: Comparison between FGO and EKF. arXiv pre-print server
Chang L, Zha F, Qin F (2017) Indirect kalman filtering based attitude estimation for low-cost attitude and heading reference systems. IEEE/ASME Trans Mechatron 22(4):1850–1858. https://doi.org/10.1109/tmech.2017.2698639
Sanjurjo E, Naya MÁ, Blanco-Claraco JL, Torres-Moreno JL, Giménez-Fernández A (2017) Accuracy and efficiency comparison of various nonlinear Kalman filters applied to multibody models. Nonlinear Dyn 88(3):1935–1951. https://doi.org/10.1007/s11071-017-3354-z
Weikun Z, Zeng S, Soberer S (2017) Robust localization and localizability estimation with a rotating laser scanner. In: 2017 IEEE International Conference on Robotics and Automation (ICRA). IEEE, 6240–6245
Qin C, Ye H, Pranata CE, Han J, Liu M (2019) LINS: A lidar-inerital state estimator for robust and fast navigation. In: 2020 IEEE International Conference on Robotics and Automation (ICRA). IEEE, 8899–8905
Gao L, Battistelli G, Chisci L (2020) Random-finite-set-based distributed multirobot SLAM. IEEE Trans on Robotics. 36(6):1758–1777. https://doi.org/10.1109/tro.2020.3001664
Fu H, Yu R, Ye L, Wu T, Xu X (2018) An efficient scan-to-map matching approach based on multi-channel lidar. J Intell Rob Syst 91(3–4):501–513. https://doi.org/10.1007/s10846-017-0717-0
Wang K, Liu Y-H, Li L (2014) A Simple and parallel algorithm for real-time robot localization by fusing monocular vision and odometry/AHRS sensors. IEEE/ASME Trans Mechatron 19(4):1447–1457. https://doi.org/10.1109/tmech.2014.2298247
Solà J (2017) Quaternion kinematics for the error-state Kalman filter. arXiv e-prints.
Trawny N, Roumeliotis SI (2005) Indirect Kalman filter for 3D attitude estimation. University of Minnesota, Minneapolis, Dept. of Computer Science & Engineering
Fan W, Shi W, Xiang H, Ding K (2019) A novel method for plane extraction from low-resolution inhomogeneous point clouds and its application to a customized low-cost mobile mapping system. Remote Sens 11(23):2789–2810. https://doi.org/10.3390/rs11232789
Shan T, Englot B, Meyers D, Wang W, Ratti C, Rus D (2020) LIO-SAM: Tightly-coupled Lidar inertial odometry via smoothing and mapping. In: 2020 IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS). IEEE, 5135–5142
Wang C, Dai Y, Elsheimy N, Wen C, Retscher G, Kang Z, Lingua A (2020) ISPRS BENCHMARK ON MULTISENSORY INDOOR MAPPING AND POSITIONING. ISPRS Ann Photogramm, Remote Sens Spatial Inf Sci 5(5):117–123. https://doi.org/10.5194/isprs-annals-V-5-2020-117-2020
Acknowledgements
This work was funded by The Hong Kong Polytechnic University (1-ZVN6, 4-BCF7), The State Bureau of Surveying and Mapping, P.R. China (1-ZVE8), and Hong Kong Research Grants Council (T22-505/19-N).
Author information
Authors and Affiliations
Corresponding author
Additional information
Publisher's Note
Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.
Rights and permissions
About this article
Cite this article
Sheng, B., Wenzhong, S., Wenzheng, F. et al. A tight coupling mapping method to integrate the ESKF, g2o, and point cloud alignment. J Supercomput 78, 1903–1922 (2022). https://doi.org/10.1007/s11227-021-03900-7
Accepted:
Published:
Issue Date:
DOI: https://doi.org/10.1007/s11227-021-03900-7