Initial Pose Estimation Method for Robust LiDAR-Inertial Calibration and Mapping
<p>LiDAR based mapping using (<b>a</b>) LiDAR-IMU calibration method: Error-free mapping, and (<b>b</b>) Without LiDAR-IMU calibration method: Mapping error due to drift, highlighted in yellow circle. The colors in each map represents the intensity of LiDAR point cloud.</p> "> Figure 2
<p>Overall framework of the proposed initial pose estimation method for robust LiDAR-IMU calibration. Different colors in voxelization shows the intensity of LiDAR points in each voxel. The extracted planes are represented with yellow and green color while red color points indicate noise.</p> "> Figure 3
<p>Robust plane detection method.</p> "> Figure 4
<p>Robust plane extraction through refinement. (<b>a</b>) Voxels containing edges and noise have low plane scores due to large distances and high variance represented as red color normal vector while those with high plane scores are represented with blue. (<b>b</b>) The refinement process enables the effective separation and removal of areas containing edges and noise.</p> "> Figure 5
<p>LiDAR calibration method.</p> "> Figure 6
<p>IMU downsampling.</p> "> Figure 7
<p>Qualitative Comparison of the proposed method with the benchmark plane detection algorithms.</p> "> Figure 8
<p>Top view of LiDAR data. (<b>a</b>) LiDAR raw data before calibration. (<b>b</b>) LiDAR data after calibration using the proposed method.</p> "> Figure 9
<p>Performance comparison in terms of (<b>a</b>) roll and (<b>b</b>) pitch errors in the VECtor dataset.</p> "> Figure 10
<p>Performance comparison in terms of the (<b>a</b>) mapping result using LI-init and (<b>b</b>) mapping result using LI-init+Proposed.</p> ">
Abstract
:1. Introduction
- We propose a plane detection method based on plane scores; we can robustly detect planes effectively, even in the presence of noise and edges.
- We present an initial pose estimation method for LiDAR-IMU calibration. Using the relationship between the detected planes and the actual planes, stable calibration is achieved even under various movements.
- The proposed method demonstrates high calibration performance compared to existing methods on the benchmark dataset. The plane detection approach also proves to have higher accuracy and faster computational speed when compared to conventional algorithms.
2. Related Works
2.1. Methods Without Initial Pose Estimation
2.2. Methods with Initial Pose Estimation
3. Method Overview
3.1. System Architecture
3.2. Notations
4. Proposed Method
4.1. Plane Feature Extration
4.2. Robust Plane Detection
Algorithm 1 Robust Plane Detection Algorithm |
Input: : Input Voxel, : score threshold, : refine threshold Output: U: Detected plane
|
4.3. LiDAR Calibration
- The indoor spaces have a cuboidal structure and are composed of at least two or more planes, including walls, floors, and ceilings;
- Each plane is flat and the angles between planes are perpendicular. Unlike outdoor environments, the planes that make up indoor spaces, such as walls, floors, and ceilings, are flat and each plane is oriented perpendicular to its neighboring planes.
Algorithm 2 LiDAR Calibration Algorithm |
Input: : Detected planes Output: : Optimized rotation matrix
|
4.4. IMU Noise Removal, Downsampling, and Calibration
5. Results
5.1. Experimental Setup
5.2. Datasets
5.3. Evaluation of Robust Plane Detection Method
5.3.1. Impact of Robust Plane Selection
5.3.2. Plane Refinement Performance Analysis
5.3.3. Comparison with Other Plane Detection Methods
5.4. Evaluation of Initial Pose Estimation Methods
5.5. Evaluation of Mapping Result
6. Conclusions
Author Contributions
Funding
Institutional Review Board Statement
Informed Consent Statement
Data Availability Statement
Conflicts of Interest
References
- Zeybek, M. Indoor mapping and positioning applications of hand-held lidar simultaneous localization and mapping (slam) systems. Türk. Lidar Derg. 2021, 3, 7–16. [Google Scholar] [CrossRef]
- Bi, S.; Yuan, C.; Liu, C.; Cheng, J.; Wang, W.; Cai, Y. A survey of low-cost 3D laser scanning technology. Appl. Sci. 2021, 11, 3938. [Google Scholar] [CrossRef]
- Zhang, J.; Singh, S. LOAM: Lidar odometry and mapping in real-time. Robot. Sci. Syst. 2014, 9, 1–9. [Google Scholar]
- Shan, T.; Englot, B.; Meyers, D.; Wang, W.; Ratti, C.; Rus, D. LIO-SAM: Tightly-coupled Lidar Inertial Odometry via Smoothing and Mapping. In Proceedings of the IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS), Las Vegas, NV, USA, 24 October 2020–24 January 2021; pp. 5135–5142. [Google Scholar]
- Xu, W.; Cai, Y.; He, D.; Lin, J.; Zhang, F. Fast-lio2: Fast direct lidar-inertial odometry. IEEE Trans. Robot. 2022, 9, 2053–2073. [Google Scholar] [CrossRef]
- Le Gentil, C.; Vidal-Calleja, T.; Huang, S. 3d lidar-imu calibration based on upsampled preintegrated measurements for motion distortion correction. In Proceedings of the IEEE International Conference on Robotics and Automation (ICRA), Brisbane, Australia, 21–25 May 2018; pp. 2149–2155. [Google Scholar]
- Li, S.; Wang, L.; Li, J.; Tian, B.; Chen, L.; Li, G. 3D LiDAR/IMU calibration based on continuous-time trajectory estimation in structured environments. IEEE Access 2021, 9, 138803–138816. [Google Scholar] [CrossRef]
- Zhu, F.; Ren, Y.; Zhang, F. Robust real-time lidar-inertial initialization. In Proceedings of the IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS), Kyoto, Japan, 23–27 October 2022; pp. 3948–3955. [Google Scholar]
- Li, S.; Li, X.; Chen, S.; Zhou, Y.; Wang, S. Two-Step LiDAR/Camera/IMU Spatial and Temporal Calibration Based on Continuous-Time Trajectory Estimation. IEEE Trans. Ind. Electron. 2024, 71, 3182–3191. [Google Scholar] [CrossRef]
- Kim, T.; Pak, G.; Kim, E. GRIL-Calib: Targetless Ground Robot IMU-LiDAR Extrinsic Calibration Method using Ground Plane Motion Constraints. IEEE Robot. Autom. Lett. 2024, 9, 5409–5416. [Google Scholar] [CrossRef]
- Das, S.; Boberg, B.; Fallon, M.; Chatterjee, S. IMU-based Online Multi-lidar Calibration. In Proceedings of the IEEE Intelligent Vehicles Symposium (IV), Jeju Island, Republic of Korea, 2–5 June 2024; pp. 3227–3234. [Google Scholar]
- Lv, J.; Xu, J.; Hu, K. Targetless calibration of lidar-imu system based on continuous-time batch estimation. In Proceedings of the IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS), Las Vegas, NV, USA, 25–29 October 2020; pp. 9968–9975. [Google Scholar]
- Yin, S.; Xie, D.; Fu, Y.; Wang, Z.; Zhong, R. Uncontrolled Two-Step Iterative Calibration Algorithm for Lidar–IMU System. Sensors 2023, 23, 3119. [Google Scholar] [CrossRef] [PubMed]
- Gustafsson, F. Determining the initial states in forward-backward filtering. IEEE Trans. Signal Process. 1996, 44, 988–992. [Google Scholar] [CrossRef] [PubMed]
- Madgwick, S. An efficient orientation filter for inertial and inertial/magnetic sensor arrays. Sens. Arrays Rep. x-io Univ. Bristol 2010, 25, 113–118. [Google Scholar]
- Guo, Y.; Li, Y.; Ren, D.; Zhang, X.; Li, J.; Pu, L.; Ma, C.; Zhan, X.; Guo, J.; Wei, M.; et al. LiDAR-Net: A Real-scanned 3D Point Cloud Dataset for Indoor Scenes. In Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition(CVPR), Seattle, WA, USA, 16–22 June 2024; pp. 21989–21999. [Google Scholar]
- Helmberger, M.; Morin, K.; Scaramuzza, D. The hilti slam challenge dataset. IEEE Robot. Autom. Lett. 2022, 7, 7518–7525. [Google Scholar] [CrossRef]
- Gao, L.; Liang, Y.; Kneip, L. Vector: A versatile event-centric benchmark for multi-sensor slam. IEEE Robot. Autom. Lett. 2022, 7, 8217–8224. [Google Scholar] [CrossRef]
- Li, L.; Yang, F.; Zhu, H.; Li, D.; Li, Y.; Tang, L. An improved RANSAC for 3D point cloud plane segmentation based on normal distribution transformation cells. Remote Sens. 2017, 9, 433. [Google Scholar] [CrossRef]
- Vo, A.V.; Truong-Hong, L.; Laefer, D.F.; Bertolotto, M. Octree-based region growing for point cloud segmentation. ISPRS J. Photogramm. Remote Sens. 2015, 104, 88–100. [Google Scholar] [CrossRef]
- Tian, Y.; Song, W.; Chen, L.; Sung, Y.; Kwak, J.; Sun, S. Fast planar detection system using a GPU-based 3D Hough transform for LiDAR point clouds. Appl. Sci. 2020, 10, 1744. [Google Scholar] [CrossRef]
Ref. | Year | Modality | Platform | Env | Approach | Limitations |
---|---|---|---|---|---|---|
[6] | 2018 | LiDAR+IMU | Handheld | Indoor | Scan-based | Precise calibration is not achievable due to the use of low-accuracy upsampled LiDAR points. |
[7] | 2021 | LiDAR+IMU | Handheld | Indoor | Pre-define map | Requires a pre-defined map |
[8] | 2022 | LiDAR+IMU | Handheld | Indoor | Scan-based | Since the first LiDAR scan is defined in the global frame, the pose in the first LiDAR scan affects the accuracy of the calibration |
[9] | 2024 | LiDAR+IMU+Camera | Handheld | Indoor/Outdoor | Scan-based | Due to the two-step calibration process involving both the camera and LiDAR, a complex processing workflow is required, and the calibration accuracy of each sensor affects the accuracy of the other |
[10] | 2024 | LiDAR+IMU | Vehicle | Outdoor | Ground information | Not suitable for handheld LiDAR |
[11] | 2024 | Multi LiDAR+IMU | Vehicle | Indoor | IMU-based | Since the calibration is based on IMU data, it is sensitive to IMU drift and noise |
[12] | 2020 | LiDAR+IMU | Handheld | Indoor/Outdoor | No initial pose estimation | High cost of computation |
[13] | 2023 | LiDAR+IMU | Handheld, Vehicle | Indoor/Outdoor | No initial pose estimation | High cost of computation |
Notations | Explanations |
---|---|
The set of LiDAR data and the l-th point | |
A voxel containing elements | |
The voxel located at | |
The LiDAR total points at , and the i-th point inside the voxel | |
The j-th normal vector and the mean vector within the voxel | |
The final Q detected planes | |
The q-th plane detected through the proposed method | |
The initial rotation matrix, the normal vector in the global frame, and the normal vector in the LiDAR frame | |
The rotation matrix between the global and LiDAR frames | |
The rotation matrix between the global and IMU frames | |
The normal vector of the actual plane |
Dataset | Sensor | Frame Rate | Horizontal FOV | Vertical FOV | Maximum Detection Range | Detection Distance Accuracy |
---|---|---|---|---|---|---|
LiDAR-Net [16] | Leica BLK2GO | 5 Hz | 210° | 85° | 10 m | ≤5 mm |
Hilti [17] | Ouster OSO-64 | 10 Hz | 360° | 90° | 50 m | ≤5 cm |
VECtor [18] | Ouster OSO-128 | 20 Hz | 360° | 90° | 50 m | ≤5 cm |
Dataset | Sensor | Frame Rate | Accelerometer | Gyroscope | Magnetometer |
---|---|---|---|---|---|
Hilti [17] | InvenSense ICM-20948 | 100 Hz | √ | √ | √ |
VECtor [18] | XSens MTi-30 AHRS | 200 Hz | √ | √ | √ |
F1 Score | Time | |||||
---|---|---|---|---|---|---|
0.0 | 83.3 | 79.5 | 78.5 | 0.83 | 1.00 | 1.07 |
0.1 | 90.6 | 81.2 | 80.3 | 0.85 | 1.00 | 1.07 |
0.2 | 91.4 | 83.2 | 81.8 | 0.85 | 1.02 | 1.09 |
0.4 | 89.7 | 79.8 | 79.2 | 0.85 | 1.02 | 1.09 |
0.6 | 87.0 | 79.6 | 78.5 | 0.87 | 1.02 | 1.09 |
0.8 | 85.5 | 77.3 | 75.5 | 0.87 | 1.03 | 1.10 |
1.0 | 79.8 | 76.3 | 74.2 | 0.87 | 1.03 | 1.10 |
F1 Score | Time | |||||
---|---|---|---|---|---|---|
0.0 | 87.5 | 79.2 | 78.6 | 0.85 | 1.00 | 1.07 |
0.1 | 91.1 | 81.2 | 79.2 | 0.86 | 1.00 | 1.07 |
0.2 | 91.4 | 83.2 | 81.8 | 0.85 | 1.02 | 1.08 |
0.3 | 91.2 | 80.8 | 79.8 | 0.85 | 1.02 | 1.08 |
0.4 | 90.1 | 78.5 | 78.7 | 0.85 | 1.02 | 1.08 |
0.5 | 89.3 | 75.8 | 72.4 | 0.86 | 1.03 | 1.09 |
0.6 | 86.9 | 72.3 | 69.2 | 0.86 | 1.03 | 1.09 |
0.7 | 79.9 | 71.2 | 60.8 | 0.86 | 1.04 | 1.09 |
0.8 | 78.4 | 69.5 | 45.9 | 0.86 | 1.04 | 1.10 |
0.9 | 56.2 | 52.2 | 30.3 | 0.87 | 1.05 | 1.11 |
1.0 | 43.7 | 28.9 | 24.2 | 0.87 | 1.06 | 1.13 |
Method | Corridor | Room | ||||||
---|---|---|---|---|---|---|---|---|
Precision (%) | Recall (%) | F1-Score (%) | Time (s) | Precision (%) | Recall (%) | F1-Score(%) | Time (s) | |
RANSAC [19] | 89.9 | 72.4 | 80.2 | 2.6 | 96.9 | 57.8 | 72.4 | 4.9 |
3D-Hough [21] | 99.0 | 48.9 | 65.4 | 13.5 | 90.9 | 40.9 | 56.4 | 23.9 |
Region Growing [20] | 95.1 | 79.6 | 86.7 | 183.7 | 87.4 | 60.8 | 71.7 | 339.6 |
Ours | 90.5 | 92.3 | 91.4 | 0.85 | 93.9 | 78.8 | 85.7 | 2.34 |
Disclaimer/Publisher’s Note: The statements, opinions and data contained in all publications are solely those of the individual author(s) and contributor(s) and not of MDPI and/or the editor(s). MDPI and/or the editor(s) disclaim responsibility for any injury to people or property resulting from any ideas, methods, instructions or products referred to in the content. |
© 2024 by the authors. Licensee MDPI, Basel, Switzerland. This article is an open access article distributed under the terms and conditions of the Creative Commons Attribution (CC BY) license (https://creativecommons.org/licenses/by/4.0/).
Share and Cite
Park , E.-S.; Arshad, S.; Park, T.-H. Initial Pose Estimation Method for Robust LiDAR-Inertial Calibration and Mapping. Sensors 2024, 24, 8199. https://doi.org/10.3390/s24248199
Park E-S, Arshad S, Park T-H. Initial Pose Estimation Method for Robust LiDAR-Inertial Calibration and Mapping. Sensors. 2024; 24(24):8199. https://doi.org/10.3390/s24248199
Chicago/Turabian StylePark , Eun-Seok, Saba Arshad, and Tae-Hyoung Park. 2024. "Initial Pose Estimation Method for Robust LiDAR-Inertial Calibration and Mapping" Sensors 24, no. 24: 8199. https://doi.org/10.3390/s24248199
APA StylePark , E. -S., Arshad, S., & Park, T. -H. (2024). Initial Pose Estimation Method for Robust LiDAR-Inertial Calibration and Mapping. Sensors, 24(24), 8199. https://doi.org/10.3390/s24248199