[go: up one dir, main page]
More Web Proxy on the site http://driver.im/ skip to main content
research-article

InLIOM: Tightly-Coupled Intensity LiDAR Inertial Odometry and Mapping

Published: 01 September 2024 Publication History

Abstract

State estimation and mapping are vital prerequisites for autonomous vehicle intelligent navigation. However, maintaining high accuracy in urban environments remains challenging, especially when the satellite signal is unavailable. This paper proposes a novel framework, InLIOM, which tightly couples LiDAR intensity measurements into the system to improve mapping performance in various challenging environments. The proposed framework introduces a stable intensity LiDAR odometry based on scan-to-scan optimization. By extracting features pairwise from intensity information of consecutive frames, this method tackles the instability issue of LiDAR intensity. To ensure the odometry’s robustness, a training-free residual-based dynamic objects filter module is further integrated into the scan-to-scan registration process. The obtained intensity LiDAR odometry solution is incorporated into the factor graph with other multi-sensors relative and absolute measurements, obtaining global optimization estimation. Experiments in indoor and outdoor urban environments show that the proposed framework achieves superior accuracy to state-of-the-art methods. Our approach can robustly adapt to high-dynamic roads, tunnels, underground parking, and large-scale urban scenarios.

References

[1]
H. Wang, C. Wang, C.-L. Chen, and L. Xie, “F-LOAM: Fast LiDAR odometry and mapping,” in Proc. IEEE/RSJ Int. Conf. Intell. Robots Syst. (IROS), Sep. 2021, pp. 4390–4396.
[2]
Q. Zou, Q. Sun, L. Chen, B. Nie, and Q. Li, “A comparative analysis of LiDAR SLAM-based indoor navigation for autonomous vehicles,” IEEE Trans. Intell. Transp. Syst., vol. 23, no. 7, pp. 6907–6921, Jul. 2022.
[3]
H. Saleem, R. Malekian, and H. Munir, “Neural network-based recent research developments in SLAM for autonomous ground vehicles: A review,” IEEE Sensors J., vol. 23, no. 13, pp. 13829–13858, Jul. 2023.
[4]
K. Chen, J. Liu, Q. Chen, Z. Wang, and J. Zhang, “Accurate object association and pose updating for semantic SLAM,” IEEE Trans. Intell. Transp. Syst., vol. 23, no. 12, pp. 25169–25179, Dec. 2022.
[5]
P. Vacek, O. Jasek, K. Zimmermann, and T. Svoboda, “Learning to predict LiDAR intensities,” IEEE Trans. Intell. Transp. Syst., vol. 23, no. 4, pp. 3556–3564, Apr. 2022.
[6]
W. Huanget al., “A fast point cloud ground segmentation approach based on coarse-to-fine Markov random field,” IEEE Trans. Intell. Transp. Syst., vol. 23, no. 7, pp. 7841–7854, Jul. 2022.
[7]
J. Zhang and S. Singh, “LOAM: LiDAR odometry and mapping in real-time,” in Proc. Robot., Sci. Syst., Jul. 2014, pp. 1–9.
[8]
T. Shan and B. Englot, “LeGO-LOAM: Lightweight and ground-optimized LiDAR odometry and mapping on variable terrain,” in Proc. IEEE/RSJ Int. Conf. Intell. Robots Syst. (IROS), Oct. 2018, pp. 4758–4765.
[9]
T. Shan, B. Englot, D. Meyers, W. Wang, C. Ratti, and D. Rus, “LIO-SAM: Tightly-coupled LiDAR inertial odometry via smoothing and mapping,” in Proc. IEEE/RSJ Int. Conf. Intell. Robots Syst. (IROS), Oct. 2020, pp. 5135–5142.
[10]
H. Ye, Y. Chen, and M. Liu, “Tightly coupled 3D LiDAR inertial odometry and mapping,” in Proc. Int. Conf. Robot. Autom. (ICRA), May 2019, pp. 3144–3150.
[11]
J. Lin, C. Zheng, W. Xu, and F. Zhang, “R2LIVE: A robust, real-time, LiDAR-inertial-visual tightly-coupled state estimator and mapping,” IEEE Robot. Autom. Lett., vol. 6, no. 4, pp. 7469–7476, Oct. 2021.
[12]
B. Bescos, C. Campos, J. D. Tardos, and J. Neira, “DynaSLAM II: Tightly-coupled multi-object tracking and SLAM,” IEEE Robot. Autom. Lett., vol. 6, no. 3, pp. 5191–5198, Jul. 2021.
[13]
G. Singh, M. Wu, M. V. Do, and S.-K. Lam, “Fast semantic-aware motion state detection for visual SLAM in dynamic environment,” IEEE Trans. Intell. Transp. Syst., vol. 23, no. 12, pp. 23014–23030, Dec. 2022.
[14]
H. Wang, H. Liang, Z. Li, P. Zhou, and L. Chen, “A fast coarse-to-fine point cloud registration based on optical flow for autonomous vehicles,” Appl. Intell., vol. 53, no. 16, pp. 19143–19160, Feb. 2023. 10.1007/s10489-022-04308-3.
[15]
K. P. Cop, P. V. K. Borges, and R. Dubé, “Delight: An efficient descriptor for global localisation using LiDAR intensities,” in Proc. IEEE Int. Conf. Robot. Autom. (ICRA), May 2018, pp. 3653–3660.
[16]
J. Guo, P. V. K. Borges, C. Park, and A. Gawel, “Local descriptor for robust place recognition using LiDAR intensity,” IEEE Robot. Autom. Lett., vol. 4, no. 2, pp. 1470–1477, Apr. 2019.
[17]
H. Wang, C. Wang, and L. Xie, “Intensity scan context: Coding intensity and geometry relations for loop closure detection,” in Proc. IEEE Int. Conf. Robot. Autom. (ICRA), May 2020, pp. 2095–2101.
[18]
H. Wang, C. Wang, and L. Xie, “Intensity-SLAM: Intensity assisted localization and mapping for large scale environment,” IEEE Robot. Autom. Lett., vol. 6, no. 2, pp. 1715–1721, Apr. 2021.
[19]
H. Li, B. Tian, H. Shen, and J. Lu, “An intensity-augmented LiDAR-inertial SLAM for solid-state LiDARs in degenerated environments,” IEEE Trans. Instrum. Meas., vol. 71, pp. 1–10, 2022.
[20]
F. Dellaert and M. Kaess, “Factor graphs for robot perception,” Found. Trends Robot., vol. 6, nos. 1–2, pp. 1–139, 2017. 10.1561/2300000043.
[21]
C. Bai, T. Xiao, Y. Chen, H. Wang, F. Zhang, and X. Gao, “Faster-LIO: Lightweight tightly coupled LiDAR-inertial odometry using parallel sparse incremental voxels,” IEEE Robot. Autom. Lett., vol. 7, no. 2, pp. 4861–4868, Apr. 2022.
[22]
S. Zhao, Z. Fang, H. Li, and S. Scherer, “A robust laser-inertial odometry and mapping method for large-scale highway environments,” in Proc. IEEE/RSJ Int. Conf. Intell. Robots Syst. (IROS), Jun. 2019, pp. 1285–1292.
[23]
C. Qin, H. Ye, C. E. Pranata, J. Han, S. Zhang, and M. Liu, “LINS: A LiDAR-inertial state estimator for robust and efficient navigation,” in Proc. IEEE Int. Conf. Robot. Autom. (ICRA), May 2020, pp. 8899–8906.
[24]
W. Xu and F. Zhang, “FAST-LIO: A fast, robust LiDAR-inertial odometry package by tightly-coupled iterated Kalman filter,” IEEE Robot. Autom. Lett., vol. 6, no. 2, pp. 3317–3324, Apr. 2021.
[25]
W. Du and G. Beltrame, “Real-time simultaneous localization and mapping with LiDAR intensity,” 2023, arXiv:2301.09257.
[26]
J. Park, Y. Cho, and Y.-S. Shin, “Nonparametric background model-based LiDAR SLAM in highly dynamic urban environments,” IEEE Trans. Intell. Transp. Syst., vol. 23, no. 12, pp. 24190–24205, Dec. 2022.
[27]
H. Lim, S. Hwang, and H. Myung, “ERASOR: Egocentric ratio of pseudo occupancy-based dynamic object removal for static 3D point cloud map building,” IEEE Robot. Autom. Lett., vol. 6, no. 2, pp. 2272–2279, Apr. 2021.
[28]
G. Kim and A. Kim, “Remove, then revert: Static point cloud map construction using multiresolution range images,” in Proc. IEEE/RSJ Int. Conf. Intell. Robots Syst. (IROS), Oct. 2020, pp. 10758–10765.
[29]
X. Chenet al., “Moving object segmentation in 3D LiDAR data: A learning-based approach exploiting sequential data,” IEEE Robot. Autom. Lett., vol. 6, no. 4, pp. 6529–6536, Oct. 2021.
[30]
M. Kaess, H. Johannsson, R. Roberts, V. Ila, J. J. Leonard, and F. Dellaert, “iSAM2: Incremental smoothing and mapping using the Bayes tree,” Int. J. Robot. Res., vol. 31, no. 2, pp. 216–235, Feb. 2012. 10.1177/0278364911430419.
[31]
R. Mur-Artal and J. D. Tardós, “ORB-SLAM2: An open-source SLAM system for monocular, stereo, and RGB-D cameras,” IEEE Trans. Robot., vol. 33, no. 5, pp. 1255–1262, Oct. 2017.
[32]
C. Campos, R. Elvira, J. J. G. Rodríguez, J. M. M. Montiel, and J. D. Tardós, “ORB-SLAM3: An accurate open-source library for visual, visual-inertial, and multimap SLAM,” IEEE Trans. Robot., vol. 37, no. 6, pp. 1874–1890, Dec. 2021.
[33]
M. Grupp. (2017). EVO: Python Package for the Evaluation of Odometry and SLAM. [Online]. Available: https://github.com/MichaelGrupp/evo
[34]
W. Xu, Y. Cai, D. He, J. Lin, and F. Zhang, “FAST-LIO2: Fast direct LiDAR-inertial odometry,” IEEE Trans. Robot., vol. 38, no. 4, pp. 2053–2073, Aug. 2022.

Recommendations

Comments

Please enable JavaScript to view thecomments powered by Disqus.

Information & Contributors

Information

Published In

cover image IEEE Transactions on Intelligent Transportation Systems
IEEE Transactions on Intelligent Transportation Systems  Volume 25, Issue 9
Sept. 2024
2384 pages

Publisher

IEEE Press

Publication History

Published: 01 September 2024

Qualifiers

  • Research-article

Contributors

Other Metrics

Bibliometrics & Citations

Bibliometrics

Article Metrics

  • 0
    Total Citations
  • 0
    Total Downloads
  • Downloads (Last 12 months)0
  • Downloads (Last 6 weeks)0
Reflects downloads up to 05 Mar 2025

Other Metrics

Citations

View Options

View options

Figures

Tables

Media

Share

Share

Share this Publication link

Share on social media