[go: up one dir, main page]
More Web Proxy on the site http://driver.im/ Skip to main content
Log in

Evaluation of 3D LiDAR SLAM algorithms based on the KITTI dataset

  • Published:
The Journal of Supercomputing Aims and scope Submit manuscript

Abstract

Safe autonomous driving is the future trend, and achieving it requires precise and real-time simultaneous localization and mapping (SLAM). Many practitioners are concerned about the performance of LiDAR SLAM algorithms, but there is little research work to evaluate LiDAR SLAM algorithms specifically. This paper evaluates LeGO-LOAM, SC-LeGO-LOAM, LIO-SAM, SC-LIO-SAM, and FAST-LIO2 utilizing the 05-10 sequences from KITTI dataset. The experimental results show that: firstly, there is no significant difference among the absolute trajectory error of the five SLAM algorithms. Secondly, LeGO-LAOM has the smallest relative positional error among the six sequences. Thirdly, FAST-LIO2 has the best real-time performance. Our experiments are intended to provide a reference for practitioners in selecting SLAM algorithms.

This is a preview of subscription content, log in via an institution to check access.

Access this article

Subscribe and save

Springer+ Basic
£29.99 /Month
  • Get 10 units per month
  • Download Article/Chapter or eBook
  • 1 Unit = 1 Article or 1 Chapter
  • Cancel anytime
Subscribe now

Buy Now

Price includes VAT (United Kingdom)

Instant access to the full article PDF.

Fig. 1
Fig. 2
Fig. 3

Similar content being viewed by others

Data availability

The dataset used in this paper is available at this website (https://www.cvlibs.net/datasets/kitti/).

References

  1. Taxonomy SAE (2018) Definitions for terms related to driving automation systems for on-road motor vehicles. SAE: Warrendale, PA, USA, 3016

  2. Raul Mur-Artal, Tardós Juan D (2017) Orb-slam2: an open-source slam system for monocular, stereo, and rgb-d cameras. IEEE Trans Rob 33(5):1255–1262. https://doi.org/10.1109/TRO.2017.2705103

    Article  Google Scholar 

  3. Newcombe Richard A, Lovegrove Steven J, Davison Andrew J (2011) DTAM: dense tracking and mapping in real-time. 2011 Int Conf Comput Vis, 2320–2327. https://doi.org/10.1109/iccv.2011.6126513

  4. Engel Jakob, Schöps Thomas, Cremers Daniel (2014) LSD-SLAM: Large-scale direct monocular SLAM. Computer Vision–ECCV 2014: 13th European Conference, Zurich, Switzerland, September 6-12, 2014, Proceedings, Part II 13, 834–849. https://doi.org/10.1007/978-3-319-10605-2_54

  5. Forster Christian, Pizzoli Matia, Scaramuzza Davide (2014) SVO: fast semi-direct monocular visual odometry. IEEE Int Conf Robot Autom (ICRA) 2014:15–22. https://doi.org/10.1109/icra.2014.6906584

    Article  Google Scholar 

  6. Wang Rui, Schworer Martin, Cremers Daniel (2017) Stereo DSO: large-scale direct sparse visual odometry with stereo cameras. Proc IEEE Int Conf Comput Vis 3903–3911. https://doi.org/10.1109/iccv.2017.421

  7. Campos Carlos, Elvira Richard, Rodríguez Juan J Gómez, Montiel José MM, Tardós Juan D (2021) Orb-slam3: an accurate open-source library for visual, visual–inertial, and multimap slam. IEEE Trans Rob 37(6):1874–1890. https://doi.org/10.1109/tro.2021.3075644

    Article  Google Scholar 

  8. Qin Tong, Li Peiliang, Shen Shaojie (2018) Vins-mono: a robust and versatile monocular visual-inertial state estimator. IEEE Trans Rob 34(4):1004–1020. https://doi.org/10.1109/tro.2018.2853729

    Article  Google Scholar 

  9. Zhang Ji, Singh Sanjiv (2014) LOAM: lidar odometry and mapping in real-time. Robot Sci Syst, 2, 9, 1–9. https://doi.org/10.15607/rss.2014.x.007

  10. Shan Tixiao, Englot Brendan (2018) Lego-loam: lightweight and ground-optimized lidar odometry and mapping on variable terrain. IEEE/RSJ Int Conf Intel Robot Syst (IROS) 2018:4758–4765. https://doi.org/10.1109/iros.2018.8594299

    Article  Google Scholar 

  11. SC-LeGO-LOAM:real-time LiDAR SLAM. (2020), from: https://github.com/irapkaist/SC-LeGO-LOAM

  12. Shan Tixiao, Englot Brendan, Meyers Drew, Wang Wei, Ratti Carlo, Rus Daniela (2020) Lio-sam: tightly-coupled lidar inertial odometry via smoothing and mapping. IEEE/RSJ Int Conf Intell Robot Syst (IROS) 2020:5135–5142. https://doi.org/10.1109/iros45743.2020.9341176

    Article  Google Scholar 

  13. SC-LIO-SAM: a real-time lidar-inertial SLAM package. (2021), from: https://github.com/gisbi-kim/SC-LIO-SAM

  14. Xu Wei, Cai Yixi, He Dongjiao, Lin Jiarong, Zhang Fu (2022) Fast-lio2: fast direct lidar-inertial odometry. IEEE Trans Rob. https://doi.org/10.1109/tro.2022.3141876

    Article  Google Scholar 

  15. Li Kailai, Li Meng, Hanebeck Uwe D (2021) Towards high-performance solid-state-lidar-inertial odometry and mapping. IEEE Robot Autom Lett 6(3):5167–5174. https://doi.org/10.1109/lra.2021.3070251

    Article  Google Scholar 

  16. Qin Chao, Ye Haoyang, Pranata Christian E, Han Jun, Zhang Shuyang, Liu Ming (2020) Lins: a lidar-inertial state estimator for robust and efficient navigation. IEEE Int Conf Robot Autom (ICRA) 2020:8899–8906. https://doi.org/10.1109/icra40945.2020.9197567

    Article  Google Scholar 

  17. Hess Wolfgang, Kohler Damon, Rapp Holger, Andor Daniel (2016) Real-time loop closure in 2D LIDAR SLAM. IEEE Int Conf Robot Autom (ICRA) 2016:1271–1278. https://doi.org/10.1109/icra.2016.7487258

    Article  Google Scholar 

  18. Kim Giseop, Kim Ayoung (2018) Scan context: egocentric spatial descriptor for place recognition within 3d point cloud map. IEEE/RSJ Int Conf Intel Robot Syst (IROS) 2018:4802–4809. https://doi.org/10.1109/iros.2018.8593953

    Article  Google Scholar 

  19. Xu Wei, Zhang Fu (2021) Fast-lio: a fast, robust lidar-inertial odometry package by tightly-coupled iterated kalman filter. IEEE Robot Autom Lett 6(2):3317–3324. https://doi.org/10.1109/lra.2021.3064227

    Article  Google Scholar 

  20. Grupp Michael (2017) evo: python package for the evaluation of odometry and SLAM, from: https://github.com/MichaelGrupp/evo

  21. Geiger Andreas, Lenz Philip, Stiller Christoph, Urtasun Raquel (2013) Vision meets robotics: the kitti dataset. Int J Robot Res 32(11):1231–1237. https://doi.org/10.1177/0278364913491297

    Article  Google Scholar 

  22. Shan Tixiao (2020) Kitti2bag: converting raw data into rosbag, from: https://github.com/TixiaoShan/LIO-SAM/tree/master/config/doc/kitti2bag

Download references

Funding

This work was funded by the Scientific Research Program of Guangzhou University (Grant No. RP2020115), GuangDong Basic and Applied Basic Research Foundation (Grant No.2022A1515110203) and Science and Technology Program of Guangzhou, China (Grant No. 202201010523).

Author information

Authors and Affiliations

Authors

Contributions

SH wrote the main manuscript text, JW run and tested the SLAM program, YY prepared all figures, BZ revised the manuscript. All authors reviewed the manuscript.

Corresponding authors

Correspondence to Shihong Huang or Bingzhi Zhang.

Ethics declarations

Competing interests

All the authors declare that they have no conflict of interest.

Ethical approval

Not applicable.

Additional information

Publisher's Note

Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.

Jiayang Wu and Shihong Huang have equally contributted to this work.

Rights and permissions

Springer Nature or its licensor (e.g. a society or other partner) holds exclusive rights to this article under a publishing agreement with the author(s) or other rightsholder(s); author self-archiving of the accepted manuscript version of this article is solely governed by the terms of such publishing agreement and applicable law.

Reprints and permissions

About this article

Check for updates. Verify currency and authenticity via CrossMark

Cite this article

Wu, J., Huang, S., Yang, Y. et al. Evaluation of 3D LiDAR SLAM algorithms based on the KITTI dataset. J Supercomput 79, 15760–15772 (2023). https://doi.org/10.1007/s11227-023-05267-3

Download citation

  • Accepted:

  • Published:

  • Issue Date:

  • DOI: https://doi.org/10.1007/s11227-023-05267-3

Keywords

Navigation