8000 GitHub - andrewliang01/SuperOdom: A highly robust and accurate LiDAR-inertial odometry package
[go: up one dir, main page]
More Web Proxy on the site http://driver.im/
Skip to content

andrewliang01/SuperOdom

Β 
Β 

Folders and files

NameName
Last commit message
Last commit date

Latest commit

Β 

History

58 Commits
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 

Repository files navigation

SuperOdometry: Lightweight LiDAR-inertial Odometry and Mapping

Website License: GPL v3

Super Odometry Pipeline

πŸ”₯ This is a slim version of Super Odometry, containing the LiDAR Odometry component and IMU Odometry component. The LiDAR odometry only provides pose constraints to IMU odometry modules to estimate the bias of IMU. In return, the IMU Odometry module offers pose predictions to the LiDAR Odometry module, serving as an initial guess for ICP optimization.

Super Odometry Pipeline

πŸ”₯ The system has been widely tested on above platforms equipped with Livox, Velodyne and Ouster LiDAR.

πŸ”₯ 1. Key Features

  • Multi-LiDAR Support
    • Compatible with Livox, Velodyne, and Ouster sensors
  • LiDAR-inertial Fusion
    • Support LiDAR-inertial Fusion
  • Dual-Mode Operation
    • Supports both localization and mapping modes
  • Alignment Risk Prediction
    • Provides alignment risk prediction for ICP algorithms
  • Degeneracy Awareness
    • Robust detection of environmental degeneracy
  • ROS 2.0 Integration
    • Built on ROS 2 Humble for modern robotics development

Super Odometry Pipeline

Alignment Risk Prediction

πŸ”₯ 6 DOF degeneracy uncertainty detection. We support visualization in both RVIZ and Rerun.

πŸ“¦ 3. Installation

Highly recommend to check our docker files to run our code with step 4 and step 5.

System Requirements

Dependencies Installation

Install Sophus

git clone http://github.com/strasdat/Sophus.git
cd Sophus && git checkout 97e7161
mkdir build && cd build
cmake .. -DBUILD_TESTS=OFF
make -j8 && sudo make install

Install GTSAM

git clone https://github.com/borglab/gtsam.git
cd gtsam && git checkout 4abef92
mkdir build && cd build
cmake \
  -DGTSAM_USE_SYSTEM_EIGEN=ON \
  -DGTSAM_BUILD_WITH_MARCH_NATIVE=OFF \
  ..
make -j6 && sudo make install

Install Ceres

git clone https://github.com/ceres-solver/ceres-solver.git
cd ceres-solver
git checkout f68321e7de8929fbcdb95dd42877531e64f72f66
mkdir build
cd build
cmake ..
make -j8  # Use number of cores you have, e.g., -j8 for 8 cores
sudo make install

Install Rerun

pip install rerun-sdk

🐳 4. Docker Setup

Prerequisites

Building Docker Image

cd ros2_humble_docker
docker build -t superodom-ros2:latest .

Workspace Structure

First create your own local ROS2 workspace and clone SuperOdom:

mkdir -p ~/ros2_ws/src
cd ~/ros2_ws/src
git clone https://github.com/superxslam/SuperOdom

Clone respective repos and ensure they follow this exact structure under ros2_ws/src:

ros2_ws/src
β”œβ”€β”€ SuperOdom
β”œβ”€β”€ livox_ros_driver2
└── rviz_2d_overlay_plugins

You can clone livox_ros_driver2 and rviz_2d_overlay_plugins using the following link:

Important: Maintain this exact structure within ros_ws/src

Docker Container Setup

# Allow Docker GUI access
xhost +local:docker

Go to ros2_humble_docker/container_run.sh and make sure you change exact directory path for PROJECT_DIR and DATASET_DIR

PROJECT_DIR="/path/to/your/superodom"
DATASET_DIR="/path/to/your/dataset"

Important: PROJECT_DIR should be the exact directory to ros2_ws/src

Then launch docker container using the following:

# Grant access
cd ros2_humble_docker
sudo chmod -R 777 container_run.sh

# Start container
./container_run.sh superodom-ros2 superodom-ros2:latest

# Source ROS2
source /opt/ros/humble/setup.bash

Important: To access container, you can open a new bash window and run docker exec --privileged -it superodom-ros2 /bin/bash

Build the workspace within container

cd ~/ros2_ws/src/livox_ros_driver2
./build.sh humble 
cd ~/ros2_ws
colcon build

Important: make sure you first build livox_ros_driver2

πŸš€ 5. Launch SuperOdometry

To launch SuperOdometry, we provide demo datasets for Livox-mid360, VLP-16 and OS1-128 sensor Download Link

For more challange dataset, feel free to download from our website slam_mode and localization_mode. You might want to convert ROS1 bag into ROS2 format using this link.

For user-defined topic names, modify super_odometry/config/$(YOUR_LiDAR_SENSOR).yaml:

imu_topic: "/your/imu/topic"
laser_topic: "/your/laser/topic"

For user-defined laser-imu extrinsics, modify super_odometry/config/$(YOUR_LiDAR_SENSOR)/$(YOUR_LiDAR_SENSOR)_calibration.yaml:

#Rotation from laser frame to imu frame, imu^R_laser
extrinsicRotation_imu_laser: !!opencv-matrix
  rows: 3
  cols: 3
  dt: d  
  data: [1., 0., 0.,
        0., 1., 0.,
        0., 0., 1.]

#Translation from laser frame to imu frame, imu^T_laser
extrinsicTranslation_imu_laser: !!opencv-matrix
  rows: 3
  cols: 1
  dt: d
  data: [-0.011, -0.02329, 0.04412]

Run SuperOdometry using the following command:

source install/setup.bash
ros2 launch super_odometry livox_mid360.launch.py
ros2 launch super_odometry os1_128.launch.py
ros2 launch super_odometry vlp_16.launch.py

Play your ROS2 dataset:

# launch this in a new bash window
docker exec --privileged -it superodom-ros2 /bin/bash
source install/setup.bash
cd ~/data
ros2 bag play $(YOUR_ROS2_DATASET)

Visualize in RVIZ2:

# launch this in a new bash window
docker exec --privileged -it superodom-ros2 /bin/bash
source install/setup.bash
cd ~/ros2_ws/src/SuperOdom/super_odometry
rviz2 -d ros2.rviz

(⭐ Alternative) Visualize in Rerun:

# launch this in a new bash window
docker exec --privileged -it superodom-ros2 /bin/bash
source install/setup.bash
cd ~/ros2_ws/src/SuperOdom/script/visualizers
python3 rerun_visualizer.py
# Open a new bash window on your local device
rerun

We also provide tmux script for easy launch with dataset (this script only works after you build the workspace in docker):

cd script
tmuxp load run.yaml

πŸ“ Localization Mode Configuration

cave_website1.mp4

πŸ”₯ The localization mode allows you to localize your robot by providing an initial pose and ground truth map.

Update your super_odometry/config/$(YOUR_LiDAR_SENSOR).yaml configuration file with:

localization_mode: true         # If true, localization mode is enabled; otherwise, SLAM mode is used
read_pose_file: false           # Set to true to read initial pose from a txt file
init_x: 0.0                     # Initial X position for localization
init_y: 0.0                     # Initial Y position for localization
init_z: 0.0                     # Initial Z position for localization
init_roll: 0.0                  # Initial roll angle
init_pitch: 0.0                 # Initial pitch angle
init_yaw: 0.0                   # Initial yaw angle

Add ground truth map map in launch file

parameters=[LaunchConfiguration("config_file"),
    { "calibration_file": LaunchConfiguration("calibration_file"),
     "map_dir": os.path.join(home_directory, "/path/to/your/pcd"),
}]

To quickly launch our localization module, feel free to try out this demo dataset using default initial pose configuration.

πŸ“š 8. Citations

@inproceedings{zhao2021super,
  title={Super odometry: IMU-centric LiDAR-visual-inertial estimator for challenging environments},
  author={Zhao, Shibo and Zhang, Hengrui and Wang, Peng and Nogueira, Lucas and Scherer, Sebastian},
  booktitle={2021 IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS)},
  pages={8729--8736},
  year={2021},
  organization={IEEE}
}

@inproceedings{zhao2025superloc,
  title={SuperLoc: The Key to Robust LiDAR-Inertial Localization Lies in Predicting Alignment Risks},
  author={Zhao, Shibo and Zhu, Honghao and Gao, Yuanjun and Kim, Beomsoo and Qiu, Yuheng and Johnson, Aaron M. and Scherer, Sebastian},
  booktitle={2025 IEEE International Conference on Robotics and Automation (ICRA)},
  year={2025},
  url={https://arxiv.org/abs/2412.02901}
}

9. Next Plan

πŸ”΅ Colorized Point Cloud Visualization β€” Video Demo

🟒 Visual Odometry Module β€” Initial Release Lightweight and robust visual odometry module integrated into SuperOdometry.

πŸ“ 10. License

This package is released under the GPLv3 license. For commercial use, please contact shiboz@andrew.cmu.edu and Prof. Sebastian Scherer.

πŸ™ 11. Acknowledgements

Special thanks to Professor Ji Zhang, Professor Michael Kaess, Parv Maheshwari, Yuanjun Gao, Yaoyu Hu for their valuable advice. Thanks to Omar Alama for providing Rerun support. We also acknowledge these foundational works:

  • LOAM: Lidar Odometry and Mapping in Real-time (RSS 2014)
  • GTSAM: Georgia Tech Smoothing and Mapping Library
  • FastLIO, LIOSAM

About

A highly robust and accurate LiDAR-inertial odometry package

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published

Languages

  • C++ 73.9%
  • Python 15.1%
  • CMake 10.2%
  • Other 0.8%
0