A Lightweight and Robust Point-Line Monocular Visual Inertial Wheel Odometry
Authors: Zhixin Zhang, Wenzhi Bai, Liang Zhao and Pawel Ladosz
We use the KAIST Complex Urban Dataset to test our algorithm.
Examples on KAIST Urban27:
- OpenCV 4.2
- Eigen 3
- ROS noetic
First, clone the code of MINS.
mkdir -p $MINS_WORKSPACE/catkin_ws/src/ && cd $MINS_WORKSPACE/catkin_ws/src/
git clone https://github.com/rpng/MINS
cd .. && catkin build
source devel/setup.bash
Then copy the PL-VIWO to the src folder and replace the open-vins folder of MINS with the one in PL-VIWO.
cd $MINS_WORKSPACE/catkin_ws/src/
git clone https://github.com/Happy-ZZX/PL-VIWO.git
rm -rf /$MINS_WORKSPACE/catkin_ws/src//MINS/thirdparty/open-vins
cp -r /$MINS_WORKSPACE/catkin_ws/src//MINS/open-vins /$MINS_WORKSPACE/catkin_ws/src//MINS/thirdparty/
rm -rf /$MINS_WORKSPACE/catkin_ws/src//MINS/open-vins
Compile the project again
cd .. && catkin build
source devel/setup.bash
Note❗: This system is based on a monocular setup, please set the camera number to 1 in the Config file.
roslaunch viwo rosbag.launch config:=kaist/kaist_C path_gt:=urban26.txt path_bag:=urban26.bag
rviz -d PL-VIWO/launch/display.rviz
For the rosbag files and ground truths used for test, please refer to MINS.
For the benchmark used in our paper, we also open-source the modified version for the convenience of the community. The difference is add KAIST datasets Config file and disabled some functions, like loop closure or re-localization. The code will coming soon.
This project was built on top of the following works.
- OpenVINS: Open-source filter-based visual-inertial estimator.
- MINS: An efficient, robust, and tightly-coupled Multisensor-aided Inertial Navigation System (MINS)
Thanks for the wonderful work and open-source from Robot Perception & Navigation Group (RPNG).