Online IMU Self-Calibration for Visual-Inertial Systems
<p>Illustration of visual-inertial system. Several keyframes and inertial measurement unit (IMU) measurements are maintained in a sliding window. When a new keyframe is inserted into the sliding window, a local bundle adjustment (BA) would be implemented to jointly optimize the camera and IMU states, as well as the feature location.</p> "> Figure 2
<p>Illustration of the concept of IMU preintegration. The IMU measures the acceleration and angular velocity at discrete time. Generally, The IMU measurement frequency is much larger than the frame rate of camera. Giving two time consecutive frames <math display="inline"><semantics> <msub> <mi>b</mi> <mi>k</mi> </msub> </semantics></math> and <math display="inline"><semantics> <msub> <mi>b</mi> <mrow> <mi>k</mi> <mo>+</mo> <mn>1</mn> </mrow> </msub> </semantics></math>, there exists several measurements in time interval <math display="inline"><semantics> <mrow> <mo stretchy="false">[</mo> <msub> <mi>t</mi> <mi>k</mi> </msub> <mo>,</mo> <msub> <mi>t</mi> <mrow> <mi>k</mi> <mo>+</mo> <mn>1</mn> </mrow> </msub> <mo stretchy="false">]</mo> </mrow> </semantics></math>. The IMU measurements between image frames <span class="html-italic">k</span> and <math display="inline"><semantics> <mrow> <mi>k</mi> <mo>+</mo> <mn>1</mn> </mrow> </semantics></math> are pre-integrated into a single compound measurement. From an optimization perspective, it can be seen as the observations or measurements with which we can construct the error terms and adjust the corresponding state values to minimize the errors.</p> "> Figure 3
<p>Block diagram illustrating of a monocular visual-inertial system (VINS-Mono) [<a href="#B8-sensors-19-01624" class="html-bibr">8</a>], based on which, we implement the proposed online IMU self-calibration method. We add the IMU intrinsic parameters into the state vector and replace the IMU Preintegration block with our method, which are detailed in <a href="#sec3dot3-sensors-19-01624" class="html-sec">Section 3.3</a> and <a href="#sec3dot4-sensors-19-01624" class="html-sec">Section 3.4</a>. The proposed IMU measurement factor (<a href="#sec3dot5-sensors-19-01624" class="html-sec">Section 3.5</a>) and IMU intrinsic factor (<a href="#sec3dot6-sensors-19-01624" class="html-sec">Section 3.6</a>) are used in the nonlinear optimization block.</p> "> Figure 4
<p>IMU intrinsics estimation in corridor 1 (<b>a</b>) and magistrale 1 (<b>b</b>). Solid lines are the estimation results and the dotted lines with the same color are the corresponding reference value.</p> "> Figure 5
<p>IMU intrinsics estimation in room 1 (<b>a</b>) and slider 1 (<b>b</b>). Solid lines are the estimation results and the dotted lines with the same color are the corresponding reference value.</p> "> Figure 6
<p>IMU intrinsics estimation with g-sensitivity in corridor 1 (<b>a</b>) and magistrale1 (<b>b</b>). Solid lines are the estimation results and the dot lines with the same color are the corresponding reference value.</p> "> Figure 7
<p>Trajectory of (<b>a</b>) rooms 3 and (<b>b</b>) room 5.</p> "> Figure 8
<p>Overall relative pose error in rooms 1–6. (<b>a</b>) Relative translation error; (<b>b</b>) Relative orientation error</p> "> Figure 9
<p>Average processing time per frame.</p> ">
Abstract
:1. Introduction
- An online IMU self-calibration method for visual-inertial system that can perform in real time in unknown environment without any external equipment.
- We frame the IMU calibration problem into general factors so that the proposed method can be easily integrated into other graph-based visual/visual-inertial frameworks.
- An extensive evaluation on publicly available dataset showing that the online calibration can obtain accurate IMU intrinsics estimation and significantly improve the estimation precision of the VINS method compared with the estimation results with offline precalibrated IMU measurements.
2. Related Work
3. Methodology
3.1. Visual-Inertial System
3.2. IMU Model
3.3. IMU Preintegration
- Compute the Jacobian matrix of , , with respect to the IMU intrinsic parameters. The Jacobian matrix will be used in two ways: (1) for the optimization method, the Jacobian matrix would be used to evaluate the increment of the input arguments; (2) at the end of each iteration step, we use Jacobian as well as the new adjusted states will be used to compute the first-order approximation of preintegration measurement so that we can avoid re-integrating the IMU measurements.
- Estimate the uncertainty of the preintegration measurements. All of the IMU measurements contain random noises. These noises, as well as the the inaccuracy of the IMU intrinsic parameters, introduce the uncertainty to the preintegration results. In order to get more precise optimization results, we need to estimate the uncertainty of the preintegration measurements in the form of a covariance matrix or information matrix and use these uncertainty information to weight the error terms during optimization.
3.4. Jacobian and Noise Propagation
3.5. IMU Measurement Factor
3.6. IMU Intrinsic Factor
- The parameters , , are modeled as random-walk processes, but are constant between two frames in the sliding window, and are handled with the same strategy as the IMU bias.
- The parameters , , are assumed to be uncertain but constant between two frames in the sliding window. With this strategy, we can set the covariance value of in Equation (30) corresponding to , , with a constant value and optimize them at every frame.
- The parameters , , are assumed to be uncertain, but constant in the timespan of the sliding window.
3.7. Discussion and Implementation Details
4. Experimental Results
4.1. IMU Intrinsics Estimation Results
4.1.1. IMU Intrinsics Estimation without G-Sensivity
4.1.2. IMU Intrinsics Estimation with G-Ensitivity
4.2. Quantitative Evaluation
4.3. Runtime Evaluation
5. Conclusions
Author Contributions
Funding
Conflicts of Interest
Abbreviations
IMU | Inertial measurement unit |
MEMS | Microelectro mechanical systems |
AHRS | Attitude and heading reference system |
Appendix A. Quaternion-Based IMU Preintegration
Appendix A.1. The Error-State Kinematics
Appendix A.2. Mid-Point Integration for the IMU Preintegration
References
- Forster, C.; Pizzoli, M.; Scaramuzza, D. SVO: Fast semi-direct monocular visual odometry. In Proceedings of the 2014 IEEE International Conference on Robotics and Automation (ICRA), Hong Kong, China, 31 May–7 June 2014; pp. 15–22. [Google Scholar] [CrossRef]
- Engel, J.; Koltun, V.; Cremers, D. Direct Sparse Odometry. IEEE Trans. Pattern Anal. Mach. Intell. 2018, 40, 611–625. [Google Scholar] [CrossRef] [PubMed]
- Howard, A. Real-time stereo visual odometry for autonomous ground vehicles. In Proceedings of the 2008 IEEE/RSJ International Conference on Intelligent Robots and Systems, Nice, France, 22–26 September 2008; pp. 3946–3952. [Google Scholar] [CrossRef]
- Wang, R.; Schworer, M.; Cremers, D. Stereo DSO: Large-Scale Direct Sparse Visual Odometry with Stereo Cameras. In Proceedings of the 2017 IEEE International Conference on Computer Vision (ICCV), Venice, Italy, 22–29 October 2017; pp. 3923–3931. [Google Scholar] [CrossRef]
- Kerl, C.; Sturm, J.; Cremers, D. Dense visual SLAM for RGB-D cameras. In Proceedings of the IEEE International Conference on Intelligent Robots and Systems, Tokyo, Japan, 3–7 November 2013; pp. 2100–2106. [Google Scholar] [CrossRef]
- Kerl, C.; Sturm, J.; Cremers, D. Robust odometry estimation for RGB-D cameras. In Proceedings of the 2013 IEEE International Conference on Robotics and Automation, Karlsruhe, Germany, 6–10 May 2013; pp. 3748–3754. [Google Scholar] [CrossRef]
- Leutenegger, S.; Lynen, S.; Bosse, M.; Siegwart, R.; Furgale, P. Keyframe-based visual-inertial odometry using nonlinear optimization. Int. J. Robot. Res. 2015, 34, 314–334. [Google Scholar] [CrossRef]
- Qin, T.; Li, P.; Shen, S. VINS-Mono: A Robust and Versatile Monocular Visual-Inertial State Estimator. IEEE Trans. Robot. 2018, 34, 1004–1020. [Google Scholar] [CrossRef]
- Mourikis, A.I.; Roumeliotis, S.I. A Multi-State Constraint Kalman Filter for Vision-aided Inertial Navigation. In Proceedings of the 2007 IEEE International Conference on Robotics and Automation, Roma, Italy, 10–14 April 2007; pp. 3565–3572. [Google Scholar] [CrossRef]
- Li, M.; Mourikis, A.I. High-precision, consistent EKF-based visual-inertial odometry. Int. J. Robot. Res. 2013, 32, 690–711. [Google Scholar] [CrossRef]
- Weiss, S.; Siegwart, R. Real-time metric state estimation for modular vision-inertial systems. In Proceedings of the 2011 IEEE International Conference on Robotics and Automation, Shanghai, China, 9–13 May 2011; Volume 231855, pp. 4531–4537. [Google Scholar] [CrossRef]
- Lynen, S.; Achtelik, M.W.; Weiss, S.; Chli, M.; Siegwart, R. A robust and modular multi-sensor fusion approach applied to MAV navigation. In Proceedings of the 2013 IEEE/RSJ International Conference on Intelligent Robots and Systems, Tokyo, Japan, 3–7 November 2013; pp. 3923–3929. [Google Scholar] [CrossRef]
- Li, M.; Yu, H.; Zheng, X.; Mourikis, A.I. High-fidelity sensor modeling and self-calibration in vision-aided inertial navigation. In Proceedings of the 2014 IEEE International Conference on Robotics and Automation (ICRA), Hong Kong, China, 31 May–7 June 2014; pp. 409–416. [Google Scholar] [CrossRef]
- Wu, K.; Ahmed, A.; Georgiou, G.; Roumeliotis, S. A Square Root Inverse Filter for Efficient Vision-aided Inertial Navigation on Mobile Devices. In Proceedings of the Robotics: Science and Systems XI, Rome, Italy, 13–17 July 2015. [Google Scholar] [CrossRef]
- Paul, M.K.; Wu, K.; Hesch, J.A.; Nerurkar, E.D.; Roumeliotis, S.I. A comparative analysis of tightly-coupled monocular, binocular, and stereo VINS. In Proceedings of the 2017 IEEE International Conference on Robotics and Automation (ICRA), Singapore, 29 May–3 June 2017; pp. 165–172. [Google Scholar] [CrossRef]
- Bloesch, M.; Omari, S.; Hutter, M.; Siegwart, R. Robust visual inertial odometry using a direct EKF-based approach. In Proceedings of the 2015 IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS), Hamburg, Germany, 28 September–2 October 2015; Volume 2015, pp. 298–304. [Google Scholar] [CrossRef]
- Leutenegger, S.; Furgale, P.; Rabaud, V.; Chli, M.; Konolige, K.; Siegwart, R. Keyframe-Based Visual-Inertial SLAM using Nonlinear Optimization. In Proceedings of the Robotics: Science and Systems IX, Berlin, Germany, 24–28 June 2013. [Google Scholar] [CrossRef]
- Mur-Artal, R.; Tardos, J.D. Visual-Inertial Monocular SLAM With Map Reuse. IEEE Robot. Autom. Lett. 2017, 2, 796–803. [Google Scholar] [CrossRef]
- Usenko, V.; Engel, J.; Stuckler, J.; Cremers, D. Direct visual-inertial odometry with stereo cameras. In Proceedings of the 2016 IEEE International Conference on Robotics and Automation (ICRA), Stockholm, Sweden, 16–21 May 2016; pp. 1885–1892. [Google Scholar] [CrossRef]
- Kasyanov, A.; Engelmann, F.; Stuckler, J.; Leibe, B. Keyframe-based visual-inertial online SLAM with relocalization. In Proceedings of the IEEE International Conference on Intelligent Robots and Systems, Vancouver, BC, Canada, 24–28 September 2017; pp. 6662–6669. [Google Scholar] [CrossRef]
- Von Stumberg, L.; Usenko, V.; Cremers, D. Direct Sparse Visual-Inertial Odometry Using Dynamic Marginalization. In Proceedings of the 2018 IEEE International Conference on Robotics and Automation (ICRA), Brisbane, QLD, Australia, 21–25 May 2018; pp. 2510–2517. [Google Scholar] [CrossRef]
- Titterton, D.; Weston, J. Strapdown Inertial Navigation Technology, 2nd ed.; Institution of Engineering and Technology: Stevenage, UK, 2004. [Google Scholar]
- Tedaldi, D.; Pretto, A.; Menegatti, E. A robust and easy to implement method for IMU calibration without external equipments. In Proceedings of the 2014 IEEE International Conference on Robotics and Automation (ICRA), Hong Kong, China, 31 May–7 June 2014; pp. 3042–3049. [Google Scholar] [CrossRef]
- Rogers, R.M. Applied Mathematics in Integrated Navigation Systems, 2nd ed.; American Institute of Aeronautics and Astronautics: Reston, VA, USA, 2007. [Google Scholar] [CrossRef]
- Chatfield, A.B. Fundamentals of High Accuracy Inertial Navigation; American Institute of Aeronautics and Astronautics: Reston, VA, USA, 1997. [Google Scholar] [CrossRef]
- Hall, J.J.; Williams, R.L. Case study: Inertial measurement unit calibration platform. J. Field Robot. 2000, 17, 623–632. [Google Scholar] [CrossRef]
- Syed, Z.F.; Aggarwal, P.; Goodall, C.; Niu, X.; Elsheimy, N. A new multi-position calibration method for MEMS inertial navigation systems. Meas. Sci. Technol. 2007, 18, 1897–1907. [Google Scholar] [CrossRef]
- Kim, A.; Golnaraghi, M.F. Initial calibration of an inertial measurement unit using an optical position tracking system. In Proceedings of the Position Location and Navigation Symposium, Monterey, CA, USA, 26–29 April 2004. [Google Scholar]
- Nebot, E.; Durrant-Whyte, H. Initial Calibration and Alignment of Low-Cost Inertial Navigation Units for Land Vehicle Applications. J. Robot. Syst. 1999, 16, 81–92. [Google Scholar] [CrossRef]
- Lotters, J.C.; Schipper, J.; Veltink, P.H.; Olthuis, W.; Bergveld, P. Procedure for in-use calibration of triaxial accelerometers in medical applications. Sens. Actuators A Phys. 1998, 68, 221–228. [Google Scholar] [CrossRef]
- Fong, W.T.; Ong, S.K.; Nee, A.Y.C. Methods for in-field user calibration of an inertial measurement unit without external equipment. Meas. Sci. Technol. 2008, 19, 085202. [Google Scholar] [CrossRef]
- Hwangbo, M.; Kim, J.S.; Kanade, T. IMU Self-Calibration Using Factorization. IEEE Trans. Robot. 2013, 29, 493–507. [Google Scholar] [CrossRef]
- Rehder, J.; Nikolic, J.; Schneider, T.; Hinzmann, T.; Siegwart, R. Extending kalibr: Calibrating the extrinsics of multiple IMUs and of individual axes. In Proceedings of the IEEE International Conference on Robotics and Automation, Stockholm, Sweden, 16–21 May 2016; pp. 4304–4311. [Google Scholar] [CrossRef]
- Furgale, P.; Barfoot, T.D.; Sibley, G. Continuous-time batch estimation using temporal basis functions. In Proceedings of the 2012 IEEE International Conference on Robotics and Automation, Saint Paul, MN, USA, 14–18 May 2012; Volume 34, pp. 2088–2095. [Google Scholar] [CrossRef]
- Lupton, T.; Sukkarieh, S. Visual-inertial-aided navigation for high-dynamic motion in built environments without initial conditions. IEEE Trans. Robot. 2012, 28, 61–76. [Google Scholar] [CrossRef]
- Shen, S.; Michael, N.; Kumar, V. Tightly-coupled monocular visual-inertial fusion for autonomous flight of rotorcraft MAVs. In Proceedings of the 2015 IEEE International Conference on Robotics and Automation (ICRA), Seattle, WA, USA, 26–30 May 2015; pp. 5303–5310. [Google Scholar] [CrossRef]
- Forster, C.; Carlone, L.; Dellaert, F.; Scaramuzza, D. IMU Preintegration on Manifold for Efficient Visual-Inertial Maximum-a-Posteriori Estimation. In Proceedings of the Robotics: Science and Systems XI, Rome, Italy, 13–17 July 2015. [Google Scholar] [CrossRef]
- Forster, C.; Carlone, L.; Dellaert, F.; Scaramuzza, D. On-Manifold Preintegration for Real-Time Visual–Inertial Odometry. IEEE Trans. Robot. 2017, 33, 1–21. [Google Scholar] [CrossRef]
- Qin, T.; Shen, S. Robust initialization of monocular visual-inertial estimation on aerial robots. In Proceedings of the IEEE International Conference on Intelligent Robots and Systems, Vancouver, BC, Canada, 24–28 September 2017; pp. 4225–4232. [Google Scholar] [CrossRef]
- Madyastha, V.; Ravindra, V.; Mallikarjunan, S.; Goyal, A. Extended Kalman Filter vs. Error State Kalman Filter for Aircraft Attitude Estimation. In Proceedings of the AIAA Guidance, Navigation, and Control Conference, Portland, OR, USA, 8–11 August 2011; pp. 6615–6638. [Google Scholar]
- Sola, J. Quaternion kinematics for the error-state Kalman filter. arXiv, 2017; arXiv:1711.02508. [Google Scholar]
- Shi, J.; Tomasi, C. Good features to track. In Proceedings of the IEEE Conference on Computer Vision and Pattern Recognition (CVPR-94), Seattle, WA, USA, 21–23 June 1994; pp. 593–600. [Google Scholar] [CrossRef]
- Lucas, B.D.; Kanade, T. An iterative image registration technique with an application to stereo vision. In Proceedings of the 7th International Joint Conference on Artificial Intelligence (IJCAI’81), Vancouver, BC, Canada, 24–28 August 1981; pp. 674–679. [Google Scholar]
- Agarwal, S.; Mierle, K. Ceres Solver. Available online: http://ceres-solver.org (accessed on 2 November 2018).
- Carlevarisbianco, N.; Ushani, A.K.; Eustice, R.M. University of Michigan North Campus long-term vision and lidar dataset. Int. J. Robot. Res. 2016, 35, 1023–1035. [Google Scholar] [CrossRef]
- Burri, M.; Nikolic, J.; Gohl, P.; Schneider, T.; Rehder, J.; Omari, S.; Achtelik, M.W.; Siegwart, R. The EuRoC micro aerial vehicle datasets. Int. J. Robot. Res. 2016, 35, 1157–1163. [Google Scholar] [CrossRef]
- Pfrommer, B.; Sanket, N.; Daniilidis, K.; Cleveland, J. PennCOSYVIO: A challenging Visual Inertial Odometry benchmark. In Proceedings of the 2017 IEEE International Conference on Robotics and Automation (ICRA), Singapore, 29 May–3 June 2017; pp. 3847–3854. [Google Scholar] [CrossRef]
- Majdik, A.; Till, C.; Scaramuzza, D. The Zurich urban micro aerial vehicle dataset. Int. J. Robot. Res. 2017, 36, 269–273. [Google Scholar] [CrossRef]
- Schubert, D.; Goll, T.; Demmel, N.; Usenko, V.; Stuckler, J.; Cremers, D. The TUM VI Benchmark for Evaluating Visual-Inertial Odometry. In Proceedings of the 2018 IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS), Madrid, Spain, 1–5 October 2018; pp. 1680–1687. [Google Scholar] [CrossRef]
- Mur-Artal, R.; Montiel, J.M.M.; Tardos, J.D. ORB-SLAM: A Versatile and Accurate Monocular SLAM System. IEEE Trans. Robot. 2015, 31, 1147–1163. [Google Scholar] [CrossRef]
- Mur-Artal, R.; Tardos, J.D. ORB-SLAM2: An Open-Source SLAM System for Monocular, Stereo, and RGB-D Cameras. IEEE Trans. Robot. 2017, 33, 1255–1262. [Google Scholar] [CrossRef]
- Mu, X.; Chen, J.; Zhou, Z.; Leng, Z.; Fan, L. Accurate Initial State Estimation in a Monocular Visual-Inertial SLAM System. Sensors 2018, 18, 506. [Google Scholar] [CrossRef]
- Sturm, J.; Engelhard, N.; Endres, F.; Burgard, W.; Cremers, D. A benchmark for the evaluation of RGB-D SLAM systems. In Proceedings of the 2012 IEEE/RSJ International Conference on Intelligent Robots and Systems, Vilamoura, Algarve, Portugal, 7–12 October 2012; pp. 573–580. [Google Scholar]
- Zhang, Z.; Scaramuzza, D. A Tutorial on Quantitative Trajectory Evaluation for Visual(-Inertial) Odometry. In Proceedings of the IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS), Madrid, Spain, 1–5 October 2018. [Google Scholar]
Reference | 1.0042 | 1.0014 | 0.9705 | |||
Mean | 1.0060 | 1.0040 | 0.0006 | 0.9700 | ||
Standard deviations | 0.0017 | 0.0024 | 0.0020 | 0.0010 | 0.0028 | 0.0021 |
|Mean-reference| | 0.0018 | 0.0018 | 0.0026 | 0.0008 | 0.0016 | 0.0004 |
Reference | 0.9436 | 0.0015 | 0.0008 | 0.0004 | 1.0941 | 0.0083 | 1.0159 | ||
Mean | 0.9416 | 0.0023 | 0.0009 | 1.0919 | 0.0064 | 1.0160 | |||
Standard deviations | 0.0009 | 0.0009 | 0.0004 | 0.0006 | 0.0018 | 0.0022 | 0.0004 | 0.0030 | 0.0012 |
|Mean-reference| | 0.0020 | 0.0008 | 0.0000 | 0.0009 | 0.0022 | 0.0015 | 0.0003 | 0.0019 | 0.0002 |
Sequence | VINS | Ours | Ours with | Length (m) |
---|---|---|---|---|
corridor1 | 0.618 | 0.453 ↓ | 0.851 | 305 |
corridor2 | 1.174 | 1.157 ↓ | 0.936 ↓ | 322 |
corridor3 | 1.309 | 0.797 ↓ | 0.447 ↓ | 300 |
corridor4 | 0.309 | 0.195 ↓ | 0.136 ↓ | 114 |
corridor5 | 0.674 | 0.547 ↓ | 0.647 ↓ | 270 |
magistrale1 | 2.385 | 2.043 ↓ | 1.072 ↓ | 918 |
magistrale2 | 3.205 | 3.105 ↓ | 3.132 ↓ | 561 |
magistrale3 | 0.358 | 0.521 | 1.035 | 566 |
magistrale4 | 4.443 | 3.919 ↓ | 4.223 ↓ | 688 |
magistrale5 | 0.542 | 0.603 | 0.738 | 458 |
magistrale6 | 2.078 | 2.825 | 1.210 ↓ | 771 |
room1 | 0.089 | 0.095 | 0.094 | 146 |
room2 | 0.048 | 0.051 | 0.049 | 142 |
room3 | 0.145 | 0.076 ↓ | 0.084 ↓ | 135 |
room4 | 0.042 | 0.048 | 0.052 | 68 |
room5 | 0.196 | 0.049 ↓ | 0.043 ↓ | 131 |
room6 | 0.059 | 0.057 ↓ | 0.048 ↓ | 67 |
slides1 | 0.517 | 0.273 ↓ | 0.193 ↓ | 289 |
slides2 | 1.007 | 1.047 | 1.209 | 299 |
slides3 | 1.005 | 0.781 ↓ | 0.764 ↓ | 383 |
© 2019 by the authors. Licensee MDPI, Basel, Switzerland. This article is an open access article distributed under the terms and conditions of the Creative Commons Attribution (CC BY) license (http://creativecommons.org/licenses/by/4.0/).
Share and Cite
Xiao, Y.; Ruan, X.; Chai, J.; Zhang, X.; Zhu, X. Online IMU Self-Calibration for Visual-Inertial Systems. Sensors 2019, 19, 1624. https://doi.org/10.3390/s19071624
Xiao Y, Ruan X, Chai J, Zhang X, Zhu X. Online IMU Self-Calibration for Visual-Inertial Systems. Sensors. 2019; 19(7):1624. https://doi.org/10.3390/s19071624
Chicago/Turabian StyleXiao, Yao, Xiaogang Ruan, Jie Chai, Xiaoping Zhang, and Xiaoqing Zhu. 2019. "Online IMU Self-Calibration for Visual-Inertial Systems" Sensors 19, no. 7: 1624. https://doi.org/10.3390/s19071624
APA StyleXiao, Y., Ruan, X., Chai, J., Zhang, X., & Zhu, X. (2019). Online IMU Self-Calibration for Visual-Inertial Systems. Sensors, 19(7), 1624. https://doi.org/10.3390/s19071624