Abstract
In this paper we present a direct semi-dense stereo Visual-Inertial Odometry (VIO) algorithm enabling autonomous flight for quadrotor systems with Size, Weight, and Power (SWaP) constraints. The proposed approach is validated through experiments on a 250 g, 22 cm diameter quadrotor equipped with a stereo camera and an IMU. Semi-dense methods have superior performance in low texture areas, which are often encountered in robotic tasks such as infrastructure inspection. However, due to the measurement size and iterative nonlinear optimization, these methods are computationally more expensive. As the scale of the platform shrinks down, the available computation of the on-board CPU becomes limited, making autonomous navigation using optimization-based semi-dense tracking a hard problem. We show that our direct semi-dense VIO performs comparably to other state-of-the-art methods, while taking less CPU than other optimization-based approaches, making it suitable for computationally-constrained small platforms. Our method takes less amount of CPU than the state-of-the-art semi-dense method, VI-Stereo-DSO, due to a simpler framework in the algorithm and a multi-threaded code structure allowing us to run real-time state estimation on an ARM board. With a low texture dataset obtained with our quadrotor platform, we show that this method performs significantly better than sparse methods in low texture conditions encountered in indoor navigation. Finally, we demonstrate autonomous flight on a small platform using our direct semi-dense Visual-Inertial Odometry. Supplementary code, low texture datasets and videos can be found on our github repo: https://github.com/KumarRobotics/sdd_vio.
Similar content being viewed by others
Explore related subjects
Discover the latest articles, news and stories from top researchers in related subjects.Notes
We are using a different CPU for runtime analysis from the previous experiments on datasets, for the two sets of experiments are conducted at different times.
References
Baker, S., & Matthews, I. (2004). Lucas-kanade 20 years on: A unifying framework. International Journal of Computer Vision, 56(3), 221–255.
Bloesch, M., Burri, M., Omari, S., Hutter, M., & Siegwart, R. (2017). Iterated extended kalman filter based visual-inertial odometry using direct photometric feedback. The International Journal of Robotics Research, 36(10), 1053–1072.
Chirikjian, G. S. (2011). Stochastic models, information theory, and lie groups, volume 2: Analytic methods and modern applications. Berlin: Springer Science Business Media.
Engel, J., Sturm, J., & Cremers, D. (2013). Semi-dense visual odometry for a monocular camera. In: Proceedings of the IEEE international conference on computer vision, pp. 1449–1456.
Engel, J., Stückler, J., & Cremers, D. (2015). Large-scale direct slam with stereo cameras. In: Intelligent Robots and Systems (IROS), 2015 IEEE/RSJ International Conference on, IEEE, pp. 1935–1942.
Engel, J., Usenko, V., & Cremers, D. (2016). A photometrically calibrated benchmark for monocular visual odometry. arXiv preprint arXiv:1607.02555
Engel, J., Koltun, V., & Cremers, D. (2018). Direct sparse odometry. IEEE Transactions on Pattern Analysis and Machine Intelligence, 40(3), 611–625.
Forster, C., Carlone, L., Dellaert, F., & Scaramuzza, D. (2017a). On-manifold preintegration for real-time visual-inertial odometry. IEEE Transactions on Robotics, 33(1), 1–21.
Forster, C., Zhang, Z., Gassner, M., Werlberger, M., & Scaramuzza, D. (2017b). Svo: Semidirect visual odometry for monocular and multicamera systems. IEEE Transactions on Robotics, 33(2), 249–265.
Fraundorfer, F., Heng, L., Honegger, D., Lee, GH., Meier, L., Tanskanen, P., & Pollefeys, M. (2012). Vision-based autonomous mapping and exploration using a quadrotor mav. In: Intelligent Robots and Systems (IROS), 2012 IEEE/RSJ International Conference on, IEEE, pp. 4557–4564.
Geneva, P., Eckenhoff, K., Lee, W., Yang, & Y., Huang, G. (2020). Openvins: A research platform for visual-inertial estimation. In: 2020 IEEE International Conference on Robotics and Automation (ICRA), IEEE, pp. 4666–4672.
Gui, J., Gu, D., Wang, S., & Hu, H. (2015). A review of visual inertial odometry from filtering and optimisation perspectives. Advanced Robotics, 29(20), 1289–1301.
Hartley, R., & Zisserman, A. (2003). Multiple view geometry in computer vision. Cambridge: Cambridge University Press.
Hérissé, B., Hamel, T., Mahony, R., & Russotto, F. X. (2012). Landing a VTOL Unmanned Aerial Vehicle on a moving platform using optical flow. IEEE Transactions on Robotics, 28(1), 77–89.
Irani, M., & Anandan, P. (1999). About direct methods. In: International Workshop on Vision Algorithms, Springer, pp. 267–277.
Klein, G., & Murray, D. (2009). Parallel tracking and mapping on a camera phone. International Symposium on Mixed and Augmented Reality (ISMAR) (pp. 83–86). USA: Orlando.
Lee, T., Leok, M., & McClamroch, NH. (2010). Geometric tracking control of a quadrotor uav on se(3). In: 49th IEEE Conference on Decision and Control (CDC), pp. 5420–5425.
Liu, W., Loianno, G., Mohta, K., Daniilidis, K., & Kumar, V. (2018). Semi-dense visual-inertial odometry and mapping for quadrotors with swap constraints. In: 2018 IEEE International Conference on Robotics and Automation (ICRA), IEEE, pp. 1–6.
Ma, Y., Soatto, S., Kosecka, J., & Sastry, S. S. (2012). An invitation to 3-d vision: From images to geometric models. Berlin: Springer Science & Business Media.
Meier, L., Tanskanen, P., Fraundorfer, F., & Pollefeys, M. (2011). Pixhawk: A system for autonomous flight using onboard computer vision. In: 2011 IEEE International Conference on Robotics and Automation, IEEE, pp. 2992–2997.
Michael, N., Mellinger, D., Lindsey, Q., & Kumar, V. (2010). The grasp multiple micro-UAV test bed. IEEE Robotics and Automation Magazine, 17(3), 56–65.
Mourikis, A. I., & Roumeliotis, S. I. (2007). (2007) A multi-state constraint kalman filter for vision-aided inertial navigation. Robotics and automation (pp. 3565–3572). IEEE: IEEE international conference on.
Mur-Artal, R., & Tardós, J. D. (2017). Orb-slam2: An open-source slam system for monocular, stereo, and rgb-d cameras. IEEE Transactions on Robotics, 33(5), 1255–1262.
Newcombe, R. A., Lovegrove, S. J., & Davison, A. J. (2011). Dtam: Dense tracking and mapping in real-time. In: 2011 international conference on computer vision, IEEE, pp. 2320–2327.
Ozaslan, T., Loianno, G., Keller, J., Taylor, C. J., Kumar, V., Wozencraft, J. M., & Hood, T. (2017). Autonomous navigation and mapping for inspection of penstocks and tunnels with mavs. IEEE Robotics and Automation Letters, 2(3), 1740–1747. https://doi.org/10.1109/LRA.2017.2699790.
Qin, T., Li, P., & Shen, S. (2018). Vins-mono: A robust and versatile monocular visual-inertial state estimator. IEEE Transactions on Robotics, 34(4), 1004–1020.
Schöps, T., Engel, J., & Cremers, D. (2014). (2014) Semi-dense visual odometry for ar on a smartphone. Mixed and Augmented Reality (ISMAR) (pp. 145–150). IEEE: IEEE International Symposium on.
Shen, S., Mulgaonkar, Y., Michael, N., & Kumar, V. (2014). Multi-sensor fusion for robust autonomous flight in indoor and outdoor environments with a rotorcraft mav. In: 2014 IEEE International Conference on Robotics and Automation (ICRA), IEEE, pp. 4974–4981.
Sun, K., Mohta, K., Pfrommer, B., Watterson, M., Liu, S., Mulgaonkar, Y., et al. (2018). Robust stereo visual inertial odometry for fast autonomous flight. IEEE Robotics and Automation Letters, 3(2), 965–972.
Thakur, D., Loianno, G., Liu, W., & Kumar, V. (2018). Nuclear environments inspection with micro aerial vehicles: Algorithms and experiments. In: International Symposium on Experimental Robotics, Springer, pp. 191–200.
Tomic, T., Schmid, K., Lutz, P., Domel, A., Kassecker, M., Mair, E., et al. (2012). Toward a fully autonomous uav: Research platform for indoor and outdoor urban search and rescue. IEEE Robotics Automation Magazine, 19(3), 46–56. https://doi.org/10.1109/MRA.2012.2206473.
Usenko, V., Engel, J., Stückler, J., & Cremers, D. (2016). Direct visual-inertial odometry with stereo cameras. In: Robotics and Automation (ICRA), 2016 IEEE International Conference on, IEEE, pp. 1885–1892.
Usenko, V., Demmel, N., Schubert, D., Stückler, J., & Cremers, D. (2019). Visual-inertial mapping with non-linear factor recovery. IEEE Robotics and Automation Letters, 5(2), 422–429.
Von Stumberg, L., Usenko, V., & Cremers, D. (2018). Direct sparse visual-inertial odometry using dynamic marginalization. In: 2018 IEEE International Conference on Robotics and Automation (ICRA), IEEE, pp. 2510–2517.
Weiss, S., Scaramuzza, D., & Siegwart, R. (2011). Monocular-SLAM-based navigation for autonomous micro helicopters in GPS denied environments. Journal of Field Robotics, 28(6), 854–874.
Weiss, S., Achtelik, MW., Lynen, S., Chli, M., & Siegwart, R. (2012). Real-time onboard visual-inertial state estimation and self-calibration of mavs in unknown environments. In: 2012 IEEE international conference on robotics and automation, IEEE, pp. 957–964.
Xu, W., Choi, D., & Wang, G. (2018). Direct visual-inertial odometry with semi-dense mapping. Computers & Electrical Engineering, 67, 761–775.
Zhang, Z., Sattler, T., & Scaramuzza, D. (2020). Reference pose generation for visual localization via learned features and view synthesis. arXiv preprint arXiv:2005.05179
Acknowledgements
We gratefully acknowledge the support of ONR grant N00014-20-1-2822, Qualcomm Research, United Technologies, the IoT4Ag Engineering Research Center funded by the National Science Foundation (NSF) under NSF Cooperative Agreement Number EEC-1941529, and C-BRIC, a Semiconductor Research Corporation Joint University Microelectronics Program, program cosponsored by DARPA.
Author information
Authors and Affiliations
Corresponding author
Ethics declarations
Conflict of interest
The authors declare that they have no conflict of interest.
Additional information
Publisher's Note
Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.
Rights and permissions
About this article
Cite this article
Liu, W., Mohta, K., Loianno, G. et al. Semi-dense visual-inertial odometry and mapping for computationally constrained platforms. Auton Robot 45, 773–787 (2021). https://doi.org/10.1007/s10514-021-10002-z
Received:
Accepted:
Published:
Issue Date:
DOI: https://doi.org/10.1007/s10514-021-10002-z