Abstract
The combination of visual and inertial sensors for state estimation has recently found wide echo in the robotics community, especially in the aerial robotics field, due to the lightweight and complementary characteristics of the sensors data. However, most state estimation systems based on visual-inertial sensing suffer from severe processor requirements, which in many cases make them impractical. In this paper, we propose a simple, low-cost and high rate method for state estimation enabling autonomous flight of micro aerial vehicles, which presents a low computational burden. The proposed state estimator fuses observations from an inertial measurement unit, an optical flow smart camera and a time-of-flight range sensor. The smart camera provides optical flow measurements up to a rate of 200 Hz, avoiding the computational bottleneck to the main processor produced by all image processing requirements. To the best of our knowledge, this is the first example of extending the use of these smart cameras from hovering-like motions to odometry estimation, producing estimates that are usable during flight times of several minutes. In order to validate and defend the simplest algorithmic solution, we investigate the performances of two Kalman filters, in the extended and error-state flavors, alongside with a large number of algorithm modifications defended in earlier literature on visual-inertial odometry, showing that their impact on filter performance is minimal. To close the control loop, a non-linear controller operating in the special Euclidean group SE(3) is able to drive, based on the estimated vehicle’s state, a quadrotor platform in 3D space guaranteeing the asymptotic stability of 3D position and heading. All the estimation and control tasks are solved on board and in real time on a limited computational unit. The proposed approach is validated through simulations and experimental results, which include comparisons with ground-truth data provided by a motion capture system. For the benefit of the community, we make the source code public.
Similar content being viewed by others
Explore related subjects
Discover the latest articles, news and stories from top researchers in related subjects.Notes
Note that in EKF the orientation error is additive and this distinction is irrelevant.
References
Bar-Shalom, Y., Li, X. R., & Kirubarajan, T. (2004). Estimation with applications to tracking and navigation: Theory algorithms and software. Hoboken: Wiley.
Blösch, M., Omari, S., Fankhauser, P., Sommer, H., Gehring, C., Hwangbo, J., Hoepflinger, M., Hutter, M., & Siegwart, R. (2014). Fusion of optical flow and inertial measurements for robust egomotion estimation. In Proceedings of the IEEE/RSJ international conference on intelligent robots and system (pp. 3102–3107). Chicago.
Blösch, M., Weiss, S., Scaramuzza, D., & Siegwart, R. (2010). Vision based MAV navigation in unknown and unstructured environments. In Proceedings of the IEEE international conference on robotics and automation (pp. 21–28). Anchorage.
Bullo, F., & Lewis, A. (2004). Geometric control of mechanical systems. Berlin: Springer.
Faessler, M., Fontana, F., Forster, C., Mueggler, E., Pizzoli, M., & Scaramuzza, D. (2016). Autonomous, vision-based flight and live dense 3D mapping with a quadrotor micro aerial vehicle. Journal of Field Robotics, 33(4), 431–450.
Fliess, M., Lévine, J., Martin, P., & Rouchon, P. (1995). Flatness and defect of non-linear systems: Introductory theory and examples. International Journal of Control, 61(6), 1327–1361.
Forster, C., Pizzoli, M., & Scaramuzza, D. (2014). SVO: Fast semi-direct monocular visual odometry. In Proceedings of the IEEE international conference on robotics and automation (pp. 15–22). Hong Kong.
Forte, F., Naldi, R., & Marconi, L. (2012). Impedance control of an aerial manipulator. In Proceedings of the American control conference (pp. 3839–3844). Montreal.
Fraundorfer, F., Heng, L., Honegger, D., Lee, G. H., Meier, L., Tanskanen, P., & Pollefeys, M. (2012). Vision-based autonomous mapping and exploration using a quadrotor MAV. In Proceedings of the IEEE/RSJ international conference on intelligent robots and system (pp. 4557–4564). Vilamoura.
Hérissé, B., Hamel, T., Mahony, R., & Russotto, F. X. (2012). Landing a VTOL unmanned aerial vehicle on a moving platform using optical flow. IEEE Transactions on Robotics, 28(1), 77–89.
Hesch, J. A., Kottas, D. G., Bowman, S. L., & Roumeliotis, S. I. (2014). Camera-IMU-based localization: Observability analysis and consistency improvement. The International Journal of Robotics Research, 33(1), 182–201.
Honegger, D., Lorenz, M., Tanskanen, P., & Pollefeys, M. (2013). An open source and open hardware embedded metric optical flow CMOS camera for indoor and outdoor applications. In Proceedings of the IEEE international conference on robotics and automation (pp. 1736–1741). Karlsruhe.
Jones, E. S., & Soatto, S. (2011). Visual-inertial navigation, mapping and localization: A scalable real-time causal approach. The International Journal of Robotics Research, 30(4), 407–430.
Kelly, J., & Sukhatme, G. S. (2011). Visual-inertial sensor fusion: Localization, mapping and sensor-to-sensor self-calibration. The International Journal of Robotics Research, 30(1), 56–79.
Lee, T., Leok, M., & McClamroch, N. H. (2013). Nonlinear robust tracking control of a quadrotor UAV on SE(3). Asian Journal of Control, 15(2), 391–408.
Li, M., & Mourikis, A. I. (2012). Improving the accuracy of EKF-based visual-inertial odometry. In Proceedings of the IEEE international conference on robotics and automation (pp. 828–835). Saint Paul.
Li, M., & Mourikis, A. I. (2013). High-precision, consistent EKF-based visual-inertial odometry. The International Journal of Robotics Research, 32(6), 690–711.
Liu, H., Darabi, H., Banerjee, P., & Liu, J. (2007). Survey of wireless indoor positioning techniques and systems. IEEE Transactions on Systems, Man, and Cybernetics, 37(6), 1067–1080.
Loianno, G., Mulgaonkar, Y., Brunner, C., Ahuja, D., Ramanandan, A., Chari, M., et al. (2015a). Smartphones power flying robots. In Proceedings of the IEEE/RSJ international conference on intelligent robots and system (pp. 1256–1263). Hamburg.
Loianno, G., Thomas, J., & Kumar, V. (2015b). Cooperative localization and mapping of MAVs using RGB-D sensors. In Proceedings of the IEEE international conference on robotics and automation (pp. 4021–4028). Seattle.
Madyastha, V. K., Ravindra, V. C., Mallikarjunan, S., & Goyal, A. (2011) Extended Kalman filter vs. error state Kalman filter for aircraft attitude estimation. In Proceedings of the AIAA guidance, navigation, and control conference (pp. 6615–6638). Portland.
Martinelli, A. (2012). Vision and IMU data fusion: Closed-form solutions for attitude, speed, absolute scale, and bias determination. IEEE Transactions on Robotics, 28(1), 44–60.
Martinelli, A. (2013). Visual-inertial structure from motion: Observability and resolvability. In Proceedings of the IEEE/RSJ international conference on intelligent robots and system (pp. 4235–4242). Tokyo.
Meier, L., Tanskanen, P., Heng, L., Lee, G., Fraundorfer, F., & Pollefeys, M. (2012). PIXHAWK: A micro aerial vehicle design for autonomous flight using onboard computer vision. Autonomous Robots, 33(1–2), 21–39.
Mellinger, D., & Kumar, V. (2011). Minimum snap trajectory generation and control for quadrotors. In Proceedings of the IEEE international conference on robotics and automation (pp. 2520–2525). Shanghai.
Michael, N., Mellinger, D., Lindsey, Q., & Kumar, V. (2010). The grasp multiple micro-UAV test bed. IEEE Robotics & Automation Magazine, 17(3), 56–65.
Michael, N., Shen, S., Mohta, K., Kumar, V., Nagatani, K., Okada, Y., et al. (2012). Collaborative mapping of an earthquake-damaged building via ground and aerial robots. Journal of Field Robotics, 29(5), 832–841.
Nikolic, J., Rehder, J., Burri, M., Gohl, P., Leutenegger, S., Furgale, P. T., & Siegwart, R. (2014). A synchronized visual-inertial sensor system with FPGA pre-processing for accurate real-time SLAM. In Proceedings of the IEEE international conference on robotics and automation (pp. 431–437). Hong Kong.
Omari, S., & Ducard, G. (2013). Metric visual-inertial navigation system using single optical flow feature. In Proceedings of the European control conference (pp. 1310–1316). Zurich.
Ozaslan, T., Shen, S., Mulgaonkar, Y., Michael, N., & Kumar, V. (2013). Inspection of penstocks and featureless tunnel-like environments using micro UAVs. In Proceedings of the conference on field and service robotics (pp. 123–136). Brisbane.
Ravindra, V., Madyastha, V., & Goyal, A. (2012). The equivalence between two well-known variants of the Kalman filter. In: Proc. Adv. Cont. Opt. of Dynamic Syst. Bangalore.
Rossi, R., Santamaria-Navarro A., Andrade-Cetto J., & Rocco P. (2017). Trajectory generation for unmanned aerial manipulators through quadratic programming. IEEE Robotics and Automation Letters, 2(2), 389–396.
Roumeliotis, S. I., Johnson, A. E., & Montgomery, J. F. (2002) Augmenting inertial navigation with image-based motion estimation. In Proceedings of the IEEE international conference on robotics and automation (Vol. 4, pp. 4326–4333). Washington.
Roussillon, C., Gonzalez, A., Solà, J., Codol, J. M., Mansard, N., Lacroix, S., & Devy, M. (2011). RT-SLAM: A generic and real-time visual SLAM implementation. In J. L. Crowley, B. A. Draper, & M. Thonnat (Eds.), Computer vision systems, lecture notes in computer science (Vol. 6962, pp. 31–40). Berlin, Heidelberg: Springer.
Ruffo, M., Di Castro, M., Molinari, L., Losito, R., Masi, A., Kovermann, J., & Rodrigues, L. (2014). New infrared time-of-flight measurement sensor for robotic platforms. In Proceedings of the international symposium work on ADC modelling and testing (pp. 13–18).
Santamaria-Navarro A., Grosch P., Lippiello V., Solà J., & Andrade-Cetto J. (2017). Uncalibrated visual servo for unmanned aerial manipulation. IEEE/ASME Transactions on Mechatronics, 22(4),1610–1621.
Santamaria-Navarro, A., Lipiello, V., & Andrade-Cetto, J. (2014). Task priority control for aerial manipulation. In Proceedings of the IEEE international symposium on safety security and rescue robotics (pp. 1–6). Toyako-cho.
Santamaria-Navarro, A., Solà, J., & Andrade-Cetto, J. (2015). High-frequency MAV state estimation using low-cost inertial and optical flow measurement units. In Proceedings of the IEEE/RSJ international conference on intelligent robots and system (pp. 1864–1871). Hamburg.
Shen, S., Michael, N., & Kumar, V. (2012). Autonomous indoor 3D exploration with a micro-aerial vehicle. In Proceedings of the IEEE international conference on robotics and automation (pp. 9–15). Saint Paul.
Shen, S., Mulgaonkar Y., Michael N., Kumar V. (2013). Vision-based state estimation and trajectory control towards high-speed flight with a quadrotor. In Proceedings of the robotics science and system. Berlin.
Solà, J. (2015) Quaternion kinematics for the error-state KF. https://hal.archives-ouvertes.fr/hal-01122406v5, hal-01122406, v5 (in preparation).
Solà, J., Vidal-Calleja, T., Civera, J., & Montiel, J. M. M. (2011). Impact of landmark parametrization on monocular EKF-SLAM with points and lines. International Journal of Computer Vision, 97(3), 339–368.
Thomas, J., Loianno, G., Sreenath, K., & Kumar, V. (2014). Toward image based visual servoing for aerial grasping and perching. In Proceedings of the IEEE international conference on robotics and automation (pp. 2113–2118). Hong Kong.
Tomic, T., Schmid, K., Lutz, P., Domel, A., Kassecker, M., Mair, E., et al. (2012). Toward a fully autonomous UAV: Research platform for indoor and outdoor urban search and rescue. IEEE Robotics & Automation Magazine, 19(3), 46–56.
Trawny, N., & Roumeliotis, S. I. (2005) Indirect Kalman filter for 3D attitude estimation. University of Minnesota, Department of Computer Science & Engineering, Tech. Rep 2, rev. 57.
Weiss, S., Achtelik, M., Lynen, S., Chli, M., & Siegwart, R. (2012). Real-time onboard visual-inertial state estimation and self-calibration of MAVs in unknown environments. Proceedings of the IEEE international conference on robotics and automation (pp. 957–964). Saint Paul.
Weiss, S., Scaramuzza, D., & Siegwart, R. (2011). Monocular-SLAM-based navigation for autonomous micro helicopters in gps denied environments. Journal of Field Robotics, 28(6), 854–874.
Author information
Authors and Affiliations
Corresponding author
Additional information
This work was funded by the project ROBINSTRUCT (TIN2014-58178-R) and the Ramón y Cajal posdoctoral fellowship (RYC-2012-11604) from the Spanish Ministry of Economy and Competitiveness, by the Spanish State Research Agency through the María de Maeztu Seal of Excellence to IRI (MDM-2016-0656) and by the EU H2020 project AEROARMS (H2020-ICT-2014-1-644271).
Electronic supplementary material
Below is the link to the electronic supplementary material.
Supplementary material 1 (mp4 9320 KB)
Appendices
Appendix A: Quaternion conventions and properties
We use, as in Solà (2015), the Hamilton convention for quaternions. If we denote a quaternion \({}^G\mathbf{q}_L\) representing the orientation of a local frame L with respect to a global frame G, then a generic composition of two quaternions is defined as
where, for a quaternion \(\mathbf{q}=[w,x,y,z]^\top \), we can define \(\mathbf{Q}^{+}\) and \(\mathbf{Q}^{-}\) respectively as the left- and right-quaternion product matrices,
In the quaternion product, we notice how the right-hand quaternion is defined locally in the frame L, which is specified by the left-hand quaternion. Vector transformation from a local frame L to the global G is performed by the double product
where we use the shortcut \(\mathbf{q}\otimes \mathbf{v} \equiv \mathbf{q}\otimes [0,\mathbf{v}]^\top \) for convenience of notation.
Throughout the paper, we note \(\mathbf{q}\{x\}\) the quaternion and \(\mathbf{R}\{x\}\) the rotation matrix equivalents to a generic orientation x. A rotation \(\varvec{\theta }= \theta \mathbf u\), of \(\theta \) radians around the unit axis \(\mathbf u\), can be expressed in quaternion and matrix forms using the exponential maps
We also write \(\mathbf{R}\,=\,\mathbf{R}\{\mathbf{q}\}\), according to
Finally, the time-derivative of the quaternion is
with \({\varvec{\omega }}\) the angular velocity in body frame, and \({\varvec{\Omega }}\) the skew-symmetric matrix defined as
Appendix B: Filter transition matrices
We detail the construction of the filter transition matrix for the three involved integrals: ESKF nominal—(22), ESKF error—(23), and EKF true—(25) kinematics. For each case, we need to define the matrix \(\mathbf{A}\) as the Jacobian of the respective continuous-time system, and build the transition matrix \(\mathbf{F}_N\) as the truncated Taylor series (20), i.e.,
In the following paragraphs, we detail the matrices \(\mathbf{A}\) for each case, and some examples of their first powers up to \(n=3\). The reader should find no difficulties in building the powers of \(\mathbf{A}\) that have not been detailed, and the transition matrices \(\mathbf{F}_N\) using the Taylor series above.
The Jacobian \(\mathbf{A}=\partial f(\mathbf{x},\delta \mathbf{x},\cdot )/\partial \delta \mathbf{x}\) of the ESKF’s continuous time error-state system f() (18) using GE is,
with \(\mathbf{V} = -[\mathbf{R}(\mathbf{a}_S-\mathbf{a}_b)]_\times \). Its powers are,
and \(\mathbf{A}^{n} = \mathbf{0}\) for \(n>3\). For LE we have
with \(\mathbf{V} = -\mathbf{R}[\mathbf{a}_S-\mathbf{a}_b]_\times \), and \({\varvec{\Theta }} = -[{\varvec{\omega }}_S-{\varvec{\omega }}_b]_\times \).
The Jacobians \(\mathbf{A}=\partial f(\mathbf{x},\cdot )/\partial \mathbf{x}\) of the continuous-time EKF true—(16) and ESKF nominal—(17) systems are equal to each other, having
where \(\mathbf{V}\), \(\mathbf{W}\) and \(\mathbf{Q}\) are defined by
and are developed hereafter. For the first Jacobian \(\mathbf{V}\) it is convenient to recall the derivative of a rotation of a vector \(\mathbf a\) by a quaternion \(\mathbf{q}=[w,x,y,z]^\top =[w,\mathbf{v}]^\top \) with respect to the quaternion,
having therefore
For the Jacobian \(\mathbf{W}\) we have from (52)
with \(\varvec{\Omega }({\varvec{\omega }})\) the skew-symmetric matrix defined in (53).
Finally, for the Jacobian \(\mathbf{Q}\) we use (46), (47) and (49) to obtain
Rights and permissions
About this article
Cite this article
Santamaria-Navarro, A., Loianno, G., Solà, J. et al. Autonomous navigation of micro aerial vehicles using high-rate and low-cost sensors. Auton Robot 42, 1263–1280 (2018). https://doi.org/10.1007/s10514-017-9690-5
Received:
Accepted:
Published:
Issue Date:
DOI: https://doi.org/10.1007/s10514-017-9690-5