Event-Based Visual/Inertial Odometry for UAV Indoor Navigation
<p>Workflow of the proposed event-based VIO.</p> "> Figure 2
<p>DAVIS346 event camera.</p> "> Figure 3
<p>Study area camera calibration: (<b>a</b>) a 6 × 9 chessboard with a square size of 30 mm; and (<b>b</b>) an example of the detected pattern image.</p> "> Figure 4
<p>An office environment simulation layout.</p> "> Figure 5
<p>Simulated dataset: comparison of trajectories.</p> "> Figure 6
<p>Ground-based dataset: an example of feature detection and tracking on an events accumulated frame.</p> "> Figure 7
<p>Ground-based dataset: an example of feature detection and tracking on a standard frame.</p> "> Figure 8
<p>Ground-based dataset: an example of feature detection and tracking on combined events and standard frames.</p> "> Figure 9
<p>Ground-based dataset: comparison of trajectories.</p> "> Figure 10
<p>UAV used for the experiments. (1) DAVIS346 event camera. (2) NVIDIA Jetson Xavier computer. (3) Pixhawk 4 flight controller.</p> "> Figure 11
<p>UAV-based dataset: an example of feature detection and tracking on an events accumulated frame.</p> "> Figure 12
<p>UAV-based dataset: an example of feature detection and tracking on a standard frame.</p> "> Figure 13
<p>UAV-based dataset: an example of feature detection and tracking on combined events and standard frames.</p> "> Figure 14
<p>UAV-based dataset: comparison of trajectories.</p> ">
Abstract
:1. Introduction
2. Event-Based VIO Approach
3. Experimental Section
3.1. Simulated Dataset
3.2. Ground-Based Dataset
3.3. UAV-Based Dataset
4. Conclusions
Author Contributions
Funding
Institutional Review Board Statement
Informed Consent Statement
Data Availability Statement
Acknowledgments
Conflicts of Interest
References
- Mazhar, F.; Khan, M.G.; Sällberg, B. Precise Indoor Positioning Using UWB: A Review of Methods, Algorithms and Implementations. Wirel. Pers. Commun. 2017, 97, 4467–4491. [Google Scholar] [CrossRef]
- Tiemann, J.; Ramsey, A.; Wietfeld, C. Enhanced UAV Indoor Navigation through SLAM-Augmented UWB Localization. In Proceedings of the 2018 IEEE International Conference on Communications Workshops (ICC Workshops), Kansas City, MO, USA, 20–24 May 2018; pp. 1–6. [Google Scholar]
- Tiemann, J.; Wietfeld, C. Scalable and precise multi-UAV indoor navigation using TDOA-based UWB localization. In Proceedings of the 2017 International Conference on Indoor Positioning and Indoor Navigation (IPIN), Sapporo, Japan, 18–21 September 2017; pp. 1–7. [Google Scholar]
- Raja, G.; Suresh, S.; Anbalagan, S.; Ganapathisubramaniyan, A.; Kumar, N. PFIN: An Efficient Particle Filter-Based Indoor Navigation Framework for UAVs. IEEE Trans. Veh. Technol. 2021, 70, 4984–4992. [Google Scholar] [CrossRef]
- Bachrach, A.; He, R.; Roy, N. Autonomous Flight in Unknown Indoor Environments. Int. J. Micro Air Veh. 2009, 1, 217–228. [Google Scholar] [CrossRef]
- Zhang, J.; Singh, S. LOAM: Lidar odometry and mapping in real-time. In Proceedings of the Robotics: Science and Systems, Berkeley, CA, USA, 12–16 July 2014; pp. 1–9. [Google Scholar]
- Yi, S.; Lyu, Y.; Hua, L.; Pan, Q.; Zhao, C. Light-LOAM: A Lightweight LiDAR Odometry and Mapping Based on Graph-Matching. IEEE Robot. Autom. Lett. 2024, 9, 3219–3226. [Google Scholar] [CrossRef]
- Bry, A.; Bachrach, A.; Roy, N. State estimation for aggressive flight in GPS-denied environments using onboard sensing. In Proceedings of the 2012 IEEE International Conference on Robotics and Automation, Saint Paul, MN, USA, 14–18 May 2012; pp. 1–8. [Google Scholar]
- Kumar, G.A.; Patil, A.K.; Patil, R.; Park, S.S.; Chai, Y.H. A LiDAR and IMU Integrated Indoor Navigation System for UAVs and Its Application in Real-Time Pipeline Classification. Sensors 2017, 17, 1268. [Google Scholar] [CrossRef]
- Hui, C.; Yousheng, C.; Shing, W.W. Trajectory tracking and formation flight of autonomous UAVs in GPS-denied environments using onboard sensing. In Proceedings of the Proceedings of 2014 IEEE Chinese Guidance, Navigation and Control Conference, Yantai, China, 8–10 August 2014; pp. 2639–2645. [Google Scholar]
- Qu, Y.; Yang, M.; Zhang, J.; Xie, W.; Qiang, B.; Chen, J. An Outline of Multi-Sensor Fusion Methods for Mobile Agents Indoor Navigation. Sensors 2021, 21, 1605. [Google Scholar] [CrossRef] [PubMed]
- Engel, J.; Schöps, T.; Cremers, D. LSD-SLAM: Large-Scale Direct Monocular SLAM. In Computer Vision—ECCV 2014; Springer International Publishing: Cham, Switzerland, 2014; pp. 834–849. [Google Scholar]
- Mur-Artal, R.; Montiel, J.M.M.; Tardos, J.D. ORB-SLAM: A Versatile and Accurate Monocular SLAM System. IEEE Trans. Robot. 2015, 31, 1147–1163. [Google Scholar] [CrossRef]
- Klein, G.; Murray, D. Parallel Tracking and Mapping for Small AR Workspaces. In Proceedings of the 2007 6th IEEE and ACM International Symposium on Mixed and Augmented Reality, Nara, Japan, 13–16 November 2007; pp. 225–234. [Google Scholar]
- Campos, C.; Elvira, R.; Rodríguez, J.J.G.; Montiel, J.M.M.; Tardós, J.D. ORB-SLAM3: An Accurate Open-Source Library for Visual, Visual-Inertial and Multi-Map SLAM. IEEE Trans. Robot. 2021, 37, 1874–1890. [Google Scholar] [CrossRef]
- Forster, C.; Zhang, Z.; Gassner, M.; Werlberger, M.; Scaramuzza, D. SVO: Semidirect Visual Odometry for Monocular and Multicamera Systems. IEEE Trans. Robot. 2017, 33, 249–265. [Google Scholar] [CrossRef]
- Taketomi, T.; Uchiyama, H.; Ikeda, S. Visual SLAM algorithms: A survey from 2010 to 2016. IPSJ Trans. Comput. Vis. Appl. 2017, 9, 16. [Google Scholar] [CrossRef]
- Endres, F.; Hess, J.; Sturm, J.; Cremers, D.; Burgard, W. 3-D Mapping With an RGB-D Camera. IEEE Trans. Robot. 2014, 30, 177–187. [Google Scholar] [CrossRef]
- Labbé, M.; Michaud, F. RTAB-Map as an open-source lidar and visual simultaneous localization and mapping library for large-scale and long-term online operation. J. Field Robot. 2019, 36, 416–446. [Google Scholar] [CrossRef]
- Kerl, C.; Sturm, J.; Cremers, D. Dense visual SLAM for RGB-D cameras. In Proceedings of the 2013 IEEE/RSJ International Conference on Intelligent Robots and Systems, Tokyo, Japan, 3–7 November 2013; pp. 2100–2106. [Google Scholar]
- Gallego, G.; Delbruck, T.; Orchard, G.; Bartolozzi, C.; Taba, B.; Censi, A.; Leutenegger, S.; Davison, A.J.; Conradt, J.; Daniilidis, K.; et al. Event-Based Vision: A Survey. IEEE Trans. Pattern Anal. Mach. Intell. 2022, 44, 154–180. [Google Scholar] [CrossRef]
- Kim, H.; Leutenegger, S.; Davison, A. Real-time 3D reconstruction and 6-DoF tracking with an event camera. In Proceedings of the 14th European Conference on Computer Vision, Amsterdam, The Netherlands, 11–14 October 2016; pp. 349–364. [Google Scholar] [CrossRef]
- Rebecq, H.; Horstschaefer, T.; Gallego, G.; Scaramuzza, D. EVO: A Geometric Approach to Event-Based 6-DOF Parallel Tracking and Mapping in Real Time. IEEE Robot. Autom. Lett. 2017, 2, 593–600. [Google Scholar] [CrossRef]
- Delaune, J.; Bayard, D.S.; Brockers, R. Range-Visual-Inertial Odometry: Scale Observability Without Excitation. IEEE Robot. Autom. Lett. 2021, 6, 2421–2428. [Google Scholar] [CrossRef]
- Lee, M.S.; Jung, J.H.; Kim, Y.J.; Park, C.G. Event-and Frame-Based Visual-Inertial Odometry With Adaptive Filtering Based on 8-DOF Warping Uncertainty. IEEE Robot. Autom. Lett. 2024, 9, 1003–1010. [Google Scholar] [CrossRef]
- Mueggler, E.; Gallego, G.; Rebecq, H.; Scaramuzza, D. Continuous-Time Visual-Inertial Odometry for Event Cameras. IEEE Trans. Robot. 2018, 34, 1425–1440. [Google Scholar] [CrossRef]
- Zhu, A.Z.; Atanasov, N.; Daniilidis, K. Event-Based Visual Inertial Odometry. In Proceedings of the 2017 IEEE Conference on Computer Vision and Pattern Recognition (CVPR), Honolulu, HI, USA, 21–26 July 2017; pp. 5816–5824. [Google Scholar]
- Rebecq, H.; Horstschaefer, T.; Scaramuzza, D. Real-time Visual-Inertial Odometry for Event Cameras using Keyframe-based Nonlinear Optimization. In Proceedings of the British Machine Vision Conference, London, UK, 4–7 September 2017. [Google Scholar]
- Zhou, Y.; Gallego, G.; Shen, S. Event-Based Stereo Visual Odometry. IEEE Trans. Robot. 2021, 37, 1433–1450. [Google Scholar] [CrossRef]
- Liu, Z.; Shi, D.; Li, R.; Yang, S. ESVIO: Event-Based Stereo Visual-Inertial Odometry. Sensors 2023, 23, 1998. [Google Scholar] [CrossRef] [PubMed]
- Rosten, E.; Drummond, T. Machine Learning for High-Speed Corner Detection. In Computer Vision—ECCV 2006; Springer: Berlin/Heidelberg, Germany, 2006; Volume 3951, pp. 430–443. [Google Scholar]
- Harris, C.; Stephens, M. A combined corner and edge detector. In Proceedings of the Alvey vision conference, Manchester, UK, 31 August–2 September 1988. [Google Scholar]
- Vasco, V.; Glover, A.; Bartolozzi, C. Fast event-based Harris corner detection exploiting the advantages of event-driven cameras. In Proceedings of the 2016 IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS), Daejeon, Republic of Korea, 9–14 October 2016. [Google Scholar]
- Muggler, E.; Bartolozzi, C.; Scaramuzza, D. Fast Event-based Corner Detection. In Proceedings of the British Machine Vision Conference, London, UK, 4–7 September 2017. [Google Scholar]
- Chaudhry, R.; Ravichandran, A.; Hager, G.; Vidal, R. Histograms of oriented optical flow and Binet-Cauchy kernels on nonlinear dynamical systems for the recognition of human actions. In Proceedings of the 2009 IEEE Conference on Computer Vision and Pattern Recognition, Miami, FL, USA, 20–25 June 2009. [Google Scholar]
- Clady, X.; Maro, J.-M.; Barré, S.; Benosman, R.B. A Motion-Based Feature for Event-Based Pattern Recognition. Front. Neurosci. 2016, 10, 594. [Google Scholar] [CrossRef]
- Lagorce, X.; Meyer, C.; Ieng, S.-H.; Filliat, D.; Benosman, R. Asynchronous Event-Based Multikernel Algorithm for High-Speed Visual Features Tracking. IEEE Trans. Neural Netw. Learn. Syst. 2015, 26, 1710–1720. [Google Scholar] [CrossRef]
- Zhu, A.Z.; Atanasov, N.; Daniilidis, K. Event-based feature tracking with probabilistic data association. In Proceedings of the 2017 IEEE International Conference on Robotics and Automation (ICRA), Singapore, 29 May–03 June 2017. [Google Scholar]
- Besl, P.J.; McKay, N.D. A method for registration of 3-D shapes. IEEE Trans. Pattern Anal. Mach. Intell. 1992, 14, 239–256. [Google Scholar] [CrossRef]
- Gehrig, D.; Rebecq, H.; Gallego, G.; Scaramuzza, D. EKLT: Asynchronous Photometric Feature Tracking Using Events and Frames. Int. J. Comput. Vis. 2020, 128, 601–618. [Google Scholar] [CrossRef]
- Brandli, C.; Berner, R.; Yang, M.; Liu, S.-C.; Delbruck, T. A 240 × 180 130 dB 3 µs Latency Global Shutter Spatiotemporal Vision Sensor. IEEE J. Solid-State Circuits 2014, 49, 2333–2341. [Google Scholar] [CrossRef]
- Kueng, B.; Mueggler, E.; Gallego, G.; Scaramuzza, D. Low-latency visual odometry using event-based feature tracks. In Proceedings of the 2016 IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS), Daejeon, Republic of Korea, 9–14 October 2016. [Google Scholar]
- Tedaldi, D.; Gallego, G.; Mueggler, E.; Scaramuzza, D. Feature detection and tracking with the dynamic and active-pixel vision sensor (DAVIS). In Proceedings of the 2016 Second International Conference on Event-based Control, Communication, and Signal Processing (EBCCSP), Krakow, Poland, 13–15 June 2016. [Google Scholar]
- Vidal, A.R.; Rebecq, H.; Horstschaefer, T.; Scaramuzza, D. Ultimate SLAM? Combining Events, Images, and IMU for Robust Visual SLAM in HDR and High-Speed Scenarios. IEEE Robot. Autom. Lett. 2018, 3, 994–1001. [Google Scholar] [CrossRef]
- iniVation. Dv-Mono-Vio-Sample. 2022. Available online: https://gitlab.com/inivation/dv/dv-mono-vio-sample (accessed on 21 March 2022).
- Lucus, B. An iterative registration technique with an application to stereo vision. In Proceedings of the Proceedings of the International Joint Conference on Artificial Intelligence, Vancouver, BC, Canada, 24–28 August 1981. [Google Scholar]
- Troiani, C.; Martinelli, A.; Laugier, C.; Scaramuzza, D. 2-Point-based Outlier Rejection for Camera-Imu Systems with applications to Micro Aerial Vehicles. In Proceedings of the IEEE International Conference on Robotics and Automation (ICRA), Hong Kong, China, 31 May 2014. [Google Scholar] [CrossRef]
- Leutenegger, S.; Furgale, P.; Rabaud, V.; Chli, M.; Konolige, K.; Siegwart, R. Keyframe-based visual-inertial SLAM using nonlinear optimization. In Proceedings of the Robotis Science and Systems (RSS) 2013; Robotics: Science and Systems, Munich, Germany, 24–28 June 2013. [Google Scholar] [CrossRef]
- Strasdat, H.; Montiel, J.M.M.; Davison, A.J. Real-time monocular SLAM: Why filter? In Proceedings of the 2010 IEEE International Conference on Robotics and Automation, Anchorage, AK, USA, 3–7 May 2010; pp. 2657–2664. [Google Scholar]
- Agarwal, S.; Mierle, K. Ceres Solver. 2022. Available online: http://ceres-solver.org (accessed on 13 May 2022).
- Forster, C.; Carlone, L.; Dellaert, F.; Scaramuzza, D. On-Manifold Preintegration for Real-Time Visual--Inertial Odometry. IEEE Trans. Robot. 2017, 33, 1–21. [Google Scholar] [CrossRef]
- iniVation. DAVIS 346. 2022. Available online: https://inivation.com/wp-content/uploads/2019/08/DAVIS346.pdf (accessed on 7 April 2022).
- iniVation. DV Software. 2022. Available online: https://inivation.gitlab.io/dv/dv-docs/docs/tutorial-calibration/ (accessed on 17 March 2022).
- Meier, L.; Tanskanen, P.; Heng, L.; Lee, G.H.; Fraundorfer, F.; Pollefeys, M.J.A.R. PIXHAWK: A micro aerial vehicle design for autonomous flight using onboard computer vision. Auton. Robot. 2012, 33, 21–39. [Google Scholar] [CrossRef]
U-SLAM | Proposed Approach | |||||
---|---|---|---|---|---|---|
Mean | RMSE | Max | Mean | RMSE | Max | |
X | 0.18 | 0.25 | 0.33 | 0.09 | 0.18 | 0.28 |
Y | 0.19 | 0.24 | 0.42 | 0.1 | 0.19 | 0.38 |
Z | 0.35 | 0.40 | 0.64 | 0.1 | 0.26 | 0.27 |
Total | 0.31 | 0.21 |
Event-Only Case | Frame-Only Case | Combined Case | |||||||
---|---|---|---|---|---|---|---|---|---|
Mean | RMSE | Max | Mean | RMSE | Max | Mean | RMSE | Max | |
X | 1.4 | 1.60 | 2.52 | 0.77 | 1.33 | 2.44 | −0.17 | 0.35 | 0.92 |
Y | −0.71 | 1.70 | 4.12 | −0.10 | 0.59 | 1.62 | −0.05 | 0.13 | 0.27 |
Total | 2.33 | 1.45 | 0.37 |
Event-Only Case | Frame-Only Case | COMBINED CASE | |
---|---|---|---|
X | 1.58 | 1.95 | 0.12 |
Y | −4.13 | −1.62 | −0.06 |
Total | 4.42 | 2.54 | 0.13 |
Event-Only Case | Frame-Only Case | Combined Case | |||||||
---|---|---|---|---|---|---|---|---|---|
Mean | RMSE | Max | Mean | RMSE | Max | Mean | RMSE | Max | |
X | 0.14 | 0.46 | 0.65 | 0.14 | 0.31 | 0.59 | −0.01 | 0.04 | 0.14 |
Y | −0.1 | 0.30 | 0.33 | 0.01 | 0.11 | 0.24 | 0.04 | 0.05 | 0.09 |
Total | 0.55 | 0.33 | 0.06 |
Event-Only Case | Frame-Only Case | Combined Case | |
---|---|---|---|
X | 0.30 | 0.19 | −0.04 |
Y | −0.48 | −0.36 | 0.01 |
Total | 0.57 | 0.41 | 0.04 |
Disclaimer/Publisher’s Note: The statements, opinions and data contained in all publications are solely those of the individual author(s) and contributor(s) and not of MDPI and/or the editor(s). MDPI and/or the editor(s) disclaim responsibility for any injury to people or property resulting from any ideas, methods, instructions or products referred to in the content. |
© 2024 by the authors. Licensee MDPI, Basel, Switzerland. This article is an open access article distributed under the terms and conditions of the Creative Commons Attribution (CC BY) license (https://creativecommons.org/licenses/by/4.0/).
Share and Cite
Elamin, A.; El-Rabbany, A.; Jacob, S. Event-Based Visual/Inertial Odometry for UAV Indoor Navigation. Sensors 2025, 25, 61. https://doi.org/10.3390/s25010061
Elamin A, El-Rabbany A, Jacob S. Event-Based Visual/Inertial Odometry for UAV Indoor Navigation. Sensors. 2025; 25(1):61. https://doi.org/10.3390/s25010061
Chicago/Turabian StyleElamin, Ahmed, Ahmed El-Rabbany, and Sunil Jacob. 2025. "Event-Based Visual/Inertial Odometry for UAV Indoor Navigation" Sensors 25, no. 1: 61. https://doi.org/10.3390/s25010061
APA StyleElamin, A., El-Rabbany, A., & Jacob, S. (2025). Event-Based Visual/Inertial Odometry for UAV Indoor Navigation. Sensors, 25(1), 61. https://doi.org/10.3390/s25010061