Sensors and Measurements for Unmanned Systems: An Overview
Abstract
:1. Introduction
2. Unmanned Systems
2.1. Unmanned Aerial Vehicles (UAVs)
2.2. Unmanned Ground Vehicles (UGVs)
2.3. Unmanned Surface Vehicles (USVs)
2.4. Unmanned Underwater Vehicles (UUVs)
3. Measurements for Unmanned Systems
3.1. Measurements for UAVs
3.2. Measurements for UGVs
3.3. Measurements for USVs
3.4. Measurements for UUVs
3.5. Measurements for Unmanned System Cooperation
4. Conclusions
Author Contributions
Funding
Conflicts of Interest
References
- National Institute of Standards and Technology Special Publication 1011. Autonomy Levels for Unmanned Systems (ALFUS) Framework, Volume I: Terminology, Version 1.1. September 2004. Available online: https://www.nist.gov/system/files/documents/el/isd/ks/NISTSP_1011_ver_1-1.pdf (accessed on 1 December 2020).
- Electronics Maker, Unmanned Vehicles—No pilot/driver on Board. 5 August 2014. Available online: https://electronicsmaker.com/unmanned-vehicles-no-pilotdriver-on-board#:~:text=An%20unmanned%20vehicle%20is%20the,including%20reconnaissance%20and%20attack%20roles (accessed on 1 December 2020).
- Daponte, P.; De Vito, L.; Mazzilli, G.; Picariello, F.; Rapuano, S.; Riccio, M. Metrology for drone and drone for metrology: Measurement systems on small civilian drones. In Proceedings of the 2015 IEEE Metrology for Aerospace (MetroAeroSpace), Benevento, Italy, 4–5 June 2015; pp. 306–311. [Google Scholar]
- Narayanan, A.; Rajeshirke, P.; Sharma, A.; Pestonjamasp, K. Survey of the emerging bio-inspired Unmanned Aerial Underwater Vehicles. In Proceedings of the 2nd International Conference on Emerging trends in Manufacturing, Engines and Modelling (ICEMEM -2019), Mumbai, India, 23–24 December 2019; Volume 810. [Google Scholar]
- Sándor, Z. Challenges Caused by the Unmanned Aerial Vehicle in the Air Traffic Management. Period. Polytech. Transp. Eng. 2017, 47, 96–105. [Google Scholar] [CrossRef] [Green Version]
- Shakhatreh, H.; Sawalmeh, A.H.; Al-Fuqaha, A.; Dou, Z.; Almaita, E.; Khalil, I.; Othman, N.S.; Khreishah, A.; Guizani, M. Unmanned Aerial Vehicles (UAVs): A Survey on Civil Applications and Key Research Challenges. IEEE Access 2019, 7, 48572–48634. [Google Scholar] [CrossRef]
- Dilipraj, E. Technologies: Pragmatic Role of Robotics, 3D Printing and Supercomputers in Future Wars. In Asian Defence Review 2016; Knowledge World Publishers: New Delhi, India, 2016; ISBN 978-9383-649-90-7. [Google Scholar]
- Odedra, S. Using unmanned ground vehicle performance measurements as a unique method of terrain classification. In Proceedings of the 2011 IEEE/RSJ International Conference on Intelligent Robots and Systems, San Francisco, CA, USA, 25–30 September 2011; pp. 286–291. [Google Scholar]
- Philips, A. Air- and Ground- Based Autonomous Vehicles Working Together, Dronelife.com. October 2014. Available online: https://dronelife.com/2014/10/02/air-ground-based-autonomous-vehicles-working-together/ (accessed on 1 December 2020).
- Ilas, C. Electronic sensing technologies for autonomous ground vehicles: A review. In Proceedings of the 8th International Symposium on Advanced Topics in Electrical Engineering (ATEE), Bucharest, Romania, 23–25 May 2013; pp. 1–6. [Google Scholar]
- Autonomous and Unmanned Surface Vessels for Marine Monitoring, Surveying and Rescue. Available online: https://www.unmannedsystemstechnology.com/company/oceanalpha/ (accessed on 1 December 2020).
- Vasudev, K.L. Review of Autonomous Underwater Vehicles. In Autonomous Vehicles; Dekoulis, G., Ed.; IntechOpen: London, UK, 2020; ISBN 978-1-83968-191-2. [Google Scholar]
- Heo, J.; Kim, J.; Kwon, Y. Technology Development of Unmanned Underwater Vehicles (UUVs). J. Comput. Commun. 2017, 5, 28–35. [Google Scholar] [CrossRef] [Green Version]
- Ho, G.; Pavlovic, N.; Arrabito, R. Human Factors Issues with Operating Unmanned Underwater Vehicles. Proc. Hum. Factors Ergon. Soc. Ann. Meet. 2011, 55, 429–433. [Google Scholar] [CrossRef]
- Andžāns, M.; Bērziņš, J.; Durst, J.; Maskaliunaite, A.; Nikitenko, A.; Ķiploks, J.; Rogers, J.; Romanovs, U.; Sliwa, Z.; Väärsi, K.; et al. Digital Infantry Battlefield Solution. Introduction to Ground Robotics, DIBS Project, Part I; Romanovs, U., Ed.; Milrem: Helsinki, Finland, 2016; ISBN 978-9984-583-92-1. [Google Scholar]
- Lagkas, T.; Argyriou, V.; Bibi, S.; Sarigiannidis, P. UAV IoT Framework Views and Challenges: Towards Protecting Drones as “Things”. Sensors 2018, 18, 4015. [Google Scholar] [CrossRef] [PubMed] [Green Version]
- Zhu, K.; Liu, X.; Pong, P.W.T. Performance Study on Commercial Magnetic Sensors for Measuring Current of Unmanned Aerial Vehicles. IEEE Trans. Instrum. Meas. 2019, 69, 1397–1407. [Google Scholar] [CrossRef]
- De Wilde, W.; Cuypers, G.; Sleewaegen, J.-M.; Deurloo, R.; Bougard, B. GNSS Interference in Unmanned Aerial Systems. In Proceedings of the 29th International Technical Meeting of The Satellite Division of the Institute of Navigation (ION GNSS+ 2016), Portland, OR, USA, 12–16 September 2016; pp. 1465–1476. [Google Scholar]
- Galtarossa, L.; Navilli, L.F.; Chiaberge, M. Visual-Inertial Indoor Navigation Systems and Algorithms for UAV Inspection Vehicles. In Industrial Robotics—New Paradigms; IntechOpen: London, UK, 2020; pp. 1–16. [Google Scholar]
- Pérez, M.C.; Gualda, D.; Vicente, J.; Villadangos, J.M.; Ureña, J. Review of UAV positioning in indoor environments and new proposal based on US measurements. In Proceedings of the 10th International Conference on Indoor Positioning and Indoor Navigation—Work-in-Progress Papers (IPIN-WiP 2019), Pisa, Italy, 30 September–3 October 2019; pp. 267–274. [Google Scholar]
- Paredes, J.A.; Álvarez, F.J.; Aguilera, T.; Villadangos, J.M. 3D Indoor Positioning of UAVs with Spread Spectrum Ultrasound and Time-of-Flight Cameras. Sensors 2017, 18, 89. [Google Scholar] [CrossRef] [PubMed] [Green Version]
- Tiemann, J.; Wietfeld, C. Scalable and precise multi-UAV indoor navigation using TDOA-based UWB localization. In Proceedings of the 2017 International Conference on Indoor Positioning and Indoor Navigation (IPIN), Sapporo, Japan, 18–21 September 2017; pp. 1–7. [Google Scholar]
- Stojkoska, B.R.; Palikrushev, J.; Trivodaliev, K.; Kalajdziski, S. Indoor localization of unmanned aerial vehicles based on RSSI. In Proceedings of the IEEE EUROCON 2017—17th International Conference on Smart Technologies, Ohrid, Macedonia, 6–8 July 2017; pp. 120–125. [Google Scholar]
- Fu, C.; Carrio, A.; Campoy, P. Efficient visual odometry and mapping for Unmanned Aerial Vehicle using ARM-based stereo vision pre-processing system. In Proceedings of the 2015 International Conference on Unmanned Aircraft Systems (ICUAS), Denver, CO, USA, 9–12 June 2015; pp. 957–962. [Google Scholar]
- Perez-Grau, F.J.; Caballero, F.; Merino, L.; Viguria, A. Multi-modal mapping and localization of unmanned aerial robots based on ultra-wideband and RGB-D sensing. In Proceedings of the 2017 IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS), Vancouver, BC, Canada, 24–28 September 2017; pp. 3495–3502. [Google Scholar]
- Khithov, V.; Petrov, A.; Tischenko, I.; Yakolev, K. Towards Autonomous UAV Landing Based on Infrared Beacons and Particle Filtering. In Advances in Intelligent Systems and Computing. Robot Intelligence Technology and Applications 4: Results from the 4th International Conference on Robot Intelligence Technology and Applications; Springer: Berlin, Germany, 2015; ISBN 978-3-319-31291-0. [Google Scholar]
- Guo, D.; Zhong, M.; Zhou, D. Multisensor Data-Fusion-Based Approach to Airspeed Measurement Fault Detection for Unmanned Aerial Vehicles. IEEE Trans. Instrum. Meas. 2018, 67, 317–327. [Google Scholar] [CrossRef]
- Waliszkiewicz, M.; Wojtowicz, K.; Rochala, Z.; Balestrieri, E. The Design and Implementation of a Custom Platform for the Experimental Tuning of a Quadcopter Controller. Sensors 2020, 20, 1940. [Google Scholar] [CrossRef]
- Brzozowski, B.; Daponte, P.; De Vito, L.; Lamonaca, F.; Picariello, F.; Pompetti, M.; Tudosa, I.; Wojtowicz, K. A re-mote-controlled platform for UAS testing. IEEE Aerosp. Electron. Syst. Mag. 2018, 33, 48–56. [Google Scholar] [CrossRef]
- Smith, B.; Stark, B.; Zhao, T.; Chen, Y.Q. An outdoor scientific data UAS ground trothing test site. In Proceedings of the International Conference on Unmanned Aircraft Systems (ICUAS), Denver, CO, USA, 9–12 June 2015; pp. 436–443. [Google Scholar]
- Daponte, P.; Lamonaca, F.; Picariello, F.; Riccio, M.; Pompetti, L.; Pompetti, M. A measurement system for testing light re-motely piloted aircraft. In Proceedings of the IEEE International Workshop on Metrology for AeroSpace (MetroAeroSpace), Padua, Italy, 21–23 June 2017; pp. 21–23. [Google Scholar]
- Faundes, N.; Wunsch, V.; Hohnstein, S.; Glass, B.; Vetter, M. Research paper on the topic of different UAV drive train qualification and parameter sets. In Proceedings of the 32nd Digital Avionics Systems Conference (DASC), East Syracuse, NY, USA, 5–10 October 2013. [Google Scholar]
- Daponte, P.; De Vito, L.; Lamonaca, F.; Picariello, F.; Riccio, M.; Rapuano, S.; Pompetti, L.; Pompetti, M. DronesBench: An innovative bench to test drones. IEEE Instrum. Meas. Mag. 2017, 20, 8–15. [Google Scholar] [CrossRef]
- Ren, Z.; Li, H.; Zhao, Y.; Zhang, D. The study of wind resistance testing equipment for unmanned helicopter. In Proceedings of the 2nd IEEE Advanced Information Technology, Electronic and Automation Control Conference (IAEAC), Chongqing, China, 25–26 March 2017; pp. 1593–1597. [Google Scholar] [CrossRef]
- Jimenez-Gonzalez, A.; Dios, J.R.M.-D.; Ollero, A. Testbeds for ubiquitous robotics: A survey. Robot. Auton. Syst. 2013, 61, 1487–1501. [Google Scholar] [CrossRef]
- Michael, N.; Mellinger, D.; Lindsey, Q.; Kumar, V. The GRASP Multiple Micro-UAV Testbed. IEEE Robot. Autom. Mag. 2010, 17, 56–65. [Google Scholar] [CrossRef]
- Oh, H.; Won, D.-Y.; Huh, S.-S.; Shim, D.H.; Tahk, M.-J.; Tsourdos, A. Indoor UAV control using multi-camera visual feed-back. J. Intell. Robot. Syst. 2011, 61, 57–84. [Google Scholar] [CrossRef] [Green Version]
- Tomer, S.; Kitts, C.; Neumann, M.; McDonald, R.; Bertram, S.; Cooper, R.; Demeter, M.; Guthrie, E.; Head, E.; Kashikar, A.; et al. A low-cost indoor testbed for multirobot adaptive navigation research. In Proceedings of the IEEE Aerospace Conference, Big Sky, MT, USA, 3–10 March 2018. [Google Scholar]
- Deng, H.; Fu, Q.; Quan, Q.; Yang, K.; Cai, K.-Y. Indoor Multi-Camera-Based Testbed for 3-D Tracking and Control of UAVs. IEEE Trans. Instrum. Meas. 2020, 69, 3139–3156. [Google Scholar] [CrossRef]
- Balestrieri, E.; Daponte, P.; De Vito, L.; Picariello, F.; Tudosa, I. Guidelines for an Unmanned Aerial Vehicle-based measurement instrument design. IEEE Instrum. Meas. Mag. 2021. [Google Scholar]
- Daponte, P.; De Vito, L.; Mazzilli, G.; Picariello, F.; Rapuano, S. A height measurement uncertainty model for archaeological surveys byaerial photogrammetry. Measurement 2017, 98, 192–198. [Google Scholar] [CrossRef]
- Jang, G.; Kim, J.; Yu, J.-K.; Kim, H.-J.; Kim, Y.; Kim, D.-W.; Kim, K.-H.; Lee, C.W.; Chung, Y.S. Review: Cost-Effective Unmanned Aerial Vehicle (UAV) Platform for Field Plant Breeding Application. Remote. Sens. 2020, 12, 998. [Google Scholar] [CrossRef] [Green Version]
- Adamopoulos, E.; Rinaudo, F. UAS-Based Archaeological Remote Sensing: Review, Meta-Analysis and State-of-the-Art. Drones 2020, 4, 46. [Google Scholar] [CrossRef]
- Chand, B.N.; Mahalakshmi, P.; Naidu, V.P.S. Sense and avoid technology in unmanned aerial vehicles: A review. In Proceedings of the 2017 International Conference on Electrical, Electronics, Communication, Computer, and Optimization Techniques (ICEECCOT), Mysuru, India, 15–16 December 2017; pp. 512–517. [Google Scholar]
- Liu, O.; Yuan, S.; Li, Z. A Survey on Sensor Technologies for Unmanned Ground Vehicles. In Proceedings of the 2020 3rd International Conference on Unmanned Systems (ICUS), Institute of Electrical and Electronics Engineers (IEEE), Harbin, China, 24–25 November 2020; pp. 638–645. [Google Scholar]
- Demetriou, G.A. A Survey of Sensors for Localization of Unmanned Ground Vehicles (UGVs). In Proceedings of the Inter-national Conference on Artificial Intelligence (ICAI 2006), Las Vegas, NV, USA, 26–29 June 2006. [Google Scholar]
- National Research Council. Technology Development for Army Unmanned Ground Vehicles; National Academies Press: Cambridge, MA, USA, 2002. [Google Scholar]
- Trentini, M.; Beckman, B.; Collier, J.; Digney, B.; Vincent, I. Intelligent Mobility Research at Defence R&D Canada for Au-tonomous UGV Mobility in Complex Terrain. In Proceedings of the Platform Innovations and System Integration for Un-manned Air, Land and Sea Vehicles (AVT-SCI Joint Symposium) Meeting Proceedings RTO-MP-AVT-146, Florence, Italy, 14–18 May 2007; Available online: http://www.rto.nato.int/abstracts.asp (accessed on 1 December 2020).
- Papadakis, P. Terrain traversability analysis methods for unmanned ground vehicles: A survey. Eng. Appl. Artif. Intell. 2013, 26, 1373–1385. [Google Scholar] [CrossRef] [Green Version]
- Brown, M.; Fieldhouse, K.; Swears, E.; Tunison, P.; Romlein, A.; Hoogs, A. Multi-Modal Detection Fusion on a Mobile UGV for Wide-Area, Long-Range Surveillance. In Proceedings of the 2019 IEEE Winter Conference on Applications of Computer Vision (WACV), Waikoloa Village, HI, USA, 7–11 January 2019; pp. 1905–1913. [Google Scholar]
- Yang, S.; Lho, H.; Song, B. Sensor fusion for obstacle detection and its application to an unmanned ground vehicle. In Proceedings of the ICCAS-SICE, Fukuoka, Japan, 18–21 August 2009; pp. 1365–1369. [Google Scholar]
- Yoon, S.; Bostelman, R. Analysis of Automatic through Autonomous—Unmanned Ground Vehicles (A-UGVs) Towards Per-formance Standards. In Proceedings of the IEEE International Symposium on Robotic and Sensors Environments (ROSE). Ottawa, ONT, Canada, 17–18 June 2019. [Google Scholar]
- Peynot, T.; Scheding, S.; Terho, S. The Marulan Data Sets: Multi-sensor Perception in a Natural Environment with Challenging Conditions. Int. J. Robot. Res. 2010, 29, 1602–1607. [Google Scholar] [CrossRef] [Green Version]
- Zhang, K.; Yang, Y.; Fu, M.; Wang, M. Traversability Assessment and Trajectory Planning of Unmanned Ground Vehicles with Suspension Systems on Rough Terrain. Sensors 2019, 19, 4372. [Google Scholar] [CrossRef] [Green Version]
- Mengoli, D.; Tazzari, R.; Marconi, L. Autonomous Robotic Platform for Precision Orchard Management: Architecture and Software Perspective. In Proceedings of the 2020 IEEE International Workshop on Metrology for Agriculture and Forestry (MetroAgriFor), Trento, Italy, 4–6 November 2020; pp. 303–308. [Google Scholar]
- Ringdahl, O.; Kurtser, P.; Edan, Y. Evaluation of approach strategies for harvesting robots: Case study of sweet pepper harvesting. J. Intell. Robot. Syst. 2018, 95, 149–164. [Google Scholar] [CrossRef] [Green Version]
- Bulanon, D.; Burks, T.; Alchanatis, V. Study on temporal variation in citrus canopy using thermal imaging for citrus fruit detection. Biosyst. Eng. 2008, 101, 161–171. [Google Scholar] [CrossRef]
- Liu, Z.; Zhang, Y.; Yu, X.; Yuan, C. Unmanned surface vehicles: An overview of developments and challenges. Annu. Rev. Control. 2016, 41, 71–93. [Google Scholar] [CrossRef]
- Bovcon, B.; Mandeljc, R.; Pers, J.; Kristan, M. Improving vision-based obstacle detection on USV using inertial sensor. In Proceedings of the 10th International Symposium on Image and Signal Processing and Analysis, Ljubljana, Slovenia, 18–20 September 2017; pp. 1–6. [Google Scholar]
- Xinchi, T.; Huajun, Z.; Wenwen, C.; Peimin, Z.; Zhiwen, L.; Kai, C. A Research on Intelligent Obstacle Avoidance for Unmanned Surface Vehicles. In Proceedings of the 2018 Chinese Automation Congress (CAC), Xi'an, China, 30 November–2 December 2018; pp. 1431–1435. [Google Scholar]
- Yang, Y.; Shuai, C.; Fan, G. The Technological Development and Prospect of Unmanned Surface Vessel. In Proceedings of the 2019 IEEE 3rd Advanced Information Management, Communicates, Electronic and Automation Control Conference (IMCEC), Chongqing, China, 11–13 October 2019; pp. 717–722. [Google Scholar]
- Stateczny, A.; Gierski, W. The concept of anti-collision system of autonomous surface vehicle. E3S Web Conf. 2018, 63, 00012. [Google Scholar] [CrossRef]
- Kolev, G.; Solomevich, E.; Rodionova, E.; Kopets, E.; Rybin, V. Sensor subsystem design for small unmanned surface vehicle. In Proceedings of the 3rd International Conference on Information Processing and Control Engineering, Moscow, Russian, 4–7 August 2019; Volume 630, p. 012022. [Google Scholar]
- Giordano, F.; Mattei, G.; Parente, C.; Peluso, F.; Santamaria, R. Integrating Sensors into a Marine Drone for Bathymetric 3D Surveys in Shallow Waters. Sensors 2015, 16, 41. [Google Scholar] [CrossRef] [Green Version]
- Battelle Report. Capabilities and Uses of Sensor-Equipped Ocean Vehicles for Subsea and Surface Detection and Tracking of Oil Spills, 2014. Available online: https://www.ipieca.org/media/3693/battelle-capabilities-and-uses-of-sensor-equipped-ocean-vehicles-for-subsea-and-surface.pdf (accessed on 20 January 2021).
- Martin, B.; Tarraf, D.; Whitmore, T.; Deweese, J.; Kenney, C.; Schmid, J.; DeLuca, P. Advancing Autonomous Systems: An Analysis of Current and Future Technology for Unmanned Maritime Vehicles; RAND Corporation: Santa Monica, CA, USA, 2019. [Google Scholar]
- Bernalte, P.; Papaelias, M.; Marquez, F.P.G. Autonomous underwater vehicles: Instrumentation and measurements. IEEE Instrum. Meas. Mag. 2020, 23, 105–114. [Google Scholar] [CrossRef]
- Wang, X. Active Fault Tolerant Control for Unmanned Underwater Vehicle with Sensor Faults. IEEE Trans. Instrum. Meas. 2020, 69, 9485–9495. [Google Scholar] [CrossRef]
- Lin, C.; Wang, H.; Fu, M.; Yuan, J.; Gu, J. A Gated Recurrent Unit-Based Particle Filter for Unmanned Underwater Vehicle State Estimation. IEEE Trans. Instrum. Meas. 2021, 70, 1–12. [Google Scholar] [CrossRef]
- Petillot, Y.R.; Antonelli, G.; Casalino, G.; Ferreira, F. Underwater Robots: From Remotely Operated Vehicles to Intervention-Autonomous Underwater Vehicles. IEEE Robot. Autom. Mag. 2019, 26, 94–101. [Google Scholar] [CrossRef]
- Yazdani, A.; Sammut, K.; Yakimenko, O.; Lammas, A. A survey of underwater docking guidance systems. Robot. Auton. Syst. 2020, 124, 103382. [Google Scholar] [CrossRef]
- González-García, J.; Gómez-Espinosa, A.; Cuan-Urquizo, E.; García-Valdovinos, L.G.; Salgado-Jiménez, T.; Cabello, J.A.E. Autonomous Underwater Vehicles: Localization, Navigation, and Communication for Collaborative Missions. Appl. Sci. 2020, 10, 1256. [Google Scholar] [CrossRef] [Green Version]
- Zhang, T.; Liu, B.; Liu, X. Advanced Mapping of the Seafloor Using Sea Vehicle Mounted Sounding Technologies. In Earth Crust; Nawaz, M., Sattar, F., Kundu, S.N., Eds.; Intechopen: London, UK, 2019; ISBN 978-1-78984-060-5. [Google Scholar]
- Marini, S.; Gjeci, N.; Govindaraj, S.; But, A.; Sportich, B.; Ottaviani, E.; Márquez, F.P.G.; Sanchez, P.J.B.; Pedersen, J.; Clausen, C.V.; et al. ENDURUNS: An Integrated and Flexible Approach for Seabed Survey Through Autonomous Mobile Vehicles. J. Mar. Sci. Eng. 2020, 8, 633. [Google Scholar] [CrossRef]
- Liang, X.; Chen, G.; Zhao, S.; Xiu, Y. Moving target tracking method for unmanned aerial vehicle/unmanned ground vehicle heterogeneous system based on AprilTags. Meas. Control. 2020, 53, 427–440. [Google Scholar] [CrossRef] [Green Version]
- Lazna, T.; Gabrlik, P.; Jilek, T.; Zalud, L. Cooperation between an unmanned aerial vehicle and an unmanned ground vehicle in highly accurate localization of gamma radiation hotspots. Int. J. Adv. Robot. Syst. 2018, 15, 1–16. [Google Scholar] [CrossRef] [Green Version]
- Chen, M.; Xiong, Z.; Liu, J.; Wang, R.; Xiong, J. Cooperative navigation of unmanned aerial vehicle swarm based on cooperative dilution of precision. Int. J. Adv. Robot. Syst. 2020, 17, 1729881420932717. [Google Scholar] [CrossRef]
- Coppola, M.; McGuire, K.N.; De Wagter, C.; De Croon, G.C.H.E. A Survey on Swarming with Micro Air Vehicles: Fundamental Challenges and Constraints. Front. Robot. AI 2020, 7, 18. [Google Scholar] [CrossRef] [PubMed] [Green Version]
- Achtelik, M.; Achtelik, M.; Brunet, Y.; Chli, M.; Chatzichristofis, S.; Decotignie, J.-D.; Doth, K.-M.; Fraundorfer, F.; Kneip, L.; Gurdan, D.; et al. SFly: Swarm of micro flying robots. In Proceedings of the 2012 IEEE/RSJ International Conference on Intelligent Robots and Systems, Vilamoura-Algarve, Portugal, 7–12 October 2012; pp. 2649–2650. [Google Scholar]
- Saska, M.; Vakula, J.; Preucil, L. Swarms of micro aerial vehicles stabilized under a visual relative localization. In Proceedings of the 2014 IEEE International Conference on Robotics and Automation (ICRA), Hong Kong, China, 31 May–7 June 2014; pp. 3570–3575. [Google Scholar]
- Saska, M. MAV-swarms: Unmanned aerial vehicles stabilized along a given path using onboard relative localization. In Proceedings of the 2015 International Conference on Unmanned Aircraft Systems (ICUAS), Denver, CO, USA, 9–12 June 2015; pp. 894–903. [Google Scholar]
- Preiss, J.A.; Honig, W.; Sukhatme, G.S.; Ayanian, N. Crazyswarm: A large nano-quadcopter swarm. In Proceedings of the 2017 IEEE International Conference on Robotics and Automation (ICRA), Singapore, 29 May–3 June 2017; pp. 3299–3304. [Google Scholar]
- Faigl, J.; Krajnik, T.; Chudoba, J.; Preucil, L.; Saska, M. Low-cost embedded system for relative localization in robotic swarms. In Proceedings of the 2013 IEEE International Conference on Robotics and Automation, Karlsruhe, Germany, 6–10 May 2013; pp. 993–998. [Google Scholar]
- Krajník, T.; Nitsche, M.; Faigl, J.; Vaněk, P.; Saska, M.; Přeučil, L.; Duckett, T.; Mejail, M. A Practical Multirobot Localization System. J. Intell. Robot. Syst. 2014, 76, 539–562. [Google Scholar] [CrossRef] [Green Version]
- Roelofsen, S.; Gillet, D.; Martinoli, A. Reciprocal collision avoidance for quadrotors using on-board visual detection. In Proceedings of the 2015 IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS), Hamburg, Germany, 28 September–2 October 2015; pp. 4810–4817. [Google Scholar]
- Teixeira, L.; Maffra, F.; Moos, M.; Chli, M. VI-RPE: Visual-Inertial Relative Pose Estimation for Aerial Vehicles. IEEE Robot. Autom. Lett. 2018, 3, 2770–2777. [Google Scholar] [CrossRef] [Green Version]
- Walter, V.; Staub, N.; Saska, M.; Franchi, A. Mutual Localization of UAVs based on Blinking Ultraviolet Markers and 3D Time-Position Hough Transform. In Proceedings of the 2018 IEEE 14th International Conference on Automation Science and Engineering (CASE), Munich, Germany, 20–24 August 2018; pp. 298–303. [Google Scholar]
- Basiri, M.; Schill, F.; Lima, P.; Floreano, D. On-Board Relative Bearing Estimation for Teams of Drones Using Sound. IEEE Robot. Autom. Lett. 2016, 1, 820–827. [Google Scholar] [CrossRef] [Green Version]
- Roberts, J.F.; Stirling, T.; Zufferey, J.-C.; Floreano, D. 3-D relative positioning sensor for indoor flying robots. Auton. Robot. 2012, 33, 5–20. [Google Scholar] [CrossRef] [Green Version]
- Opromolla, R.; Esposito, G.; Fasano, G. In-flight estimation of magnetic biases on board of small UAVs exploiting cooperation. In Proceedings of the 2019 IEEE 5th International Workshop on Metrology for AeroSpace (MetroAeroSpace), Torino, Italy, 19–21 June 2019; pp. 655–660. [Google Scholar]
- Rambabu, R.; Bahiki, M.R.; Ali, S.M.A. Relative Position-Based Collision Avoidance System for Swarming Uavs Using Multi-Sensor Fusion. ARPN J. Eng. Appl. Sci. 2015, 10, 10012–10017. [Google Scholar]
- Walter, V.; Saska, M.; Franchi, A. Fast Mutual Relative Localization of UAVs using Ultraviolet LED Markers. In Proceedings of the 2018 International Conference on Unmanned Aircraft Systems (ICUAS), Dallas, TX, USA, 12–15 June 2018; pp. 1217–1226. [Google Scholar]
- Daponte, P.; De Vito, L.; Lamonaca, F.; Picariello, F.; Rapuano, S.; Riccio, M. Measurement science and education in the drone times. In Proceedings of the 2017 IEEE International Instrumentation and Measurement Technology Conference (I2MTC), Torino, Italy, 22–25 May 2017; pp. 1–6. [Google Scholar]
- This Anchorage High School is Bringing Drones into the Classroom. Available online: http://www.adn.com/alaska-news/education/2016/05/22/this-anchorage-high-school-is-bringing-drones-into-the-classroom/ (accessed on 30 December 2020).
- 20 Best Drone Training Colleges. Available online: http://successfulstudent.org/15-best-drone-training-colleges/?nabw=1&utm_referrer=https://www.google.it/ (accessed on 30 December 2020).
- University of Southampton. Available online: https://www.southampton.ac.uk/courses/postgraduate-taught (accessed on 30 December 2020).
- UAS-OPS—Professional Operations of Unmanned Aircraft Systems (UAS). Available online: https://jaato.com/courses/552/uas-ops-professional-operations-of-unmanned-aircraft-systems-uas/ (accessed on 30 December 2020).
- UTC dji Academy. Available online: https://www.uastc.com/nl/ (accessed on 30 December 2020).
- Unmanned Vehicle University. Available online: http://www.uxvuniversity.com/doctorate-degree/ (accessed on 30 December 2020).
- Best Master’s Degrees in UAV Engineering 2021. Available online: https://www.masterstudies.com/Masters-Degree/UAV-Engineering/ (accessed on 30 December 2020).
- Unmanned Underwater Vehicles Short Course. Available online: https://www.arl.psu.edu/uuvsc (accessed on 30 December 2020).
- Unmanned Maritime Systems (UMS) Certificate Programs. Available online: https://www.usm.edu/ocean-science-engineering/unmanned-maritime-systems-ums-certification.php (accessed on 30 December 2020).
- Embry-Ridddle Aeronautical University. Available online: https://erau.edu/degrees/master/u (accessed on 30 December 2020).
Used Technology | Difference between the Estimated Position and the Actual One | Advantages and Drawbacks |
---|---|---|
Camera, US | cm (in static test positions) | Good precision/Doppler effect, acoustic noise |
IMU, Optical, US, GNSS, UWB | 10 cm | Good accuracy/High Cost |
WiFi | 1 m | Reuse infrastructure/Tradeoff between accuracy and complexity |
Stereo Vision | 10 cm | Size and weight/Light condition influence |
UWB, RGBD Sensor | 20 cm | Good accuracy/high cost |
Infrared, IMU, computer vision | 10 cm–4 m | Easy to deploy/sunlight interference |
UAV Application: Agriculture Monitoring and Management of Crops | ||
Functions | Sensors | Specifications |
Plant coverage, plant height, and color indices | RGB camera | Spatial resolution (1280 × 720); (1920 × 1080); (2048 × 1152); (3840 × 2160); (4000 × 3000); (4000 × 3000); (4056 × 2282); (4160 × 2340); (4608 × 3456); (5344 × 4016); (5472 × 3648); (5472 × 3648) |
Vegetation indices; physiological status of the plant | Multispectral camera | Spatial resolution (1080 × 720); (1248 × 950); (1280 × 960); (2064 × 1544) Weight from 30 g to 420 g Frame rate from 1 fps to 30 fps |
Plant surface temperature; Crop Water Stress Index | Thermal camera | Spatial resolution (336 × 256); (640 × 512); (1920 × 1080) Weight from 92 g to 370 g Spectral range from 7.5 μm to 14 μm Operating Temperature Range (°C) from −40 to 550 |
UAV Application: Archaeology exploratory survey and aerial reconnaissance | ||
Functions | Sensors | Specifications |
Detailed digital terrain and surface models; penetrating vegetated landscapes | LiDAR | Range from 100 m to 340 m FOV (deg) (Vertical) 20, 30, 40 (Horizontal) 360 Accuracy from 1 to 3 cm Weight from 0.59 kg to 3.5 kg |
Landscape matrix contrast detection | Multispectral camera | Resolution 1280 × 960; 1280 × 1024; 2048 × 1536; 2064 × 1544 Spectral range Blue, Red, Green, Near-infrared, Red-edge, long-wave infrared |
Landscape matrix contrast detection | Hyperspectral camera | Resolution 640 × 640; 640 × 512;1024 × 1024; 2048 × 1088 Spectral range from 380 nm to 13,400 nm Weight from 0.45 kg to 2 kg |
Detection of measurably distinct variations between the features and their soil matrix | Thermal camera | Resolution 160 × 120; 320 × 240; 320 × 256; 336 × 256; 382 × 288; 640 × 480; 640 × 512 Accuracy (°C) from 1 to 5 Spectral range from 7 nm to 14 nm Weight from 39 g to 588 g |
UAV Application: General | ||
Functions | Sensors | Specifications |
Sensing and avoiding capabilities | Radar | Detection range 35 km |
LiDAR | Detection range 15 km | |
Electro-optic sensor | Detection range 20 km |
LiDAR | Radar | Utrasonic | Monocular Camera | Stereo Camera | Omni Direction Camera | Infrared Camera | Event Camera |
---|---|---|---|---|---|---|---|
High accuracy | Medium accuracy | Low accuracy | High accuracy | High accuracy | High accuracy | Low accuracy | Low accuracy |
Range <200 m | Range <250 m | Range <5 m | Range operational environment dependent | Range <100 m | Range operational environment dependent | Range operational environment dependent | Range operational environment dependent |
Affected by weather | __ | __ | Affected by weather and illumination | Affected by weather and illumination | Affected by weather and illumination | Affected by weather | Affected by weather and illumination |
Large size | Small Size | Small size | Small size | Medium size | Small Size | Small size | Small size |
High cost | Medium cost | Low cost | Low cost | Low Cost | Low Cost | Low cost | Low cost |
UGV Application: Challenging Environment Outdoor Applications | ||
Functions | Sensors | Specifications |
Reliable perception, obstacle detection | Radar | Maximum range 40 m Range resolution 0.2 m Horizontal FOV 360° Angular resolution ≈ 1.9° |
Visual camera | Image size 1340 × 1024 FOV (horiz.) 68.2°; (vert.) 53.8° | |
Infrared camera | Image size 640 × 480 FOV (horiz.) 35.8°; (vert.) 27.1° | |
UGV Application: Application operating on complex terrain surface | ||
Functions | Sensors | Specifications |
Traversability assessment | LiDAR | Distance accuracy < 2 cm Measurement range from 50 m to 120 m FOV (vert.) +2.0° to −24.8°; +10.7° to 30.7° vert. angular resolution 0.4°; 1.33° horiz. angular resolution 0.09°; 0.16° |
UGV Application: Agriculture | ||
Functions | Sensors | Specifications |
To detect rows of trees, obstacles, branches, other orchard features, ditches, tranches, other non-transitable areas | LiDAR | Range up to 100 m Accuracy 3 cm FOV (horiz.) 360°; (vert.) 30° vert. angular resolution 2° horiz. angular resolution 0.1°–0.4° |
Automatic registration of fruits locations | RGB camera | Resolution 1600 × 1200 Frame rate 35.6 fps Color depth 12 bit |
Fruit detection | Thermal camera | Resolution 320 × 240 Spectral range 7.5–13 μm Resolution 0.05 °C |
Sensors | Advantages | Limitations |
---|---|---|
Radar | Long detecting range, nearly all-weather and broad-area imagery, high depth resolution and accuracy | Skewed data in fast turning maneuvers, limited small and dynamic target detection capability |
LiDAR | Good at near range obstacle detection, high depth resolution and accuracy | Sensor noise and calibration errors, sensitive to environment and USV motion |
Sonar | No visual restrictions, high depth resolution and accuracy | Limited detecting range in each scanning, impressionable to the noise from near surface |
Visual sensor | High lateral and temporal resolution, simplicity and low weight | Low depth resolution and accuracy, challenge to real-time implementation, affected by light and weather conditions |
Infrared sensor | Applicable in dark conditions, low power consumption | Indoor or evening use only, affected by interference and distance |
IMU | Small size, low cost and power consumption | Affected by accumulated errors and magnetic environment |
GPS | Small size, low cost and power consumption | Affected by loss or jamming of signals and magnetic environment |
USV Application: General | ||
Functions | Sensors | Specifications |
Anti-collision | Radar | Range from 1.5 to 340 m Distance measurement accuracy 0.25 m FOV (horiz.) 100°/(vert.) 16° Weight 1290 g Size 230 × 160 mm |
LiDAR | Range 100 m FOV (horiz.) 30°/(vert.) 360° Weight 830 g Size Diam. 103 mm × height 72 mm | |
Visual camera | Resolution 1920 × 1080; 3840 × 2160 Frame rate 60 fps; 30 fps Size 140 × 98 × 132 mm Weight 461 g | |
USV Application: Requiring small low cost USV | ||
Functions | Sensors | Specifications |
Exploration | LiDAR | Range 5 cm to 40 m Accuracy +/− 2.5 cm at distances > 1 m Size 40 × 48 × 20 mm Weight 22 g |
Visual camera | Maximum resolution 3280 × 2464 Frame rate from 30 fps to 90 fps Size 25 × 20 × 9 mm | |
USV Application: Archaeology | ||
Functions | Sensors | Specifications |
Bathymetric survey in shallow waters | Sonar | Depth range 0.30 m to 75.00 m Accuracy +/− 0.025 m (RMS) Operating temperature 0 to 45 °C Size 100 mm × 220 mm × 45 mm Weight 0.75 kg |
Conditions and the presence of obstacles in real-time | Visual camera | Underwater depth up to ≈40 m Camcorder sensor res. 5 megapixels Video capture 1920 × 1080—30 fps; 1280 × 960—30 fps; 1280 × 720—60/30 fps; 848 × 480—60 fps Weight ≈ 0.74 kg |
Obstacle detection | Ultrasonic | Detection range 2–450 cm Accuracy 0.2 cm |
USV Application: Detection and Tracking of oil spills | ||
Functions | Sensors | Specifications |
Locating oil spills on water | Thermal IR camera | FOV 18° × 13°; 30° × 23°; 32.4° ×25.6° Temperature resolution 0.05–0.1 °C |
Radar | Range 2–7 km FOV 360° |
Sensors Types | Capabilities | Challenge and Limitations |
---|---|---|
Inertial | Position, orientation and velocity information is carried out by collecting data from accelerometers and gyroscopes | Data processing and fusion of data from multiple sensors are required to correct for drift errors |
Acoustic | Acoustic transponders are used to determine positioning relative to receivers or features (seafloor) | Fixed infrastructure can be required, constraints due to water environment, possible speed restrictions |
Depth | The ambient pressure in the water column is measured to calculate depth | Limitations are minimal, measurement sensors will function at depths much greater than projected platforms are intended to go |
Orientation | Platform heading is calculated from one or several sensors | Degraded performance during acceleration |
Light and optical | Positioning is carried out using environmental features as a guide | Light attenuation in the water limits accuracy |
Light and optical: light detection and ranging, laser line scanning | Laser mapping and imaging, video feed generation | The minimal optical wavelength propagation through water drives the requirement to be physically close to targets |
Sonar: single beam, multibeam, sidescan, synthetic aperture | Target detection and identification, buried object detection, imaging | Sound propagation in water depends on the temperature and salinity, calibration is needed |
UUV Application: Long-Range Surveys | ||
Functions | Sensors | Specifications |
Docking guidance systems | Acoustic navigation sensor | Acquisition range in order of km (up to 2 km) |
Optical navigation sensor | Acquisition range 10~30 m Lateral and temporal accuracy 2~10 cm | |
Electromagnetic navigation sensor | Acquisition range 25~30 m Accuracy ≈ 20 cm | |
UUV Application: Collaborative missions, surveillance and intervention | ||
Functions | Sensors | Specifications |
Localization and navigation for collaborative work | Sonar | Accuracy from 5–10 cm to 10–120 cm Range From 5 m up to hundreds m from obstacles |
Acoustic range (LBL, SBL, USBL) | Accuracy from some cm up to tens of m Range Up to tens of m from array | |
Optical light sensors | Accuracy Up to 20 cm for position and 10° for orientation Range 1–20 m from markers | |
Optical cameras | Accuracy Up to 1 cm for position and 3° for orientation Range 1–20 m from markers | |
UUV Application: Seafloor mapping | ||
Functions | Sensors | Specifications |
Seafloor topographical data, measure the size, shape, and height variation of underwater targets | Sonar (MBE) | Range 175–635 m Frequency 200–450 kHz Transmit beamwidth 0.4°–2.0°; Receive beamwidth 0.5°–2.0°System depth rating 6000 m |
Sonar (SSS) | Range 150–600 m Frequency 75–600 kHz Horizontal beams 0.2°–1.0° Depth rating 2000–6000 m |
Sensor Technology | Advantages | Limitations |
---|---|---|
Vision (direct, passive, visual markers or colored balls) | Rich information; passive sensors | Dependent on light conditions; visual line of sight needed; computationally expensive |
Vision (direct, with active markers, infrared or ultraviolet) | Can operate in visual cluttered environments; accurate; low dependence on light conditions | Visual line of sight needed; can result in high energy expense |
Sound | Can operate in visual cluttered environments; omni-directional; passive sensors | Angular accuracy is limited due to limited baseline between microphones in the array; limited range |
Infrared sensor array | Can operate in visual cluttered environments; accurate and computationally simple | Visual line of sight needed; heavy; can result in high energy expense |
UAV Swarm Application: Aerial Surveillance, Remote Sensing, Aerial Inspections | ||
Functions | Sensors | Specifications |
Avoid collisions with other members of swarms and environmental obstacles | Infrared | Range 20 to 150 m Operating temperature −10 to +60 °C Size 29.5 × 13 × 21.6 mm |
Ultrasonic | Range 0 to 6.45 m Resolution 2.5 cm Size 2.2 × 2.0 × 1.6 cm | |
UAV swarm Application: Indoor outdoor | ||
Functions | Sensors | Specifications |
Relative localization | Camera | Resolution 752 × 480 FOV (horiz.) 185° Maximum frame rate 93 Hz |
Publisher’s Note: MDPI stays neutral with regard to jurisdictional claims in published maps and institutional affiliations. |
© 2021 by the authors. Licensee MDPI, Basel, Switzerland. This article is an open access article distributed under the terms and conditions of the Creative Commons Attribution (CC BY) license (http://creativecommons.org/licenses/by/4.0/).
Share and Cite
Balestrieri, E.; Daponte, P.; De Vito, L.; Lamonaca, F. Sensors and Measurements for Unmanned Systems: An Overview. Sensors 2021, 21, 1518. https://doi.org/10.3390/s21041518
Balestrieri E, Daponte P, De Vito L, Lamonaca F. Sensors and Measurements for Unmanned Systems: An Overview. Sensors. 2021; 21(4):1518. https://doi.org/10.3390/s21041518
Chicago/Turabian StyleBalestrieri, Eulalia, Pasquale Daponte, Luca De Vito, and Francesco Lamonaca. 2021. "Sensors and Measurements for Unmanned Systems: An Overview" Sensors 21, no. 4: 1518. https://doi.org/10.3390/s21041518
APA StyleBalestrieri, E., Daponte, P., De Vito, L., & Lamonaca, F. (2021). Sensors and Measurements for Unmanned Systems: An Overview. Sensors, 21(4), 1518. https://doi.org/10.3390/s21041518