[go: up one dir, main page]
More Web Proxy on the site http://driver.im/
Next Issue
Volume 6, March
Previous Issue
Volume 6, January
You seem to have javascript disabled. Please note that many of the page functionalities won't work as expected without javascript enabled.
 
 

Drones, Volume 6, Issue 2 (February 2022) – 27 articles

Cover Story (view full-size image): New propulsive architectures, with high interactions with the aerodynamic performance of the platform, are an attractive option for reducing power consumption and improving the handling of fixed-wing UAVs. Distributed electric propulsion with boundary layer ingestion over the wing introduces extra complexity to the design of these systems. This work studies the effect of different combinations of angles of attack and propeller positions over the wing's pressure coefficient and skin friction coefficient distributions through extensive simulation. A proper orthogonal decomposition of the coefficient distributions is performed, which may be used to interpolate the results to non-simulated combinations, giving more information than the interpolation of the main aerodynamic coefficients, such as the lift and drag coefficients. View this paper
  • Issues are regarded as officially published after their release is announced to the table of contents alert mailing list.
  • You may sign up for e-mail alerts to receive table of contents of newly released issues.
  • PDF is the official format for papers published in both, html and pdf forms. To view the papers in pdf format, click on the "PDF Full-text" link, and use the free Adobe Reader to open them.
Order results
Result details
Section
Select all
Export citation of selected articles as:
18 pages, 6608 KiB  
Article
Local Control of Unmanned Air Vehicles in the Mountain Area
by Pavol Kurdel, Marek Češkovič, Natália Gecejová, František Adamčík and Mária Gamcová
Drones 2022, 6(2), 54; https://doi.org/10.3390/drones6020054 - 21 Feb 2022
Cited by 23 | Viewed by 4059
Abstract
The task of increasing the accuracy and stabilization of the flight of unmanned aerial vehicles (UAV) in the alpine environment is a complex problem. It is related to the evaluation of UAV flight parameters and control conditions for the operator’s place. The purpose [...] Read more.
The task of increasing the accuracy and stabilization of the flight of unmanned aerial vehicles (UAV) in the alpine environment is a complex problem. It is related to the evaluation of UAV flight parameters and control conditions for the operator’s place. The purpose of the UAV’s autonomous flight control is to ensure stable control of the UAV’s flight parameters. Flight control systems are affected by various disturbances caused by both internal and external conditions. In general, the number of autonomous control systems corresponds to the number of degrees of freedom, which determines the flight of an autonomous vehicle. An important factor in assessing the quality of such a UAV is its readiness for an autonomous flight together with the level of its safe guidance on the route. The presented article focuses on the analysis of UAV flight control and the quality of prediction and elimination of errors that exist during maneuvers toward the place of a successful UAV landing. The aim of the article is to point out the solvability of the complexities of such a flight procedure with the evaluation of the readiness for the descent phase of the autonomous UAV. The given problem is caused by the social demand for the creation of a way of providing health care in the mountain area of the High Tatras in Slovakia. The existing database of data obtained from the flying vehicles used in Slovakia was compared with the data obtained from the simulated flights, with their subsequent evaluation in the MATLAB software (Version R2021b) environment. Full article
(This article belongs to the Special Issue Conceptual Design, Modeling, and Control Strategies of Drones)
Show Figures

Figure 1

Figure 1
<p>Description of the model situation in the selected terrain area, Poprad Airport and the High Tatras landing point (3D visualization of the environment). Legend: 1. UAV and navigation data settings; 2. UAV control and management workplace; 3. Controlled UAV flight path; 4. ASS (Area of Successful Solution).</p>
Full article ">Figure 2
<p>Simulation and evaluation laboratory environment.</p>
Full article ">Figure 3
<p>The 3D visualization of the UAV line in the area of line P (final approach). Legend: (<b>1</b>) is the point where the loss of GPS connection ends; (<b>2</b>) is the point where the loss of GPS connection and other influences on UAV start to affect; (<b>3</b>) is the starting point of descent to ASS.</p>
Full article ">Figure 4
<p>Kinematic elements of remote control approach to the ASS as the final phase of UAV flight.</p>
Full article ">Figure 5
<p>Illustration of model situation: (<b>a</b>) The Little Cold Valley–High Tatras; (<b>b</b>) local 3D model of the selected research area.</p>
Full article ">Figure 6
<p>Parameterization of UAV test flights with probability of readiness.</p>
Full article ">Figure 7
<p>Normal distribution of track deviation probabilities.</p>
Full article ">Figure 8
<p>Distribution of the probability of a linear deviation from the height of the descent line (0 indicates on track, negative values—deviation to the left from central axis, positive values—deviation to the right from central axis).</p>
Full article ">Figure 9
<p>Histogram of the distribution of probabilities of deviation from the height of approximation with ASS (area of successful solution).</p>
Full article ">Figure 10
<p>Linear deviation from the selected route line.</p>
Full article ">Figure 11
<p>Representation of the implemented route line and its corresponding corridor boundaries.</p>
Full article ">Figure 12
<p>Relative dispersions of three types of random error detection processes from navigation systems.</p>
Full article ">
27 pages, 10659 KiB  
Article
EuroDRONE, a European Unmanned Traffic Management Testbed for U-Space
by Vaios Lappas, Giorgos Zoumponos, Vassilis Kostopoulos, Hae In Lee, Hyo-Sang Shin, Antonios Tsourdos, Marco Tantardini, Dennis Shomko, Jose Munoz, Epameinondas Amoratis, Aris Maragkakis, Thomas Machairas and Andra Trifas
Drones 2022, 6(2), 53; https://doi.org/10.3390/drones6020053 - 18 Feb 2022
Cited by 19 | Viewed by 5277
Abstract
EuroDRONE is an Unmanned Traffic Management (UTM) demonstration project, funded by the EU’s SESAR organization, and its aim is to test and validate key UTM technologies for Europe’s ‘U-Space’ UTM program. The EuroDRONE UTM architecture comprises cloud software (DroNav) and hardware (transponder) to [...] Read more.
EuroDRONE is an Unmanned Traffic Management (UTM) demonstration project, funded by the EU’s SESAR organization, and its aim is to test and validate key UTM technologies for Europe’s ‘U-Space’ UTM program. The EuroDRONE UTM architecture comprises cloud software (DroNav) and hardware (transponder) to be installed on drones. The proposed EuroDRONE system is a Highly Automated Air Traffic Management System for small UAVs operating at low altitudes. It is a sophisticated, self-learning system based on software and hardware elements, operating in a distributed computing environment, offering multiple levels of redundancy, fail-safe algorithms for conflict prevention/resolution and assets management. EuroDRONE focuses its work on functionalities which involve the use of new communication links, the use of vehicle-to-infrastructure (V2I) and vehicle-to-vehicle (V2V) technology to communicate information between drones and operators for safe and effective UTM functionality. Practical demonstrations that took place in Patras/Messolonghi in 2019 are presented and show the benefits and shortcomings of near-term UTM implementation in Europe. Full article
(This article belongs to the Section Drone Design and Development)
Show Figures

Figure 1

Figure 1
<p>U-space Levels of Service [<a href="#B1-drones-06-00053" class="html-bibr">1</a>,<a href="#B2-drones-06-00053" class="html-bibr">2</a>,<a href="#B3-drones-06-00053" class="html-bibr">3</a>,<a href="#B4-drones-06-00053" class="html-bibr">4</a>,<a href="#B5-drones-06-00053" class="html-bibr">5</a>].</p>
Full article ">Figure 2
<p>EuroDRONE Architecture and Functionalities (Level or U1-3).</p>
Full article ">Figure 3
<p>EuroDRONE Deployment.</p>
Full article ">Figure 4
<p>EuroDRONE Deployment Processes.</p>
Full article ">Figure 5
<p>DronAssistant prototype for U3 (with DAA, V2I, V2V).</p>
Full article ">Figure 6
<p>DroNav submitted missions list and status.</p>
Full article ">Figure 7
<p>DroNav submitted missions list and status—example of explanation of mission not approved.</p>
Full article ">Figure 8
<p>DroNav example of mission not approved displayed on map (at Messolonghi airport in Greece).</p>
Full article ">Figure 9
<p>Mission planner–PARTAKE timeline interactions for (<b>A</b>) denial (<b>B</b>) take off.</p>
Full article ">Figure 10
<p>DAA Simulation Results. (<b>a</b>) CA trajectory (1 scenario). (<b>b</b>) Intruder trajectory (100 scenarios). (<b>c</b>) Minimum distance to intruder/obstacle (100 scenarios). (<b>d</b>) Total flight time to reach the waypoint (100 scenarios).</p>
Full article ">Figure 11
<p>Three drones in VLOS operations and two drones in fully automated BVLOS operations utilizing the DronAssistant.</p>
Full article ">Figure 12
<p>EuroDRONE fixed-wing VTOL ‘BabyShark’ (<b>left</b>) and hexacopter ‘GAIA’ (<b>right</b>).</p>
Full article ">Figure 13
<p>Dronav mission planner overview interface with an approved mission for Demo 1.</p>
Full article ">Figure 14
<p>VLOS Flight at Messolonghi Airport.</p>
Full article ">Figure 15
<p>Demo 1 mission that the UAV had to execute.</p>
Full article ">Figure 16
<p>UAV stopped its mission and moved to a safe area in order to avoid the fixed obstacle.</p>
Full article ">Figure 17
<p>The mission that was executed by the UAV.</p>
Full article ">Figure 18
<p>EuroDRONE Scenario 1 Testing for multiple mission scenarios: (<b>a</b>) Dynamically Geofenced UTM Testing Area with GAIA UAV starting point—geofenced, (<b>b</b>) Scenario 1 Test with Geofenced Area and for a SAR/Precision Agriculture UAV Scenario—precision agriculture, and (<b>c</b>) Scenario 1 Testing with Logistics (VLOS) Scenario over Messolonghi Airport—logistics with on-going aviation traffic near the airport.</p>
Full article ">Figure 18 Cont.
<p>EuroDRONE Scenario 1 Testing for multiple mission scenarios: (<b>a</b>) Dynamically Geofenced UTM Testing Area with GAIA UAV starting point—geofenced, (<b>b</b>) Scenario 1 Test with Geofenced Area and for a SAR/Precision Agriculture UAV Scenario—precision agriculture, and (<b>c</b>) Scenario 1 Testing with Logistics (VLOS) Scenario over Messolonghi Airport—logistics with on-going aviation traffic near the airport.</p>
Full article ">Figure 19
<p>Trial 1 Testing Scenario Summary with Multiple UAVs.</p>
Full article ">Figure 20
<p>The flight path of the BVLOS mission/Demo 2.</p>
Full article ">Figure 21
<p>View during the medium-range BVLOS flight of the Gaia UAV carrying cargo in Demonstration No. 2.</p>
Full article ">Figure 22
<p>EuroDRONE Team after Demo 2 Flight Testing.</p>
Full article ">
31 pages, 51975 KiB  
Article
Precise Quantification of Land Cover before and after Planned Disturbance Events with UAS-Derived Imagery
by Zachary Miller, Joseph Hupy, Sarah Hubbard and Guofan Shao
Drones 2022, 6(2), 52; https://doi.org/10.3390/drones6020052 - 18 Feb 2022
Cited by 4 | Viewed by 3608
Abstract
This paper introduces a detailed procedure to utilize the high temporal and spatial resolution capabilities of an unmanned aerial system (UAS) to document vegetation at regular intervals both before and after a planned disturbance, a key component in natural disturbance-based management (NDBM), which [...] Read more.
This paper introduces a detailed procedure to utilize the high temporal and spatial resolution capabilities of an unmanned aerial system (UAS) to document vegetation at regular intervals both before and after a planned disturbance, a key component in natural disturbance-based management (NDBM), which uses treatments such as harvest and prescribed burns toward the removal of vegetation fuel loads. We developed a protocol and applied it to timber harvest and prescribed burn events. Geographic image-based analysis (GEOBIA) was used for the classification of UAS orthomosaics. The land cover classes included (1) bare ground, (2) litter, (3) green vegetation, and (4) burned vegetation for the prairie burn site, and (1) mature canopy, (2) understory vegetation, and (3) bare ground for the timber harvest site. Sample datasets for both kinds of disturbances were used to train a support vector machine (SVM) classifier algorithm, which produced four land cover classifications for each site. Statistical analysis (a two-tailed t-test) indicated there was no significant difference in image classification efficacies between the two disturbance types. This research provides a framework to use UASs to assess land cover, which is valuable for supporting effective land management practices and ensuring the sustainability of land practices along with other planned disturbances, such as construction and mining. Full article
Show Figures

Figure 1

Figure 1
<p>Study area and site map for prescribed burn and selective timber harvest plots throughout the state of Indiana. Sites selected for final analysis are outlined in red. The “CHF Boundary” locator map was adapted from <span class="html-italic">The Central Hardwood Forest: Its Boundaries and Physiographic Provinces</span> [<a href="#B42-drones-06-00052" class="html-bibr">42</a>].</p>
Full article ">Figure 2
<p>Purdue FNR prairie plots for prescribed fire seasonality research (outlined in yellow). (<b>A</b>) =Doak, (<b>B</b>) =Hermann, (<b>C</b>) =PWA. PWA was selected for final land cover analysis of prescribed burns.</p>
Full article ">Figure 3
<p>Pike Lumber Company forest plots for timber production (outlined in yellow). (<b>A</b>) =Rough, (<b>B</b>) =Deardorff, (<b>C</b>) =Jackson, (<b>D</b>) =Urton, (<b>E</b>) =McAffee, (<b>F</b>) =Whiteman, (<b>G</b>) =Volz*. Volz was selected for the final land cover analysis.</p>
Full article ">Figure 4
<p>The UAS platform used in this study was a DJI M600 Pro [<a href="#B45-drones-06-00052" class="html-bibr">45</a>] equipped with a Geosnap PPK [<a href="#B46-drones-06-00052" class="html-bibr">46</a>], which triggered a Sony A6000 Mirrorless Interchangeable Lens Camera [<a href="#B47-drones-06-00052" class="html-bibr">47</a>].</p>
Full article ">Figure 5
<p>Workflow diagram of methods. Yellow squares = user actions/processes, and blue squares = outputs (and inputs).</p>
Full article ">Figure 6
<p>Geographic object-based image analysis (GEOBIA) process. (<b>a</b>) PWA original image; (<b>b</b>) PWA segmented image; (<b>c</b>) PWA classified image; (<b>d</b>) Volz original image; (<b>e</b>) Volz segmented image; (<b>f</b>) Volz classified image.</p>
Full article ">Figure 7
<p>Segmented image parameters tested. (<b>a</b>) Original image; (<b>b</b>) “default” segmentation parameters—15.5 spectral, 15 spatial, 20 minimum segment size (MSS); (<b>c</b>) 17 spectral, 10 spatial, 20 MSS; (<b>d</b>) 17 spectral, 10 spatial, 80 MSS.</p>
Full article ">Figure 8
<p>(<b>a</b>–<b>d</b>) PWA prairie burn orthomosaics. UAS flight area shown in yellow; study area for land cover classification shown in green.</p>
Full article ">Figure 9
<p>(<b>a</b>–<b>d</b>) Volz harvest orthomosaics. Study area for land cover classification shown in yellow.</p>
Full article ">Figure 10
<p>(<b>A</b>) RB (<b>B</b>) PB (<b>C</b>) PB 2 (<b>D</b>) PB 3, RB = pre-burn, PB = post-burn. PWA prairie burn land cover classification. Burn plots shown in red outline.</p>
Full article ">Figure 11
<p>(<b>A</b>) pre-cut (<b>B</b>) mid-cut (<b>C</b>) post-cut (<b>D</b>) post-cut 2. Volz timber harvest land cover classification. Harvest area shown in yellow outline.</p>
Full article ">
18 pages, 1806 KiB  
Article
Homogeneous Agent Behaviours for the Multi-Agent Simultaneous Searching and Routing Problem
by Thomas Kent, Arthur Richards and Angus Johnson
Drones 2022, 6(2), 51; https://doi.org/10.3390/drones6020051 - 17 Feb 2022
Cited by 7 | Viewed by 3145
Abstract
Through the use of autonomy Unmanned Aerial Vehicles (UAVs) can be used to solve a range of of multi-agent problems that exist in the real world, for example search and rescue or surveillance. Within these scenarios the global objective might often be better [...] Read more.
Through the use of autonomy Unmanned Aerial Vehicles (UAVs) can be used to solve a range of of multi-agent problems that exist in the real world, for example search and rescue or surveillance. Within these scenarios the global objective might often be better achieved if aspects of the problem can be optimally shared amongst its agents. However, in uncertain, dynamic and often partially observable environments centralised global-optimisation techniques are not achievable. Instead, agents may have to act on their own belief of the world, making the best decisions independently and potentially myopically. With multiple agents acting in a decentralised manner how can we discourage competitive behaviour and instead facilitate cooperation. This paper focuses on the specific problem of multiple UAVs simultaneously searching for tasks in an environment whilst efficiently routing between them and ultimately visiting them. This paper is motivated by this idea that collaboration can be simple and achieved without the need for a dialogue but instead through the design of the individual agent’s behaviour. By focusing on what is communicated we expand the use of a single agent behaviour. Which through minor modifications can produce distinct agents demonstrating independent, collaborative and competitive behaviour. In particular by investigating the role of sensor and communication ranges this paper will show that increased sensor ranges can be detrimental to system performance, and instead the simple modelling of nearby agents’ intent is a far better approach. Full article
(This article belongs to the Special Issue Conceptual Design, Modeling, and Control Strategies of Drones)
Show Figures

Figure 1

Figure 1
<p>Diagram of UAVs (<math display="inline"><semantics> <msub> <mi>UAV</mi> <mn>1</mn> </msub> </semantics></math>) communication radius <math display="inline"><semantics> <msub> <mi>R</mi> <mi>comms</mi> </msub> </semantics></math> and vision radius <math display="inline"><semantics> <msub> <mi>R</mi> <mi>vision</mi> </msub> </semantics></math> for finding tasks and communicating with other agents (<math display="inline"><semantics> <msub> <mi>UAV</mi> <mn>2</mn> </msub> </semantics></math> and <math display="inline"><semantics> <msub> <mi>UAV</mi> <mn>3</mn> </msub> </semantics></math>).</p>
Full article ">Figure 2
<p>Two agents competing for three tasks. Blue agent (<b>left</b>) might change the order in which it completes its tasks based on the assumption that Red agent (<b>right</b>) might reach a task first.</p>
Full article ">Figure 3
<p>Three examples of procedurally generated maps of hexagonal terrain tiles (blue, yellow, green, dark green) a base spawn location (red) and a quad-copter UAV at the base).</p>
Full article ">Figure 4
<p>Time steps 50, 100 and 300 of two example search and routing trials with 5 UAVs with <math display="inline"><semantics> <msub> <mi>R</mi> <mi>vision</mi> </msub> </semantics></math> of 250 m, <math display="inline"><semantics> <msub> <mi>R</mi> <mi>comms</mi> </msub> </semantics></math> of 50 m. With the left column using all Greedy behaviours and right column using all Solo+ behaviour. Black dots indicate task locations, dashed lines are each UAVs intended path (many overlap) and the green circle indicates <math display="inline"><semantics> <msub> <mi>R</mi> <mi>vision</mi> </msub> </semantics></math>.</p>
Full article ">Figure 5
<p>A sweep over <math display="inline"><semantics> <msub> <mi>R</mi> <mi>vision</mi> </msub> </semantics></math> values for increasing (top to bottom) numbers of UAVs for two fixed <math display="inline"><semantics> <msub> <mi>R</mi> <mi>comms</mi> </msub> </semantics></math> values of 50 and 250 m.</p>
Full article ">Figure 6
<p>A sweep over <math display="inline"><semantics> <msub> <mi>R</mi> <mi>comms</mi> </msub> </semantics></math> values for increasing (top to bottom) numbers of UAVs for two fixed <math display="inline"><semantics> <msub> <mi>R</mi> <mi>vision</mi> </msub> </semantics></math> values of 50 and 250 m.</p>
Full article ">
14 pages, 1759 KiB  
Article
Load and Wind Aware Routing of Delivery Drones
by Satoshi Ito, Keishi Akaiwa, Yusuke Funabashi, Hiroki Nishikawa, Xiangbo Kong, Ittetsu Taniguchi and Hiroyuki Tomiyama
Drones 2022, 6(2), 50; https://doi.org/10.3390/drones6020050 - 17 Feb 2022
Cited by 20 | Viewed by 4900
Abstract
Delivery drones have been attracting attention as one of the promising technologies to deliver packages. Several research studies on routing problems specifically for drone delivery scenarios have extended Vehicle Routing Problems (VRPs). Most existing VRPs are based on Traveling Salesman Problems (TSPs) for [...] Read more.
Delivery drones have been attracting attention as one of the promising technologies to deliver packages. Several research studies on routing problems specifically for drone delivery scenarios have extended Vehicle Routing Problems (VRPs). Most existing VRPs are based on Traveling Salesman Problems (TSPs) for minimizing the overall distance. On the other hand, VRPs for drone delivery have been aware of energy consumption due to the consideration of battery capacity. Despite hovering motions with loading packages accounting for a large portion of energy consumption since delivery drones need to hover with several packages, little research has been conducted on drone routing problems that aim at the minimization of overall flight times. In addition, flight time is strongly influenced by windy conditions such as headwinds and tailwinds. In this paper, we propose a VRP for drone delivery in which flight time is dependent on the weight of packages in a windy environment, called Flight Speed-aware Vehicle Routing Problem with Load and Wind (FSVRPLW). In this paper, flight speed changes depending on the load and wind. Specifically, a heavier package slows down flight speeds and a lighter package speeds up flight speeds. In addition, a headwind slows down flight speeds and a tailwind speed up flight speeds. We mathematically derived the problem and developed a dynamic programming algorithm to solve the problem. In the experiments, we investigate how much impact both the weight of packages and the wind have on the flight time. The experimental results indicate that taking loads and wind into account is very effective in reducing flight times. Moreover, the results of comparing the effects of load and wind indicate that flight time largely depends on the weight of packages. Full article
Show Figures

Figure 1

Figure 1
<p>An example problem.</p>
Full article ">Figure 2
<p>Comparison of FSVRP and FSVRPLW: (<b>a</b>) flight distance of FSVRP, (<b>b</b>) flight distance of FSVRPLW, (<b>c</b>) flight time of FSVRP, and (<b>d</b>) flight time of FSVRPLW.</p>
Full article ">Figure 3
<p>Effect of load: (<b>a</b>) drone without load and (<b>b</b>) drone with load <span class="html-italic">w</span>.</p>
Full article ">Figure 4
<p>Drone flight under windy conditions.</p>
Full article ">Figure 5
<p>Speed changed by wind.</p>
Full article ">Figure 6
<p>Comparison of flight time.</p>
Full article ">Figure 7
<p>Comparison of flight distance.</p>
Full article ">Figure 8
<p>Algorithm runtime.</p>
Full article ">
12 pages, 18782 KiB  
Article
Thrust Vector Observation for Force Feedback-Controlled UAVs
by Lennart Werner, Michael Strohmeier, Julian Rothe and Sergio Montenegro
Drones 2022, 6(2), 49; https://doi.org/10.3390/drones6020049 - 17 Feb 2022
Cited by 1 | Viewed by 5965
Abstract
This paper presents a novel approach to Thrust Vector Control (TVC) for small Unmanned Aerial Vehicles (UAVs). The difficulties associated with conventional feed-forward TVC are outlined, and a practical solution to conquer these challenges is derived. The solution relies on observing boom deformations [...] Read more.
This paper presents a novel approach to Thrust Vector Control (TVC) for small Unmanned Aerial Vehicles (UAVs). The difficulties associated with conventional feed-forward TVC are outlined, and a practical solution to conquer these challenges is derived. The solution relies on observing boom deformations that are created by different thrust vector directions and high-velocity air inflow. The paper describes the required measurement electronics as well as the implementation of a dedicated testbed that allows the evaluation of mid-flight force measurements. Wind-tunnel tests show that the presented method for active thrust vector determination is able to quantify the disturbances due to the incoming air flow. Full article
(This article belongs to the Special Issue Conceptual Design, Modeling, and Control Strategies of Drones)
Show Figures

Figure 1

Figure 1
<p>Test rig concept with strain gauge placement.</p>
Full article ">Figure 2
<p>Complete arm.</p>
Full article ">Figure 3
<p>Closeup of the gauges.</p>
Full article ">Figure 4
<p>Assembled nacelle.</p>
Full article ">Figure 5
<p>Exploded view of nacelle.</p>
Full article ">Figure 6
<p>Schematic of the strain gauge measurement setup.</p>
Full article ">Figure 7
<p>Setup of the evaluation tests: the red arrows indicate the positive direction of the horizontal and vertical axes of the stationary coordinate frame.</p>
Full article ">Figure 8
<p>Evaluation test data at different inflow speeds.</p>
Full article ">Figure 9
<p>Strain distribution at 45<math display="inline"><semantics> <msup> <mrow/> <mo>∘</mo> </msup> </semantics></math> thrust scenario with strain gauge placement.</p>
Full article ">Figure 10
<p>Backward tilting due to incoming flow.</p>
Full article ">
18 pages, 14743 KiB  
Article
High-Density UAV-LiDAR in an Integrated Crop-Livestock-Forest System: Sampling Forest Inventory or Forest Inventory Based on Individual Tree Detection (ITD)
by Ana Paula Dalla Corte, Ernandes M. da Cunha Neto, Franciel Eduardo Rex, Deivison Souza, Alexandre Behling, Midhun Mohan, Mateus Niroh Inoue Sanquetta, Carlos Alberto Silva, Carine Klauberg, Carlos Roberto Sanquetta, Hudson Franklin Pessoa Veras, Danilo Roberti Alves de Almeida, Gabriel Prata, Angelica Maria Almeyda Zambrano, Jonathan William Trautenmüller, Anibal de Moraes, Mauro Alessandro Karasinski and Eben North Broadbent
Drones 2022, 6(2), 48; https://doi.org/10.3390/drones6020048 - 16 Feb 2022
Cited by 18 | Viewed by 6136
Abstract
Lidar point clouds have been frequently used in forest inventories. The higher point density has provided better representation of trees in forest plantations. So we developed a new approach to fill this gap in the integrated crop-livestock-forest system, the sampling forest inventory, which [...] Read more.
Lidar point clouds have been frequently used in forest inventories. The higher point density has provided better representation of trees in forest plantations. So we developed a new approach to fill this gap in the integrated crop-livestock-forest system, the sampling forest inventory, which uses the principles of individual tree detection applied under different plot arrangements. We use a UAV-lidar system (GatorEye) to scan an integrated crop-livestock-forest system with Eucalyptus benthamii seed forest plantations. On the high density UAV-lidar point cloud (>1400 pts. m2), we perform a comparison of two forest inventory approaches: Sampling Forest Inventory (SFI) with circular (1380 m2 and 2300 m2) and linear (15 trees and 25 trees) plots and Individual Tree Detection (ITD). The parametric population values came from the approach with measurements taken in the field, called forest inventory (FI). Basal area and volume estimates were performed considering the field heights and the heights measured in the LiDAR point clouds. We performed a comparison of the variables number of trees, basal area, and volume per hectare. The variables by scenarios were submitted to analysis of variance to verify if the averages are considered different or equivalent. The RMSE (%) were calculated to explain the deviation between the measured volume (filed) and estimated volume (LiDAR) values of these variables. Additionally, we calculated rRMSE, Standard error, AIC, R2, Bias, and residual charts. The basal area values ranged from 7.40 m2 ha−1 (C1380) to 8.14 m2 ha−1 281 (C2300), about −5.9% less than the real value (8.65 m2 ha−1). The C2300 scenario was the only one whose confidence interval (CI) limits included the basal area real. For the total stand volume, the ITD scenario was the one that presented the closer values (689.29 m3) to the real total value (683.88 m3) with the real value positioned in the CI. Our findings indicate that for the stand conditions under study, the SFI approach (C2300) that considers an area of 2300 m2 is adequate to generate estimates at the same level as the ITD approach. Thus, our study should be able to assist in the selection of an optimal plot size to generate estimates with minimized errors and gain in processing time. Full article
(This article belongs to the Special Issue Feature Papers for Drones in Ecology Section)
Show Figures

Figure 1

Figure 1
<p>Study area location and stand characterization in the State of Paraná, southern Brazil.</p>
Full article ">Figure 2
<p>Methodological approaches tested and illustrative diagram of UAV-LiDAR point cloud and ITD processing. (<b>a</b>) UAV-LiDAR point cloud pre-processing; (<b>b</b>) Generation of Digital Terrain Model (DTM) and Digital Surface Model (DSM); (<b>c</b>) Generation of Canopy Height Model (CHM); (<b>d</b>) Individual tree detection (ITD); (<b>e</b>) LiDAR-derived tree height (<span class="html-italic">hLiDAR</span>) using ITD dat; (f) Extract <span class="html-italic">hLiDAR</span>; (g) Clip of samples plots.</p>
Full article ">Figure 3
<p>Point cloud processing diagram for the different approaches and analyses performed.</p>
Full article ">Figure 4
<p>Box-plot for the variables diameter (cm), transverse area (m<sup>2</sup>), total height (m), and volume (m<sup>3</sup>) for the stand.</p>
Full article ">Figure 5
<p>Box-plot for the individual tree volumes in different scenarios. Where: Black points represent outlier values; Red points represent average values; Pink points represent the observations in the scenarios; IF = forest inventory based on a complete census; ITD = individual tree detection; C2300 = circular plots had areas of 2300 m<sup>2</sup>; L25 = linear plots had fixed tree numbers of 25 individuals; C1380 = circular plots had areas of 1380 m<sup>2</sup>; L15 = linear plots had fixed tree numbers of 15 individuals.</p>
Full article ">Figure 6
<p>Dispersion count between observed (IF) and the tested scenarios for transverse area and individual tree volume. Where: Counts represents maps the number of cases to the hexagon fill.</p>
Full article ">Figure 7
<p>Differences between each scenario evaluated compared to the field data (FI).</p>
Full article ">Figure 8
<p>Means and confidence intervals for the variables basal area, volume per hectare, and total volume. The red line represents the parametric values (based on a complete census).</p>
Full article ">
14 pages, 18047 KiB  
Article
Water Hyacinth (Eichhornia crassipes) Detection Using Coarse and High Resolution Multispectral Data
by Luís Pádua, Ana M. Antão-Geraldes, Joaquim J. Sousa, Manuel Ângelo Rodrigues, Verónica Oliveira, Daniela Santos, Maria Filomena P. Miguens and João Paulo Castro
Drones 2022, 6(2), 47; https://doi.org/10.3390/drones6020047 - 15 Feb 2022
Cited by 28 | Viewed by 5657
Abstract
Efficient detection and monitoring procedures of invasive plant species are required. It is of crucial importance to deal with such plants in aquatic ecosystems, since they can affect biodiversity and, ultimately, ecosystem function and services. In this study, it is intended to detect [...] Read more.
Efficient detection and monitoring procedures of invasive plant species are required. It is of crucial importance to deal with such plants in aquatic ecosystems, since they can affect biodiversity and, ultimately, ecosystem function and services. In this study, it is intended to detect water hyacinth (Eichhornia crassipes) using multispectral data with different spatial resolutions. For this purpose, high-resolution data (<0.1 m) acquired from an unmanned aerial vehicle (UAV) and coarse-resolution data (10 m) from Sentinel-2 MSI were used. Three areas with a high incidence of water hyacinth located in the Lower Mondego region (Portugal) were surveyed. Different classifiers were used to perform a pixel-based detection of this invasive species in both datasets. From the different classifiers used, the results were achieved by the random forest classifiers stand-out (overall accuracy (OA): 0.94). On the other hand, support vector machine performed worst (OA: 0.87), followed by Gaussian naive Bayes (OA: 0.88), k-nearest neighbours (OA: 0.90), and artificial neural networks (OA: 0.91). The higher spatial resolution from UAV-based data enabled us to detect small amounts of water hyacinth, which could not be detected in Sentinel-2 data. However, and despite the coarser resolution, satellite data analysis enabled us to identify water hyacinth coverage, compared well with a UAV-based survey. Combining both datasets and even considering the different resolutions, it was possible to observe the temporal and spatial evolution of water hyacinth. This approach proved to be an effective way to assess the effects of the mitigation/control measures taken in the study areas. Thus, this approach can be applied to detect invasive species in aquatic environments and to monitor their changes over time. Full article
(This article belongs to the Special Issue Feature Papers for Drones in Ecology Section)
Show Figures

Figure 1

Figure 1
<p>Overview of the study area: (<b>a</b>) its location within mainland Portugal; (<b>b</b>) overview of the Lower Mondego main water canals (in blue); (<b>c</b>) location of the three analysed areas, red polygon highlighted in (<b>b</b>); and (<b>d</b>,<b>e</b>) photographs of study areas 1 and 2, respectively.</p>
Full article ">Figure 2
<p>Red, green, blue (RGB) composite representation of the orthorectified multispectral data acquired by an unmanned aerial vehicle in the three study areas.</p>
Full article ">Figure 3
<p>Average reflectance of water hyacinth samples in the three surveyed areas for the five bands acquired by the MicaSense RedEdge-MX sensor with the unmanned aerial vehicle.</p>
Full article ">Figure 4
<p>Classification of water hyacinth in the three study areas using multispectral data from an unmanned aerial vehicle (<b>a</b>) and from Sentinel-2 (<b>b</b>).</p>
Full article ">Figure 5
<p>Water hyacinth coverage in the water channels of the surveyed areas in three consecutive dates. Red pixels mark the potential presence of water hyacinth.</p>
Full article ">
26 pages, 11764 KiB  
Article
DRONET: Multi-Tasking Framework for Real-Time Industrial Facility Aerial Surveillance and Safety
by Simeon Okechukwu Ajakwe, Vivian Ukamaka Ihekoronye, Dong-Seong Kim and Jae Min Lee
Drones 2022, 6(2), 46; https://doi.org/10.3390/drones6020046 - 15 Feb 2022
Cited by 31 | Viewed by 8410
Abstract
The security of key and critical infrastructures is crucial for uninterrupted industrial process flow needed in strategic management as these facilities are major targets of invaders. The emergence of non-military use of drones especially for logistics comes with the challenge of redefining the [...] Read more.
The security of key and critical infrastructures is crucial for uninterrupted industrial process flow needed in strategic management as these facilities are major targets of invaders. The emergence of non-military use of drones especially for logistics comes with the challenge of redefining the anti-drone approach in determining a drone’s harmful status in the airspace based on certain metrics before countering it. In this work, a vision-based multi-tasking anti-drone framework is proposed to detect drones, identifies the airborne objects, determines its harmful status through perceived threat analysis, and checks its proximity in real-time prior to taking an action. The model is validated using manually generated 5460 drone samples from six (6) drone models under sunny, cloudy, and evening scenarios and 1709 airborne objects samples of seven (7) classes under different environments, scenarios (blur, scales, low illumination), and heights. The proposed model was compared with seven (7) other object detection models in terms of accuracy, sensitivity, F1-score, latency, throughput, reliability, and efficiency. The simulation result reveals that, overall, the proposed model achieved superior multi-drone detection accuracy of 99.6%, attached object identification of sensitivity of 99.80%, and F1-score of 99.69%, with minimal error, low latency, and less computational complexity needed for effective industrial facility aerial surveillance. A benchmark dataset is also provided for subsequent performance evaluation of other object detection models. Full article
(This article belongs to the Section Drone Design and Development)
Show Figures

Figure 1

Figure 1
<p>Components of an anti-drone system (Adapted from [<a href="#B5-drones-06-00046" class="html-bibr">5</a>]).</p>
Full article ">Figure 2
<p>Vision-based drone detection and aerial communication system illustrating airborne object detection around an industrial facility.</p>
Full article ">Figure 3
<p>Proposed aerial drone detection design for industrial facility inspection highlighting the flow of input, process, and output components.</p>
Full article ">Figure 4
<p>Compact drone detector model showing backbone, neck, and head as its components, networks, and convolutions.</p>
Full article ">Figure 5
<p>Path Aggregation Network showing the underlying components and process; FPN Backbone, Bottom-up Path Augmentation, Adaptive feature pooling, Box branch, and Fully connected fusion.</p>
Full article ">Figure 6
<p>Area Mapping for adaptive drone neutralization depicting detection measurement logic with point A as a position of a mounted electro-optical camera, point B is airspace boundary, while point C, D, and E represent the detected drones in the airspace.</p>
Full article ">Figure 7
<p>Drone detection dataset distribution for the models’ training showing various drones models and climatic scenarios.</p>
Full article ">Figure 8
<p>Payload recognition dataset distribution showing the different drone models and attached objects under different climatic conditions and altitudes.</p>
Full article ">Figure 9
<p>Precision graph of the proposed model across different batch sizes.</p>
Full article ">Figure 10
<p>Recall graph of the proposed model across different batch sizes.</p>
Full article ">Figure 11
<p>Confusion Matrix Graphs of showing drone classification by DRONET under different climatic conditions with different hyper-parameters; (<b>a</b>) drone classification at batch size of 8, (<b>b</b>) drone classification at a batch size of 16.</p>
Full article ">Figure 12
<p>Confusion Matrix Graphs of showing drone classification by DRONET under different climatic conditions with different hyper-parameters; (<b>a</b>) drone classification at a batch size of 32, (<b>b</b>) drone classification at a batch size of 64.</p>
Full article ">Figure 13
<p>Confusion Matrix Graphs showing drone classification by DRONET under different climatic conditions with different hyper-parameters; drone classification at batch size of 128.</p>
Full article ">Figure 14
<p>DRONET vs. variants of the YOLO model.</p>
Full article ">Figure 15
<p>Performance Evaluation of DRONET vs. other models showing Precision, Recall, and F1-score values.</p>
Full article ">Figure 16
<p>Loss Graph Result of DRONET vs. other models.</p>
Full article ">Figure 17
<p>Reliability Graph of DRONET across different batch size showing minimal errors in prediction.</p>
Full article ">Figure 18
<p>Efficiency Graph of DRONET and other object detection models.</p>
Full article ">
15 pages, 659 KiB  
Article
Prioritized User Association for Sum-Rate Maximization in UAV-Assisted Emergency Communication: A Reinforcement Learning Approach
by Abdul Basit Siddiqui, Iraj Aqeel, Ahmed Alkhayyat, Umer Javed and Zeeshan Kaleem
Drones 2022, 6(2), 45; https://doi.org/10.3390/drones6020045 - 15 Feb 2022
Cited by 18 | Viewed by 4288
Abstract
Unmanned air vehicles (UAVs) used as aerial base stations (ABSs) can provide communication services in areas where cellular network is not functional due to a calamity. ABSs provide high coverage and high data rates to the user because of the advantage of a [...] Read more.
Unmanned air vehicles (UAVs) used as aerial base stations (ABSs) can provide communication services in areas where cellular network is not functional due to a calamity. ABSs provide high coverage and high data rates to the user because of the advantage of a high altitude. ABSs can be static or mobile; they can adjust their position according to real-time location of ground user and maintain a good line-of-sight link with ground users. In this paper, a reinforcement learning framework is proposed to maximize the number of served users by optimizing the ABS 3D location and power. We also design a reward function that prioritize the emergency users to establish a connection with the ABS using Q-learning. Simulation results reveal that the proposed scheme clearly outperforms the baseline schemes. Full article
Show Figures

Figure 1

Figure 1
<p>Deployment of multiple UAVs in wireless communications.</p>
Full article ">Figure 2
<p>RL categories; model-free and model-based approaches.</p>
Full article ">Figure 3
<p>System model.</p>
Full article ">Figure 4
<p>Optimized 3D ABS deployment of ABSs and users using the proposed scheme.</p>
Full article ">Figure 5
<p>Comparison of sum-rate performance under the proposed and the conventional SNR-based user association scheme.</p>
Full article ">Figure 6
<p>Outage performance comparison.</p>
Full article ">
39 pages, 4843 KiB  
Article
A Control Algorithm for Early Wildfire Detection Using Aerial Sensor Networks: Modeling and Simulation
by André M. Rocha, Pedro Casau and Rita Cunha
Drones 2022, 6(2), 44; https://doi.org/10.3390/drones6020044 - 11 Feb 2022
Cited by 5 | Viewed by 3437
Abstract
This work presents an algorithm for an Aerial Sensor Network (ASN) composed of fixed-wing Unmanned Aerial Vehicles (UAVs) that performs surveillance and detects the early signs of a wildfire in a given territory. The main goal is to cover a given area while [...] Read more.
This work presents an algorithm for an Aerial Sensor Network (ASN) composed of fixed-wing Unmanned Aerial Vehicles (UAVs) that performs surveillance and detects the early signs of a wildfire in a given territory. The main goal is to cover a given area while prioritizing areas of higher fire hazard risk. The proposed algorithm is scalable to any number of aircraft and can use any kind of fire hazard risk map as long as it contains bounded and nonnegative values. Two different dynamical models associated with the movement of fixed-wing UAVs are proposed, tested, and compared through simulations. Lastly, we propose a workflow to size the ASN in order to maximize the probability of detection of wildfires for a particular risk profile. Full article
(This article belongs to the Special Issue Feature Papers for Drones in Ecology Section)
Show Figures

Figure 1

Figure 1
<p>Illustration of a Dubin’s Vehicle trajectory and of the local coordinate frame.</p>
Full article ">Figure 2
<p>Adapted vehicle’s coordinate frame.</p>
Full article ">Figure 3
<p>Base risk function for simulation.</p>
Full article ">Figure 4
<p>The Fourier Transform of the Fire Hazard Risk Distribution for <math display="inline"><semantics> <mrow> <mi>K</mi> <mo>=</mo> <mn>15</mn> </mrow> </semantics></math> in a larger domain and the trajectories obtained by running the simulation from <a href="#sec3dot3-drones-06-00044" class="html-sec">Section 3.3</a> without countermeasures for the agents leaving the domain. (<b>a</b>) The Fourier Transform of the Fire Hazard Risk Distribution for <math display="inline"><semantics> <mrow> <mi>K</mi> <mo>=</mo> <mn>15</mn> </mrow> </semantics></math> in a larger domain. (<b>b</b>) The trajectories obtained by running the simulation from <a href="#sec3dot3-drones-06-00044" class="html-sec">Section 3.3</a> without countermeasures for the agents leaving the domain.</p>
Full article ">Figure 5
<p>Risk function with an extended domain.</p>
Full article ">Figure 6
<p>Fire Hazard probability function and its Fourier approximation with <math display="inline"><semantics> <mrow> <mi>K</mi> <mo>=</mo> <mn>15</mn> </mrow> </semantics></math>. (<b>a</b>) Fire Hazard probability. (<b>b</b>) Fourier Transform of the Fire Hazard probability.</p>
Full article ">Figure 7
<p>Control scheme of a full aircraft.</p>
Full article ">Figure 8
<p>Obtained trajectory for three UAVs (Unmanned Aerial Vehicles) using the Dubin’s Vehicle model.</p>
Full article ">Figure 9
<p>Control input sequence for the three UAVs in <math display="inline"><semantics> <mrow> <mi>t</mi> <mo>=</mo> <mo>[</mo> <mn>500</mn> <mo>,</mo> <mn>510</mn> <mo>]</mo> </mrow> </semantics></math> s.</p>
Full article ">Figure 10
<p>Metric for uniform coverage representation for a one-hour simulation. (<b>a</b>) Metric for uniform coverage with both log scales. (<b>b</b>) Metric for uniform coverage with <span class="html-italic">y</span> log scale.</p>
Full article ">Figure 11
<p>Visualization of the desired probability density function and the obtained probability density function’s heatmap. (<b>a</b>) Heatmap of the desired probability density function. (<b>b</b>) Heatmap of the obtained probability density function.</p>
Full article ">Figure 12
<p>Effect of the number of harmonics in computational time, in the setup stage, and in the simulation. (<b>a</b>) Computational time in the setup stage for different values of <span class="html-italic">K</span>. (<b>b</b>) Computational time in the simulation for different values of <span class="html-italic">K</span>.</p>
Full article ">Figure 13
<p>Effect of the linear speed and heading turn rate on the Dubin model. (<b>a</b>) Effect of the linear speed on the Dubin model. (<b>b</b>) Effect of the heading turn rate on the Dubin model.</p>
Full article ">Figure 14
<p>Effect of the number of UAVs in the metric for uniform coverage.</p>
Full article ">Figure 15
<p>Trajectory and control input obtained for the regularized control input with <math display="inline"><semantics> <mrow> <mi>γ</mi> <mo>=</mo> <msup> <mn>10</mn> <mn>14</mn> </msup> </mrow> </semantics></math>. (<b>a</b>) Trajectory. (<b>b</b>) Regularized control input.</p>
Full article ">Figure 16
<p>Metric for uniform coverage for a regularized input with <math display="inline"><semantics> <mrow> <mi>γ</mi> <mo>=</mo> <msup> <mn>10</mn> <mn>14</mn> </msup> </mrow> </semantics></math>.</p>
Full article ">Figure 17
<p>Trajectory and control input obtained for the regularized control input with <math display="inline"><semantics> <mrow> <mi>γ</mi> <mo>=</mo> <msup> <mn>10</mn> <mn>15</mn> </msup> </mrow> </semantics></math>. (<b>a</b>) Trajectory. (<b>b</b>) Regularized control input.</p>
Full article ">Figure 18
<p>Metric for uniform coverage for a regularized input with <math display="inline"><semantics> <mrow> <mi>γ</mi> <mo>=</mo> <msup> <mn>10</mn> <mn>15</mn> </msup> </mrow> </semantics></math>.</p>
Full article ">Figure 19
<p>Obtained trajectory for three UAVs using the Adapted Dubin’s path model.</p>
Full article ">Figure 20
<p>Control input for one UAV in <math display="inline"><semantics> <mrow> <mi>t</mi> <mo>=</mo> <mo>[</mo> <mn>510</mn> <mo>,</mo> <mn>520</mn> <mo>]</mo> </mrow> </semantics></math> s.</p>
Full article ">Figure 21
<p>Metric for uniform coverage representation for a one-hour simulation with the Adapted Dubin’s Vehicle Model. (<b>a</b>) Metric for uniform coverage with both log scales. (<b>b</b>) Metric for uniform coverage with <span class="html-italic">y</span> log scale.</p>
Full article ">Figure 22
<p>Visualization of the desired probability density function and the obtained probability density function’s heatmap on the Adapted Dubin’s Vehicle Model. (<b>a</b>) Heatmap of the desired probability density function. (<b>b</b>) Heatmap of the obtained probability density function.</p>
Full article ">Figure 23
<p>Effect of the tracked point’s position in the trajectory of the Adapted Dubin’s Path Model. (<b>a</b>) Trajectory for <math display="inline"><semantics> <mrow> <msub> <mi>d</mi> <mn>1</mn> </msub> <mo>=</mo> <mn>2</mn> </mrow> </semantics></math> m. (<b>b</b>) Trajectory for <math display="inline"><semantics> <mrow> <msub> <mi>d</mi> <mn>1</mn> </msub> <mo>=</mo> <mn>0.5</mn> </mrow> </semantics></math> m.</p>
Full article ">Figure 24
<p>Effect of the tracked point’s position in the metric for uniform coverage of the Adapted Dubin’s Path Model. (<b>a</b>) Metric for uniform coverage for <math display="inline"><semantics> <mrow> <msub> <mi>d</mi> <mn>1</mn> </msub> <mo>=</mo> <mn>2</mn> </mrow> </semantics></math> m. (<b>b</b>) Metric for uniform coverage for <math display="inline"><semantics> <mrow> <msub> <mi>d</mi> <mn>1</mn> </msub> <mo>=</mo> <mn>0.5</mn> </mrow> </semantics></math> m.</p>
Full article ">Figure 25
<p>Comparison between the metric for uniform coverage of the Dubin’s Vehicle and the Adapted Dubin’s Vehicle.</p>
Full article ">Figure 26
<p>Planck’s Law curves for ideal blackbody radiation at high temperatures between 300K and 1200K. Smaller wavelengths produce higher peaks.</p>
Full article ">Figure 27
<p>Probability of the sensor detecting an ignition at a distance of <span class="html-italic">R</span> meters from the sensor.</p>
Full article ">Figure 28
<p>Probability of detecting an ignition in the function of the altitude.</p>
Full article ">Figure 29
<p>Probability of detecting an ignition in function of the number of agents.</p>
Full article ">Figure 30
<p>Percentage increase over the individual probability value with the number of UAVs, for a variety of base probabilities.</p>
Full article ">Figure 31
<p>Probability of detecting an ignition in the function of the domain size.</p>
Full article ">
26 pages, 12018 KiB  
Article
Drone Control in AR: An Intuitive System for Single-Handed Gesture Control, Drone Tracking, and Contextualized Camera Feed Visualization in Augmented Reality
by Konstantinos Konstantoudakis, Kyriaki Christaki, Dimitrios Tsiakmakis, Dimitrios Sainidis, Georgios Albanis, Anastasios Dimou and Petros Daras
Drones 2022, 6(2), 43; https://doi.org/10.3390/drones6020043 - 10 Feb 2022
Cited by 15 | Viewed by 13336
Abstract
Traditional drone handheld remote controllers, although well-established and widely used, are not a particularly intuitive control method. At the same time, drone pilots normally watch the drone video feed on a smartphone or another small screen attached to the remote. This forces them [...] Read more.
Traditional drone handheld remote controllers, although well-established and widely used, are not a particularly intuitive control method. At the same time, drone pilots normally watch the drone video feed on a smartphone or another small screen attached to the remote. This forces them to constantly shift their visual focus from the drone to the screen and vice-versa. This can be an eye-and-mind-tiring and stressful experience, as the eyes constantly change focus and the mind struggles to merge two different points of view. This paper presents a solution based on Microsoft’s HoloLens 2 headset that leverages augmented reality and gesture recognition to make drone piloting easier, more comfortable, and more intuitive. It describes a system for single-handed gesture control that can achieve all maneuvers possible with a traditional remote, including complex motions; a method for tracking a real drone in AR to improve flying beyond line of sight or at distances where the physical drone is hard to see; and the option to display the drone’s live video feed in AR, either in first-person-view mode or in context with the environment. Full article
(This article belongs to the Special Issue Feature Papers of Drones)
Show Figures

Figure 1

Figure 1
<p>System architecture and data flow between components.</p>
Full article ">Figure 2
<p>First Person View (FPV) into the Augmented Reality environment. Real and virtual drones are shown in the lower right corner.</p>
Full article ">Figure 3
<p>The AR Egocentric View mode. The AR user can be shown in collinearity between the user and the drone’s camera in the right corner.</p>
Full article ">Figure 4
<p>Hand-tracking as visualized in our AR application. Cubes represent hand joints. (The displacement of joints is due to the location of camera capturing the screenshots; the user of the app can see the virtual joints on top of his/her actual joints.</p>
Full article ">Figure 5
<p>The Android UAV interface app. Left: the app’s configurator page, with setting regarding connectivity, gesture command sensitivity, and infrared view display. Right (rotated): the app’s main view, showing a live view of the drone’s camera feed.</p>
Full article ">Figure 6
<p>The palm gesture vocabulary used for drone navigation.</p>
Full article ">Figure 7
<p>Palm vectors and angles used to interpret gesture commands. Interpretation is always relative to the reference position, whose normal vectors form the axes <span class="html-italic">x</span>, <span class="html-italic">y</span>, and <span class="html-italic">z</span>. The current position’s normals are translated to the <span class="html-italic">y</span>-axis and analyzed into components. Pitch, roll, yaw, and throttle values are then computed according to the angles and distances between these component vectors and the reference axes.</p>
Full article ">Figure 8
<p>Axis translation and rotation via the two-step manual calibration process. Left: with the first tap on the drone, the translation offset is captured by the HoloLens. Right: after moving the drone a small distance, the second tap captures the drone’s position in both coordinate systems (HoloLens and Drone), allowing the calculation of rotation angle <math display="inline"><semantics> <mi>ϕ</mi> </semantics></math>.</p>
Full article ">Figure 9
<p>Diagrams of different streaming methods.</p>
Full article ">Figure 10
<p>Overview of the gesture selection user study results.</p>
Full article ">Figure 11
<p>Drone response time measurements according to broker location.</p>
Full article ">Figure 12
<p>IMU measurements vs. ground truth for various pitch values, including linear trendlines and a no-error line for reference. Note how lower pitch values result in greater errors, and how the angle between the trendlines and the no-error line increases for slower speeds.</p>
Full article ">Figure 13
<p>IMU measurement compensation. (<b>Left</b>): MSE of the error without compensation (solid lines) and with (dashed lines) for different pitch values. Note that the vertical axis is in logarithmic scale. (<b>Right</b>): Comparison of average MSE values with and without compensation, for different pitch values. Compensation has a drastic impact on slower speeds, where the error is greatest.</p>
Full article ">Figure 14
<p>Present and future components of the AR part of the solution. Solid lines indicate completed modules, dashed lines work in progress, and dotted lines future plans.</p>
Full article ">
11 pages, 4257 KiB  
Communication
Drone Technology for Monitoring Protected Areas in Remote and Fragile Environments
by Barbara Bollard, Ashray Doshi, Neil Gilbert, Ceisha Poirot and Len Gillman
Drones 2022, 6(2), 42; https://doi.org/10.3390/drones6020042 - 9 Feb 2022
Cited by 26 | Viewed by 8322
Abstract
Protected Areas are established to protect significant ecosystems and historical artefacts. However, many are subject to little structured monitoring to assess whether the attributes for which they have been protected are being maintained or degraded. Monitoring sensitive areas using ground surveys risks causing [...] Read more.
Protected Areas are established to protect significant ecosystems and historical artefacts. However, many are subject to little structured monitoring to assess whether the attributes for which they have been protected are being maintained or degraded. Monitoring sensitive areas using ground surveys risks causing damage to the values for which they are being protected, are usually based on limited sampling, and often convey insufficient detail for understanding ecosystem change. Therefore, there is a need to undertake quick and accurate vegetation surveys that are low impact, cost effective and repeatable with high precision. Here we use drone technology to map protected areas in Antarctica to ultra-high resolution and provide baseline data for future monitoring. Our methods can measure micro-scale changes, are less expensive than ground-based sampling and can be applied to any protected area where fine scale monitoring is desirable. Drone-based surveys should therefore become standard practice for protected areas in remote fragile environments. Full article
(This article belongs to the Special Issue Drones for Biodiversity Conservation)
Show Figures

Figure 1

Figure 1
<p>Location of the three Antarctic Specially Protected Areas mapped by drone. (<b>A</b>) McMurdo Sound; (<b>B</b>) photo of Botany Bay ASPA 154 taken by Ashray Doshi; (<b>C</b>) photo of Canada Glacier ASPA 131 from the top of the ASPA taken by Len Gillman; and (<b>D</b>) photo of Cape Evans ASPA 155 taken by Len Gillman.</p>
Full article ">Figure 2
<p>(<b>A</b>) Crustose and foliose lichen on rocks in the Botany Bay ASPA; (<b>B</b>) bryophytes found on rock platforms in the Botany Bay ASPA; and (<b>C</b>) Granite House, designated as Historic Site and Monument (HSM) No. 67 in the Botany Bay ASPA.</p>
Full article ">Figure 3
<p>Botany Bay ASPA 154 (management plan maps from the Antarctic Treaty Secretariat’s Antarctic Protected Area database (<a href="https://documents.ats.aq/recatt/att652_e.pdf" target="_blank">https://documents.ats.aq/recatt/att652_e.pdf</a>, accessed on 2 February 2022)): (<b>A</b>) drone survey maps of total vegetation density (% within 1 m<sup>2</sup> cells); (<b>B</b>) bryophyte density (% within 1 m<sup>2</sup> cells); (<b>C</b>) photo from drone of 4 m<sup>2</sup> showing RGB image of bryophytes amongst rocks; (<b>D</b>) classified moss beds in green for the same area as in (<b>C</b>).</p>
Full article ">Figure 4
<p>Canada Glacier ASPA 131 management map showing vegetation classification from the drone surveys (management plan and maps taken from the Antarctic Treaty Secretariat’s Antarctic Protected Area database (<a href="https://www.ats.aq/documents/recatt%5Catt683_e.pdf" target="_blank">https://www.ats.aq/documents/recatt%5Catt683_e.pdf</a>, accessed on 2 February 2022)).</p>
Full article ">Figure 5
<p>(<b>A</b>) RGB orthomosaic of Terra Nova Hut (Scott’s hut) at Cape Evans, Ross Island, Antarctica based on drone imagery; (<b>B</b>) close up of the hut showing artefacts surrounding the building, in addition to footprints on the snow and all around the buildings; (<b>C</b>) close up of the chain that was used to tie the sled dogs; and (<b>D</b>) close up of the cache of mutton bones that was exposed during the summer season of 2015/2016.</p>
Full article ">
26 pages, 2429 KiB  
Review
A Review on Software-Based and Hardware-Based Authentication Mechanisms for the Internet of Drones
by Emmanouel T. Michailidis and Demosthenes Vouyioukas
Drones 2022, 6(2), 41; https://doi.org/10.3390/drones6020041 - 8 Feb 2022
Cited by 33 | Viewed by 7935
Abstract
During the last few years, a wide variety of Internet of Drones (IoD) applications have emerged with numerous heterogeneous aerial and ground network elements interconnected and equipped with advanced sensors, computation resources, and communication units. The evolution of IoD networks presupposes the mitigation [...] Read more.
During the last few years, a wide variety of Internet of Drones (IoD) applications have emerged with numerous heterogeneous aerial and ground network elements interconnected and equipped with advanced sensors, computation resources, and communication units. The evolution of IoD networks presupposes the mitigation of several security and privacy threats. Thus, robust authentication protocols should be implemented in order to attain secure operation within the IoD. However, owing to the inherent features of the IoD and the limitations of Unmanned Aerial Vehicles (UAVs) in terms of energy, computational, and memory resources, designing efficient and lightweight authentication solutions is a non-trivial and complicated process. Recently, the development of authentication mechanisms for the IoD has received unprecedented attention. In this paper, up-to-date research studies on authentication mechanisms for IoD networks are presented. To this end, the adoption of conventional technologies and methods, such as the widely used hash functions, Public Key Infrastructure (PKI), and Elliptic-Curve Cryptography (ECC), is discussed along with emerging technologies, including Mobile Edge Computing (MEC), Machine Learning (ML), and Blockchain. Additionally, this paper provides a review of effective hardware-based solutions for the identification and authentication of network nodes within the IoD that are based on Trusted Platform Modules (TPMs), Hardware Security Modules (HSMs), and Physically Unclonable Functions (PUFs). Finally, future directions in these relevant research topics are given, stimulating further work. Full article
Show Figures

Figure 1

Figure 1
<p>Challenges and threats within the Internet of Drones (IoD).</p>
Full article ">Figure 2
<p>Simple representation of an authentication process in an IoD network.</p>
Full article ">Figure 3
<p>Key enablers for secure authentication within the IoD.</p>
Full article ">Figure 4
<p>Classification of the recently proposed authentication schemes for the IoD.</p>
Full article ">
24 pages, 1560 KiB  
Article
Robust Hierarchical Formation Control of Unmanned Aerial Vehicles via Neural-Based Observers
by Yang Fei, Yuan Sun and Peng Shi
Drones 2022, 6(2), 40; https://doi.org/10.3390/drones6020040 - 6 Feb 2022
Cited by 4 | Viewed by 3667
Abstract
Herein, we investigate the robust formation control problem for a group of unmanned aerial vehicles (UAVs) with system uncertainty. A hierarchical formation control strategy is introduced to ensure the uniform ultimate boundedness of each UAV’s reference tracking error. First, a group of saturated [...] Read more.
Herein, we investigate the robust formation control problem for a group of unmanned aerial vehicles (UAVs) with system uncertainty. A hierarchical formation control strategy is introduced to ensure the uniform ultimate boundedness of each UAV’s reference tracking error. First, a group of saturated high-level virtual agents are defined to act as the trajectory planners that offer feasible position references to the actual UAVs. A sliding mode neural-based observer is then constructed to estimate the nonlinear uncertainty in the UAV model. Furthermore, sliding mode controllers are designed for both the position loop and the attitude loop of the UAV. To attenuate the chattering phenomenon in the control input, a saturated and smoothed differentiator is proposed along with an observation introduction function. The effectiveness of the proposed control scheme is validated by both the Lyapunov stability theory and numerical simulations based on a multiple-UAV system. Full article
(This article belongs to the Special Issue Conceptual Design, Modeling, and Control Strategies of Drones)
Show Figures

Figure 1

Figure 1
<p>Hierarchical control diagram.</p>
Full article ">Figure 2
<p>Communication topology.</p>
Full article ">Figure 3
<p>Norms of high-level reference tracking errors.</p>
Full article ">Figure 4
<p>Norms of high-level sliding variables.</p>
Full article ">Figure 5
<p>Effectiveness of the neural-based observer.</p>
Full article ">Figure 6
<p>Effectiveness of the low-level controller.</p>
Full article ">Figure 7
<p>Control inputs of TOC.</p>
Full article ">Figure 8
<p>Control inputs of TSC.</p>
Full article ">Figure 9
<p>Norm of the overall reference tracking error.</p>
Full article ">Figure 10
<p>Illustration of system formation and individual trajectories.</p>
Full article ">Figure 11
<p>Trends of angular states.</p>
Full article ">
18 pages, 968 KiB  
Article
Drones in B5G/6G Networks as Flying Base Stations
by Georgios Amponis, Thomas Lagkas, Maria Zevgara, Georgios Katsikas, Thanos Xirofotos, Ioannis Moscholios and Panagiotis Sarigiannidis
Drones 2022, 6(2), 39; https://doi.org/10.3390/drones6020039 - 5 Feb 2022
Cited by 54 | Viewed by 10424
Abstract
Advances in the fields of networking, broadband communications and demand for high-fidelity low-latency last-mile communications have rendered as-efficient-as-possible relaying methods more necessary than ever. This paper investigates the possibility of the utilization of cellular-enabled drones as aerial base stations in next-generation cellular networks. [...] Read more.
Advances in the fields of networking, broadband communications and demand for high-fidelity low-latency last-mile communications have rendered as-efficient-as-possible relaying methods more necessary than ever. This paper investigates the possibility of the utilization of cellular-enabled drones as aerial base stations in next-generation cellular networks. Flying ad hoc networks (FANETs) acting as clusters of deployable relays for the on-demand extension of broadband connectivity constitute a promising scenario in the domain of next-generation high-availability communications. Matters of mobility, handover efficiency, energy availability, optimal positioning and node localization as well as respective multi-objective optimizations are discussed in detail, with their core ideas defining the structure of the work at hand. This paper examines improvements to the existing cellular network core to support novel use-cases and lower the operation costs of diverse ad hoc deployments. Full article
(This article belongs to the Special Issue Feature Papers of Drones)
Show Figures

Figure 1

Figure 1
<p>High-level structure of the presented work.</p>
Full article ">Figure 2
<p>3GPP-compliant 5G architecture.</p>
Full article ">Figure 3
<p>Terrestrial network coverage enhancement: a drone BS-supported firefighting scenario.</p>
Full article ">Figure 4
<p>Drone-BS-assisted vehicular communications scenario.</p>
Full article ">Figure 5
<p>Beamforming: a drone BS-supported intercell interference mitigation scenario.</p>
Full article ">Figure 6
<p>High-level overview of the virtual-physical infrastructure compartmentalization and key component interfaces.</p>
Full article ">
29 pages, 2392 KiB  
Article
Propeller Position Effects over the Pressure and Friction Coefficients over the Wing of an UAV with Distributed Electric Propulsion: A Proper Orthogonal Decomposition Analysis
by José Ramón Serrano, Luis Miguel García-Cuevas, Pau Bares and Pau Varela
Drones 2022, 6(2), 38; https://doi.org/10.3390/drones6020038 - 29 Jan 2022
Cited by 10 | Viewed by 5349
Abstract
New propulsive architectures, with high interactions with the aerodynamic performance of the platform, are an attractive option for reducing the power consumption, increasing the resilience, reducing the noise and improving the handling of fixed-wing unmanned air vehicles. Distributed electric propulsion with boundary layer [...] Read more.
New propulsive architectures, with high interactions with the aerodynamic performance of the platform, are an attractive option for reducing the power consumption, increasing the resilience, reducing the noise and improving the handling of fixed-wing unmanned air vehicles. Distributed electric propulsion with boundary layer ingestion over the wing introduces extra complexity to the design of these systems, and extensive simulation and experimental campaigns are needed to fully understand the flow behaviour around the aircraft. This work studies the effect of different combinations of propeller positions and angles of attack over the pressure coefficient and skin friction coefficient distributions over the wing of a 25 kg fixed-wing remotely piloted aircraft. To get more information about the main trends, a proper orthogonal decomposition of the coefficient distributions is performed, which may be even used to interpolate the results to non-simulated combinations, giving more information than an interpolation of the main aerodynamic coefficients such as the lift, drag or pitching moment coefficients. Full article
(This article belongs to the Special Issue Feature Papers of Drones)
Show Figures

Figure 1

Figure 1
<p>Simulated aircraft, as shown in [<a href="#B24-drones-06-00038" class="html-bibr">24</a>]. This sketch is for a configuration of 12 propellers.</p>
Full article ">Figure 2
<p>Side view sketch of the computational grid used for the current calculations.</p>
Full article ">Figure 3
<p>Airfoil side-view with principal heights (not to scale).</p>
Full article ">Figure 4
<p>Maximum and minimum propeller heights above the trailing edge. (<b>a</b>) Section of wing simulated with a virtual disc for modelling the propeller. (<b>b</b>) 0% position. (<b>c</b>) 100% position.</p>
Full article ">Figure 5
<p>Front section of the analysed airfoil section, marked in blue.</p>
Full article ">Figure 6
<p>Pressure coefficient distribution at two angles of attack for a Reynolds number of 5 × 10<sup>5</sup> and a propeller position of 75%.</p>
Full article ">Figure 7
<p><math display="inline"><semantics> <mi mathvariant="italic">TKE</mi> </semantics></math> of the different <math display="inline"><semantics> <msub> <mi>C</mi> <mi>p</mi> </msub> </semantics></math> modes for both the suction side (in blue) and pressure side (in red).</p>
Full article ">Figure 8
<p>First four modes of the pressure coefficient of the suction side.</p>
Full article ">Figure 9
<p>First four modes of the pressure coefficient of the pressure side.</p>
Full article ">Figure 10
<p>Pressure coefficient over the airfoil for an angle of attack of 3°, a propeller position of 50%. The results for the reconstruction with the first 3 and 9 eigenvectors are also included.</p>
Full article ">Figure 11
<p>Configuration coefficients for the first two modes of the pressure coefficient over the suction side of the airfoil, as a function of the angle of attack and the relative propeller position. (<b>a</b>) Configuration coefficient <math display="inline"><semantics> <mrow> <mi>A</mi> <mn>1</mn> </mrow> </semantics></math> of the first mode of <math display="inline"><semantics> <msub> <mi>C</mi> <mi>p</mi> </msub> </semantics></math> over the suction side. (<b>b</b>) Configuration coefficient <math display="inline"><semantics> <mrow> <mi>A</mi> <mn>2</mn> </mrow> </semantics></math> of the second mode of <math display="inline"><semantics> <msub> <mi>C</mi> <mi>p</mi> </msub> </semantics></math> over the suction side.</p>
Full article ">Figure 12
<p>Contribution of the two first modes in the case of maximum relative height of the propeller over the trailing edge and maximum angle of attack simulated.</p>
Full article ">Figure 13
<p>Configuration coefficients for the first three modes of the pressure coefficient over the pressure side of the airfoil, as a function of the angle of attack and the relative propeller position. (<b>a</b>) Configuration coefficient <math display="inline"><semantics> <mrow> <mi>A</mi> <mn>1</mn> </mrow> </semantics></math> of the first mode of the pressure coefficient over the pressure side. (<b>b</b>) Configuration coefficient <math display="inline"><semantics> <mrow> <mi>A</mi> <mn>2</mn> </mrow> </semantics></math> of the second mode of the pressure coefficient over the pressure side. (<b>c</b>) Configuration coefficient <math display="inline"><semantics> <mrow> <mi>A</mi> <mn>3</mn> </mrow> </semantics></math> of the third mode of the pressure coefficient over the pressure side.</p>
Full article ">Figure 14
<p>Friction coefficient over the airfoil with the propeller in 50% position.</p>
Full article ">Figure 15
<p><math display="inline"><semantics> <mi mathvariant="italic">TKE</mi> </semantics></math> of the friction coefficient.</p>
Full article ">Figure 16
<p>Friction coefficient reconstruction over the airfoil, for an angle of attack of 3° and a propeller relative height of 50%. (<b>a</b>) Friction coefficient reconstruction over the suction side of the airfoil. (<b>b</b>) Friction coefficient reconstruction over the pressure side of the airfoil.</p>
Full article ">Figure 17
<p>Configuration coefficients for the first two modes of the friction coefficient over the suction side of the airfoil, as a function of the angle of attack and the relative propeller position. (<b>a</b>) Configuration coefficient <math display="inline"><semantics> <mrow> <mi>A</mi> <mn>1</mn> </mrow> </semantics></math> of the first mode of the friction coefficient over the suction side. (<b>b</b>) Configuration coefficient <math display="inline"><semantics> <mrow> <mi>A</mi> <mn>2</mn> </mrow> </semantics></math> of the second mode of the friction coefficient over the suction side.</p>
Full article ">Figure 18
<p>Configuration coefficients for the first two modes of the friction coefficient over the pressure side of the airfoil, as a function of the angle of attack and the relative propeller position. (<b>a</b>) Configuration coefficient <math display="inline"><semantics> <mrow> <mi>A</mi> <mn>1</mn> </mrow> </semantics></math> of the first mode of the friction coefficient over the pressure side. (<b>b</b>) Configuration coefficient <math display="inline"><semantics> <mrow> <mi>A</mi> <mn>2</mn> </mrow> </semantics></math> of the second mode of the friction coefficient over the pressure side.</p>
Full article ">Figure 19
<p>Lift coefficient for each propeller position above the trailing edge and angle of attack.</p>
Full article ">Figure 20
<p>Drag coefficient for each propeller position above the trailing edge and angle of attack.</p>
Full article ">Figure 21
<p>Fraction of lift coefficient computed using from 1 to 9 modes, using both the pressure coefficient and friction coefficient distributions. (<b>a</b>) Lift coefficient computed using from 1 to 9 modes for an angle of attack of 1°. (<b>b</b>) Fraction of lift coefficient computed using from 1 to 9 modes for an angle of attack of 5°. (<b>c</b>) Fraction of lift coefficient computed using from 1 to 9 modes for an angle of attack of 9°.</p>
Full article ">Figure 22
<p>Fraction of drag coefficient computed using from 1 to 9 modes, using both the pressure coefficient and friction coefficient distributions. (<b>a</b>) Fraction of drag coefficient computed using from 1 to 9 modes for an angle of attack of 1°. (<b>b</b>) Fraction of drag coefficient computed using from 1 to 9 modes for an angle of attack of 5°. (<b>c</b>) Fraction of drag coefficient computed using from 1 to 9 modes for an angle of attack of 9°.</p>
Full article ">Figure 23
<p>Pressure coefficient reconstructed in a propeller position not used to fit the surrogate model, compared with data from a CFD simulation. (<b>a</b>) Pressure coefficient reconstructed an angle of attack of 3° and a propeller position of 30%. (<b>b</b>) Pressure coefficient reconstructed for an angle of attack of 3° and a propeller position of 65%.</p>
Full article ">Figure 24
<p>Pressure coefficient reconstructed in a propeller position and angle of attack not used to fit the surrogate model, compared with data from a CFD simulation. (<b>a</b>) Pressure coefficient reconstructed for an angle of attack of <math display="inline"><semantics> <mrow> <mn>5.5</mn> </mrow> </semantics></math>° and a propeller position of 50%. (<b>b</b>) Pressure coefficient reconstructed for an angle of attack of <math display="inline"><semantics> <mrow> <mn>5.5</mn> </mrow> </semantics></math>° and a propeller position of 65%.</p>
Full article ">Figure 25
<p>Friction coefficient reconstructed in a propeller position not used to produce the surrogate model, compared with data from a CFD simulation. (<b>a</b>) Friction coefficient reconstructed an angle of attack of 3° and a propeller position of 30%. (<b>b</b>) Friction coefficient reconstructed for an angle of attack of 3° and a propeller position of 65%.</p>
Full article ">Figure 26
<p>Friction coefficient reconstructed in a propeller position and angle of attack not used to fit the surrogate model, compared with data from a CFD simulation. (<b>a</b>) Friction coefficient reconstructed for an angle of attack of <math display="inline"><semantics> <mrow> <mn>5.5</mn> </mrow> </semantics></math>° and a propeller position of 50%. (<b>b</b>) Friction coefficient reconstructed for an angle of attack of <math display="inline"><semantics> <mrow> <mn>5.5</mn> </mrow> </semantics></math>° and a propeller position of 65%.</p>
Full article ">
6 pages, 165 KiB  
Editorial
Acknowledgment to Reviewers of Drones in 2021
by Drones Editorial Office
Drones 2022, 6(2), 37; https://doi.org/10.3390/drones6020037 - 29 Jan 2022
Viewed by 2191
Abstract
Rigorous peer-reviews are the basis of high-quality academic publishing [...] Full article
20 pages, 801 KiB  
Article
Real-Time Improvement of a Trajectory-Tracking Control Based on Super-Twisting Algorithm for a Quadrotor Aircraft
by Iván González-Hernández, Sergio Salazar, Rogelio Lozano and Oscar Ramírez-Ayala
Drones 2022, 6(2), 36; https://doi.org/10.3390/drones6020036 - 26 Jan 2022
Cited by 14 | Viewed by 5313
Abstract
This article addresses the development and experimental validation of a trajectory-tracking control for a miniature autonomous Quadrotor helicopter system (X4-prototype) using a robust algorithm control based on second-order sliding mode technique or also known as super-twisting algorithm in outdoor environments. This nonlinear control [...] Read more.
This article addresses the development and experimental validation of a trajectory-tracking control for a miniature autonomous Quadrotor helicopter system (X4-prototype) using a robust algorithm control based on second-order sliding mode technique or also known as super-twisting algorithm in outdoor environments. This nonlinear control strategy guarantees the convergence in finite time to a desired path r(t) in the presence of external disturbances or uncertainties in the model affecting the appropriate behavior of our Quadrotor helicopter. For this purpose, a polynomial smooth curve trajectory is selected as a reference signal where the corresponding derivatives of the function are bounded. Moreover, we consider disturbances due to wind gusts acting on the aerial vehicle, and the reference signal is pre-programmed in an advanced autopilot system. The proposed solution consists of implementing a real-time control law based on super-twisting control using GPS measurements in order to obtain the position in the xy-plane to accomplish the desired trajectory. Simulation and experimental results of trajectory-tracking control are presented to demonstrate the performance and robustness of the proposed nonlinear controller in windy conditions. Full article
Show Figures

Figure 1

Figure 1
<p>Quadrotor helicopter coordinate system scheme.</p>
Full article ">Figure 2
<p>Cascading integrators scheme for the trajectory-tracking control in the <span class="html-italic">y</span>-axis.</p>
Full article ">Figure 3
<p><span class="html-italic">PD-trajectory tracking control</span> response on the <span class="html-italic">x</span>-axis. Trajectory-tracking on this axis oscillates around the desired reference signal.</p>
Full article ">Figure 4
<p><span class="html-italic">PD-trajectory tracking control</span> response on the <span class="html-italic">y</span>-axis. Trajectory-tracking on this axis oscillates around the desired reference signal.</p>
Full article ">Figure 5
<p><span class="html-italic">Super twisting-trajectory tracking control</span> response on the <span class="html-italic">x</span>-axis. Trajectory-tracking converges faster to desired reference signal.</p>
Full article ">Figure 6
<p><span class="html-italic">Super twisting-trajectory tracking control</span> response on the <span class="html-italic">y</span>-axis. Trajectory-tracking converges faster to desired reference signal.</p>
Full article ">Figure 7
<p><span class="html-italic">PD-trajectory tracking error</span> response on the <span class="html-italic">x</span>-axis.</p>
Full article ">Figure 8
<p><span class="html-italic">PD-trajectory tracking error</span> response on the <span class="html-italic">y</span>-axis.</p>
Full article ">Figure 9
<p><span class="html-italic">Super twisting-trajectory tracking error</span> response on the <span class="html-italic">x</span>-axis.</p>
Full article ">Figure 10
<p><span class="html-italic">Super twisting-trajectory tracking error</span> response on the <span class="html-italic">y</span>-axis.</p>
Full article ">Figure 11
<p>X4-prototype setup with the embedded control system.</p>
Full article ">Figure 12
<p><span class="html-italic">x</span>-trajectory tracking response of the Quadrotor helicopter using <span class="html-italic">PD-controller</span> in outdoor environment. The current position presents oscillations around and does not converge adequately to the reference signal.</p>
Full article ">Figure 13
<p><span class="html-italic">y</span>-trajectory tracking response of the Quadrotor helicopter using <span class="html-italic">PD-controller</span> in outdoor environment. The current position presents oscillations around and does not converge adequately to the reference signal.</p>
Full article ">Figure 14
<p><span class="html-italic">x</span>-trajectory tracking behavior of the Quadrotor helicopter using <span class="html-italic">super-twisting algorithm</span> in outdoor environment. The current position converges correctly to the reference signal.</p>
Full article ">Figure 15
<p><span class="html-italic">y</span>-trajectory tracking behavior of the Quadrotor helicopter using <span class="html-italic">super-twisting algorithm</span> in outdoor environment. The current position converges correctly to the reference signal.</p>
Full article ">
16 pages, 4792 KiB  
Article
Assessment of Android Network Positioning as an Alternative Source of Navigation for Drone Operations
by Dong-Kyeong Lee, Filip Nedelkov and Dennis M. Akos
Drones 2022, 6(2), 35; https://doi.org/10.3390/drones6020035 - 23 Jan 2022
Cited by 4 | Viewed by 4198
Abstract
Applications of drones have increased significantly in the past decade for both indoor and outdoor operations. In order to assist autonomous drone navigation, there are numerous sensors installed onboard the vehicles. These include Global Navigation Satellite Systems (GNSS) chipsets, inertial sensors, barometer, lidar, [...] Read more.
Applications of drones have increased significantly in the past decade for both indoor and outdoor operations. In order to assist autonomous drone navigation, there are numerous sensors installed onboard the vehicles. These include Global Navigation Satellite Systems (GNSS) chipsets, inertial sensors, barometer, lidar, radar and vision sensors. The two sensors used most often by drone autopilot controllers for absolute positioning are the GNSS chipsets and barometer. Although, for most outdoor operations, these sensors provide accurate and reliable position information, their accuracy, availability, and integrity deteriorate for indoor applications and in the presence of radio frequency interference (RFI), such as GNSS spoofing and jamming. Therefore, it is possible to derive network-based locations from Wi-Fi and cellular transmission. Although there have been many theoretical studies on network positioning, limited resources are available for the expected quantitative performance of these positioning methodologies. In this paper, the authors investigate both the horizontal and vertical accuracy of the Android network location engines under rural, suburban, and urban environments. The paper determines the horizontal location accuracy to be approximately 1637 m, 38 m, and 32 m in terms of 68% circular error probable (CEP) for rural, suburban, and urban environments, respectively, and the vertical accuracy to be 1.2 m and 4.6 m in terms of 68% CEP for suburban and urban environments, respectively. In addition, the availability and latency of the location engines are explored. Furthermore, the paper assesses the accuracy of the Android network location accuracy indicator for various drone operation environments. The assessed accuracies of the network locations provide a deeper insight into their potential for drone navigation. Full article
(This article belongs to the Special Issue Advances in SLAM and Data Fusion for UAVs/Drones)
Show Figures

Figure 1

Figure 1
<p>Wi-Fi fingerprinting methodology. Access points are received by a user, and the corresponding identifier and location is saved to the database. When another device receives a set of identifier and the corresponding power, it is sent to a server to be converted to a location solution.</p>
Full article ">Figure 2
<p>Illustration of cell localization using cell towers. If the locations of the cell towers are known, and the distance to each cell tower can be estimated, a position solution can be computed.</p>
Full article ">Figure 3
<p>The routes used for each of the test scenarios. (<b>A</b>) Erie, Colorado, with limited Wi-Fi and Cell coverage; (<b>B</b>) downtown Boulder, Colorado, with many residential network coverage; (<b>C</b>) downtown Denver, Colorado, with many corporate and business network coverage; (<b>D</b>) a loop with a lot of elevation changes;.</p>
Full article ">Figure 4
<p>Representation of the route in black, available NLP in colored dots, and the representation of the number of access points in the respective colors. (<b>A</b>) is the rural scenario, (<b>B)</b> is the suburban drive, and (<b>C</b>) is the urban test for downtown Denver.</p>
Full article ">Figure 5
<p>Cumulative distribution function plot of the horizontal errors observed for three scenarios: rural, suburban, and urban. The urban scenario had the least horizontal errors, and the rural scenario had the most errors.</p>
Full article ">Figure 6
<p>Relationship between the number of access points and the observed horizontal error. When only cell positioning is used (rural), the horizontal error was the greatest. For suburban and urban scenarios when Wi-Fi positioning was used, an increased number of access points meant smaller horizontal error, but the advantage plateaus when enough access points are available.</p>
Full article ">Figure 7
<p>Accuracy of the altitude measurements from the NLP versus truth. For altitude variation and suburban scenarios, the altitude estimates match closely. For the urban scenario, the NLP altitude is sometimes positively skewed due to over-estimating the altitude.</p>
Full article ">Figure 8
<p>Relationship between the number of Wi-Fi access points and the observed altitude error.</p>
Full article ">Figure 9
<p>Observed network location error versus the android estimated error level. For horizontal accuracies, the Wi-Fi errors matched the android estimates well, but the errors exceeded the estimates significantly for the urban scenario when only cell positioning was used. For vertical accuracies, the Wi-Fi errors were well bounded for the suburban scenario, but were underestimated for the urban environment.</p>
Full article ">
17 pages, 3326 KiB  
Article
DAGmap: Multi-Drone SLAM via a DAG-Based Distributed Ledger
by Seongjoon Park and Hwangnam Kim
Drones 2022, 6(2), 34; https://doi.org/10.3390/drones6020034 - 20 Jan 2022
Cited by 9 | Viewed by 4490
Abstract
Simultaneous localization and mapping (SLAM) in unmanned vehicles, such as drones, has great usability potential in versatile applications. When operating SLAM in multi-drone scenarios, collecting and sharing the map data and deriving converged maps are major issues (regarded as the bottleneck of the [...] Read more.
Simultaneous localization and mapping (SLAM) in unmanned vehicles, such as drones, has great usability potential in versatile applications. When operating SLAM in multi-drone scenarios, collecting and sharing the map data and deriving converged maps are major issues (regarded as the bottleneck of the system). This paper presents a novel approach that utilizes the concepts of distributed ledger technology (DLT) for enabling the online map convergence of multiple drones without a centralized station. As DLT allows each agent to secure a collective database of valid transactions, DLT-powered SLAM can let each drone secure global 3D map data and utilize these data for navigation. However, block-based DLT—a so called blockchain—may not fit well to the multi-drone SLAM due to the restricted data structure, discrete consensus, and high power consumption. Thus, we designed a multi-drone SLAM system that constructs a DAG-based map database and sifts the noisy 3D points based on the DLT philosophy, named DAGmap. Considering the differences between currency transactions and data constructions, we designed a new strategy for data organization, validation, and a consensus framework under the philosophy of DAG-based DLT. We carried out a numerical analysis of the proposed system with an off-the-shelf camera and drones. Full article
(This article belongs to the Special Issue Advances in SLAM and Data Fusion for UAVs/Drones)
Show Figures

Figure 1

Figure 1
<p>Conceptual overview of DAGmap.</p>
Full article ">Figure 2
<p>Technical relationship between the components of DLT, SLAM and DAGmap.</p>
Full article ">Figure 3
<p>Feature transaction issuance, validation, and confirmation procedure.</p>
Full article ">Figure 4
<p>Adjacency DAG composed of 3D features.</p>
Full article ">Figure 5
<p>Validation and feature bundle transaction.</p>
Full article ">Figure 6
<p>Finding true feature by confirmation process.</p>
Full article ">Figure 7
<p>Map consensus algorithm.</p>
Full article ">Figure 8
<p>Implementation scheme of DAGmap.</p>
Full article ">Figure 9
<p>Searching areas of DAGmap agents.</p>
Full article ">Figure 10
<p>Experiment results of DAGmap.</p>
Full article ">Figure 11
<p><math display="inline"><semantics> <msub> <mi>N</mi> <mi>f</mi> </msub> </semantics></math> traffic of DAGmap network with respect to time.</p>
Full article ">
34 pages, 9688 KiB  
Article
Distributed 3D Navigation of Swarms of Non-Holonomic UAVs for Coverage of Unsteady Environmental Boundaries
by Alexey S. Matveev and Anna A. Semakova
Drones 2022, 6(2), 33; https://doi.org/10.3390/drones6020033 - 20 Jan 2022
Cited by 4 | Viewed by 3164
Abstract
A team of non-holonomic constant-speed under-actuated unmanned aerial vehicles (UAVs) with lower-limited turning radii travel in 3D. The space hosts an unknown and unpredictably varying scalar environmental field. A space direction is given; this direction and the coordinate along it are conditionally termed [...] Read more.
A team of non-holonomic constant-speed under-actuated unmanned aerial vehicles (UAVs) with lower-limited turning radii travel in 3D. The space hosts an unknown and unpredictably varying scalar environmental field. A space direction is given; this direction and the coordinate along it are conditionally termed as the “vertical” and “altitude”, respectively. All UAVs should arrive at the moving and deforming isosurface where the field assumes a given value. They also should evenly distribute themselves over a pre-specified range of the “altitudes” and repeatedly encircle the entirety of the isosurface while remaining on it, each at its own altitude. Every UAV measures only the field intensity at the current location and both the Euclidean and altitudinal distances to the objects (including the top and bottom of the altitudinal range) within a finite range of visibility and has access to its own speed and the vertical direction. The UAVs carry no communication facilities, are anonymous to one another, and cannot play distinct roles in the team. A distributed control law is presented that solves this mission under minimal and partly inevitable assumptions. This law is justified by a mathematically rigorous global convergence result; computer simulation tests confirm its performance. Full article
(This article belongs to the Special Issue Conceptual Design, Modeling, and Control Strategies of Drones)
Show Figures

Figure 1

Figure 1
<p>Unmanned Aerial Vehicle.</p>
Full article ">Figure 2
<p>Block-diagram of the control system for the <span class="html-italic">i</span>th UAV.</p>
Full article ">Figure 3
<p>(<b>a</b>) Classic example of a repelling surface <span class="html-italic">S</span>; (<b>b</b>,<b>c</b>) Examples of surfaces that cause both attraction and repulsion.</p>
Full article ">Figure 4
<p>Locating and sweep coverage of a slowly translating isosurface with singularities.</p>
Full article ">Figure 5
<p>Locating and sweep coverage of a slowly rotating isosurface.</p>
Full article ">Figure 6
<p>Sweep coverage of a moving and deforming isosurface with a dropout of team members.</p>
Full article ">Figure 7
<p>Sweep coverage of a deforming isosurface with an admission of extra team members.</p>
Full article ">
26 pages, 3718 KiB  
Article
Examining New Zealand Unmanned Aircraft Users’ Measures for Mitigating Operational Risks
by Isaac Levi Henderson
Drones 2022, 6(2), 32; https://doi.org/10.3390/drones6020032 - 20 Jan 2022
Cited by 9 | Viewed by 4783
Abstract
While the potential risks of unmanned aircraft have received significant attention, there is little in the academic literature that examines how operational risks are mitigated by users. This study examines the prevalence of key operational risk mitigations amongst a sample of 812 unmanned [...] Read more.
While the potential risks of unmanned aircraft have received significant attention, there is little in the academic literature that examines how operational risks are mitigated by users. This study examines the prevalence of key operational risk mitigations amongst a sample of 812 unmanned aircraft users in New Zealand, their confidence levels in identifying and complying with airspace requirements, and their ability to read visual navigation charts (VNCs) and use AirShare (a local tool that shows airspace requirements). Significant differences exist between the number and type of mitigations applied, users’ confidence levels in identifying and complying with airspace requirements, and users’ ability to read VNCs and use AirShare based upon user characteristics. Education, practical assessment, membership of a professional body, professional/semi-professional use, and operating for a certificated organisation all improve risk mitigation (greater number and variety of risk mitigations applied). The only risk mitigation employed by almost all users was conducting a pre-flight check of their aircraft, identifying the need for users to view risk mitigation more holistically. The findings support policy directions related to educational requirements, the ability for member-based organisations and professional bodies to self-regulate, and the fitness of the current regulatory system in New Zealand. Full article
Show Figures

Figure 1

Figure 1
<p>Example of a JSA [<a href="#B41-drones-06-00032" class="html-bibr">41</a>].</p>
Full article ">Figure 2
<p>Users’ typical pre-flight risk mitigations along with groups associated with being more or less likely to use each pre-flight risk mitigation.</p>
Full article ">Figure 3
<p>Differences in the number of risk mitigations typically applied based upon user characteristics. Note: *, *** indicate statistical significance at the &lt;0.05, &lt;0.0001.</p>
Full article ">Figure 4
<p>Use of air band radio in general and in specific scenarios.</p>
Full article ">Figure 5
<p>Users’ confidence in being able to identify and comply with airspace requirements. Note: *, **, *** indicate statistical significance at the &lt;0.05, &lt;0.01, &lt;0.0001.</p>
Full article ">Figure 6
<p>Users’ ability to read VNCs and use AirShare. Note: *, **, *** indicate statistical significance at the &lt;0.05, &lt;0.01, &lt;0.0001.</p>
Full article ">Figure A1
<p>Excerpt from a VNC demonstrating a low fly zone (L163) that was presented to participants.</p>
Full article ">Figure A2
<p>Excerpt from a VNC demonstrating a danger area for explosive hazards that was presented to participants.</p>
Full article ">Figure A3
<p>Excerpt from AirShare demonstrating a military operating area that was presented to participants.</p>
Full article ">Figure A4
<p>Excerpt from AirShare demonstrating a control zone that was presented to participants.</p>
Full article ">
16 pages, 8619 KiB  
Article
UAV Photogrammetry and GIS Interpretations of Extended Archaeological Contexts: The Case of Tacuil in the Calchaquí Area (Argentina)
by Carolina Orsini, Elisa Benozzi, Veronica Williams, Paolo Rossi and Francesco Mancini
Drones 2022, 6(2), 31; https://doi.org/10.3390/drones6020031 - 20 Jan 2022
Cited by 14 | Viewed by 4485
Abstract
The scope and scientific purpose of this paper focuses on multiscale (aerial and terrestrial) photogrammetry as a support to investigations and interpretations in a multi-component archaeological site located in the Argentinian Cordillera (Calchaquí, Salta), known as Tacuil. Due to its scarce accessibility, as [...] Read more.
The scope and scientific purpose of this paper focuses on multiscale (aerial and terrestrial) photogrammetry as a support to investigations and interpretations in a multi-component archaeological site located in the Argentinian Cordillera (Calchaquí, Salta), known as Tacuil. Due to its scarce accessibility, as well as long-term problems associated with the interpretation of the visibility of this type of settlement, the use of aerial surveying was combined with the reconstruction of structures and complex soil morphologies by resorting to modern photogrammetric approaches (3D models and orthophotos). This dataset was complemented by a terrestrial survey to obtain extremely high resolution and detailed representations of archaeological features that were integrated in a GIS database. The outcome of photogrammetric surveying was fundamental in supporting the debate on the functionality of the site and his integration in a complex, socially constructed, ancient landscape. Finally, the present paper introduces the first complete map of Tacuil. Full article
(This article belongs to the Special Issue (Re)Defining the Archaeological Use of UAVs)
Show Figures

Figure 1

Figure 1
<p>(<b>a</b>) Location map of the study area, the rectangle encloses the extent investigated by UAV-based and terrestrial photogrammetry; (<b>b</b>) top view of a hamlet of Tacuil, long shadows point out some remains and the presence of only sparse and low vegetation; (<b>c</b>,<b>d</b>) panoramic views of the site, the two hilltops, the topography, and the surrounding vegetation.</p>
Full article ">Figure 2
<p>(<b>a</b>) Map of the Tacuil archaeological area; dashed polygons state the location and names of sites investigated within this study; (<b>b</b>) panoramic view of Pukara 1; (<b>c</b>) panoramic view of Pukara 2; (<b>d</b>) focus on Pukara 1 archaeological remains (frame extracted by panoramic UAV flight).</p>
Full article ">Figure 3
<p>Acquired datasets: (<b>a</b>) UAV dataset (images and GCPs), representation of errors on GCPs after the orientation of acquired images; (<b>b</b>) ground dataset, example of images acquisition scheme and targets—GCPs positioning.</p>
Full article ">Figure 4
<p>Workflow of data processing dived into field operations, photogrammetric processing, and GIS analysis steps.</p>
Full article ">Figure 5
<p>Sectors of the investigated areas as represented in the virtual model: (<b>a</b>) view of the textured 3D model obtained for the Tacuil area; (<b>b</b>,<b>c</b>) views of two structures in the lower hamlet documented through a ground photogrammetry approach.</p>
Full article ">Figure 6
<p>High resolution orthophoto (<b>left</b>) for the lower hamlet with interpretation of archaeological features. Digital elevation models of a couple of sectors belonging to Pukara 1 (<b>right</b>, <b>below</b>) and 2 (<b>right</b>, <b>above</b>) with visible, 3D, human-made structures.</p>
Full article ">Figure 7
<p>Visibility analysis performed on large extents with respect to Pukara 1 (<b>left</b>) and Pukara 2 (<b>right</b>) at distances of 8 km (yellow areas) and 15 km (orange areas), respectively.</p>
Full article ">Figure 8
<p>Visibility analysis at local scale, based on the high-resolution digital elevation models from the UAV photogrammetry. Red triangles labelled from 1 to 4 locate the points from which the analysis is carried out. Areas of visibility from the observation points are represented as grey polygons with transparencies.</p>
Full article ">Figure 9
<p>General map of the site with the structures mentioned in the text.</p>
Full article ">
15 pages, 14861 KiB  
Article
Accuracy Assessment of a UAV Direct Georeferencing Method and Impact of the Configuration of Ground Control Points
by Xiaoyu Liu, Xugang Lian, Wenfu Yang, Fan Wang, Yu Han and Yafei Zhang
Drones 2022, 6(2), 30; https://doi.org/10.3390/drones6020030 - 20 Jan 2022
Cited by 53 | Viewed by 7672
Abstract
Unmanned aerial vehicles (UAVs) can obtain high-resolution topography data flexibly and efficiently at low cost. However, the georeferencing process involves the use of ground control points (GCPs), which limits time and cost effectiveness. Direct georeferencing, using onboard positioning sensors, can significantly improve work [...] Read more.
Unmanned aerial vehicles (UAVs) can obtain high-resolution topography data flexibly and efficiently at low cost. However, the georeferencing process involves the use of ground control points (GCPs), which limits time and cost effectiveness. Direct georeferencing, using onboard positioning sensors, can significantly improve work efficiency. The purpose of this study was to evaluate the accuracy of the Global Navigation Satellite System (GNSS)-assisted UAV direct georeferencing method and the influence of the number and distribution of GCPs. A FEIMA D2000 UAV was used to collect data, and several photogrammetric projects were established. Among them, the number and distribution of GCPs used in the bundle adjustment (BA) process were varied. Two parameters were considered when evaluating the different projects: the ground-measured checkpoints (CPs) root mean square error (RMSE) and the Multiscale Model to Model Cloud Comparison (M3C2) distance. The results show that the vertical and horizontal RMSE of the direct georeferencing were 0.087 and 0.041 m, respectively. As the number of GCPs increased, the RMSE gradually decreased until a specific GCP density was reached. GCPs should be uniformly distributed in the study area and contain at least one GCP near the center of the domain. Additionally, as the distance to the nearest GCP increased, the local accuracy of the DSM decreased. In general, UAV direct georeferencing has an acceptable positional accuracy level. Full article
(This article belongs to the Special Issue Unconventional Drone-Based Surveying)
Show Figures

Figure 1

Figure 1
<p>Workflow of methods, starting with the route planning of a UAV campaign until the determination of DSM and orthomosaic accuracy (<span class="html-italic">RMSE</span> and M3C2 distance).</p>
Full article ">Figure 2
<p>Study area with the GCP and checkpoint distributions. Projection coordinate system: WGS_1984_UTM_zone_49N; (<b>a</b>) DOM and (<b>b</b>) route map.</p>
Full article ">Figure 3
<p>Experimental settings. (<b>a</b>) FEIMA D2000 UAV. (<b>b</b>) Distribution of GCPs and CPs in the bundle adjustment. (<b>c</b>) Measurement of field control points. (<b>d</b>) Display of control points in the image. (<b>e</b>) The number of overlapping images used in each pixel calculation. Red and yellow areas indicate low overlap, while green areas indicate more than 5 images per pixel.</p>
Full article ">Figure 4
<p>Schemes with different distributions of GCPs. (<b>a</b>) Five distributions of one GCP; (<b>b</b>) two distributions of four GCPs. The distributions are shown by different colours.</p>
Full article ">Figure 5
<p>Effect of GCP density on the <span class="html-italic">RMSE</span> of (<b>a</b>) DSM vertical and (<b>b</b>) DOM horizontal. GCP density was calculated by dividing the quantity of GCPs used by the measurement area of 0.5 km<sup>2</sup>. d represents the density of point clouds on the horizontal axis. Add error bars at ±1 standard error at GCP density of 2, 4 and 8, respectively.</p>
Full article ">Figure 6
<p>Statistical analysis of the UAV survey based on 120 checkpoints. (<b>a</b>–<b>c</b>) Histograms of elevation differences between the DSM and RTK checkpoints obtained by 0, 1, or 2 GCPs and their Gaussian fit. (<b>d</b>) Linear fitting function between RTK checkpoint elevations and DSM elevations obtained with 0 GCPs.</p>
Full article ">Figure 7
<p>M3C2 distance between reference point clouds (16 GCPs) and point clouds obtained from different photogrammetric projects. (<b>a</b>) Average difference (accuracy). (<b>b</b>) Standard deviation (precision). Add error bars at +/−1 standard error at GCP number of 1, 2 and 4, respectively.</p>
Full article ">Figure 8
<p>M3C2 distance between the point cloud obtained by using different numbers of GCPs as control points and the reference point cloud (obtained by using all 16 GCPS). (<b>a</b>–<b>d</b>) Zero-, 3-, 6-, and 9-GCP and Gaussian distribution. Solid black dots indicate the location of GCPs. The vertical red line in the figure are zero error lines.</p>
Full article ">Figure 9
<p>Boxplots represent vertical errors obtained by seven layout schemes divided by the number of GCPs (the horizontal line represents the median, the lower part of the hinge represents the 25th percentile, and the upper part represents the 75th percentile). The whiskers represent 1.5× IQR (interquartile range) in both directions. The black squares and circles represent the mean and outliers, respectively. Refer to the schematic diagram of different GCP distribution in <a href="#drones-06-00030-f004" class="html-fig">Figure 4</a>. 1 and k1 in 1-k1 represent the number and code of control points respectively, and others are similar.</p>
Full article ">Figure 10
<p>Relationship between local DSM accuracy and nearest GCP. (<b>a</b>) Differential DSM; (<b>b</b>) the scatterplot and nonlinear curve fit obtained by sampling from the differential DSM.</p>
Full article ">
22 pages, 1869 KiB  
Article
Robotic Herding of Farm Animals Using a Network of Barking Aerial Drones
by Xiaohui Li, Hailong Huang, Andrey V. Savkin and Jian Zhang
Drones 2022, 6(2), 29; https://doi.org/10.3390/drones6020029 - 19 Jan 2022
Cited by 28 | Viewed by 8796
Abstract
This paper proposes a novel robotic animal herding system based on a network of autonomous barking drones. The objective of such a system is to replace traditional herding methods (e.g., dogs) so that a large number (e.g., thousands) of farm animals such as [...] Read more.
This paper proposes a novel robotic animal herding system based on a network of autonomous barking drones. The objective of such a system is to replace traditional herding methods (e.g., dogs) so that a large number (e.g., thousands) of farm animals such as sheep can be quickly collected from a sparse status and then driven to a designated location (e.g., a sheepfold). In this paper, we particularly focus on the motion control of the barking drones. To this end, a computationally efficient sliding mode based control algorithm is developed, which navigates the drones to track the moving boundary of the animals’ footprint and enables the drones to avoid collisions with others. Extensive computer simulations, where the dynamics of the animals follow Reynolds’ rules, show the effectiveness of the proposed approach. Full article
(This article belongs to the Special Issue Conceptual Design, Modeling, and Control Strategies of Drones)
Show Figures

Figure 1

Figure 1
<p>A basic unit of the proposed drone herding system.</p>
Full article ">Figure 2
<p>Illustration of the convex hull of the herding animals (<span class="html-fig-inline" id="drones-06-00029-i003"> <img alt="Drones 06 00029 i003" src="/drones/drones-06-00029/article_deploy/html/images/drones-06-00029-i003.png"/></span>) and the corresponding extended hull.</p>
Full article ">Figure 3
<p>Illustration of the effective broadcasting angle <math display="inline"><semantics> <mi>β</mi> </semantics></math> and distance <math display="inline"><semantics> <msub> <mi>R</mi> <mi>b</mi> </msub> </semantics></math> with (<b>a</b>) a small drone-to-animal distance <math display="inline"><semantics> <msub> <mi>d</mi> <mi>s</mi> </msub> </semantics></math> and (<b>b</b>) a larger drone-to-animal distance <math display="inline"><semantics> <msub> <mi>d</mi> <mi>s</mi> </msub> </semantics></math>, under the same animals’ distribution. Here (<span class="html-fig-inline" id="drones-06-00029-i001"> <img alt="Drones 06 00029 i001" src="/drones/drones-06-00029/article_deploy/html/images/drones-06-00029-i001.png"/></span>) stands for the animal repulsing by the barking drone (<span class="html-fig-inline" id="drones-06-00029-i002"> <img alt="Drones 06 00029 i002" src="/drones/drones-06-00029/article_deploy/html/images/drones-06-00029-i002.png"/></span>); (<span class="html-fig-inline" id="drones-06-00029-i003"> <img alt="Drones 06 00029 i003" src="/drones/drones-06-00029/article_deploy/html/images/drones-06-00029-i003.png"/></span>) stands for the animal outsides the barking broadcasting range; (<span class="html-fig-inline" id="drones-06-00029-i004"> <img alt="Drones 06 00029 i004" src="/drones/drones-06-00029/article_deploy/html/images/drones-06-00029-i004.png"/></span>) stands for <math display="inline"><semantics> <msub> <mi>C</mi> <mi>o</mi> </msub> </semantics></math>.</p>
Full article ">Figure 4
<p>Overview of the proposed method.</p>
Full article ">Figure 5
<p>Illustration of the path planning for a barking drone from <span class="html-italic">A</span> to <span class="html-italic">B</span>, where <span class="html-italic">A</span> is a point on the plane of the extended hull, <span class="html-italic">B</span> is a vertex of the extended hull, <span class="html-italic">O</span> is the reaching point of the drone on the extended hull. The animals’ convex hull is denoted by blue lines. The green arrows and black arrows are the planned trajectories with a given clockwise and counterclockwise direction, respectively.</p>
Full article ">Figure 6
<p>Illustration of (<b>a</b>) <span class="html-italic">Fly to edge</span> guidance with <math display="inline"><semantics> <mrow> <mo>(</mo> <mi>p</mi> <mo>,</mo> <mi>q</mi> <mo>)</mo> <mo>&lt;</mo> <mn>0</mn> </mrow> </semantics></math>; (<b>b</b>) <span class="html-italic">Fly to edge</span> guidance with <math display="inline"><semantics> <mrow> <mo>(</mo> <mi>p</mi> <mo>,</mo> <mi>q</mi> <mo>)</mo> <mo>&lt;</mo> <mn>0</mn> </mrow> </semantics></math> and <math display="inline"><semantics> <mrow> <mo>|</mo> <mi>q</mi> <mo>|</mo> <mo>⩾</mo> <mo>|</mo> <mi>p</mi> <mo>|</mo> </mrow> </semantics></math>; (<b>c</b>) <span class="html-italic">Fly to edge</span> guidance otherwise; (<b>d</b>) <span class="html-italic">Fly on edge</span> guidance navigates the drone from <math display="inline"><semantics> <mi mathvariant="bold-italic">d</mi> </semantics></math> to <math display="inline"><semantics> <msup> <mi>O</mi> <mo>*</mo> </msup> </semantics></math>.</p>
Full article ">Figure 7
<p>Illustration of the motion <span class="html-italic">TRANSFER</span>.</p>
Full article ">Figure 8
<p>The examples of (<b>a</b>) drones’ positions on the <span class="html-italic">z</span> axis, (<b>b</b>) steering points’ positions on the <span class="html-italic">z</span> axis, (<b>c</b>) steering points on the <math display="inline"><semantics> <msup> <mi>z</mi> <mo>*</mo> </msup> </semantics></math> axis, and (<b>d</b>) drone’s positions, steering points’ positions and travel routes on the <math display="inline"><semantics> <msup> <mi>z</mi> <mo>*</mo> </msup> </semantics></math> axis. Here (<span class="html-fig-inline" id="drones-06-00029-i002"> <img alt="Drones 06 00029 i002" src="/drones/drones-06-00029/article_deploy/html/images/drones-06-00029-i002.png"/></span>) stands for <math display="inline"><semantics> <msub> <mi>z</mi> <mi>j</mi> </msub> </semantics></math>, (<span class="html-fig-inline" id="drones-06-00029-i004"> <img alt="Drones 06 00029 i004" src="/drones/drones-06-00029/article_deploy/html/images/drones-06-00029-i004.png"/></span>) stands for <math display="inline"><semantics> <msubsup> <mi>z</mi> <mrow> <mi>j</mi> </mrow> <mo>′</mo> </msubsup> </semantics></math>, (<span class="html-fig-inline" id="drones-06-00029-i005"> <img alt="Drones 06 00029 i005" src="/drones/drones-06-00029/article_deploy/html/images/drones-06-00029-i005.png"/></span>) stands for <math display="inline"><semantics> <msubsup> <mi>z</mi> <mrow> <mi>j</mi> </mrow> <mo>*</mo> </msubsup> </semantics></math>, and the blue arrows stand for the drones’ travel routes.</p>
Full article ">Figure 9
<p>Barking drones deployment for animal driving, where the star marker stands for the designated location <span class="html-italic">G</span>; the red dot stands for <math display="inline"><semantics> <msubsup> <mi>C</mi> <mi>o</mi> <mo>′</mo> </msubsup> </semantics></math>. The dark red arrows stand for the side-to-side trajectories of the barking drones.</p>
Full article ">Figure 10
<p>(<b>a</b>) Animals’ footprint radius versus time <span class="html-italic">t</span> for herding 200 and 1000 animals with 4 barking drones using both the proposed and the benchmark method; (<b>b</b>) Initial distribution of the 1000 animals; (<b>c</b>–<b>f</b>) snapshots of the 1000 animals at <math display="inline"><semantics> <mrow> <mi>t</mi> <mo>=</mo> <mn>5</mn> <mo>,</mo> <mn>8</mn> </mrow> </semantics></math> min for both methods.</p>
Full article ">Figure 11
<p>Comparisons of the average gathering time for different values of (<b>a</b>) length of the initial square field; (<b>b</b>) number of barking drones <math display="inline"><semantics> <msub> <mi>n</mi> <mi>d</mi> </msub> </semantics></math>.</p>
Full article ">Figure 12
<p>Comparisons of the average gathering time for different values of (<b>a</b>) the maximum drone speed <math display="inline"><semantics> <msub> <mi>V</mi> <mrow> <mi mathvariant="italic">max</mi> </mrow> </msub> </semantics></math>; (<b>b</b>) the maximum animal speed <math display="inline"><semantics> <msub> <mi>V</mi> <mrow> <mi>a</mi> <mi>n</mi> <mi>i</mi> <mi>m</mi> <mi>a</mi> <mi>l</mi> </mrow> </msub> </semantics></math>.</p>
Full article ">Figure 13
<p>Comparisons of the average gathering time for different values of (<b>a</b>) barking cone radius <math display="inline"><semantics> <msub> <mi>R</mi> <mi>b</mi> </msub> </semantics></math>; (<b>b</b>) drone-to-animal distance <math display="inline"><semantics> <msub> <mi>d</mi> <mi>s</mi> </msub> </semantics></math>; (<b>c</b>) angle of barking cone <math display="inline"><semantics> <mi>β</mi> </semantics></math>.</p>
Full article ">Figure 14
<p>Impact of measurement errors.</p>
Full article ">
19 pages, 923 KiB  
Article
Implementing Mitigations for Improving Societal Acceptance of Urban Air Mobility
by Ender Çetin, Alicia Cano, Robin Deransy, Sergi Tres and Cristina Barrado
Drones 2022, 6(2), 28; https://doi.org/10.3390/drones6020028 - 18 Jan 2022
Cited by 38 | Viewed by 8879
Abstract
The continuous development of technical innovations provides the opportunity to create new economic markets and a wealth of new services. However, these innovations sometimes raise concerns, notably in terms of societal, safety, and environmental impacts. This is the case for services related to [...] Read more.
The continuous development of technical innovations provides the opportunity to create new economic markets and a wealth of new services. However, these innovations sometimes raise concerns, notably in terms of societal, safety, and environmental impacts. This is the case for services related to the operation of unmanned aerial vehicles (UAV), which are emerging rapidly. Unmanned aerial vehicles, also called drones, date back to the first third of the twentieth century in aviation industry, when they were mostly used for military purposes. Nowadays, drones of various types and sizes are used for many purposes, such as precision agriculture, search and rescue missions, aerial photography, shipping and delivery, etc. Starting to operate in areas with low population density, drones are now looking for business in urban and suburban areas, in what is called urban air mobility (UAM). However, this rapid growth of the drone industry creates psychological fear of the unknown in some parts of society. Reducing this fear will play an important role in public acceptance of drone operations in urban areas. This paper presents the main concerns of society with regard to drone operations, as already captured in some public surveys, and proposes a list of mitigation measures to reduce these concerns. The proposed list is then analyzed, and its applicability to individual, urban, very large demonstration flights is explained, using the feedback from the CORUS-XUAM project. CORUS-XUAM will organize a set of very large drone flight demonstrations across seven European countries to investigate how to safely integrate drone operations into airspace with the support of the U-space. Full article
(This article belongs to the Special Issue Feature Papers of Drones)
Show Figures

Figure 1

Figure 1
<p>Drone market areas (size represents the market expectations).</p>
Full article ">Figure 2
<p>Summary of the surveys per year, aim, and proposed vehicles.</p>
Full article ">Figure 3
<p>Acceptance levels of drones and/or UAM per survey (in %).</p>
Full article ">Figure 4
<p>Word clouds highlighting the public concerns in the different surveys.</p>
Full article ">Figure 5
<p>Mitigation analysis process.</p>
Full article ">Figure 6
<p>Mitigation measures classification procedure.</p>
Full article ">Figure 7
<p>Examples of mitigations by category.</p>
Full article ">Figure 8
<p>Examples of mitigations by ease of implementation.</p>
Full article ">Figure 9
<p>Mitigation categories for full list.</p>
Full article ">Figure 10
<p>Percentages of total ease of implementation for full list.</p>
Full article ">Figure 11
<p>Mitigation scores for full list of mitigation measures.</p>
Full article ">Figure 12
<p>Mitigation categories for VLDs.</p>
Full article ">Figure 13
<p>Percentages of total ease of implementation for VLDs.</p>
Full article ">Figure 14
<p>Mitigation scores for mitigation measures applicable to VLDs.</p>
Full article ">
Previous Issue
Next Issue
Back to TopTop