[go: up one dir, main page]
More Web Proxy on the site http://driver.im/
You seem to have javascript disabled. Please note that many of the page functionalities won't work as expected without javascript enabled.
 
 
Sign in to use this feature.

Years

Between: -

Subjects

remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline

Journals

remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline

Article Types

Countries / Regions

remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline

Search Results (9,284)

Search Parameters:
Keywords = UAV

Order results
Result details
Results per page
Select all
Export citation of selected articles as:
29 pages, 1064 KiB  
Review
Impact of 3D Digitising Technologies and Their Implementation
by Paula Triviño-Tarradas, Diego Francisco García-Molina and José Ignacio Rojas-Sola
Technologies 2024, 12(12), 260; https://doi.org/10.3390/technologies12120260 (registering DOI) - 14 Dec 2024
Viewed by 87
Abstract
In recent years, 3D digitalisation has experienced significant growth, revolutionising the way we capture, process and use geometric data. Initially conceived for industrial applications, these technologies have expanded to multiple fields, offering unprecedented accuracy and versatility. Depending on the accuracy and efficiency to [...] Read more.
In recent years, 3D digitalisation has experienced significant growth, revolutionising the way we capture, process and use geometric data. Initially conceived for industrial applications, these technologies have expanded to multiple fields, offering unprecedented accuracy and versatility. Depending on the accuracy and efficiency to be achieved in a specific field of application, and on the analytical capacity, a specific 3D digitalisation technique or another will be used. This review aims to delve into the application of 3D scanning techniques, according to the implementation sector. The optimal geometry capturing and processing 3D data techniques for a specific case are studied as well as their limitations. Full article
(This article belongs to the Section Assistive Technologies)
23 pages, 1524 KiB  
Article
Robust Task Offloading and Trajectory Optimization for UAV-Mounted Mobile Edge Computing
by Runhe Wang, Yang Huang, Yiwei Lu, Pu Xie and Qihui Wu
Drones 2024, 8(12), 757; https://doi.org/10.3390/drones8120757 (registering DOI) - 14 Dec 2024
Viewed by 98
Abstract
Mobile edge computing (MEC) deployed in unmanned aerial vehicles (UAVs) has shown special strength by enhancing computational capacity and prolonging the battery lives of terrestrial user equipment (UE). Nevertheless, current research lacks studies of robust offloading scheme scheduling and trajectory planning using terrestrial [...] Read more.
Mobile edge computing (MEC) deployed in unmanned aerial vehicles (UAVs) has shown special strength by enhancing computational capacity and prolonging the battery lives of terrestrial user equipment (UE). Nevertheless, current research lacks studies of robust offloading scheme scheduling and trajectory planning using terrestrial random channels. The state-of-the-art joint task-offloading and trajectory-planning optimization techniques for UAV-mounted MEC are focused on scenarios where only air–ground channels exist rather than time-varying terrestrial channels. By contrast, this paper considers the scenario where both the time-varying/random terrestrial channels and the line-of-sight air–ground channels occur. Aiming at robust resource scheduling for energy-efficient UAV-assisted MEC, we formulate a novel joint optimization of UAV trajectory planning and task offloading, which, however, is highly nonconvex. As a countermeasure, the original optimization is recast as subproblems related to task offloading and trajectory planning and solved by a novel robust iterative optimization algorithm that combines the methods of weighted minimum mean square error, S-procedure, successive convex approximation, etc. Numerical results indicate that, compared to various baselines, the proposed algorithm can effectively reduce energy consumption and optimize the trajectory in the presence of a large number of input tasks. In addition, in terms of stability and effectiveness, the proposed robust iterative optimization algorithm can reduce energy consumption more stably in time-varying/random channels compared to non-robust schemes. Full article
29 pages, 3304 KiB  
Article
Enhancing Automated Maneuvering Decisions in UCAV Air Combat Games Using Homotopy-Based Reinforcement Learning
by Yiwen Zhu, Yuan Zheng, Wenya Wei and Zhou Fang
Drones 2024, 8(12), 756; https://doi.org/10.3390/drones8120756 - 13 Dec 2024
Viewed by 267
Abstract
In the field of real-time autonomous decision-making for Unmanned Combat Aerial Vehicles (UCAVs), reinforcement learning is widely used to enhance their decision-making capabilities in high-dimensional spaces. These enhanced capabilities allow UCAVs to better respond to the maneuvers of various opponents, with the win [...] Read more.
In the field of real-time autonomous decision-making for Unmanned Combat Aerial Vehicles (UCAVs), reinforcement learning is widely used to enhance their decision-making capabilities in high-dimensional spaces. These enhanced capabilities allow UCAVs to better respond to the maneuvers of various opponents, with the win rate often serving as the primary optimization metric. However, relying solely on the terminal outcome of victory or defeat as the optimization target, but without incorporating additional rewards throughout the process, poses significant challenges for reinforcement learning due to the sparse reward structure inherent in these scenarios. While algorithms enhanced with densely distributed artificial rewards show potential, they risk deviating from the primary objectives. To address these challenges, we introduce a novel approach: the homotopy-based soft actor–critic (HSAC) method. This technique gradually transitions from auxiliary tasks enriched with artificial rewards to the main task characterized by sparse rewards through homotopic paths. We demonstrate the consistent convergence of the HSAC method and its effectiveness through deployment in two distinct scenarios within a 3D air combat game simulation: attacking horizontally flying UCAVs and a combat scenario involving two UCAVs. Our experimental results reveal that HSAC significantly outperforms traditional algorithms, which rely solely on using sparse rewards or those supplemented with artificially aided rewards. Full article
(This article belongs to the Special Issue Path Planning, Trajectory Tracking and Guidance for UAVs: 2nd Edition)
25 pages, 1912 KiB  
Article
Rapid Integrated Design Verification of Vertical Take-Off and Landing UAVs Based on Modified Model-Based Systems Engineering
by Zhuo Bai, Bangchu Zhang, Mingli Song and Zhong Tian
Drones 2024, 8(12), 755; https://doi.org/10.3390/drones8120755 - 13 Dec 2024
Viewed by 280
Abstract
Unmanned Aerial Vehicle (UAV) development has garnered significant attention, yet one of the major challenges in the field is how to rapidly iterate the overall design scheme of UAVs to meet actual needs, thereby shortening development cycles and reducing costs. This study integrates [...] Read more.
Unmanned Aerial Vehicle (UAV) development has garnered significant attention, yet one of the major challenges in the field is how to rapidly iterate the overall design scheme of UAVs to meet actual needs, thereby shortening development cycles and reducing costs. This study integrates a “Decision Support System” and “Live Virtual Construct (LVC) environment” into the existing Model-Based Systems Engineering framework, proposing a Modified Model-Based Systems Engineering methodology for the full-process development of UAVs. By constructing a decision support system and a hybrid reality space—which includes pure digital modeling and simulation analysis software, semi-physical simulation platforms, real flight environments, and virtual UAVs—we demonstrate this method through the development of the electric vertical take-off and landing fixed-wing UAV DB1. This method allows for rapid, on-demand iteration in a fully digital environment, with feasibility validated by comparing actual flight test results with mission indicators. The study results show that this approach significantly accelerates UAV development while reducing costs, achieving rapid development from “demand side to design side” under the “0 loss” background. The DB1 platform can carry a 2.5 kg payload, achieve over 40 min of flight time, and cover a range of more than 70 km. This work provides valuable references for UAV enterprises aiming to reduce costs and increase efficiency in the rapid commercialization of UAV applications. Full article
(This article belongs to the Section Drone Design and Development)
26 pages, 18107 KiB  
Article
Tree Species Classification for Shelterbelt Forest Based on Multi-Source Remote Sensing Data Fusion from Unmanned Aerial Vehicles
by Kai Jiang, Qingzhan Zhao, Xuewen Wang, Yuhao Sheng and Wenzhong Tian
Forests 2024, 15(12), 2200; https://doi.org/10.3390/f15122200 - 13 Dec 2024
Viewed by 239
Abstract
Accurately understanding the stand composition of shelter forests is essential for the construction and benefit evaluation of shelter forest projects. This study explores classification methods for dominant tree species in shelter forests using UAV-derived RGB, hyperspectral, and LiDAR data. It also investigates the [...] Read more.
Accurately understanding the stand composition of shelter forests is essential for the construction and benefit evaluation of shelter forest projects. This study explores classification methods for dominant tree species in shelter forests using UAV-derived RGB, hyperspectral, and LiDAR data. It also investigates the impact of individual tree crown (ITC) delineation accuracy, crown morphological parameters, and various data sources and classifiers. First, as a result of the overlap and complex structure of tree crowns in shelterbelt forests, existing ITC delineation methods often lead to over-segmentation or segmentation errors. To address this challenge, we propose a watershed and multi-feature-controlled spectral clustering (WMF-SCS) algorithm for ITC delineation based on UAV RGB and LiDAR data, which offers clearer and more reliable classification objects, features, and training data for tree species classification. Second, spectral, texture, structural, and crown morphological parameters were extracted using UAV hyperspectral and LiDAR data combined with ITC delineation results. Twenty-one classification images were constructed using RF, SVM, MLP, and SAMME for tree species classification. The results show that (1) the proposed WMF-SCS algorithm demonstrates significant performance in ITC delineation in complex mixed forest scenarios (Precision = 0.88, Recall = 0.87, F1-Score = 0.87), resulting in a 1.85% increase in overall classification accuracy; (2) the inclusion of crown morphological parameters derived from LiDAR data improves the overall accuracy of the random forest classifier by 5.82%; (3) compared to using LiDAR or hyperspectral data alone, the classification accuracy using multi-source data improves by an average of 7.94% and 7.52%, respectively; (4) the random forest classifier combined with multi-source data achieves the highest classification accuracy and consistency (OA = 90.70%, Kappa = 0.8747). Full article
(This article belongs to the Section Forest Inventory, Modeling and Remote Sensing)
Show Figures

Figure 1

Figure 1
<p>Schematic diagram of the study area.</p>
Full article ">Figure 2
<p>Flowchart of the technical roadmap.</p>
Full article ">Figure 3
<p>Flowchart of WMF-SCS algorithm for individual tree crown delineation.</p>
Full article ">Figure 4
<p>Delineation results of site 1 using (<b>a</b>) CHM-based watershed segmentation algorithm; (<b>b</b>) WMF-SCS algorithm; (<b>c</b>) overlap display.</p>
Full article ">Figure 5
<p>Delineation results of site 2 using (<b>a</b>) CHM-based watershed segmentation algorithm; (<b>b</b>) WMF-SCS algorithm; (<b>c</b>) overlap display.</p>
Full article ">Figure 6
<p>Delineation results of site 3 using (<b>a</b>) CHM-based watershed segmentation algorithm; (<b>b</b>) WMF-SCS algorithm; (<b>c</b>) overlap display.</p>
Full article ">Figure 7
<p>Spectral response curve of tree species using (<b>a</b>) full hyperspectral bands; (<b>b</b>) CV-SVM-RFE algorithm.</p>
Full article ">Figure 8
<p>Screening results and importance of variables.</p>
Full article ">Figure 9
<p>Box plots of CPA, CSI, I_PH75, CD, H_Skewness, and PD of LiDAR data.</p>
Full article ">Figure 10
<p>Box plots of CARI, PCA2, MTCI, MNF_Homogeneity, MNF_ASM, and B12_Mean of hyperspectral data.</p>
Full article ">Figure 11
<p>Box plots of CPA, CD, CARI, MTCI, PD, and REP of LiDAR and hyperspectral data.</p>
Full article ">Figure 12
<p>The relationship between the number of LiDAR data feature variables and classification accuracy.</p>
Full article ">Figure 13
<p>The relationship between the number of hyperspectral data feature variables and classification accuracy.</p>
Full article ">Figure 14
<p>The relationship between the number of LiDAR and hyperspectral data feature variables and classification accuracy.</p>
Full article ">Figure 15
<p>LiDAR point clouds of different tree species.</p>
Full article ">Figure 16
<p>Tree species classification results: (<b>a</b>) Site 1; (<b>b</b>) Site 2; (<b>c</b>) Site 3.</p>
Full article ">
31 pages, 28782 KiB  
Article
Reducing the Maximum Amplitudes of Forced Vibrations of a Quadcopter Arm Using an Aerodynamic Profile Adapter
by Andra Tofan-Negru, Amado Ștefan and Maria Casapu
Drones 2024, 8(12), 754; https://doi.org/10.3390/drones8120754 - 13 Dec 2024
Viewed by 249
Abstract
This research focuses on the dynamic response analysis of a quadcopter arm without an adapter mounted and with aerodynamic profile adapters mounted to enhance drone performance. Nine different adapters were simulated to assess their impact on the arm’s dynamic behavior during various motor [...] Read more.
This research focuses on the dynamic response analysis of a quadcopter arm without an adapter mounted and with aerodynamic profile adapters mounted to enhance drone performance. Nine different adapters were simulated to assess their impact on the arm’s dynamic behavior during various motor operating regimes. The pressure force distribution from the airflow around the quadcopter arm was analyzed to determine the optimal adapter configuration. Numerical simulations revealed the best geometry for the adapter, which significantly reduced maximum displacement amplitudes compared to the non-adapter arm. The study also examined the effects of static imbalance from the rotor-propeller assembly, leading to the calculation of an eccentricity value of 0.022 mm for inertial force application. Experimental tests validated the numerical findings, with laser vibrometer measurements confirming improved dynamic responses with Adapter 8 across most operating regimes. Overall, the study shows the advantages of using better aerodynamic designs in quadcopter arms to improve stability and performance, contributing to advancements in drone technology through improved structural designs. Full article
(This article belongs to the Section Drone Design and Development)
Show Figures

Figure 1

Figure 1
<p>Example of the maximum transverse vibration amplitudes and the velocities for the points at the free end of the quadcopter arm (for one-second measurements).</p>
Full article ">Figure 2
<p>Characteristic elements of symmetrical airfoils.</p>
Full article ">Figure 3
<p>The geometric parameters of the adapters.</p>
Full article ">Figure 4
<p>The geometries of adapters with airfoils in cross-section, modelled in CATIA V5-6R 2019 software: (<b>a</b>) Adapter 1—EPPLER 862 STRUT AIRFOIL; (<b>b</b>) Adapter 2—EPPLER 863 STRUT AIRFOIL; (<b>c</b>) Adapter 3—GRIFFITH 30% SUCTION AIRFOIL; (<b>d</b>) Adapter 4—GOE 460 AIRFOIL (Gottingen); (<b>e</b>) Adapter 5—GOE 775 AIRFOIL (Gottingen); (<b>f</b>) Adapter 6—JOUKOVSKY F = 0% T = 21%; (<b>g</b>) Adapter 7—NACA 0021; (<b>h</b>) Adapter 8—NACA 0024; (<b>i</b>) Adapter 9—GOE 776 AIRFOIL.</p>
Full article ">Figure 4 Cont.
<p>The geometries of adapters with airfoils in cross-section, modelled in CATIA V5-6R 2019 software: (<b>a</b>) Adapter 1—EPPLER 862 STRUT AIRFOIL; (<b>b</b>) Adapter 2—EPPLER 863 STRUT AIRFOIL; (<b>c</b>) Adapter 3—GRIFFITH 30% SUCTION AIRFOIL; (<b>d</b>) Adapter 4—GOE 460 AIRFOIL (Gottingen); (<b>e</b>) Adapter 5—GOE 775 AIRFOIL (Gottingen); (<b>f</b>) Adapter 6—JOUKOVSKY F = 0% T = 21%; (<b>g</b>) Adapter 7—NACA 0021; (<b>h</b>) Adapter 8—NACA 0024; (<b>i</b>) Adapter 9—GOE 776 AIRFOIL.</p>
Full article ">Figure 5
<p>The numerical analysis setup of the quadcopter arm’s dynamic response considering the air jet’s influence and the rotor-propeller assembly’s static imbalance.</p>
Full article ">Figure 6
<p>Inertial force variation and pressure distribution on the quadcopter arm.</p>
Full article ">Figure 7
<p>Measuring the maximum amplitudes of transverse vibrations.</p>
Full article ">Figure 8
<p>Experimental configuration for quadcopter arm dynamic response testing and the representation of the mounted adapter: 1—Quadcopter arm; 2—Brushless electric motor; 3—Propeller; 4—Electronic speed controller; 5—Signal generator; 6—DC power supply; 7—Vise; 8—Data recorder; 9—Computer for CATMAN-HBM; 10—Computer for data interpretation; 11—Signal amplifier; 12—Laser vibrometer; 13—Oscilloscope.</p>
Full article ">Figure 9
<p>Variations in the velocity v<sub>z</sub> of the air, relative pressure on the surface of the arm and propeller at the moment when a propeller blade passes over the arm with: (<b>a</b>) ADAPTER 5 mounted on the quadcopter arm; (<b>b</b>) ADAPTER 8 mounted on the quadcopter arm.</p>
Full article ">Figure 10
<p>Maximum absolute displacement of the node at the free end of the quadcopter arm. The data of Adapter 5 is truncated to better illustrate the results of the other adapters.</p>
Full article ">Figure 11
<p>Representation of the eccentricity of the point of application of the inertial force due to the static imbalance of the rotor-propeller assembly.</p>
Full article ">Figure 12
<p>Experimentally determined variation of the displacement over time of the node at the free end of the arm without the adapter.</p>
Full article ">Figure 13
<p>Maximum absolute displacement of the node at the free end of the quadcopter arm, without an adapter and with Adapter 8.</p>
Full article ">Figure 14
<p>Experimental setup of quadcopter arm with Adapter 8 mounted: (<b>a</b>) frontal view; (<b>b</b>) lateral view; (<b>c</b>) experimental setup.</p>
Full article ">Figure 15
<p>Recorded data of the vibrations measured with the laser vibrometer for the quadcopter arm.</p>
Full article ">Figure 16
<p>Experimental measurements of the displacements at the free end of the arm without an adapter.</p>
Full article ">Figure 17
<p>Experimental measurements of the displacements at the free end of the arm with Adapter 8 mounted.</p>
Full article ">Figure 18
<p>Displacements at the free end of the arm with the mounted Adapter 8 and without an adapter for the 60% motor operating regime.</p>
Full article ">Figure 19
<p>Displacements at the free end of the arm with the mounted Adapter 8 and without an adapter for the 70% motor operating regime.</p>
Full article ">Figure 20
<p>Displacements at the free end of the arm with the mounted Adapter 8 and without an adapter for the 80% motor operating regime.</p>
Full article ">Figure 21
<p>Displacements at the free end of the arm with the mounted Adapter 8 and without an adapter for the 90% motor operating regime.</p>
Full article ">Figure 22
<p>Maximum absolute amplitudes for the arm with and without the mounted Adapter 8 across the four motor operating regimes.</p>
Full article ">Figure 23
<p>The mean values from samples for the arm with and without the mounted Adapter 8 across the four motor operating regimes.</p>
Full article ">Figure 24
<p>Maximum absolute amplitudes for the arm without the mounted adapter, obtained from experimental testing and numerical analysis across the four motor operating regimes.</p>
Full article ">Figure 25
<p>Maximum absolute amplitudes for the arm with Adapter 8 mounted, obtained from experimental testing and numerical analysis across the four motor operating regimes.</p>
Full article ">
19 pages, 10993 KiB  
Article
Observation Angle Effect of Near-Ground Thermal Infrared Remote Sensing on the Temperature Results of Urban Land Surface
by Xu Yuan, Zhi Lv, Kati Laakso, Jialiang Han, Xiao Liu, Qinglin Meng and Sihan Xue
Land 2024, 13(12), 2170; https://doi.org/10.3390/land13122170 - 13 Dec 2024
Viewed by 226
Abstract
During the process of urbanization, a large number of impervious land surfaces are replacing the biologically active surface. Land surface temperature is a key factor reflecting the urban thermal environment and a crucial factor affecting city livability and resident comfort. Therefore, the accurate [...] Read more.
During the process of urbanization, a large number of impervious land surfaces are replacing the biologically active surface. Land surface temperature is a key factor reflecting the urban thermal environment and a crucial factor affecting city livability and resident comfort. Therefore, the accurate measurement of land surface temperature is of great significance. Thermal infrared remote sensing is widely applied to study the urban thermal environment due to its distinctive advantages of high sensitivity, wide coverage, high resolution, and continuous measurement. Low-altitude remote sensing, performed using thermal infrared sensors carried by unmanned aerial vehicles (UAVs), is a common method of land surface observation. However, thermal infrared sensors may experience varying degrees of sway due to wind, affecting the quality of the data. It is still uncertain as to what degree angle changes affect thermal infrared data in urban environments. To investigate this effect, a near-ground remote sensing experiment was conducted to observe three common urban land surfaces, namely, marble tiles, cement tiles and grasses, at observation angles of 15°, 30°, 45°, and 60° using a thermal infrared imager. This is accompanied by synchronous ground temperature measurements conducted by iButton digital thermometers. Our results suggest that the temperature differences between the remote sensing data of the land surface and the corresponding ground truth data increase as a function of the increasing observation angle of the three land surfaces. Furthermore, the differences are minor when the observation angle changes are not more than 15° and the changes are not the same for different land surfaces. Our findings increase the current understanding of the effects of different angles on thermal infrared remote sensing in urban land surface temperature monitoring. Full article
(This article belongs to the Section Land – Observation and Monitoring)
Show Figures

Figure 1

Figure 1
<p>Map of study area locations (Reference system: UTM-WGS 1984, EPSG: 4326): (<b>a</b>) Guangzhou City within Guangdong Province, China; (<b>b</b>) study area in Tianhe district, Guangzhou City; (<b>c</b>) Wushan Campus of South China University of Technology. Area A: plaza in front of the Liwu Science and Technology Building; area B: courtyard of the No. 2 Graduate Student Dormitory Building.</p>
Full article ">Figure 2
<p>Measuring points for ground temperature measurement experiments: (<b>a</b>) cement tiles; (<b>b</b>) marble tiles; (<b>c</b>) grasses. Specific measurement points are marked using red dots.</p>
Full article ">Figure 3
<p>Thermal infrared remote sensing data collection. The observation angle of the infrared thermal imager was adjusted according to the reference lines on the white paper.</p>
Full article ">Figure 4
<p>Visible images of the land surface observation area and the ranges of different thermal infrared observation angles.</p>
Full article ">Figure 5
<p>Thermal infrared images of different urban land surfaces at different observation angles. Area A: (<b>a</b>) Cement tiles at an observation angle of 30°. (<b>b</b>) Cement and marble tiles at an observation angle of 45°. (<b>c</b>) Marble tiles at an observation angle of 60°. Area B: (<b>d</b>) Grasses at an observation angle of 15°. (<b>e</b>) Grasses at an observation angle of 30°.</p>
Full article ">Figure 6
<p>Thermal infrared remote sensing data (<math display="inline"><semantics> <mrow> <msub> <mrow> <mi>T</mi> </mrow> <mrow> <mi>i</mi> <mi>n</mi> <mi>f</mi> </mrow> </msub> </mrow> </semantics></math>) and ground data (<math display="inline"><semantics> <mrow> <msub> <mrow> <mi>T</mi> </mrow> <mrow> <mi>i</mi> <mi>B</mi> <mi>u</mi> </mrow> </msub> <mo>)</mo> </mrow> </semantics></math>, and temperature difference between them. (<b>a</b>) Cement tiles: results at observation angles of 30° and 45°. (<b>b</b>) Marble tiles: results at observation angles of 45° and 60°. (<b>c</b>) Grasses: results at observation angles of 15° and 30°.</p>
Full article ">Figure 6 Cont.
<p>Thermal infrared remote sensing data (<math display="inline"><semantics> <mrow> <msub> <mrow> <mi>T</mi> </mrow> <mrow> <mi>i</mi> <mi>n</mi> <mi>f</mi> </mrow> </msub> </mrow> </semantics></math>) and ground data (<math display="inline"><semantics> <mrow> <msub> <mrow> <mi>T</mi> </mrow> <mrow> <mi>i</mi> <mi>B</mi> <mi>u</mi> </mrow> </msub> <mo>)</mo> </mrow> </semantics></math>, and temperature difference between them. (<b>a</b>) Cement tiles: results at observation angles of 30° and 45°. (<b>b</b>) Marble tiles: results at observation angles of 45° and 60°. (<b>c</b>) Grasses: results at observation angles of 15° and 30°.</p>
Full article ">Figure 7
<p>Box plot of temperature differences between thermal infrared remote sensing data (<math display="inline"><semantics> <mrow> <msub> <mrow> <mi>T</mi> </mrow> <mrow> <mi>i</mi> <mi>n</mi> <mi>f</mi> </mrow> </msub> </mrow> </semantics></math>) and ground data (<math display="inline"><semantics> <mrow> <msub> <mrow> <mi>T</mi> </mrow> <mrow> <mi>i</mi> <mi>B</mi> <mi>u</mi> </mrow> </msub> <mo>)</mo> </mrow> </semantics></math>. (<b>a</b>) Cement tiles: results at observation angles of 30° and 45°. (<b>b</b>) Marble tiles: results at observation angles of 45° and 60°. (<b>c</b>) Grasses: results at observation angles of 15° and 30°.</p>
Full article ">Figure 8
<p>The average temperature value of the thermal infrared remote sensing data (<math display="inline"><semantics> <mrow> <msub> <mrow> <mi>T</mi> </mrow> <mrow> <mi>i</mi> <mi>n</mi> <mi>f</mi> </mrow> </msub> </mrow> </semantics></math>) and the ground data (<math display="inline"><semantics> <mrow> <msub> <mrow> <mi>T</mi> </mrow> <mrow> <mi>i</mi> <mi>B</mi> <mi>u</mi> </mrow> </msub> </mrow> </semantics></math>), shown as a function of the observation angle and material (cement tiles, marble tiles and grasses).</p>
Full article ">Figure 9
<p>Normalized results of the thermal infrared remote sensing data (<math display="inline"><semantics> <mrow> <msub> <mrow> <mi>T</mi> </mrow> <mrow> <mi>i</mi> <mi>n</mi> <mi>f</mi> </mrow> </msub> </mrow> </semantics></math>) and the ground data (<math display="inline"><semantics> <mrow> <msub> <mrow> <mi>T</mi> </mrow> <mrow> <mi>i</mi> <mi>B</mi> <mi>u</mi> </mrow> </msub> </mrow> </semantics></math>), shown as a function of the observation angle and land surface (cement tiles, marble tiles and grasses).</p>
Full article ">Figure 10
<p>The temperature differences between the thermal infrared remote sensing data (<math display="inline"><semantics> <mrow> <msub> <mrow> <mi>T</mi> </mrow> <mrow> <mi>i</mi> <mi>n</mi> <mi>f</mi> </mrow> </msub> </mrow> </semantics></math>) and the ground data (<math display="inline"><semantics> <mrow> <msub> <mrow> <mi>T</mi> </mrow> <mrow> <mi>i</mi> <mi>B</mi> <mi>u</mi> </mrow> </msub> </mrow> </semantics></math>), shown as percentages and as a function of the observation angle and land surface (cement tiles, marble tiles and grasses), and the percentage difference between the results from different angles.</p>
Full article ">
16 pages, 269 KiB  
Article
The Impact of Digital Technology Application on Agricultural Low-Carbon Transformation—A Case Study of the Pesticide Reduction Effect of Plant Protection Unmanned Aerial Vehicles (UAVs)
by Qian Deng, Yuhan Zhang, Zhuyu Lin, Xueping Gao and Zhenlin Weng
Sustainability 2024, 16(24), 10920; https://doi.org/10.3390/su162410920 - 12 Dec 2024
Viewed by 442
Abstract
Reducing pesticide use is a crucial step toward achieving the green and low-carbon transformation of agriculture. Analyzing the role and mechanisms of agricultural digital technologies—particularly plant protection unmanned aerial vehicles (UAVs) for aerial spraying—is essential for identifying viable strategies to reduce pesticide application [...] Read more.
Reducing pesticide use is a crucial step toward achieving the green and low-carbon transformation of agriculture. Analyzing the role and mechanisms of agricultural digital technologies—particularly plant protection unmanned aerial vehicles (UAVs) for aerial spraying—is essential for identifying viable strategies to reduce pesticide application intensity among farming households. This analysis is critical for facilitating the low-carbon transformation of rice production and advancing sustainable agricultural development. This study, using survey data from 455 farming households in Jiangxi Province, China, employs Ordinary Least Squares (OLS) and Propensity Score Matching (PSM) methods to investigate the relationship between plant protection UAVs and pesticide application intensity. The findings reveal that adopting plant protection UAVs significantly reduces pesticide application intensity in rice production by 24.9%. Further analysis indicates that the reduction effect is more pronounced among non-aged, large-scale, and part-time farming households. To achieve the low-carbon transformation of rice production, it is vital to enhance agricultural support policies and develop effective market promotion and application mechanisms to encourage the adoption of UAV-based aerial spraying and other digital agricultural technologies. Full article
20 pages, 13662 KiB  
Article
Unmanned Aerial Vehicle (UAV) Hyperspectral Imagery Mining to Identify New Spectral Indices for Predicting the Field-Scale Yield of Spring Maize
by Yue Zhang, Yansong Wang, Hang Hao, Ziqi Li, Yumei Long, Xingyu Zhang and Chenzhen Xia
Sustainability 2024, 16(24), 10916; https://doi.org/10.3390/su162410916 - 12 Dec 2024
Viewed by 435
Abstract
A nondestructive approach for accurate crop yield prediction at the field scale is vital for precision agriculture. Considerable progress has been made in the use of the spectral index (SI) derived from unmanned aerial vehicle (UAV) hyperspectral images to predict crop yields before [...] Read more.
A nondestructive approach for accurate crop yield prediction at the field scale is vital for precision agriculture. Considerable progress has been made in the use of the spectral index (SI) derived from unmanned aerial vehicle (UAV) hyperspectral images to predict crop yields before harvest. However, few studies have explored the most sensitive wavelengths and SIs for crop yield prediction, especially for different nitrogen fertilization levels and soil types. This study aimed to investigate the appropriate wavelengths and their combinations to explore the ability of new SIs derived from UAV hyperspectral images to predict yields during the growing season of spring maize. In this study, the hyperspectral canopy reflectance measurement method, a field-based high-throughput method, was evaluated in three field experiments (Wang-Jia-Qiao (WJQ), San-Ke-Shu (SKS), and Fu-Jia-Jie (FJJ)) since 2009 with different soil types (alluvial soil, black soil, and aeolian sandy soil) and various nitrogen (N) fertilization levels (0, 168, 240, 270, and 312 kg/ha) in Lishu County, Northeast China. The measurements of canopy spectral reflectance and maize yield were conducted at critical growth stages of spring maize, including the jointing, silking, and maturity stages, in 2019 and 2020. The best wavelengths and new SIs, including the difference spectral index, ratio spectral index, and normalized difference spectral index forms, were obtained from the contour maps constructed by the coefficient of determination (R2) from the linear regression models between the yield and all possible SIs screened from the 450 to 950 nm wavelengths. The new SIs and eight selected published SIs were subsequently used to predict maize yield via linear regression models. The results showed that (1) the most sensitive wavelengths were 640–714 nm at WJQ, 450–650 nm and 750–950 nm at SKS, and 450–700 nm and 750–950 nm at FJJ; (2) the new SIs established here were different across the three experimental fields, and their performance in maize yield prediction was generally better than that of the published SIs; and (3) the new SIs presented different responses to various N fertilization levels. This study demonstrates the potential of exploring new spectral characteristics from remote sensing technology for predicting the field-scale crop yield in spring maize cropping systems before harvest. Full article
Show Figures

Figure 1

Figure 1
<p>Location of the study area (<b>a</b>), UAV hyperspectral images (<b>b</b>–<b>d</b>) and the nitrogen application rates (<b>e</b>) of three experimental fields (WJQ, SKS, and FJJ).</p>
Full article ">Figure 2
<p>Mean canopy reflectance spectra curves of spring maize under different N treatments across three growth stages in the three experimental fields. (<b>a</b>): WJQ, (<b>b</b>): SKS, (<b>c</b>): FJJ.</p>
Full article ">Figure 3
<p>Contour maps for the linear model between the difference spectral index (DSI), ratio spectral index (RSI), normalized difference spectral index (NDSI), and maize yield for the WJQ experimental field. (<b>a</b>–<b>c</b>): DSI, RSI, and NDSI forms at the jointing stage; (<b>d</b>–<b>f</b>): DSI, RSI, and NDSI forms at the silking stage; (<b>g</b>–<b>i</b>): DSI, RSI, and NDSI forms at the maturity stage.</p>
Full article ">Figure 4
<p>Contour maps for the linear model between the difference spectral index (DSI), ratio spectral index (RSI), normalized difference spectral index (NDSI), and maize yield for the SKS experimental field. (<b>a</b>–<b>c</b>): DSI, RSI, and NDSI forms at the jointing stage; (<b>d</b>–<b>f</b>): DSI, RSI, and NDSI forms at the silking stage; (<b>g</b>–<b>i</b>): DSI, RSI, and NDSI forms at the maturity stage.</p>
Full article ">Figure 5
<p>Contour maps for the linear model between the difference spectral index (DSI), ratio spectral index (RSI), normalized difference spectral index (NDSI), and maize yield for the FJJ experimental field. (<b>a</b>–<b>c</b>): DSI, RSI, and NDSI forms at the jointing stage; (<b>d</b>–<b>f</b>): DSI, RSI, and NDSI forms at the silking stage; (<b>g</b>–<b>i</b>): DSI, RSI, and NDSI forms at the maturity stage.</p>
Full article ">Figure 6
<p>Scatter plots of the measured yield (kg/ha) versus the yield (kg/ha) predicted by the new SIs: (<b>a</b>) NDSI (690, 710) at WJQ, (<b>b</b>) RSI (906, 546) at SKS, and (<b>c</b>) DSI (698, 922) at FJJ.</p>
Full article ">Figure 7
<p>The response of the maize yield to different N application rates on the three experimental fields.</p>
Full article ">Figure 8
<p>The response of the new SIs to different N treatments on the three experimental fields. (<b>a</b>−<b>c</b>): DSI, RSI, and NDSI forms for WJQ; (<b>d</b>−<b>f</b>): DSI, RSI, and NDSI forms for SKS; and (<b>g</b>−<b>i</b>): DSI, RSI, and NDSI forms for FJJ, respectively.</p>
Full article ">Figure 8 Cont.
<p>The response of the new SIs to different N treatments on the three experimental fields. (<b>a</b>−<b>c</b>): DSI, RSI, and NDSI forms for WJQ; (<b>d</b>−<b>f</b>): DSI, RSI, and NDSI forms for SKS; and (<b>g</b>−<b>i</b>): DSI, RSI, and NDSI forms for FJJ, respectively.</p>
Full article ">Figure 8 Cont.
<p>The response of the new SIs to different N treatments on the three experimental fields. (<b>a</b>−<b>c</b>): DSI, RSI, and NDSI forms for WJQ; (<b>d</b>−<b>f</b>): DSI, RSI, and NDSI forms for SKS; and (<b>g</b>−<b>i</b>): DSI, RSI, and NDSI forms for FJJ, respectively.</p>
Full article ">
26 pages, 2478 KiB  
Article
Enhanced Nutcracker Optimization Algorithm with Hyperbolic Sine–Cosine Improvement for UAV Path Planning
by Shuhao Jiang, Shengliang Cui, Haoran Song, Yizi Lu and Yong Zhang
Biomimetics 2024, 9(12), 757; https://doi.org/10.3390/biomimetics9120757 - 12 Dec 2024
Viewed by 291
Abstract
Three-dimensional (3D) path planning is a crucial technology for ensuring the efficient and safe flight of UAVs in complex environments. Traditional path planning algorithms often find it challenging to navigate complex obstacle environments, making it challenging to quickly identify the optimal path. To [...] Read more.
Three-dimensional (3D) path planning is a crucial technology for ensuring the efficient and safe flight of UAVs in complex environments. Traditional path planning algorithms often find it challenging to navigate complex obstacle environments, making it challenging to quickly identify the optimal path. To address these challenges, this paper introduces a Nutcracker Optimizer integrated with Hyperbolic Sine–Cosine (ISCHNOA). First, the exploitation process of the sinh cosh optimizer is incorporated into the foraging strategy to enhance the efficiency of nutcracker in locating high-quality food sources within the search area. Secondly, a nonlinear function is designed to improve the algorithm’s convergence speed. Finally, a sinh cosh optimizer that incorporates historical positions and dynamic factors is introduced to enhance the influence of the optimal position on the search process, thereby improving the accuracy of the nutcracker in retrieving stored food. In this paper, the performance of the ISCHNOA algorithm is tested using 14 classical benchmark test functions as well as the CEC2014 and CEC2020 suites and applied to UAV path planning models. The experimental results demonstrate that the ISCHNOA algorithm outperforms the other algorithms across the three test suites, with the total cost of the planned UAV paths being lower. Full article
Show Figures

Figure 1

Figure 1
<p>Threat cost visualization.</p>
Full article ">Figure 2
<p>Flight altitude visualization.</p>
Full article ">Figure 3
<p>Turning and climbing angle calculation.</p>
Full article ">Figure 4
<p>Convergence curves of each algorithm for the test function.</p>
Full article ">Figure 5
<p>Convergence curves of each algorithm for the test function (continued).</p>
Full article ">Figure 6
<p>Top view of each algorithm’s result across eight different obstacle classes.</p>
Full article ">Figure 6 Cont.
<p>Top view of each algorithm’s result across eight different obstacle classes.</p>
Full article ">Figure 7
<p>Convergence plots of each algorithm for the eight obstacle class cases.</p>
Full article ">Figure 7 Cont.
<p>Convergence plots of each algorithm for the eight obstacle class cases.</p>
Full article ">
30 pages, 12451 KiB  
Article
A Method Coupling NDT and VGICP for Registering UAV-LiDAR and LiDAR-SLAM Point Clouds in Plantation Forest Plots
by Fan Wang, Jiawei Wang, Yun Wu, Zhijie Xue, Xin Tan, Yueyuan Yang and Simei Lin
Forests 2024, 15(12), 2186; https://doi.org/10.3390/f15122186 - 12 Dec 2024
Viewed by 239
Abstract
The combination of UAV-LiDAR and LiDAR-SLAM (Simultaneous Localization and Mapping) technology can overcome the scanning limitations of different platforms and obtain comprehensive 3D structural information of forest stands. To address the challenges of the traditional registration algorithms, such as high initial value requirements [...] Read more.
The combination of UAV-LiDAR and LiDAR-SLAM (Simultaneous Localization and Mapping) technology can overcome the scanning limitations of different platforms and obtain comprehensive 3D structural information of forest stands. To address the challenges of the traditional registration algorithms, such as high initial value requirements and susceptibility to local optima, in this paper, we propose a high-precision, robust, NDT-VGICP registration method that integrates voxel features to register UAV-LiDAR and LiDAR-SLAM point clouds at the forest stand scale. First, the point clouds are voxelized, and their normal vectors and normal distribution models are computed, then the initial transformation matrix is quickly estimated based on the point pair distribution characteristics to achieve preliminary alignment. Second, high-dimensional feature weighting is introduced, and the iterative closest point (ICP) algorithm is used to optimize the distance between the matching point pairs, adjusting the transformation matrix to reduce the registration errors iteratively. Finally, the algorithm converges when the iterative conditions are met, yielding an optimal transformation matrix and achieving precise point cloud registration. The results show that the algorithm performs well in Chinese fir forest stands of different age groups (average RMSE—horizontal: 4.27 cm; vertical: 3.86 cm) and achieves high accuracy in single-tree crown vertex detection and tree height estimation (average F-score: 0.90; R2 for tree height estimation: 0.88). This study demonstrates that the NDT-VGICP algorithm can effectively fuse and collaboratively apply multi-platform LiDAR data, providing a methodological reference for accurately quantifying individual tree parameters and efficiently monitoring 3D forest stand structures. Full article
(This article belongs to the Section Forest Inventory, Modeling and Remote Sensing)
Show Figures

Figure 1

Figure 1
<p>Location of the study area: (<b>a</b>) Fujian Province of China; (<b>b</b>) Nanping City; (<b>c</b>) topographic map of Shunchang County; (<b>d</b>) aerial view of site distribution; (<b>e</b>) UAV-LiDAR, LiDAR-SLAM, and ground data survey.</p>
Full article ">Figure 2
<p>Stand conditions for (<b>a</b>) young-growth forests; (<b>b</b>) half-mature forests; (<b>c</b>) near-mature forests; (<b>d</b>) mature forests; and (<b>e</b>) over-mature forests.</p>
Full article ">Figure 3
<p>Technical flowchart.</p>
Full article ">Figure 4
<p>NDT coarse registration algorithm flowchart.</p>
Full article ">Figure 5
<p>Schematic of VGICP precision registration algorithm. (<b>a</b>) construct of the voxel grid; (<b>b</b>) downsampled of source and target point cloud; (<b>c</b>) calculation of voxel normal vectors; (<b>d</b>) construct point-voxel transformation field. The blue points in (<b>a</b>,<b>b</b>) are the original point clouds and the red points are the target point clouds. The red point in (<b>c</b>) is the nearest neighbor point cloud, the black point is the edge point cloud, and the yellow line is the voxel normal. The colored points in (<b>d</b>) are the matched point clouds.</p>
Full article ">Figure 6
<p>The technical workflow of the improved individual tree segmentation method combining the rasterized canopy height model (CHM) and point cloud clustering.</p>
Full article ">Figure 7
<p>Single wood segmentation process based on horizontal distance and edge distance. (<b>a</b>–<b>c</b>) represent the point cloud data extracted from the study object using rolling segmentation blocks.</p>
Full article ">Figure 8
<p>The registration effects of three algorithms on Chinese fir plantations across different age groups: (<b>a</b>) young-growth forests; (<b>b</b>) middle-aged forests; (<b>c</b>) near-mature forests; (<b>d</b>) mature forests; (<b>e</b>) over-mature forests. Taking plots Y-1, H-3, N-1, M-2, and O-1 as examples. Different colors represent point cloud datasets from two different platforms.</p>
Full article ">Figure 9
<p>The registration effects of three algorithms on individual Chinese fir trees of different age groups: (<b>a</b>) young-growth forests; (<b>b</b>) middle-aged forests; (<b>c</b>) near-mature forests; (<b>d</b>) mature forests; (<b>e</b>) vver-mature forests. Taking plots Y-1, H-3, N-1, M-2, and O-1 as examples. The white points represent the registered UAV-LiDAR data, and the color-rendered points represent the LiDAR-SLAM data. The white frame show the specific positions of the three slice angles of the local field of view.</p>
Full article ">Figure 9 Cont.
<p>The registration effects of three algorithms on individual Chinese fir trees of different age groups: (<b>a</b>) young-growth forests; (<b>b</b>) middle-aged forests; (<b>c</b>) near-mature forests; (<b>d</b>) mature forests; (<b>e</b>) vver-mature forests. Taking plots Y-1, H-3, N-1, M-2, and O-1 as examples. The white points represent the registered UAV-LiDAR data, and the color-rendered points represent the LiDAR-SLAM data. The white frame show the specific positions of the three slice angles of the local field of view.</p>
Full article ">Figure 10
<p>Differential analysis of individual tree crown delineation apex detection based on three registration algorithms. (<b>a</b>) NDT-ICP algorithms; (<b>b</b>) NDT-GICP algorithms; (<b>c</b>) NDT-VGICP algorithms.</p>
Full article ">Figure 10 Cont.
<p>Differential analysis of individual tree crown delineation apex detection based on three registration algorithms. (<b>a</b>) NDT-ICP algorithms; (<b>b</b>) NDT-GICP algorithms; (<b>c</b>) NDT-VGICP algorithms.</p>
Full article ">Figure 11
<p>Main effects of age groups and three registration algorithms on the ITCD-F score and tree height RMSE using Tukey’s test. Panels (<b>a</b>,<b>b</b>) show the main effects of age groups and registration algorithms on the ITCD-F score, while panels (<b>c</b>,<b>d</b>) show the main effects on tree height RMSE. In panels (<b>a</b>,<b>c</b>), different colored boxes represent different age groups; in panels (<b>b</b>,<b>d</b>), different colored boxes represent different registration algorithms.</p>
Full article ">Figure 12
<p>Comparison of the optimized registration algorithm and the traditional algorithm across different age groups. (<b>a</b>) NDT-ICP algorithms; (<b>b</b>) NDT-GICP algorithms; (<b>c</b>) NDT-VGICP algorithms. “Y” represents young-growth forests; “H” represents half-mature forests; “N” represents near-mature forests; “M” represents mature forests; and “O” represents over-mature forests. The different colored columns in the figure represent different age groups.</p>
Full article ">Figure 13
<p>Accuracy evaluation of remote sensing-derived tree height at individual tree and stand scales. (<b>a</b>) Fitting results of remote sensing-derived tree height at the individual tree level and field-measured tree height; (<b>b</b>) Fitting results of remote sensing-derived stand average tree height and field-measured average tree height.</p>
Full article ">Figure 14
<p>Accuracy evaluation of remote sensing-derived tree height for different age groups: (<b>a</b>) young-growth forests; (<b>b</b>) middle-aged forests; (<b>c</b>) near-mature forests; (<b>d</b>) mature forests; (<b>e</b>) over-mature forests.</p>
Full article ">Figure 14 Cont.
<p>Accuracy evaluation of remote sensing-derived tree height for different age groups: (<b>a</b>) young-growth forests; (<b>b</b>) middle-aged forests; (<b>c</b>) near-mature forests; (<b>d</b>) mature forests; (<b>e</b>) over-mature forests.</p>
Full article ">
15 pages, 6009 KiB  
Article
Positioning Method for Unmanned Aerial Vehicle (UAV) Based on Airborne Two-Dimensional Laser Doppler Velocimeter: Experiment and Dead Reckoning
by Lanjian Chen, Chongbin Xi, Shilong Jin and Jian Zhou
Drones 2024, 8(12), 751; https://doi.org/10.3390/drones8120751 - 12 Dec 2024
Viewed by 251
Abstract
In the autonomous navigation of drones, improving positioning accuracy is of significant importance to obtain highly accurate information on flight velocity. Traditional microwave and acoustic velocity measurement methods have the disadvantages of poor precision and susceptibility to interference. In this study, an unmanned [...] Read more.
In the autonomous navigation of drones, improving positioning accuracy is of significant importance to obtain highly accurate information on flight velocity. Traditional microwave and acoustic velocity measurement methods have the disadvantages of poor precision and susceptibility to interference. In this study, an unmanned aerial vehicle (UAV)-mounted two-dimensional laser Doppler velocimeter was developed and investigated, and a relevant drone flight navigation and positioning experiment was carried out. The UAV-mounted two-dimensional laser Doppler velocimeter (LDV) prototype developed in this study applies a scheme of dual-beam measurement light, sharing a focusing lens group. After process integration, the performance of the prototype was measured. It shows that a velocity measurement effect with a high signal-to-noise ratio can be achieved by using two measurement probe beams within a working distance range of 40 m–60 m. In the flight experiment, the flight trajectory calculated using the LDV-measured velocity data was compared with the global navigation satellite system (GNSS)-recorded trajectory. The result shows that LDV can achieve an odometer accuracy of 4.8‰. This study has validated the feasibility of the laser Doppler velocimeter in drone navigation and positioning, providing a novel method for reliable and high-precision velocity measurement in autonomous drone navigation. Full article
(This article belongs to the Special Issue Drones Navigation and Orientation)
Show Figures

Figure 1

Figure 1
<p>(<b>a</b>) The Airframe State of a UAV in Flight; (<b>b</b>) Diagram of the geometric relationship between velocity components.</p>
Full article ">Figure 2
<p>Schematic Diagram of the UAV’s Coordinate System (m-System).</p>
Full article ">Figure 3
<p>Configuration of the Multiplex Transmitter for two measurement beams in 2D LDV (<b>a</b>) External structure; (<b>b</b>) Inner Structure.</p>
Full article ">Figure 4
<p>Schematic Diagram of the Internal Optical Path Structure of a 2D Airborne LDV. (<b>a</b>) Top View; (<b>b</b>) Side View.</p>
Full article ">Figure 5
<p>(<b>a</b>) Field image of beam waist position calibration; (<b>b</b>) Diagram of the calibration process.</p>
Full article ">Figure 6
<p>Doppler quality factor measured at different working distances during the waist position calibration process. (<b>a</b>) Beam 1; (<b>b</b>) Beam 2.</p>
Full article ">Figure 7
<p>Diagram of UAV velocity components’ directions.</p>
Full article ">Figure 8
<p>Triangle flight route diagram.</p>
Full article ">Figure 9
<p>Flight experiment site.</p>
Full article ">Figure 10
<p>Flight experiment results. (<b>a</b>) Raw velocity data from the 2D LDV; (<b>b</b>) the variation of UAV altitude during the flight; (<b>c</b>) n of <span class="html-italic">x</span>-axis velocity measured by LDV and GNSS after data processing (<b>d</b>) Comparison of <span class="html-italic">z</span>-axis velocity measured by LDV and GNSS after data processing; (<b>e</b>) Doppler quality factor of the two measurement beams during the flight process.</p>
Full article ">Figure 10 Cont.
<p>Flight experiment results. (<b>a</b>) Raw velocity data from the 2D LDV; (<b>b</b>) the variation of UAV altitude during the flight; (<b>c</b>) n of <span class="html-italic">x</span>-axis velocity measured by LDV and GNSS after data processing (<b>d</b>) Comparison of <span class="html-italic">z</span>-axis velocity measured by LDV and GNSS after data processing; (<b>e</b>) Doppler quality factor of the two measurement beams during the flight process.</p>
Full article ">Figure 11
<p>Experimental site photograph. (<b>a</b>) UAV mounted with 2D LDV; (<b>b</b>) UAV in takeoff and hovering phases, respectively.</p>
Full article ">Figure 12
<p>Flight experiment results. (<b>a</b>) Raw velocity data from two channels of 2D LDV output; (<b>b</b>) Comparison of LDV-measured <span class="html-italic">x</span>-axis velocity with GNSS-measured <span class="html-italic">x</span>-axis velocity; (<b>c</b>) Comparison of LDV-measured <span class="html-italic">z</span>-axis velocity with GNSS-measured <span class="html-italic">z</span>-axis velocity; (<b>d</b>) Comparison of LDV-measured resultant velocity with GNSS-measured resultant velocity.</p>
Full article ">Figure 13
<p>Dead reckoning results. (<b>a</b>) Comparison of flight path derived from DR with GNSS recorded trajectory; (<b>b</b>) Variation in positional error during flight.</p>
Full article ">Figure 14
<p>Variation in heading angle during flight.</p>
Full article ">
18 pages, 6258 KiB  
Article
Rice Yield Estimation Based on Cumulative Time Series Vegetation Indices of UAV MS and RGB Images
by Jun Li, Weiqiang Wang, Yali Sheng, Sumera Anwar, Xiangxiang Su, Ying Nian, Hu Yue, Qiang Ma, Jikai Liu and Xinwei Li
Agronomy 2024, 14(12), 2956; https://doi.org/10.3390/agronomy14122956 - 12 Dec 2024
Viewed by 181
Abstract
Timely and accurate yield estimation is essential for effective crop management and the grain trade. Remote sensing has emerged as a valuable tool for monitoring rice yields; however, many studies concentrate on a single period or simply aggregate multiple periods, neglecting the complexities [...] Read more.
Timely and accurate yield estimation is essential for effective crop management and the grain trade. Remote sensing has emerged as a valuable tool for monitoring rice yields; however, many studies concentrate on a single period or simply aggregate multiple periods, neglecting the complexities underlying yield formation. The study enhances yield estimation by integrating cumulative time series vegetation indices (VIs) from multispectral (MS) and RGB (Red, Green, Blue) sensors to identify optimal combinations of growth periods. We utilized two unmanned aerial vehicle to capture spectral information from rice canopies through MS and RGB sensors. By analyzing the correlations between vegetation indices from different sensors and rice yields, the optimal MS-VIs and RGB-VIs for each period were identified. Following this, the relationship between the cumulative time series of MS-VIs, RGB-VIs, and rice yields was further examined. The results demonstrate that the booting stage is a crucial growth period influencing rice yield, with VIs exhibiting increased correlation with yield, peaking during this stage before declining. For the MS sensor, the rice yield model, based on the cumulative time series of MS-VIs from the tillering stage to the panicle initiation stage, achieves optimal accuracy (R2 = 0.722, RRMSE = 0.555). For the RGB sensor, the rice yield model, based on the cumulative time series of RGB-VIs from the tillering stage to the grain-filling stage, yields the highest accuracy (R2 = 0.727, RRMSE = 0.526). In comparison, the multi-sensor rice yield model, which combines the cumulative time series of MS-VIs from the tillering stage and RGB-VIs from the panicle initiation to grain-filling stages, achieves the highest accuracy with R2 = 0.759 and RRMSE = 0.513. These findings suggest that cumulative time series VIs and the integration of multiple sensors enhance yield prediction accuracy, providing a comprehensive approach for estimating rice yield dynamics and supporting precision agriculture and informed crop management. Full article
(This article belongs to the Section Precision and Digital Agriculture)
Show Figures

Figure 1

Figure 1
<p>Study location and field experimental layout.</p>
Full article ">Figure 2
<p>Schematic diagram of ∑(MS-VIs &amp; RGB-VIs).</p>
Full article ">Figure 3
<p>Correlation between vegetation indices and yield across six growth stages of rice. Note: ** indicates a significant level of 0.01.</p>
Full article ">Figure 4
<p>Coefficient of determination R2 (<b>a</b>) and relative root mean square error RRMSE (<b>b</b>) between rice yield and cumulative multispectral vegetation indices (∑MS-VIs) at different growth stages. T: tillering; J: jointing; B: bolting; H: heading; F: filling; M: maturity. The diagonal values indicate the values at the single growth stage. Other values represent the cumulative effect of including data from previous stages up to the indicated stage. Highlight the best result in red.</p>
Full article ">Figure 5
<p>Coefficient of determination R2 (<b>a</b>) and relative root mean square error RRMSE (<b>b</b>) of rice yield and ∑RGB-VIs. T: tillering; J: jointing; B: bolting; H: heading; F: filling; M: maturity. The diagonal values indicate the values at the single growth stage. Other values represent the cumulative effect of including data from previous stages up to the indicated stage. Highlight the best result in red.</p>
Full article ">Figure 6
<p>Coefficient of determination R2 and relative root mean square error RRMSE of rice yield and cumulative time series MS-VIs &amp; RGB-VIs. T: tillering; J: jointing; B: bolting; H: heading; F: Filling; M: maturity. Highlight the best result in red.</p>
Full article ">Figure 7
<p>Rice yield mapping based on field measurement (<b>a</b>) and estimating yield values based on the optimal cumulative time series of vegetation indices derived from MS, RGB, and fused MS-RGB data; (<b>b</b>) MS sensor; (<b>c</b>) RGB sensor (<b>d</b>) fused MS-RGB.</p>
Full article ">Figure 8
<p>Rice yield estimation models based on optimal period combinations from different sensors: (<b>a</b>) Rice yield estimation model based on the optimal period combination of multispectral data; (<b>b</b>) Rice yield estimation model based on the optimal period combination of RGB data; (<b>c</b>) Rice yield estimation model based on the optimal period combination of multi-source sensor data.</p>
Full article ">
22 pages, 6400 KiB  
Article
A Novel Spherical Shortest Path Planning Method for UAVs
by Fan Liu, Pengchuan Wang, Aniruddha Bhattacharjya and Qianmu Li
Drones 2024, 8(12), 749; https://doi.org/10.3390/drones8120749 - 12 Dec 2024
Viewed by 325
Abstract
As a central subdivision of the low-altitude economy industry, industrial and consumer drones have broad market application prospects and are becoming the primary focus of the low-altitude economy; however, with increasing aircraft density, effective planning of reasonable flight paths and avoiding conflicts between [...] Read more.
As a central subdivision of the low-altitude economy industry, industrial and consumer drones have broad market application prospects and are becoming the primary focus of the low-altitude economy; however, with increasing aircraft density, effective planning of reasonable flight paths and avoiding conflicts between flight paths have become critical issues in UAV clustering. Current UAV path planning often concentrates on 2D and 3D realistic scenes, which do not meet the actual requirements of realistic spherical paths. This paper has proposed a Gradient-Based Optimization algorithm based on the State Transition function (STGBO) to address the spherical path planning problem for UAV clusters. The state transition function is applied on the scale of medium and high-dimensional cities, balancing the stability and efficiency of the algorithm. Through evolution and comparisons with many mainstream meta-heuristic algorithms, STGBO has demonstrated superior performance and effectiveness in solving Medium-Altitude Unmanned Aerial Vehicle (MUAV) path planning problems on three-dimensional spherical surfaces, contributing to the development of the low-altitude economy. Full article
Show Figures

Figure 1

Figure 1
<p>The example of candidate solution generation. (<b>a</b>) Sphere (<b>b</b>) Different u and v coordinate positions (<b>c</b>) The shortest distance of p1 to p2.</p>
Full article ">Figure 2
<p>Spanning Tree Corresponding to Prufer Code.</p>
Full article ">Figure 3
<p>Gradient estimation using <math display="inline"><semantics> <mrow> <msub> <mrow> <mi>x</mi> </mrow> <mrow> <mi>n</mi> </mrow> </msub> </mrow> </semantics></math> and its neighboring positions.</p>
Full article ">Figure 4
<p>Algorithm Flowchart for Solving the Spherical Optimal Path.</p>
Full article ">Figure 5
<p>Cities are randomly generated (100 Cities).</p>
Full article ">Figure 6
<p>Convergence Curves (50 Cities).</p>
Full article ">Figure 7
<p>Convergence Curves (100 Cities).</p>
Full article ">Figure 8
<p>Minimum values across 30 runs (50 Cities).</p>
Full article ">Figure 9
<p>Minimum values across 30 runs (100 Cities).</p>
Full article ">Figure 10
<p>Convergence Curves (200 Cities).</p>
Full article ">Figure 11
<p>Minimum values across 30 runs (200 Cities).</p>
Full article ">Figure 12
<p>Best route for 50 cities.</p>
Full article ">Figure 13
<p>Best route for 100 cities.</p>
Full article ">Figure 14
<p>Convergence Curves (500 Cities).</p>
Full article ">Figure 15
<p>Convergence Curves (1000 Cities).</p>
Full article ">Figure 16
<p>Minimum values across 30 runs (500 Cities).</p>
Full article ">Figure 17
<p>Minimum values across 30 runs (1000 Cities).</p>
Full article ">
23 pages, 7072 KiB  
Article
Prescribed-Time Cooperative Guidance Law for Multi-UAV with Intermittent Communication
by Wenhui Ma and Xiaowen Guo
Drones 2024, 8(12), 748; https://doi.org/10.3390/drones8120748 - 11 Dec 2024
Viewed by 435
Abstract
Networked cooperation of multi-unmanned-aerial-vehicle (UAV) is significant, but their cooperative guidance presents challenges due to weak cross-domain communication. Given the difficulties in information transmission security, this paper proposes a prescribed-time cooperative guidance law (PTCGL) for networked multiple UAVs with intermittent communication. Supported by [...] Read more.
Networked cooperation of multi-unmanned-aerial-vehicle (UAV) is significant, but their cooperative guidance presents challenges due to weak cross-domain communication. Given the difficulties in information transmission security, this paper proposes a prescribed-time cooperative guidance law (PTCGL) for networked multiple UAVs with intermittent communication. Supported by the directed internal communication, the proposed PTCGL can ensure the simultaneous arrival in the case of the pinning external communication is time-triggered intermittent. In the first stage, PTCGL is designed to combine with a time-scaling function and second-order intermittent pinning consensus. With the desired convergence performance, the convergence time can be pre-specified independent of initial conditions and parameter tuning in theory. In the second stage, begin with the suitable initial conditions at the prescribed convergence time, the multiple UAVs are organized by proportional navigation guidance, so as to ensure the cooperation regardless of weak communication. Finally, simulations are conducted to verify the effectiveness and robustness of the proposed cooperative guidance law. Full article
Show Figures

Figure 1

Figure 1
<p>3-D cooperative guidance geometry.</p>
Full article ">Figure 2
<p>Schematic diagram of intermittent pinning communication.</p>
Full article ">Figure 3
<p>Block diagram of PTCGL.</p>
Full article ">Figure 4
<p>Communication topology.</p>
Full article ">Figure 5
<p>Simulation results of Case 1.</p>
Full article ">Figure 6
<p>Consumption indicator and commands with <math display="inline"><semantics> <mrow> <msub> <mi>τ</mi> <mi>p</mi> </msub> <mo>=</mo> </mrow> </semantics></math> 0.5 s, 1 s and 2 s.</p>
Full article ">Figure 7
<p>Simulation results of Case 2.</p>
Full article ">Figure 8
<p>Consumption indicator and commands with <math display="inline"><semantics> <mrow> <msub> <mi>τ</mi> <mi>p</mi> </msub> <mo>=</mo> <mn>0.5</mn> </mrow> </semantics></math> s, 1 s and 2 s.</p>
Full article ">Figure 9
<p>Intermittent sequence of Case 3.</p>
Full article ">Figure 10
<p>Simulation results of Case 3.</p>
Full article ">Figure 11
<p>Simulation results of Case 4.</p>
Full article ">Figure 12
<p>Simulation results of Case 5.</p>
Full article ">
Back to TopTop