[go: up one dir, main page]
More Web Proxy on the site http://driver.im/
Next Article in Journal
Fast and Intelligent Proportional–Integral–Derivative (PID) Attitude Control of Quadrotor and Dual-Rotor Coaxial Unmanned Aerial Vehicles (UAVs) Based on All-True Composite Motion
Previous Article in Journal
RBFNN-Based Adaptive Fixed-Time Sliding Mode Tracking Control for Coaxial Hybrid Aerial–Underwater Vehicles Under Multivariant Ocean Disturbances
Previous Article in Special Issue
Deep Learning Models Outperform Generalized Machine Learning Models in Predicting Winter Wheat Yield Based on Multispectral Data from Drones
You seem to have javascript disabled. Please note that many of the page functionalities won't work as expected without javascript enabled.
 
 
Font Type:
Arial Georgia Verdana
Font Size:
Aa Aa Aa
Line Spacing:
Column Width:
Background:
Article

Flight Altitude and Sensor Angle Affect Unmanned Aerial System Cotton Plant Height Assessments

1
Department of Plant and Soil Science, Texas Tech University, Lubbock, TX 79409, USA
2
Department of Soil and Crop Sciences, Texas A&M AgriLife Research, Lubbock, TX 79403, USA
*
Author to whom correspondence should be addressed.
Drones 2024, 8(12), 746; https://doi.org/10.3390/drones8120746
Submission received: 12 November 2024 / Revised: 3 December 2024 / Accepted: 4 December 2024 / Published: 10 December 2024
(This article belongs to the Special Issue Advances of UAV Remote Sensing for Plant Phenology)
Figure 1
<p>Study site on a research farm in Lubbock County, Texas, in 2022 and 2023.</p> ">
Figure 2
<p>DJI Phantom 4 RTKs and GNSS mobile station for acquiring RGB images in a research field in Lubbock, Texas, 2022. (<b>a</b>) DJI Phantom 4 RTKs UAS platform (<b>Left</b>), (<b>b</b>) Phantom 4 UAS controller (<b>middle</b>), and (<b>c</b>) D-RTKs 2 High-Precision GNSS Mobile Station (<b>right</b>) (source: <a href="https://www.dji.com" target="_blank">https://www.dji.com</a>, accessed on 5 January 2024.).</p> ">
Figure 3
<p>Image acquisitions at two flight altitudes (40 m and 80 m) and three camera angles (45°, 60°, and 90°) using a UAS in a cotton field in Lubbock, Texas.</p> ">
Figure 4
<p>Workflow for processing unmanned aerial system (UAS) images to estimate plant height.</p> ">
Figure 5
<p>Boxplot of plant height measurements in a research field in Lubbock, Texas, on 4 July and 2 August 2023 and 28 August and 24 October 2022.</p> ">
Figure 6
<p>Errors in UAS-derived cotton plant height at two UAS flight altitudes and three camera angles on (<b>a</b>) 4 July 2023, (<b>b</b>) 2 August 2023, (<b>c</b>) 28 August 2022, and (<b>d</b>) 24 October 2022.</p> ">
Figure 7
<p>Interactions between flight altitude and camera angle for errors in plant heights derived from UAS image on (<b>a</b>) 4 July 2023, (<b>b</b>) 2 August 2023, (<b>c</b>) 28 August 2022, and (<b>d</b>) 24 October 2022.</p> ">
Figure 8
<p>Tukey’s post hoc test for different camera angles (45°, 60°, 90°) at different flight altitudes for errors in plant heights derived from UAS images on (<b>a</b>) 4 July 2023, (<b>b</b>) 2 August 2023, (<b>c</b>) 28 August 2022, and (<b>d</b>) 24 October 2022. Significance levels: * <span class="html-italic">p</span> &lt; 0.05, ** <span class="html-italic">p</span> &lt; 0.01, and *** <span class="html-italic">p</span> &lt; 0.001, and n.s. represents non-significant results.</p> ">
Figure 9
<p>Relationship between measured plant height and UAS-derived plant height from different UAS altitudes and angles in a research field in Lubbock, Texas. (<b>a</b>) 4 July 2023, (<b>b</b>) 2 August 2023, (<b>c</b>) 28 August 2022, and (<b>d</b>) 24 October 2022.</p> ">
Figure 10
<p>Relationship between measured and UAS-derived plant heights using 30% test data for a flight altitude of 40 m and a camera angle of 45° for 4 July and 2 August 2023 and 28 August and 24 October 2022.</p> ">
Versions Notes

Abstract

:
Plant height is a critical biophysical trait indicative of plant growth and developmental conditions and is valuable for biomass estimation and crop yield prediction. This study examined the effects of flight altitude and camera angle in quantifying cotton plant height using unmanned aerial system (UAS) imagery. This study was conducted in a field with a sub-surface irrigation system in Lubbock, Texas, between 2022 and 2023. Images using the DJI Phantom 4 RTKs were collected at two altitudes (40 m and 80 m) and three sensor angles (45°, 60°, and 90°) at different growth stages. The resulting images depicted six scenarios of UAS altitudes and camera angles. The derived plant height was subsequently calculated as the vertical difference between the apical region of the plant and the ground elevation. Linear regression compared UAS-derived heights to manual measurements from 96 plots. Lower altitudes (40 m) outperformed higher altitudes (80 m) across all dates. For the early season (4 July 2023), the 40 m altitude had r2 = 0.82–0.86 and RMSE = 2.02–2.16 cm compared to 80 m (r2 = 0.66–0.68, RMSE = 7.52–8.76 cm). Oblique angles (45°) yielded higher accuracy than nadir (90°) images, especially in the late season (24 October 2022) results (r2 = 0.96, RMSE = 2.95 cm vs. r2 = 0.92, RMSE = 3.54 cm). These findings guide optimal UAS parameters for plant height measurement.

1. Introduction

Plant height is a vital indicator of crop productivity since it primarily reflects crops’ growth and health status [1]. It is a crucial phenotypic trait in crop breeding as a critical parameter in estimating crop biomass or productivity in different crops, including cotton [2,3,4], barley [5], rice [6,7], wheat [1,8], and maize [9,10,11]. Understanding plant height dynamics throughout the growing season is essential in evaluating genetic traits, plant physiological characteristics, and environmental effects [11,12,13]. For instance, Sui et al. (2012) highlighted plant height as a crucial indicator of plant health and yield potential, demonstrating its invaluable role in optimizing agricultural practices [12]. Furthermore, Sharma and Sharma et al. (2015) investigated the relationship between cotton plant height and yield, providing evidence of the strong effects of plant height on cotton lint yield [13]. These studies highlight plant height estimation as a vital component in precision agriculture as it can help to enhance effective and efficient field management practices and assess crop performance, particularly for commercial crops like cotton [4,14].
Plant height is traditionally measured manually [1], which is laborious and time-intensive, especially in large-scale measurement, and, hence, inadequate in meeting the demands of high-throughput phenotyping [11]. Moreover, reading and recording errors are often plausible, especially during unfavorable weather conditions [1]. Proximal and remote sensing technologies have been widely deployed to acquire high-spatial-resolution data for estimating plant height [13,14]. Although ground-based sensing techniques often generate high-quality data, the acquisition process is time-intensive and has limited coverage [2]. Unmanned aerial systems (UASs) with various sensors have been widely applied in high-throughput plant phenotyping, including estimating plant heights [15,16,17,18,19], as they have the advantages of obtaining high-spatial-resolution and high-temporal-resolution images at a low cost, flexible revisit frequency, and low flying altitude [2,9,15]. For instance, Bendig et al. (2014) estimated summer barley height from crop surface models using UAS-based data [5]. Zhou et al. (2020) quantitatively assessed the changes in the plant height of lodged maize at different growth stages using a UAS equipped with a light detection and ranging (LiDAR) sensor [10]. Xu et al. (2019) developed a methodology to estimate cotton phenotypic traits, such as plant height, canopy cover, and flowers, using multispectral images obtained from a UAS [3].
Despite the numerous advantages of the UAS, many factors affect the performance of these systems in plant phenotyping, including flight parameters, image quality, image processing algorithms, ground control points, and the structure of the vegetation [17,20]. Among these parameters, the flight altitude and camera angle significantly impact the accuracy of phenotyping, particularly plant height. Numerous studies have explored these parameters, revealing that low-altitude flights and oblique camera angles enhance phenotyping accuracy (16,18,19). For instance, Fujiwara et al. (2022) observed that oblique (tilted) angles provide better corn plant height estimations than nadir (directly downward) UAS camera angles [19]. Generally, low-altitude flights are associated with higher image resolutions, improving target representation and measurement accuracy.
Several studies have investigated the impact of UAS altitude and camera angles on crop plant height [18,19,21,22,23,24]; they often focus on either exploring different UAS altitudes or examining various camera angles independently during plant height estimation. For instance, Sadeghi and Sohrabi (2019) evaluated the performance of UAS at five different altitudes (60, 80, 100, 120, and 140 m) on the accuracy of tree height [22], while Fujiwara et al. (2022) compared the performance of UAS images at two different camera angles (60° and 90°) for predicting corn plant height [19]. No studies have explored the interaction effects of UAS altitudes and camera angles when estimating cotton plant height.
This method also plays a crucial role in influencing the accuracy and replicability of estimating plant height. In the processing of UAS images, studies have derived plant height by acquiring plant canopy images during the growing season and ground altitude by processing off-season images (pre-germination or post-harvest) to generate a Digital Terrain Model (DTM) [25,26,27,28]. However, this method has limitations as it necessitates an additional flight and is often inaccurate [19]. Moreover, models derived from different flights tend to deviate due to uncertainties in coordinates and distortions in the 3D models. This study adopts a method that extracts plant height from image scenes containing plant canopy and the soil surface from the in-season digital surface model (DSM) [29]. This approach has been proven to enhance efficiency and reduce processing time [17,30]. By adopting this method and analyzing the relationship between these acquisition parameters in estimating plant height, valuable insights can be gained to optimize UAS-based phenotyping techniques for cotton and other crops. Therefore, the objective of this study was to assess the effects of sensor angle and flight altitude on the performance of estimating cotton plant height using UAS images.

2. Materials and Methods

2.1. Experimental Site

This study was conducted in a research field (33°41′36.45″ N, 101°54′18.61″ W) in Lubbock County, Texas, in 2022 and 2023 (Figure 1). The region is semi-arid with annual precipitation of approximately 500 mm mainly between May and September. The average wind speed is about 5.5 m s−1, with high wind occurrence during winter and spring. The dominant soil type of the field is Pullman clay loam (fine, mixed, superactive, thermic Torrertic Paleustolls) with good drainage and moderately high saturated hydraulic conductivity.
Variations in plant height at different growth stages mainly resulted from irrigation and cotton variety treatments. Three irrigation rates, with 30% (low), 60% (medium), and 90% (high) evapotranspiration (ET) replacements, were implemented using a sub-surface drip irrigation system. Four cotton varieties with four replications were planted, including FM 1730GLTP, FM 2398GLTP, FM 1830GLT, and ST 5471GLTP (BASF, Ludwigshafen, Germany). The experiment consisted of 96 plots, each with four rows of cotton at ~4 m wide and ~8 m long. An alley of ~1.5 m was allocated between ranges.

2.2. Data Acquisition

A DJI Phantom 4 real-time kinematics (RTKs) (DJI, Shenzhen, China) with a 4K camera was used to acquire RGB images (Figure 2). The UAS has a 2-axis gimbal to maintain the orientation of the camera independently from the movement. It is controlled with a 2.4 GHz frequency bidirectional transmission that receives data, including Global Positioning System (GPS) status, the distance, and the height from the home point where the UAS takes off. The DSM system includes two D-RTKs High-Precision GNSS receivers, one on a mobile station (DJI, Shenzhen, China) and the other on the UAS (Figure 2c). The RTKs Global Navigation Satellite System (GNSS) receiver has an accuracy of 1.5 cm vertically and 1.0 cm horizontally. Each flight plan was created using the DJI GSPro software (DJI, Shenzhen, China) embedded in a controller.
UAS images were acquired around the local solar noon on 28 August and 24 October 2022 and 4 July and 2 August 2023, each with light-to-moderate wind conditions. These dates represent growth stages with various plant sizes in the early, mid, and late seasons. The flight missions included mission plans with the UAS flown at altitudes of 40 m and 80 m and camera angles set at 45°, 60°, and 90° to the land surface (Figure 3). The UAS was flown at 2.4 m s−1 with 80% front and side overlaps. The resultant images represented six scenarios of UAS altitudes and camera angles (Table 1).

2.3. UAS Image Processing

Figure 4 presents the processing workflow for estimating cotton plant height from UAS images. Figure 4a outlines the steps in analyzing raw RGB images acquired from the UAS captured at various flight altitudes and camera angles using Pix4D Mapper software (Pix4D S.A., Prilly, Switzerland). Pix4D was used to perform an initial processing step of aligning the images and generated a high-density point cloud, which was further used to create a digital surface model (DSM) (Figure 4b,c). The software used Structure-from-Motion (SfM), a photogrammetry technique that creates a three-dimensional point cloud from two-dimensional image sequences. SfM incorporates GPS information to enhance the accuracy and georeferencing of the 3D point cloud and DSM [31]. Additionally, the software generated orthomosaic images of the field through the blending and stitching of images, considering georeferenced information from the dense point clouds (Figure 4d). The DSM and orthomosaic images were obtained for the raw images on different dates.
The mosaicked images were further segmented into two classes, cotton canopy and soil, using the maximum likelihood algorithm of the supervised classification method in ArcGIS Pro (Figure 4e). Regions of intersection between the cotton plant and soil were excluded to mitigate edge effects and ensure the accurate representation of plant height. The two classes were superimposed onto the DSM to determine plant height, leveraging the specific data associated with each class (Figure 4f).
The upper region of the DSM within the canopy class was identified to determine plant height. A specific higher percentile in the DSM typically denotes this approach. Plant height representative values commonly used in various studies range from the 90th to the 99th percentiles, as observed in the works of Holman et al. (2016) and Tirado et al. (2020) [29,32]. These percentiles effectively capture the apical region of cotton plants, providing a more robust representation than a mean and exhibiting less susceptibility to noise than a maximum. After a thorough evaluation of different percentiles, the 99th percentile was selected. The selected percentile was to prevent misrepresenting plant height, ensuring a more accurate depiction of the upper boundary within the canopy class.
The DTM in the soil class was averaged to represent the ground elevation in individual plots. The DTM was averaged due to the insignificant variation within each plot. Studies have shown that this approach, distinct from obtaining the DTM before planting, has demonstrated increased efficiency and reduced processing time during the crop season [17,30]. Cotton plant height was computed as the vertical length difference between the plant’s apical region and ground elevation.

2.4. Plant Height Measurements

On each day of UAS image acquisitions, cotton plant height was measured using a yardstick. For each plot, the average plant height was determined by measuring plants within a predefined 1 m2 sampling point, which was marked in the field to allow for the precise identification of the sampling area. Within each sampling point, plants were chosen that exhibited uniform growth and no significant gaps, ensuring that the measurements accurately represented the canopy without interference from empty spaces. The use of the RTKs system ensured high positional accuracy when linking the ground-based measurements to the corresponding locations in the UAS-derived orthomosaic. This approach minimized errors in aligning ground data with the orthomosaic, enabling reliable integration of the datasets.

2.5. Statistical Analyses

The Zonal Statistics tool in the Spatial Analyst extension of ArcGIS Pro software (Esri, Redlands, CA, USA) was applied to summarize the image data of each plot. In total, 70% of the data were used to build a model, while 30% of the measured plant height was used to validate the performance of the UAS-derived plant height. Statistical analyses were performed using the R language [33]. Analysis of variance (ANOVA) was conducted using the aov function to test if there were significant differences in the measurement of cotton plants at different angles and heights. Subsequently, Tukey’s post hoc test was conducted to explore specific pairwise differences among the means of different camera angles, providing a more in-depth examination of the variations identified by the ANOVA. Simple linear regression models were performed to determine the relationship between the plant heights derived from the UAS and those manually measured. Model accuracy was evaluated using statistical metrics, including the coefficient of determination (R2), root mean square error (RMSE), and slope of the regression line, to assess how well the predicted values matched the measured plant heights. The plant height error was calculated by subtracting the UAS-derived plant heights from the measured plant height, providing a direct measure of the prediction error. Furthermore, the result was validated using the UAS scenario with the best-performing model, developed using 70% of the data. This model was then applied to predict plant height across different dates using the remaining 30% of the dataset. This approach assessed the model’s performance across the growth stages to demonstrate its reliability.

3. Results

Figure 5 presents the plant height measurements on 28 August and 24 October 2022 and 4 July and 2 August 2023. Cotton plots were grouped into low, medium, and high sizes, corresponding to low, medium, and high irrigation rates, respectively. The mean height for the high category was 22.8 cm, 51.8 cm, 69.8 cm, and 70.3 cm for the four dates, respectively. The mean height for the medium category was 29.5 cm, 42.9 cm, 60.8 cm, and 59.4 cm, respectively. For the low category, the mean plant heights were 23.8 cm, 34.1 cm, 42.8 cm, and 40.8 cm on these dates, respectively. Overall, on each date, plants of the low sizes had more significant variations than the high and medium sizes, indicated by the standard deviation and coefficient of variation values.
Figure 6 shows the errors in UAS-derived cotton plant heights at different flight altitudes and camera angles on the four dates. The scenario with a flight altitude of 40 m and a camera angle of 45 degrees exhibited lower errors in estimating plant height than other scenarios. For example, on 4 July 2023 (early season), a flight altitude of 40 m and a camera angle of 45 degrees had an average plant height error of 0.2 cm, while the scenario with a flight altitude of 80 m and a camera angle of 90 degrees had a mean plant height error of −2.2 cm. A flight altitude of 80 m and a camera angle of 90 degrees consistently had the highest error range in plant height estimation for all the dates.
Figure 7 shows the interactions of flight altitude and camera angle on the errors of UAS-derived plant height. The interaction between flight altitude and camera angle had a significant effect on plant height estimation for all dates (p < 0.05). This means that the flight altitude angle of the camera can have a significant impact on the accuracy of the measurements of plant height. There was an underestimation in UAS-derived plant height at higher altitudes. For example, the mean plant height error at altitudes of 40 m and 80 m at an angle of 45 degrees was 0.05 cm and −1.70 cm, respectively, on 4 July 2023 (early season). The lower altitudes and moderate camera angles, such as 45 degrees, resulted in more accurate estimations (−0.13 cm to 0.14 cm). In comparison, higher altitudes and larger camera angles introduced more significant errors (−2.77 cm to −5.51 cm).
Figure 8 shows the result of Tukey’s post hoc test for various camera angles at different flight altitudes, evaluating errors in plant heights derived from UAS images for the different dates. As altitude increases, the errors in plant height estimation may be influenced, potentially leading to decreased accuracy. On 4 July 2023 (early season), at an altitude of 40 m, the analysis revealed a significant difference between the two oblique angles (45° and 60°), as well as between angles 45 and 90 degrees (p < 0.001). However, at an altitude of 80 m, there was no significant difference between the two oblique angles. Notably, the significance level between angles 45 degrees and 90 degrees decreased (p < 0.005). Similar trends were observed on August 2 and August 28 (mid–late season), where no significant difference was found between oblique angles at both 40 m and 80 m altitudes. Nevertheless, there was a significant difference in plant height error between camera angles 60 degrees and 90 degrees at an altitude of 40 m (p < 0.05), while no significant difference was observed at the altitude of 80 m. Consistent significant differences were found between camera angles 45 degrees and 90 degrees at both altitudes for all the dates.
Simple linear regression analyses revealed significant relationships between UAS-derived plant height and measured plant height across the six UAS altitude (40 and 80 m) and angle (45°, 60°, 90°) scenarios, with r2 values ranging from 0.80 to 0.97 for the observation dates (Figure 9). The corresponding root mean square error (RMSE) values ranged from 2.01 cm to 3.85 cm, indicating the relatively high accuracy of the estimations.
On 4 July 2023 (early season), the 40 m-45° scenario exhibited the strongest relationship and highest accuracy (r2 = 0.86, RMSE = 2.01 cm). Additionally, the 40 m-60° and 40 m-90° scenarios displayed stronger relationships and higher accuracies (r2 = 0.85, RMSE = 2.14 cm; r2 = 0.82, RMSE = 2.16 cm, respectively). At 80 m, the relationship was relatively weaker compared to scenarios at 40 m (r2 = 0.66–0.68, RMSE = 7.52–8.76 cm). On 2 August 2023 (mid-season), the relationship between the measured and UAS-derived plant heights was stronger compared to 4 July 2023, with r2 values ranging from 0.88 to 0.90. However, it had lower accuracy, with RMSE values ranging from 3.71 cm to 3.90 cm.
On 28 August 2022 (mid–late season), the 40 m-45° scenario exhibited the strongest relationship and highest accuracy (r2 = 0.97, RMSE = 2.10 cm). Additionally, the 40 m-60° and 40 m-90° scenarios displayed stronger relationships and higher accuracies (r2 = 0.94, RMSE = 3.12 cm; r2 = 0.93, RMSE = 3.13 cm, respectively) compared to the scenarios at 80 m altitude (r2 = 0.93–0.90, RMSE = 3.31–3.85 cm). Similarly, on 24 October 2022 (late season), the 40 m-45° scenario demonstrated the strongest relationship and higher accuracy (r2 = 0.95, RMSE = 2.94 cm). Conversely, the 80 m-45° scenario exhibited a relatively weak relationship and lower accuracy in estimating plant height using UAS (r2 = 0.94, RMSE = 3.24 cm).
Additionally, the slopes of the linear models ranged from 0.91 to 1.13 across the dates. Scenarios with an altitude of 40 m exhibited a steeper slope compared to the 80 m scenario. Also, the nadir camera angle (90 degrees) had a relatively greater slope than oblique angles. The results further showed that plant height estimation at higher altitudes and nadir angles was slightly underestimated during the mid and late seasons compared to lower altitudes and oblique angles.
Overall, the results highlight the sensitivity of UAS-derived plant height estimations to both flight altitude and sensor angle. The 40 m altitude and oblique camera angles (45, 60, and 90 degrees) consistently yielded stronger relationships and higher plant height estimation accuracies than the 80 m altitude scenario. The result also showed that the 40 m-45° scenario consistently had a stronger relationship and the highest accuracy throughout the different cotton growth stages. These findings emphasize the importance of carefully selecting flight parameters to optimize the accuracy of UAS-based plant height assessments.
The cross-validation indicated that the model for the 40 m-45° scenario performed the best in terms of predicting the plant heights for all dates. The relationship between the measured (30% dataset) and predicted plant heights using the model developed with the 70% data for this scenario is presented in Figure 10. The comparison between predicted and measured plant heights indicates consistently high levels of accuracy for all the dates. Linear models using UAS images had good performance in predicting plant heights with high accuracy in the early season (r2 = 0.88 and RMSE = 2.20 cm), mid-season (r2 = 0.95 and RMSE = 3.20 cm), mid–late season (r2 = 0.96 and RMSE = 2.40 cm), and late season (r2 = 0.96 and RMSE = 2.56 cm).

4. Discussion

The findings of this study suggest that the accuracy of plant height derived from UAS images can be improved by flying at lower altitudes and using oblique sensor angles. The flight altitude of the UAS plays a crucial role in determining the spatial resolution and quality of the resulting 3D model of a target. When using a sensor with a particular field of view, capturing images at lower altitudes produces images with greater resolutions, which can enhance the photogrammetric process used to generate the 3D coordinates of a surface. Other studies reported similar results in quantifying plant height using UAS images [18,19,34,35]. Fujiwara et al. (2022) explained that data acquisition at higher altitudes causes lower resolution and results in the mixing of neighboring pixels containing both soil and plant features [19]. Most plant phenotyping studies require images acquired at relatively low altitudes for detailed plant structure retrieval [35]. In a study conducted by Naesset (2004), the effects of different flying altitudes on the biophysical stand properties were investigated using a small-footprint airborne scanning laser [36]. The results revealed that flying altitude and footprint diameter may affect the canopy height and density metrics derived from laser data. This observation was corroborated by Weiss and Baret (2017) in their investigation of utilizing 3D point clouds generated from UAS RGB imagery to characterize the 3D macro-structure of vineyards [37]. Their findings underscore the necessity of optimal flight configuration and camera settings to estimate plant characteristics accurately. Moreover, lower spatial resolution can lead to measurement errors in estimating small features, such as cotton plants. This can be attributed to a systematic shift in the plant height measured from higher altitudes, resulting in inaccurate measurements.
This study has also demonstrated that using oblique angles in UAS imagery can lead to better estimations of cotton plant height than nadir angles. As noted in prior research, nadir images captured directly above the target tend to underestimate plant height, thereby compromising the accuracy of the plant height estimation [4,32]. Additionally, oblique imagery captures more intricate details with denser point clouds, resulting in the more precise differentiation of plant profiles. Several studies have confirmed that nadir imagery provides limited information on plant canopy surfaces. Fujiwara et al. (2022) predicted maize plant height using UAS images at camera angles of 60 degrees and 90 degrees [19]. The results showed that plant height prediction was better at a camera angle of 60 degrees, with low mean absolute errors. They concluded that the oblique camera angle could reduce obstruction error and better capture the 3D structures of plants. Li et al. (2021) investigated the impact of camera viewing angles on the accuracy of estimating leaf parameters of wheat plants using 3D point clouds [18]. This study demonstrated that oblique imagery accurately modeled wheat plants and generated more point clouds compared to nadir imagery. These findings align with previous research showing that oblique angles can mitigate systematic errors in plant height estimation and can accurately generate high-quality 3D models [38,39]. Overall, these results highlight the potential of oblique imagery to improve crop monitoring practices.
Although oblique angles showed the potential to improve plant height estimation accuracy, there are some drawbacks. The complexity of oblique imagery compared to nadir imagery stems from intricate flight path planning. Oblique imagery requires multiple passes from different angles, making flight planning and execution more demanding and time-consuming. Mosaicking oblique images is also more complex, as the varying angles require precise alignment and integration to create a seamless final product, leading to longer processing times. In addition, oblique angles can result in more occlusion of plant features, particularly those on the far side of the canopy, leading to incomplete data and lower accuracy [11,18]. Finally, oblique angles may require more post-processing and analysis time than nadir images, as the point clouds generated from oblique images can be more complex and require additional steps to convert into usable data [11]. Despite these potential drawbacks, oblique angles remain a promising approach for improving the accuracy of plant height estimation in UAS imagery. However, it is important to note that flying at a higher altitude will significantly reduce the accuracy of measuring plant height estimation using oblique angles because the combined effects of reduced image resolution, increased ground sampling distance, shadow effects, reduced signal-to-noise ratio, and limitations in sensor resolution contribute to a decrease in the accuracy of plant height estimation when flying at higher altitudes, particularly with oblique angle observations.
This study also noted that UAS images at higher altitudes and nadir angles were found to slightly underestimate crop heights compared to lower altitudes and oblique angles, as discussed in previous studies [40,41]. However, overall, the results showed an improvement in accuracy compared to similar studies [5,21,41]. Further research is needed to determine the underlying causes for the over- and underestimations for different plant sizes.
Based on the effects of sensor angle and flight altitude on the performance of estimating cotton plant height, this study emphasizes the importance of carefully selecting flight parameters to optimize the accuracy of UAS-based plant height assessments. This study also highlights the practical implications of using UAS imagery in plant phenotyping for crop improvement. Adjusting UAS flight parameters, such as flying at lower altitudes and oblique angles, can improve the accuracy of plant height estimation. However, this may increase flight and processing time. Highly accurate plant height measurements, nonetheless, provide valuable insights into plant growth and development, aiding in the identification of plant stress, diseases, and breeding programs for improved crop varieties. This information can assist farmers and researchers in making informed decisions regarding irrigation, fertilizer application, and other management practices.

5. Conclusions

This study demonstrated the significant impact of flight altitude and sensor angle on cotton plant height assessment using UAS images. The findings indicate that the oblique sensor angles used in this study (45 degrees and 60 degrees) improved the accuracy of the plant height measurement. In contrast, increasing the flight altitude from 40 m to 80 m decreased the accuracy.
This study provides a valuable foundation for future research, including exploring the impact of flight altitude and sensor angle on other crop traits, such as leaf area index and biomass, while considering the influence of environmental conditions. Future research will explore other imaging equipment, such as LiDAR, multispectral, and thermal cameras. For example, LiDAR can offer high-resolution 3D mapping for architectural traits. This approach will broaden the scope for collecting different crop phenotype metrics, providing spatial and temporal resolutions typically unavailable to plant scientists. By advancing our knowledge in these areas, we can further harness the potential of UAS technology for precision agriculture, facilitating sustainable and efficient cotton production. Integrating remote sensing techniques and different machine learning algorithms is pivotal to unlocking the full benefits of UAS-based assessments, ultimately contributing to improved agricultural practices and crop management strategies.

Author Contributions

Conceptualization, O.A. and W.G.; methodology, O.A. and W.G.; software, O.A. and W.G.; validation, O.A. and W.G.; formal analysis, O.A.; investigation, O.A., B.G., A.A. and W.G.; writing—original draft preparation, O.A.; writing—review and editing, O.A., A.A., B.G., G.R., and W.G.; visualization, O.A.; supervision, W.G.; project administration, W.G.; funding acquisition, W.G. All authors have read and agreed to the published version of the manuscript.

Funding

Cotton Incorporated (grant number: 16-252TX), USDA NIFA (grant number: 2023-70001-40993), and USDA NIFA Hatch funding (Project number: 9898).

Data Availability Statement

The raw data supporting the conclusions of this article will be made available by the authors upon request.

Acknowledgments

We appreciate the financial support from Cotton Incorporated (grant number: 16-252TX), USDA NIFA (grant number: 2023-70001-40993), and USDA NIFA Hatch funding (Project number: 9898) for this study.

Conflicts of Interest

The authors declare no conflicts of interest.

References

  1. Yuan, W.; Li, J.; Bhatta, M.; Shi, Y.; Baenziger, P.S.; Ge, Y. Wheat height estimation using LiDAR in comparison to ultrasonic sensor and UAS. Sensors 2018, 18, 3731. [Google Scholar] [CrossRef] [PubMed]
  2. Chu, T.; Chen, R.; Landivar, J.A.; Maeda, M.M.; Yang, C.; Starek, M.J. Cotton growth modeling and assessment using unmanned aircraft system visual-band imagery. J. Appl. Remote Sens. 2016, 10, 036018. [Google Scholar] [CrossRef]
  3. Xu, R.; Li, C.; Paterson, A.H. Multispectral imaging and unmanned aerial systems for cotton plant phenotyping. PLoS ONE 2019, 14, e0205083. [Google Scholar] [CrossRef] [PubMed]
  4. Thompson, A.L.; Thorp, K.R.; Conley, M.M.; Elshikha, D.M.; French, A.N.; Andrade-Sanchez, P.; Pauli, D. Comparing nadir and multi-angle view sensor technologies for measuring in-field plant height of upland cotton. Remote Sens. 2019, 11, 700. [Google Scholar] [CrossRef]
  5. Bendig, J.; Bolten, A.; Bennertz, S.; Broscheit, J.; Eichfuss, S.; Bareth, G. Estimating biomass of barley using crop surface models (CSMs) derived from UAV-based RGB imaging. Remote Sens. 2014, 6, 10395–10412. [Google Scholar] [CrossRef]
  6. Li, S.; Ding, X.; Kuang, Q.; Ata-UI-Karim, S.T.; Cheng, T.; Liu, X.; Cao, Q. Potential of UAV-based active sensing for monitoring rice leaf nitrogen status. Front. Plant Sci. 2018, 9, 1834. [Google Scholar] [CrossRef] [PubMed]
  7. Lu, W.; Okayama, T.; Komatsuzaki, M. Rice Height Monitoring between Different Estimation Models Using UAV Photogrammetry and Multispectral Technology. Remote Sens. 2021, 14, 78. [Google Scholar] [CrossRef]
  8. Belton, D.; Helmholz, P.; Long, J.; Zerihun, A. Crop height monitoring using a consumer-grade camera and UAV technology. PFG—J. Photogramm. Remote Sens. Geoinf. Sci. 2019, 87, 249–262. [Google Scholar] [CrossRef]
  9. Calou, V.B.; Teixeira, A.D.S.; Moreira, L.C.; Rocha, O.C.D.; Silva, J.A.D. Estimation of maize biomass using unmanned aerial vehicles. Eng. Agrícola 2019, 39, 744–752. [Google Scholar] [CrossRef]
  10. Zhou, L.; Gu, X.; Cheng, S.; Yang, G.; Shu, M.; Sun, Q. Analysis of plant height changes of lodged maize using UAV-LiDAR data. Agriculture 2020, 10, 146. [Google Scholar] [CrossRef]
  11. Che, Y.; Wang, Q.; Xie, Z.; Zhou, L.; Li, S.; Hui, F.; Ma, Y. Estimation of maize plant height and leaf area index dynamics using an unmanned aerial vehicle with oblique and nadir photography. Ann. Bot. 2020, 126, 765–773. [Google Scholar] [CrossRef]
  12. Sui, R.; Thomasson, J.A.; Ge, Y. Development of sensor systems for precision agriculture in cotton. Int. J. Agric. Biol. Eng. 2012, 5, 1–14. [Google Scholar]
  13. Sharma, B.; Ritchie, G.L. High-throughput phenotyping of cotton in multiple irrigation environments. Crop Sci. 2015, 55, 958–969. [Google Scholar] [CrossRef]
  14. Feng, A.; Sudduth, K.; Vories, E.; Zhang, M.; Zhou, J. Cotton yield estimation based on plant height from UAV-based imagery data. In 2018 ASABE Annual International Meeting; American Society of Agricultural and Biological Engineers: St. Joseph, MI, USA, 2018; p. 1. [Google Scholar]
  15. Leitão, D.A.H.d.S.; Sharma, A.K.; Singh, A.; Sharma, L.K. Yield and plant height predictions of irrigated maize through unmanned aerial vehicle in North Florida. Comput. Electron. Agric. 2023, 215, 108374. [Google Scholar] [CrossRef]
  16. Nguyen, T.T.; Slaughter, D.C.; Townsley, B.T.; Carriedo, L.; Maloof, J.N.; Sinha, N. In-field plant phenotyping using multiview reconstruction: An investigation in eggplant. In Proceedings of the 13th International Conference on Precision Agriculture, Monticello, IL, USA, 31 July–3 August 2016; International Society of Precision Agriculture: Monticello, IL, USA, 2016. [Google Scholar]
  17. Zhang, H.; Sun, Y.; Chang, L.; Qin, Y.; Chen, J.; Qin, Y.; Du, J.; Yi, S.; Wang, Y. Estimation of Grassland Canopy Height and Aboveground Biomass at the Quadrat Scale Using Unmanned Aerial Vehicle. Remote Sens. 2018, 10, 851. [Google Scholar] [CrossRef]
  18. Li, M.; Shamshiri, R.R.; Schirrmann, M.; Weltzien, C. Impact of Camera Viewing Angle for Estimating Leaf Parameters of Wheat Plants from 3D Point Clouds. Agriculture 2021, 11, 563. [Google Scholar] [CrossRef]
  19. Fujiwara, R.; Kikawada, T.; Sato, H.; Akiyama, Y. Comparison of Remote Sensing Methods for Plant Heights in Agricultural Fields Using Unmanned Aerial Vehicle-Based Structure from Motion. Front. Plant Sci. 2022, 13, 886804. [Google Scholar] [CrossRef]
  20. Swayze, N.C.; Tinkham, W.T.; Creasy, M.B.; Vogeler, J.C.; Hoffman, C.M.; Hudak, A.T. Influence of UAS Flight Altitude and Speed on Aboveground Biomass Prediction. Remote Sens. 2022, 14, 1989. [Google Scholar] [CrossRef]
  21. Bareth, G.; Bendig, J.; Tilly, N.; Hoffmeister, D.; Aasen, H.; Bolten, A. A comparison of UAV-and TLS-derived plant height for crop monitoring: Using polygon grids for the analysis of crop surface models (CSMs). Photogramm. Fernerkund. Geoinf 2016, 2016, 85–94. [Google Scholar] [CrossRef]
  22. Sadeghi, S.; Sohrabi, H. The effect of UAV flight altitude on the accuracy of individual tree height extraction in a broad-leaved forest. Int. Arch. Photogramm. Remote Sens. Spat. Inf. Sci. 2019, 42, W18. [Google Scholar]
  23. Dhami, H.; Yu, K.; Xu, T.; Zhu, Q.; Dhakal, K.; Friel, J.; Tokekar, P. Crop height and plot estimation for phenotyping from unmanned aerial vehicles using 3D LiDAR. In Proceedings of the 2020 IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS), Las Vegas, NV, USA, 24 October 2020–24 January 2021; IEEE: New York, NY, USA, 2020; pp. 2643–2649. [Google Scholar]
  24. Xie, T.; Li, J.; Yang, C.; Jiang, Z.; Chen, Y.; Guo, L.; Zhang, J. Crop height estimation based on UAV images: Methods, errors, and strategies. Comput. Electron. Agric. 2021, 185, 106155. [Google Scholar] [CrossRef]
  25. Roth, L.; Streit, B. Predicting cover crop biomass by lightweight UAS-based RGB and NIR photography: An applied photogrammetric approach. Precis. Agric. 2018, 19, 93–114. [Google Scholar] [CrossRef]
  26. Ziliani, M.G.; Parkes, S.D.; Hoteit, I.; McCabe, M.F. Intra-season crop height variability at commercial farm scales using a fixed-wing UAV. Remote Sens. 2018, 10, 2007. [Google Scholar] [CrossRef]
  27. Jiang, Q.; Fang, S.H.; Peng, Y.; Gong, Y.; Zhu, R.S.; Wu, X.T.; Ma, Y.; Duan, B.; Liu, J. UAV-based biomass estimation for rice-combining spectral, TIN-based structural, and meteorological features. Remote Sens. 2019, 11, 890. [Google Scholar] [CrossRef]
  28. Kawamura, K.; Asai, H.; Yasuda, T.; Khanthavong, P.; Soisouvanh, P.; Phongchanmixay, S. Field phenotyping of plant height in an upland rice field in Laos using low-cost small unmanned aerial vehicles (UAVs). Plant Prod. Sci. 2020, 23, 452–465. [Google Scholar] [CrossRef]
  29. Tirado, S.B.; Hirsch, C.N.; Springer, N.M. UAV-based imaging platform for monitoring maize growth throughout development. Plant Direct 2020, 4, e00230. [Google Scholar] [CrossRef]
  30. Hassan, M.A.; Yang, M.; Fu, L.; Rasheed, A.; Zheng, B.; Xia, X.; Xiao, Y.; He, Z. Accuracy assessment of plant height using an unmanned aerial vehicle for quantitative genomic analysis in bread wheat. Plant Methods 2019, 15, 37. [Google Scholar] [CrossRef]
  31. Colomina, I.; Molina, P. Unmanned Aerial Systems for Photogrammetry and Remote Sensing: A Review. ISPRS J. Photogramm. Remote Sens. 2014, 92, 79–97. [Google Scholar] [CrossRef]
  32. Holman, F.H.; Riche, A.B.; Michalski, A.; Castle, M.; Wooster, M.J.; Hawkesford, M.J. High Throughput Field Phenotyping of Wheat Plant Height and Growth Rate in Field Plot Trials Using UAV Based Remote Sensing. Remote Sens. 2016, 8, 1031. [Google Scholar] [CrossRef]
  33. R Core Team. R: A Language and Environment for Statistical Computing; R Foundation for Statistical Computing: Vienna, Austria, 2020. [Google Scholar]
  34. Jay, S.; Rabatel, G.; Hadoux, X.; Moura, D.; Gorretta, N. In-field crop row phenotyping from 3D modeling performed using structure from motion. Comput. Electron. Agric. 2015, 110, 70–77. [Google Scholar] [CrossRef]
  35. Maimaitijiang, M.; Ghulam, A.; Sidike, P.; Hartling, S.; Maimaitiyiming, M.; Peterson, K.; Shavers, E.; Fishman, J.; Peterson, J.; Kadam, S.; et al. Unmanned Aerial System (UAS)-based phenotyping of soybean using multi-sensor data fusion and extreme learning machine. ISPRS J. Photogramm. Remote Sens. 2017, 134, 43–58. [Google Scholar] [CrossRef]
  36. Næsset, E. Effects of different flying altitudes on biophysical stand properties estimated from canopy height and density measured with a small-footprint airborne scanning laser. Remote Sens. Environ. 2004, 91, 243–255. [Google Scholar] [CrossRef]
  37. Weiss, M.; Baret, F. Using 3D point clouds derived from UAV RGB imagery to describe vineyard 3D macro-structure. Remote Sens. 2017, 9, 111. [Google Scholar] [CrossRef]
  38. James, M.R.; Robson, S. Mitigating systematic error in topographic models derived from UAV and ground-based image networks. Earth Surf. Process. Landf. 2014, 39, 1413–1420. [Google Scholar] [CrossRef]
  39. Gerke, M.; Nex, F.; Remondino, F.; Jacobsen, K.; Kremer, J.; Karel, W.; Huf, H.; Ostrowski, W. Orientation of oblique airborne image sets—Experiences from the ISPRS/Eurosdr benchmark on multi-platform photogrammetry. Int. Arch. Photogramm. Remote Sens. Spat. Inf. Sci. ISPRS Arch. 2016, 2016, 185–191. [Google Scholar]
  40. Aasen, H.; Burkart, A.; Bolten, A.; Bareth, G. Generating 3D hyperspectral information with lightweight UAV snapshot cameras for vegetation monitoring: From camera calibration to quality assurance. ISPRS J. Photogramm. Remote Sens. 2015, 108, 245–259. [Google Scholar] [CrossRef]
  41. Willkomm, M.; Bolten, A.; Bareth, G. Non-destructive monitoring of rice by hyperspectral in-field spectrometry and UAV-based remote sensing: A case study of field-grown rice in North Rhine-Westphalia, Germany. Int. Arch. Photogramm. Remote Sens. Spat. Inf. Sci. 2016, 41, 1071–1077. [Google Scholar] [CrossRef]
Figure 1. Study site on a research farm in Lubbock County, Texas, in 2022 and 2023.
Figure 1. Study site on a research farm in Lubbock County, Texas, in 2022 and 2023.
Drones 08 00746 g001
Figure 2. DJI Phantom 4 RTKs and GNSS mobile station for acquiring RGB images in a research field in Lubbock, Texas, 2022. (a) DJI Phantom 4 RTKs UAS platform (Left), (b) Phantom 4 UAS controller (middle), and (c) D-RTKs 2 High-Precision GNSS Mobile Station (right) (source: https://www.dji.com, accessed on 5 January 2024.).
Figure 2. DJI Phantom 4 RTKs and GNSS mobile station for acquiring RGB images in a research field in Lubbock, Texas, 2022. (a) DJI Phantom 4 RTKs UAS platform (Left), (b) Phantom 4 UAS controller (middle), and (c) D-RTKs 2 High-Precision GNSS Mobile Station (right) (source: https://www.dji.com, accessed on 5 January 2024.).
Drones 08 00746 g002
Figure 3. Image acquisitions at two flight altitudes (40 m and 80 m) and three camera angles (45°, 60°, and 90°) using a UAS in a cotton field in Lubbock, Texas.
Figure 3. Image acquisitions at two flight altitudes (40 m and 80 m) and three camera angles (45°, 60°, and 90°) using a UAS in a cotton field in Lubbock, Texas.
Drones 08 00746 g003
Figure 4. Workflow for processing unmanned aerial system (UAS) images to estimate plant height.
Figure 4. Workflow for processing unmanned aerial system (UAS) images to estimate plant height.
Drones 08 00746 g004
Figure 5. Boxplot of plant height measurements in a research field in Lubbock, Texas, on 4 July and 2 August 2023 and 28 August and 24 October 2022.
Figure 5. Boxplot of plant height measurements in a research field in Lubbock, Texas, on 4 July and 2 August 2023 and 28 August and 24 October 2022.
Drones 08 00746 g005
Figure 6. Errors in UAS-derived cotton plant height at two UAS flight altitudes and three camera angles on (a) 4 July 2023, (b) 2 August 2023, (c) 28 August 2022, and (d) 24 October 2022.
Figure 6. Errors in UAS-derived cotton plant height at two UAS flight altitudes and three camera angles on (a) 4 July 2023, (b) 2 August 2023, (c) 28 August 2022, and (d) 24 October 2022.
Drones 08 00746 g006
Figure 7. Interactions between flight altitude and camera angle for errors in plant heights derived from UAS image on (a) 4 July 2023, (b) 2 August 2023, (c) 28 August 2022, and (d) 24 October 2022.
Figure 7. Interactions between flight altitude and camera angle for errors in plant heights derived from UAS image on (a) 4 July 2023, (b) 2 August 2023, (c) 28 August 2022, and (d) 24 October 2022.
Drones 08 00746 g007
Figure 8. Tukey’s post hoc test for different camera angles (45°, 60°, 90°) at different flight altitudes for errors in plant heights derived from UAS images on (a) 4 July 2023, (b) 2 August 2023, (c) 28 August 2022, and (d) 24 October 2022. Significance levels: * p < 0.05, ** p < 0.01, and *** p < 0.001, and n.s. represents non-significant results.
Figure 8. Tukey’s post hoc test for different camera angles (45°, 60°, 90°) at different flight altitudes for errors in plant heights derived from UAS images on (a) 4 July 2023, (b) 2 August 2023, (c) 28 August 2022, and (d) 24 October 2022. Significance levels: * p < 0.05, ** p < 0.01, and *** p < 0.001, and n.s. represents non-significant results.
Drones 08 00746 g008
Figure 9. Relationship between measured plant height and UAS-derived plant height from different UAS altitudes and angles in a research field in Lubbock, Texas. (a) 4 July 2023, (b) 2 August 2023, (c) 28 August 2022, and (d) 24 October 2022.
Figure 9. Relationship between measured plant height and UAS-derived plant height from different UAS altitudes and angles in a research field in Lubbock, Texas. (a) 4 July 2023, (b) 2 August 2023, (c) 28 August 2022, and (d) 24 October 2022.
Drones 08 00746 g009
Figure 10. Relationship between measured and UAS-derived plant heights using 30% test data for a flight altitude of 40 m and a camera angle of 45° for 4 July and 2 August 2023 and 28 August and 24 October 2022.
Figure 10. Relationship between measured and UAS-derived plant heights using 30% test data for a flight altitude of 40 m and a camera angle of 45° for 4 July and 2 August 2023 and 28 August and 24 October 2022.
Drones 08 00746 g010
Table 1. Six scenarios of flight altitudes and camera angles for quantifying cotton plant height using UAS images.
Table 1. Six scenarios of flight altitudes and camera angles for quantifying cotton plant height using UAS images.
DatasetFlight Altitude
(m)
Sensor Angle
(Degree)
Scenario
(Altitude–Angle)
Image Resolution
(cm)
1404540 m-45°1.48
2406040 m-60°1.25
3409040 m-90°1.21
4804580 m-45°2.85
5806080 m-60°2.52
6809080 m-90°2.21
Disclaimer/Publisher’s Note: The statements, opinions and data contained in all publications are solely those of the individual author(s) and contributor(s) and not of MDPI and/or the editor(s). MDPI and/or the editor(s) disclaim responsibility for any injury to people or property resulting from any ideas, methods, instructions or products referred to in the content.

Share and Cite

MDPI and ACS Style

Adedeji, O.; Abdalla, A.; Ghimire, B.; Ritchie, G.; Guo, W. Flight Altitude and Sensor Angle Affect Unmanned Aerial System Cotton Plant Height Assessments. Drones 2024, 8, 746. https://doi.org/10.3390/drones8120746

AMA Style

Adedeji O, Abdalla A, Ghimire B, Ritchie G, Guo W. Flight Altitude and Sensor Angle Affect Unmanned Aerial System Cotton Plant Height Assessments. Drones. 2024; 8(12):746. https://doi.org/10.3390/drones8120746

Chicago/Turabian Style

Adedeji, Oluwatola, Alwaseela Abdalla, Bishnu Ghimire, Glen Ritchie, and Wenxuan Guo. 2024. "Flight Altitude and Sensor Angle Affect Unmanned Aerial System Cotton Plant Height Assessments" Drones 8, no. 12: 746. https://doi.org/10.3390/drones8120746

APA Style

Adedeji, O., Abdalla, A., Ghimire, B., Ritchie, G., & Guo, W. (2024). Flight Altitude and Sensor Angle Affect Unmanned Aerial System Cotton Plant Height Assessments. Drones, 8(12), 746. https://doi.org/10.3390/drones8120746

Article Metrics

Back to TopTop