[go: up one dir, main page]
More Web Proxy on the site http://driver.im/
Next Article in Journal
Estimating FAPAR of Rice Growth Period Using Radiation Transfer Model Coupled with the WOFOST Model for Analyzing Heavy Metal Stress
Previous Article in Journal
Automatic Sky View Factor Estimation from Street View Photographs—A Big Data Approach
You seem to have javascript disabled. Please note that many of the page functionalities won't work as expected without javascript enabled.
 
 
Font Type:
Arial Georgia Verdana
Font Size:
Aa Aa Aa
Line Spacing:
Column Width:
Background:
Article

Mapping Reflectance Anisotropy of a Potato Canopy Using Aerial Images Acquired with an Unmanned Aerial Vehicle

by
Peter P. J. Roosjen
1,*,
Juha M. Suomalainen
1,2,
Harm M. Bartholomeus
1,
Lammert Kooistra
1 and
Jan G. P. W. Clevers
1
1
Laboratory of Geo-Information Science and Remote Sensing, Wageningen University & Research, Droevendaalsesteeg 3, 6708 PB Wageningen, The Netherlands
2
Finnish Geospatial Research Institute, National Land Survey of Finland, Geodeetinrinne 1, 02430 Masala, Finland
*
Author to whom correspondence should be addressed.
Remote Sens. 2017, 9(5), 417; https://doi.org/10.3390/rs9050417
Submission received: 10 March 2017 / Revised: 12 April 2017 / Accepted: 22 April 2017 / Published: 29 April 2017
Graphical abstract
">
Figure 1
<p>The eight experimental plots (A–H) in the potato field with different initial nitrogen (N) fertilization levels (horizontal zones), and additional nitrogen fertilization levels (vertical zones). The red line indicates the flight path flown during Flight 1 on 9 June 2016 (<a href="#remotesensing-09-00417-t002" class="html-table">Table 2</a>). The base map is an RGB composite (R = 674 nm, G = 559 nm, and B = 500 nm), produced with the data collected during the flight.</p> ">
Figure 2
<p>An example of the observation geometry of the pixels of one of the georeferenced images. (<b>a</b>) An RGB composite (R = 674 nm, G = 559 nm, and B = 500 nm) of the study area, collected on 9 June 2016, showing the location of the current image (black line) and the position of the unmanned aerial vehicle (UAV) (sun-shaped symbol); (<b>b</b>) the reflectance factors at 864 nm of the current image; (<b>c</b>) the relative azimuth angles of the pixels of the current image, where 0° is the forward scattering direction and 180° is the backward scattering direction. The solar principal plane is indicated by the black line; (<b>d</b>) the view zenith angle of the pixels of the current image. Projection of (<b>a</b>–<b>d</b>) is WGS84 UTM 31N.</p> ">
Figure 3
<p>Normalized difference vegetation index (NDVI) values of the study area on 9 June 2016 (<b>a</b>) and 19 July 2016 (<b>b</b>), both in WGS84 UTM 31N projection. Estimated leaf area index (LAI) in the experimental plots based on the weighted difference vegetation index (WDVI), calculated from Cropscan measurements (<b>c</b>). The dates at which the additional sensor-based nitrogen (N) fertilization was applied to Plots B, D, F, and H is indicated by the solid lines. The date at which all plots received potassium (K) fertilization is indicated by the dashed line.</p> ">
Figure 4
<p>The number of observations per pixel, captured during the flight on 9 June 2016, at 658 nm (<b>a</b>). The camera positions (numbers in white circles) from which an example pixel (white square between Plot C and D) was captured (<b>b</b>). The red lines indicate the flight path of the UAV and the black dots indicate the camera positions that did not capture the example pixel. Both (<b>a</b>) and (<b>b</b>) are in WGS84 UTM 31N projection.</p> ">
Figure 5
<p>Linearly interpolated polar plots of the (anisotropy factors (ANIFs) observed for the example pixel in <a href="#remotesensing-09-00417-f004" class="html-fig">Figure 4</a> at 658 nm (<b>a</b>) and 848 nm (<b>b</b>), respectively. The numbers indicate the camera positions and correspond to the numbers in <a href="#remotesensing-09-00417-f004" class="html-fig">Figure 4</a>b. The azimuth angles are relative azimuth angles, where 0° is the forward scattering direction and 180° is the backward scattering direction. The vertical dashed line indicates the solar principal plane.</p> ">
Figure 6
<p><math display="inline"> <semantics> <mrow> <msub> <mi>ρ</mi> <mn>0</mn> </msub> </mrow> </semantics> </math>, <math display="inline"> <semantics> <mi>k</mi> </semantics> </math>, and <math display="inline"> <semantics> <mi>Θ</mi> </semantics> </math> parameters obtained by fitting the Rahman–Pinty–Verstraete (RPV) model through the measurements of Flight 1 (9 June 2016) at 658 nm (<b>a</b>–<b>c</b>) and at 848 nm (<b>d</b>–<b>f</b>). All figures are in WGS84 UTM 31N projection.</p> ">
Figure 7
<p><math display="inline"> <semantics> <mrow> <msub> <mi>ρ</mi> <mn>0</mn> </msub> </mrow> </semantics> </math>, <math display="inline"> <semantics> <mi>k</mi> </semantics> </math>, and <math display="inline"> <semantics> <mi>Θ</mi> </semantics> </math> parameters obtained by fitting the RPV model through the measurements of Flight 2 (19 July 2016) at 658 nm (<b>a</b>–<b>c</b>) and at 848 nm (<b>d</b>–<b>f</b>). All figures are in WGS84 UTM 31N projection.</p> ">
Figure 8
<p>The reflectance anisotropy derived from of all pixels within experimental Plot C (<b>top</b>) and Plot E (<b>bottom</b>) shown as linearly inter- and extrapolated polar graphs at 658 nm and 848 nm, collected on 9 June 2016 (<b>left</b>) and 19 July 2016 (<b>right</b>), respectively. The white stars indicate the position of the sun during data collection and the black dots indicate the measurement positions.</p> ">
Figure 9
<p>Average <math display="inline"> <semantics> <mrow> <msub> <mi>ρ</mi> <mn>0</mn> </msub> </mrow> </semantics> </math>, <math display="inline"> <semantics> <mi>k</mi> </semantics> </math>, and <math display="inline"> <semantics> <mi>Θ</mi> </semantics> </math> parameters obtained by fitting the RPV model though each individual pixel in the experimental plots on both dates. The colored surfaces indicate the standard deviations.</p> ">
Figure 10
<p>The relation between the <math display="inline"> <semantics> <mrow> <msub> <mi>ρ</mi> <mn>0</mn> </msub> </mrow> </semantics> </math>, <math display="inline"> <semantics> <mi>k</mi> </semantics> </math>, and <math display="inline"> <semantics> <mi>Θ</mi> </semantics> </math> parameters of the RPV model of all pixels in each experimental plot (indicated by letters A–H) and the canopy cover, at 658 nm (black circles) and 848 nm (gray triangles), respectively, for both dates. The error bars indicate the standard deviation.</p> ">
Figure 11
<p>Ranking correlation (Kendall’s tau) between the <math display="inline"> <semantics> <mrow> <msub> <mi>ρ</mi> <mn>0</mn> </msub> </mrow> </semantics> </math>, <math display="inline"> <semantics> <mi>k</mi> </semantics> </math>, and <math display="inline"> <semantics> <mi>Θ</mi> </semantics> </math> parameters of the RPV model and the canopy cover (<b>a</b>) and Cropscan LAI (<b>b</b>) for the flights of Day 1 and Day 2 separately and for the flights of both days combined. The gray-shaded areas indicate the significance levels: values above or below the dark-gray areas were significant for the analysis of both dates (n = 16) and values above or below the light-gray areas were significant for the analysis of the separate days (n = 8), both at the 5% confidence level.</p> ">
Versions Notes

Abstract

:
Viewing and illumination geometry has a strong influence on optical measurements of natural surfaces due to their anisotropic reflectance properties. Typically, cameras on-board unmanned aerial vehicles (UAVs) are affected by this because of their relatively large field of view (FOV) and thus large range of viewing angles. In this study, we investigated the magnitude of reflectance anisotropy effects in the 500–900 nm range, captured by a frame camera mounted on a UAV during a standard mapping flight. After orthorectification and georeferencing of the images collected by the camera, we calculated the viewing geometry of all observations of each georeferenced ground pixel, forming a dataset with multi-angular observations. We performed UAV flights on two days during the summer of 2016 over an experimental potato field where different zones in the field received different nitrogen fertilization treatments. These fertilization levels caused variation in potato plant growth and thereby differences in structural properties such as leaf area index (LAI) and canopy cover. We fitted the Rahman–Pinty–Verstraete (RPV) model through the multi-angular observations of each ground pixel to quantify, interpret, and visualize the anisotropy patterns in our study area. The Θ parameter of the RPV model, which controls the proportion of forward and backward scattering, showed strong correlation with canopy cover, where in general an increase in canopy cover resulted in a reduction of backward scattering intensity, indicating that reflectance anisotropy contains information on canopy structure. In this paper, we demonstrated that anisotropy data can be extracted from measurements using a frame camera, collected during a typical UAV mapping flight. Future research will focus on how to use the anisotropy signal as a source of information for estimation of physical vegetation properties.

Graphical Abstract">

Graphical Abstract

1. Introduction

In optical remote sensing, the sensors are mostly passive systems and their measured signal therefore depends on the position and orientation of the sensor and the position of the sun relative to the observed surface. Differences in the intensity of the measured signal of natural surfaces as a function of observation and viewing geometry are the result of their anisotropic reflectance behavior. In its idealized form, this is commonly referred to as the Bidirectional Reflectance Distribution Function (BRDF) [1]. Anisotropic reflectance effects are the result of optical and structural properties of the surface. In the case of vegetation these anisotropic reflectance effects can, for example, be the result of the different proportions of shadowed and sunlit parts of the canopy at different observation geometries [2,3], differences in proportion of the background visible at different observation geometries [4], canopy heterogeneity and architecture [5,6,7], and/or the orientation of leaves [8,9]. Knowledge of the anisotropic reflectance characteristics of a surface is important for correction and normalization of effects due to viewing and illumination geometry [10,11,12]. In addition, it can be considered an information source. Information on anisotropic reflectance effects has shown to improve classification results [13,14,15,16] or parameter retrieval such as leaf area index (LAI) [17,18], canopy height [19], and canopy clumping [20,21].
Traditionally, anisotropic reflectance characteristics are studied by performing multi-angular reflectance measurements using goniometer setups. Various goniometers have been developed over the past decades to perform these measurements under controlled laboratory conditions [3,10,22,23] or under field conditions [24,25,26,27,28,29]. Typically, goniometers are ground-based, static devices that allow for accurate control of the positioning of a sensor to measure the reflectance into predefined directions. Measurements by goniometers can sample only small surface areas due to the relatively short distance between target and sensor, making it difficult to capture a large enough representative area of natural heterogenic targets [30]. Moreover, in situ conditions often complicate the deployment of a goniometer setup, for example in poorly accessible terrain, or in snow [31,32], or above water [33].
Unmanned aerial vehicles (UAVs) provide an alternative platform to field goniometer systems for performing multi-angular reflectance measurements [34,35,36]. An advantage of UAV-based multi-angular reflectance measurements is that there is no ground-based device needed, which means that the studied surface does not get disturbed. In addition, the sensor-target distance can be larger because no moving arm is needed to position the sensor at a specific spot above the field, which means that surfaces made of larger objects, like forests, can also be studied [37]. Theoretically, the sensor-target distance can even be easily hundreds of meters if such UAV flight altitudes are allowed by local legislation [38].
Recently, several methods and sensors for UAV-based multi-angular sampling have been explored. By using dedicated flight patterns to maneuver a UAV around a target while using a gimbal to point a spectrometer or camera towards the center of the target, multi-angular measurements can be acquired [34,35]. By a different strategy, multi-angular observations were obtained by extracting the viewing geometry of measurements by a pushbroom spectrometer mounted on a UAV [36]. By hovering a UAV above fields covered with a single crop and rotating the UAV around its vertical axis, multi-angular measurements were collected within the FOV of the spectrometer. Several other studies have demonstrated how to use a frame camera on-board a UAV to acquire multi-angular measurements [39,40,41]. The angular coverage, i.e., the range of azimuth and zenith angles that are sampled, of multi-angular measurement methods relies on the position, orientation, and FOV of the sensor, as well as on the number of samples that are taken. Orthorectification and georeferencing by photogrammetric processing of frame-based camera imagery also relies on multiple views of the same pixel. Thus, by determining the observation geometry of pixels that are captured by multiple cameras, it is possible to collect multi-angular data and thereby location/pixel specific anisotropy information.
In this paper, we used a frame camera to study pixel-wise anisotropy effects based on the overlapping pixels—and therefore multi-angular observations—that are collected during a typical UAV flight for mapping purposes. We define a mapping flight as a flight where several parallel flight lines are flown in a block pattern with sufficient forward and sideways overlap of the collected images for photogrammetric processing to construct a map of the covered area. We present the geometrical equations that allow for the calculation of the observation and illumination geometry of georeferenced ground pixels. In addition, we quantify, visualize, and interpret the anisotropy effects based on the parameters of the semi-empirical Rahman–Pinty–Verstraete (RPV) model [42], obtained by fitting the model through our measurements. Moreover, we correlate the RPV parameters with LAI and canopy cover in a potato field where different nitrogen fertilization treatments were applied to sections in the field to study the effect of canopy development on reflectance anisotropy.

2. Materials and Methods

2.1. Study Area

The study area was a field with consumer potatoes (Solanum tuberosum L., cultivar Fontane) of approximately 450 × 200 m, located south of the village Reusel (51°59′47.9′′N, 5°9′34.5′′E) on the Dutch–Belgian border, in the province of Noord-Brabant (The Netherlands). The planting date was 16 April 2016. In this field, an experiment was performed to evaluate the effect of split-level fertilization on potato yield. Different initial fertilization levels (before planting) and sensor-based additional sensor-based fertilization (during crop growth) were applied to zones in the field (Figure 1 and Table 1). The sensor-based variable-rate fertilization was applied based on crop reflectance measurements performed by Fritzmeier ISARIA crop sensors (Fritzmeier Umwelttechnik, Großhelfendorf, Germany) mounted in front of the tractor providing information on nitrogen need of the crop on a weekly basis. The variable rates for nitrogen were based on measured indices from the crop sensor and decision rules derived from previous dedicated fertilization experiments between 2010 and 2013 [43], and applied using a GNSS guided fertilizer spreader. Eight 30 × 30 m plots (Figure 1: A–H) were created in the different fertilization zones where weekly measurements of leaf area index (LAI) and leaf chlorophyll content were performed to monitor the status of the potato plants. Sensor-based nitrogen fertilization was applied to half of the plots on 28 June 2016, 15 July 2016, and 9 August 2016. In addition, on 7 July 2016, fertilization with potassium of 60 kg/ha was applied to all plots.
The ground-based apparent LAI (hereafter referred to LAI) was estimated based on weekly measurements with a Cropscan Multispectral Radiometer that used the weighted difference vegetation index (WDVI) [44] for estimating LAI. Calibration curves between the WDVI and LAI were determined by performing both Cropscan measurements and LAI2000 Plant Canopy Analyser measurements in the field at the same farm for potatoes grown in the years 2010–2014 [45]. In 2016, limited measurements with the LAI2000 were available, but validation with these measurements showed estimating LAI with Cropscan data was accurate. The percentage of canopy cover was estimated by classification of the pixels in orthophotos of the study area at high spatial resolution (~8 cm ground sampling distance (GSD)) into a vegetation and non-vegetation class. For this, we calculated the normalized difference vegetation index (NDVI) (Equation (1)).
N D V I = R 848 n m 0.5 ( R 658 n m + R 674 n m ) R 848 n m + 0.5 ( R 658 n m + R 674 n m )
where R 658   n m , R 674   n m , and R 848   n m are the reflectance factors at the subscripted wavelengths. We chose these bands because they are closest to Red Band 4 (665 nm) and NIR Band 8 (842 nm) of the Multi-Spectral Imager (MSI) onboard ESA’s Sentinel-2 satellite, that are used for NDVI calculation [46]. The average of Rikola bands at 658 and 674 nm was calculated to approximate the reflectance of MSI Band 4 at 665 nm. Pixels with an NDVI > 0.7 were classified as vegetation. We chose this threshold value since it provided good results in separation of the vegetation and non-vegetation classes based on visual inspection.

2.2. UAV Flights

Two UAV flights, with an Altura AT8 octocopter, were performed during the growing season in the summer of 2016 (Table 2); one flight at an early growing phase (9 June 2016) and one flight at a late growing phase after the potato canopy was fully developed (19 July 2016). Between flights on the two dates, additional sensor-based fertilization was given to half of the plots (Table 1). The flights were performed under clear-sky conditions to assure the strongest anisotropy effects [47]. The UAV was programmed to fly at 120 m above ground level with a horizontal speed of 4 m/s from waypoint to waypoint, resulting in four flight-lines that were spaced 20 m apart (Figure 1). Images were collected continuously at an interval of 2.4 s. Flying at the aforementioned height and speed with the camera that we used in our study (Section 2.3) resulted in forward and sideways overlap of approximately 88% and 75%, respectively, with a GSD of approximately 8 cm.

2.3. Spectral Measurements

The sensor used for our measurements was a frame-based hyperspectral camera (Rikola Ltd., Oulu, Finland), which has a horizontal and vertical FOV of 36.5° and collects images of 1010 × 1010 pixels. The camera uses the Fabry–Perot interferometer (FPI) technology to capture a series of snapshot images on programmatically selectable spectral bands in the visible-near infrared (VIS-NIR) wavelength range. An FPI system uses an optical filter with two semi-transparent mirror films facing each other. The wavelength of the transmitted light depends on the distance between these two films (the air gap). Thus, by taking a sequence of images while varying the distance between the films, a complete spectrum can be captured for each pixel. As the images on different wavelengths are collected at different times, each wavelength covers a slightly different extent and has a slightly different orientation due to the movement of the UAV during the collection of a full datacube. The number of spectral bands that can be sampled and the center wavelengths of those can be selected according to the user’s preference. For our measurements, we sampled 16 bands in the 500–900 nm range (Table 3). A full datacube of 16 bands in raw format is 32.0 MB. The memory card that we used in our configuration, a 32.0 GB SanDisk CompactFlash memory card, has space for potentially storing 999 images of 16 bands.
Prior to take-off for each flight, a dark current (image with lens cap on) measurement and image of a reflectance reference panel (gray 50% Spectralon panel (LabSphere Inc., North Sutton, NH, USA)) were taken. The HyperspectralImager 2.0 software by Rikola was used to convert the raw images to radiances. The radiance images were then transformed into reflectance factor images using the empirical line method and the measurement of the Spectralon panel. As the images were taken in clear atmospheric conditions from a relatively low altitude, performing advanced atmospheric correction to the images was not found to be necessary.

2.4. Orthorectification and Measurement Geometry

The reflectance factor images were aligned and ortho- and georectified in PhotoScan Professional 1.2.6 (AgiSoft LLC, St. Petersburg, Russia). Each of the 16 spectral bands had to be processed individually due to the movement of the UAV during the collection of a full datacube [48]. Per measured spectral band, the reflectance factor images were aligned (accuracy setting: highest) using the GPS data collected by the Rikola sensor during the flights as a reference. Thereafter, all bands were merged, and 12 natural ground control points (GCPs) (Figure 1) were used for accurate georeferencing of the data. The GCPs were manually selected in each of the 16 spectral bands. Unfortunately, there were no dedicated Real Time Kinematic (RTK) or GPS/GNSS ground control markers available at the time of the flights. The imagery was used to calculate and export a Digital Surface Model (DSM) of the target area at a 5 m ground pixel size in WGS84 UTM 31N coordinate system. Each reflectance factor image was georectified and resampled separately to the same 5 m ground pixels and exported as separate GeoTIFFs. We specifically chose a 5 m ground pixel size, because a pixel of this size is large enough to capture a representative proportion of potato rows and exposed soil between the potato rows, while it is small enough to capture the spatial variation caused by the different fertilization treatments applied to the plots in the field. Finally, the photogrammetric positions of the camera ( x c a m , y c a m , z c a m ) during each image acquisition were exported as an ASCII file. The PhotoScan processing was performed on a computer with a Windows 7 Enterprise (64 bit) operating system, with an Intel(R) Xeon(R) CPU E5-1650 0 processor at 3.20 GHz (12 CPUs), an AMD FirePro V5900 2 GB graphics card, and 8.00 GB of installed memory. The processing time of the aforementioned steps was approximately 3.5 h per flight.
After the PhotoScan processing, the location of the ground pixels x p x , y p x , z p x were known. Based on the pixel and camera locations, we calculated the View Azimuth Angle (VAA) and View Zenith Angle (VZA) for each ground pixel and reflectance factor image combination. The solar illumination angle for each pixel, relative to the normal of the surface ( β ( x p x , y p x ) ) was calculated using
cos β ( x p x , y p x ) = cos θ s cos θ n ( x p x , y p x ) + sin θ s sin θ n ( x p x , y p x ) cos [ φ s   φ n ( x p x , y p x ) ]
where θ s and φ s are the sun zenith and sun azimuth angle (Table 2), respectively. θ n and φ n are the slope and aspect of the pixel, respectively, which were calculated from the DSM. The VZA ( θ v ) was calculated using
θ v ( x p x , y p x ) = arctan ( z c a m z p x ( x c a m x p x ) 2 + ( y c a m y p x ) 2 ) .
The VAA ( φ v ) of each pixel was calculated using
φ v ( x , y ) = arctan ( y p x y c a m x p x x c a m ) .
Equation (3) assumes that the measured surface is flat. To correct for variations in slope due to height differences in the field, Equations (2)–(4) were substituted in Equation (2), resulting in
cos θ v c o r r e c t e d ( x p x , y p x )    = c o s θ v ( x p x , y p x ) c o s θ n ( x p x , y p x )    + s i n θ v ( x p x , y p x ) s i n θ n ( x p x , y p x ) cos [ φ v ( x p x , y p x ) φ n ( x p x , y p x ) ] .
Throughout the rest of this article, cos θ v c o r r e c t e d is referred to as VZA. By applying the Equations (2)–(5), we were able to obtain the observation and illumination geometry of each individual pixel of each image at each spectral band (Figure 2). Besides a spatial location and reflectance factor (Figure 2b), each pixel now has a VAA and VZA (Figure 2c,d). Pixels that were captured by multiple cameras thus have multiple observation angles as well.

2.5. Data Analysis and Visualization

After determining the observation geometry for all ground pixels and aerial image combinations, we fitted the RPV model [42] to each ground pixel in the study area. The RPV parameters obtained for each pixel were used to create parameterized anisotropy maps of the study area and to analyze the anisotropy patterns that were observed in the different experimental plots. We choose the RPV model because it has been successfully used in a wide variety of anisotropy studies at different spatial scales, ranging from a centimeter scale in laboratory studies [49,50] to a kilometer scale in space-borne studies [51]. The RPV model defines the reflectance anisotropy with four parameters where the ρ 0 parameter controls the amplitude, the k parameter controls the bowl ( k < 1)/bell ( k > 1) shape of the anisotropy curve, the Θ parameter controls forward/backward scattering, and the ρ c parameter controls the hotspot effect around the direct backscattering angle. Mathematically the RPV model is defined as
R s ( θ i , φ i ; θ r , φ r ) = ρ 0 c o s k 1 θ i c o s k 1 θ r ( cos θ i + cos θ r ) 1 k F ( g ) [ 1 + R ( G ) ]
where
F ( g ) = 1 Θ 2 [ 1 + Θ 2 2 Θ cos ( π g ) ] 1.5
g = c o s θ i c o s θ r + s i n θ i s i n θ r cos ( φ i φ r )
1 + R ( G ) = 1 + 1 ρ c 1 + G
G = [ t a n 2 θ i + t a n 2 θ r 2 t a n θ i t a n θ r cos ( φ i φ r ) ] 1 / 2 .
To fit the model to each ground pixel dataset, we applied an iterative optimization method that uses the Levenberg–Marquardt algorithm to minimize the sum of squared residuals (i.e., the difference between the measured reflectance factor at all observation angles and the reflectance factors modeled by the RPV model at the same observation angles) to estimate the parameters. During the fitting procedure, the range of possible solutions for the Θ parameter was limited between −1 and 1, the ranges of the ρ 0 and k parameter were left free. The hotspot parameter ρ c was fixed at 1, which disables the hot spot parameter, since we acquired no observations close to the hotspot.

3. Results

3.1. Crop Development

In the data collected on 9 June 2016, the different initial fertilization levels were clearly visible as horizontal zones in NDVI map of the potato field (Figure 3a). Plot C and Plot D, the plots in the zone that did not receive any initial fertilization, stood out with clearly lower NDVI values. During the period between the two measurement days, there were two moments at which the potato plants on Plots B, D, F, and H received additional sensor-based nitrogen fertilization. This resulted in higher NDVI values in the plots in this section compared to the plots in the section that did not receive any additional fertilization. The effect of the different additional fertilization levels on the NDVI is most clearly visible, as the difference between Plots C and D on 19 July (Figure 3b), where Plot D has clearly higher NDVI values.
Figure 3c shows the development of the LAI based on Cropscan measurements throughout the growing season. During Flight 1 on 9 June 2016, the potato crop was still at an early stage of growth with a haulm (leaves and stems) length of 0.4–0.6 m. The potato plants were standing upright above the ridges, showing a clear row structure. The LAI of all plots increased until the last week of June (Figure 3c). During the first two weeks of July, the LAI decreased, which can be explained by growth-reducing effects caused by a dry period with high temperatures in the area where the study site was located. The insufficient water availability due to the lack of rain, combined with the high temperatures can affect the stomatal capacity of leaves, which in turn can cause a reduction biomass, and thus LAI [52]. Flight 2 on 19 July 2016 occurred after the potato plants were flowered, and the haulm length increased to 0.6–0.9 m. Due to the drought and high temperatures, stems had fallen over and started filling up the space between the potato rows. This resulted in an increase in canopy cover at the date of the second flight, while the LAI remained more or less equal, or for some plots became even lower (Table 4). After the second flight, the improved water availability due to rainfall and combined with the fertilizer addition of July 15 (Table 1) resulted in an increase in LAI with the largest long-term effect for the Plots B, D, F and H (Figure 3c).

3.2. View Angle Coverage

Based on Equations (2)–(5), we determined the VAAs and VZAs of each pixel that was georeferenced during the Agisoft PhotoScan processing. Pixels that were captured from multiple camera positions were thus also viewed from multiple observation geometries. As an example, Figure 4a shows from how many camera positions each pixel in the study area was captured at 658 nm during the flight on 9 June 2016. An example pixel (the white square in between Plot C and Plot D) was captured from 32 different camera positions in all four flight lines (Figure 4b).
The nadir normalized reflectance (anisotropy factors (ANIFs) [3] observed from the different camera positions at 658 nm for the example pixel indicated a strong anisotropy effect, where the highest ANIFs were observed in the backward scattering direction in the solar principal plane (Figure 5a). This backward scattering anisotropy was weaker for the same pixel at 848 nm (Figure 5b). Due to the movement of the UAV during collection of a full datacube, the camera positions were slightly different for both bands. In addition, the example pixel was not captured from all the same camera positions at both wavelengths. For example, the example pixel was not captured from Camera Positions 12 and 147 at 848 nm. On the contrary, the example pixel was not captured from Camera Position 58 at 658 nm, while it was at 848 nm (compare Figure 5a,b).

3.3. Anisotropy Maps

For every georeferenced ground pixel in the study area at each measured wavelength, multi-angular observations as displayed in Figure 5 were available. By fitting the RPV model through these multi-angular observations at every pixel, we created anisotropy maps that show the spatial distribution of the model parameters. Figure 6 shows the parameter maps of the ρ 0 , k , and Θ parameters at 658 nm and at 848 nm on 9 June 2016. The ρ 0 parameter, which is closely related to the nadir reflectance, follows a typical vegetation trend, with low values at 658 nm (Figure 6a) and higher values at 848 nm (Figure 6d). Plots C and D, which did not receive any initial fertilization and therefore had a lower canopy cover, showed clearly lower ρ 0 values at 848 nm. In general, pixels in both bands had k < 1, indicating a bowl-shaped anisotropy pattern (Figure 6c,e). Both wavelengths showed clear backward scattering anisotropy ( Θ < 0) over the full study area. This backward scattering was particularly strong at 658 nm (Figure 6c) and less pronounced at 848 nm (Figure 6f).
On 19 July 2016, the potato canopy was fully developed, resulting in general in higher ρ 0 values at 848 nm (Figure 7d). The pixels of Plot C, which was the plot in the zone that did not receive any initial fertilization nor additional sensor-based fertilization, had clearly lower ρ 0 values at 848 nm than the other plots. The k parameter remained more or less the same as on the first measuring day. The Θ parameter, on the contrary, showed a strong increase at both wavelengths (i.e., the values became less negative), indicating a reduction of backward scattering intensity (compare Figure 6c,f and Figure 7c,f).

3.4. Plot Statistics

Figure 8 shows the interpolated polar plots of the ANIFs at 658 nm and 848 nm for all pixels within Plot C and Plot E, two plots that varied strongly in LAI and canopy cover. Pixels within a 3 m distance of tractor tracks, which were running through the center of the plots (Figure 1), were excluded from this analysis. At both wavelengths, there was a clear backward scattering anisotropy in the plots, which was most pronounced at 658 nm. During the flight on 9 June 2016, we observed a weaker backward scattering anisotropy in Plot C at 658 nm compared to Plot E, as can be observed by the higher density of isolines. This was likely due to the fact that the potato plant rows in Plot C were not continuous due to gaps in the potato rows, either because of missing plants in the rows or because of the small size of the plants (Figure 4b). This in turn resulted in fewer strong shadows and thereby a weaker backward scattering intensity. The rows of potato plants in Plot E, on the other hand, had fewer gaps and therefore displayed stronger shadows, and thus displayed a stronger backward scattering intensity. During the flight on 19 July 2016, the canopy of Plot E had fully developed into a closed surface with a row structure that was at that point hardly visible. This resulted in less pronounced shadows cast by the potato rows and thus in a weaker backward scattering on this date. On both dates, the backward scattering intensity for both plots at 848 nm remained more or less similar.
The average RPV parameters, based on the individually obtained RPV parameter values for every pixel within the experimental plots, are shown in Figure 9. Again, pixels within a 3 m radius of the tractor tracks were excluded. The ρ 0 parameter followed the pattern of a typical vegetation spectrum. In general, a higher ρ 0 value was observed in the green and NIR wavelength region for plots that had a higher canopy cover on 19 July 2016. This is most clear for the plots where there was a strong increase in canopy cover between the two dates (Table 4). The k parameter was in general <1 at all wavelengths, indicating a bowl-shaped anisotropy pattern for the whole sampled wavelength region. The Θ parameter, like the ρ 0 parameter, followed in shape a vegetation spectrum. The strongest backward scattering effects (the lowest Θ values) were observed in the visible wavelength region, with a minimum at 674 nm, where shadow effects are strongest and multiple scattering effects are absent due to the absorption of radiance by chlorophyll in this region. In the NIR region, where shadow effects are diminished due to high reflectance and transmittance by vegetation, the Θ values were in general less negative than in the visible region. The Θ values obtained for the flight on Day 2 were consistently higher (less negative) than the Θ values obtained for the flight on Day 1, indicating a reduced backward scatting intensity with increased canopy cover.

3.5. RPV Parameters vs. Canopy Cover and LAI

Figure 10 shows the relation between the average RPV parameters of all pixels within each experimental plot and canopy cover at 658 nm and 848 nm, respectively. On both dates, an increased canopy cover resulted in a decrease in the ρ 0 parameter at 658 nm and in an increase at 848 nm. This trend can be explained by the lower reflectance of vegetation at 658 nm compared to the soil background reflectance at this wavelength. A higher canopy cover therefore resulted in higher reflectance, and thus a higher ρ 0 value. The opposite holds at 848 nm, where the soil reflectance was lower than the vegetation reflectance. On 9 June 2016 at 658 nm and on 19 July 2016 at 848 nm, the relations between ρ 0 and canopy cover were strong, indicated by the R2 of 0.764 and 0.753, respectively.
The relation between the k parameter and canopy cover was less obvious. On Day 1 at 658 nm, we observed a slight decrease in the k parameter with increased canopy cover (R2 = 0.478). On both dates at 848 nm, the relation between the k parameter and canopy cover was very weak, indicated by the R2 of 0.262 and 0.211 on Day 1 and Day 2, respectively.
On Day 1, we observed a strong decrease in the Θ parameter (which indicates increased backward scattering intensity) with an increase in the canopy cover at 658 nm. On Day 2, on the contrary, we observed an increase in the Θ parameter with increasing canopy cover. The decrease in the Θ parameter on Day 1 happened most likely due to the aforementioned gaps in the potato rows that were present in the plots with low canopy cover on this day: the gaps in the rows result in weaker shadows and thus weaker backward scattering. On Day 2, when there were hardly any gaps left in the rows, we observed a decrease in backward scattering intensity with increasing canopy cover. This is likely due to the disappearance of the row-structure since the potato rows started growing into each other, narrowing and disappearing the space between the rows, which cause strong shadow effects.
Finally, we studied the correlation of the RPV parameters with the canopy cover and LAI, respectively, for the different experimental plots at all measured spectral bands (Figure 11). We calculated the Kendall’s tau ranking correlation coefficient, since not all canopy cover, LAI, and RPV parameter values were normally distributed, and the number of plots and thus number of observations (n = 8) was rather low. The Kendall’s tau correlation coefficient takes values between −1 and +1, where a positive value indicates that the ranks of both variables are increasing and a negative value indicates that the ranks of both variables are decreasing. The closer Kendall’s tau gets to +1 or −1, the stronger the correlation between the variables. Since the flight on Day 1 was performed at an early phase of the potato crop before it reached maximum LAI and the flight on Day 2 at a late phase after reaching maximum LAI, both flights are first analyzed separately and then combined.
The correlation of the ρ 0 parameter with canopy cover (Figure 11a, upper graph) on Day 1 was negative in the visible part of the spectrum, whereas it was positive in the NIR. On Day 2, this correlation was only negative in the red. A significant correlation at 5% confidence level (p = 0.05) was only observed in the NIR for both dates. The pattern for the correlation between ρ 0 and LAI was quite similar to the one found for canopy cover on both dates (compare Figure 11b with Figure 11a, upper graphs).
The correlation of the k parameter with canopy cover (Figure 11a, middle graph) was strongest (negative) on Day 1 in the visible part of the spectrum. Only the correlation at 658 nm was significant. The correlations between the k parameter and LAI (Figure 11b, middle graph) were quite similar to the one for canopy cover. No correlations were significant at the 5% level.
The correlation between the Θ parameter and canopy cover was negative in the visible part of the spectrum and positive in the NIR on Day 1, but they were not significant (Figure 11a, lower graph). On Day 2, the correlations were close to zero, except for positive, non-significant correlations in the visible part. The correlations between the Θ parameter and LAI were similar to those of canopy cover. For LAI, the correlations were significant in the visible region on Day 1 (Figure 11b, lower graph).
When we combine the data collected on both days, the correlation between the ρ 0 parameter and canopy cover increased slightly over the whole spectrum (Figure 11a, upper graph). On the contrary, the correlation between the ρ 0 parameter and LAI decreased when both days were combined, which indicates that the ρ 0 parameter was more sensitive to canopy cover than to LAI, and differences between the structure of the potato crop on both dates were not well represented by this parameter. This was likely due to the fact that the LAI did not change much between the two dates, while there was a strong increase in canopy cover (Table 4). Over the whole spectrum, there was a positive correlation between the Θ parameter and canopy cover when the data of both dates were combined (Figure 11a, lower graph), suggesting that in general there was a decrease in backward scattering intensity (increase in the Θ parameter) when the canopy cover increased.

4. Discussion

In this paper, we demonstrated how to extract multi-angular observations from overlapping pixels captured by a frame-based camera during a typical UAV mapping flight. After orthorectification and georeferencing of the individual images taken at all measured spectral bands, we were able to calculate the observation geometry of all ground referenced pixels based on the pixel location and camera positions from which the pixels were observed.
During the two flights that we performed during the summer of 2016 over an experimental potato field, pixels were captured in up to 40 different images collected from different positions of the flight path of the UAV (Figure 4), resulting in up to 40 different observation geometries per pixel. Although the angular sampling coverage of these observations was limited, strong anisotropy effects occurred (Figure 5a,b). This highlights the importance of taking reflectance anisotropy into account when data is collected at off-nadir viewing angles. Either off-nadir data needs to be corrected and/or normalized to a standardized observation geometry [53], or the exact illumination and observation geometry needs to be considered in further analysis of the data [54].
In this study, we sampled a relatively large area, which resulted in a relatively low angular sampling coverage. To achieve a higher sampling density, we suggested focusing on smaller areas and design flight paths with higher overlap and/or to introduce flight lines in multiple azimuth directions covering the area of interest, like for example done by Honkavaara et al. [55] (p. 78, Figure 4). Our focus, however, was to derive anisotropy information during a typical UAV mapping flight, and we therefore did not test any alternative flight patterns than the one displayed in Figure 1. Another option to sample larger VZAs would be to mount the camera tilted under the UAV. However, this could complicate photogrammetric processing steps.
Several other methods to perform multi-angular reflectance measurements of vegetation targets using UAVs have recently been proposed ranging from spectrometers controlled by a gimbal [34,35] to the exploitation of the wide field of view of a hyperspectral line scanner [36]. Using these methods, clear anisotropy effects were observed for different vegetation targets at off-nadir viewing angles, where typically higher reflectance factors were observed in the backward scattering direction, with emphasis on view angles close to the hotspot position [34,36]. The drawback of these methods is that the measurements only provide the anisotropy information of a small area or the average anisotropy of a larger surface, respectively. Both of these methods lack information on the spatial distribution of the observed anisotropy effects. The method used in this paper does provide spatially explicit information on anisotropy. By fitting BRDF models, like we did with the RPV model, information on anisotropy can be stored and visualized in maps.
In this study, we used an FPI frame-based camera, which complicates the photogrammetric processing due to the movement of the UAV during the collection of a datacube [48]. In addition, the movement of the UAV during the flight also caused the angular coverage per pixel to be slightly different for each spectral band (Figure 5). For the fitting of a BRDF model, this does not have to be a problem as long as a large amount of observation angles are available per pixel.
To capture a representative amount of potato rows and spaces between the potato rows in each ground pixel, we resampled the original ground pixels (~8 cm) to 5 m ground pixels. For more homogeneous surfaces or surfaces with smaller height differences, a smaller or even the original ground pixel size can be used. The proposed methodology in this study can easily be applied to—which is more commonly available—multispectral camera systems. We used the Rikola camera system because that was the camera we had available. Multispectral cameras, which in general have a higher spatial resolution, allow for analysis at a higher detail level. If local legislation allows it, these cameras can be flown at greater heights above ground level to cover a larger surface area, while maintaining a high spatial resolution. Multispectral camera systems with bands in both the visible and NIR capture the most pronounced changes in anisotropy that relate to increased canopy cover in a potato crop, as can be seen in the Θ and ρ 0 parameter spectra in Figure 9 and the correlation spectra in Figure 11.
We studied the magnitude of anisotropic reflectance effects by analyzing anisotropy parameters obtained by fitting the RPV model through the multi-angular observations of each georeferenced ground pixel. Observed trends were a backward scattering anisotropy for the whole study area, which was most pronounced in the visible wavelength region and less pronounced in the NIR region, which is typical for vegetation targets [2,3].
In addition, we studied the effects of canopy cover and LAI on reflectance anisotropy. On the first measuring day, we observed an increase in backward scattering intensity with increased canopy cover at visible wavelengths. This might be related to the higher amount of gaps in the potato rows on this day, which in turn resulted in weaker shadows and thus weaker backward scattering intensity. A higher canopy cover on this day occurred at plots where the potato rows had fewer gaps, which resulted in stronger shadow effects. On the second measuring day, when there were fewer gaps between rows, we observed a decrease in backward scattering intensity with an increase in canopy cover. This decrease is likely the result of the filling up of space between consecutive rows due to potato plants growing into each other (the potato plants had fallen over due to long and unstable haulms in combination with unfavorable weather conditions), which resulted in a decrease in shadows created by the rows. A decrease in backward scattering intensity with canopy closure was observed in a previous UAV-based anisotropy study of potato canopy development [36].
This study demonstrates that pixel-wise multi-angular observations can be obtained from frame-based cameras by calculation of the VAAs and VZAs of georeferenced ground pixels that were captured from multiple camera positions. The information contained in these multi-angular observations is valuable for correction of anisotropic reflectance effects, but also provides additional information to spectral data on vegetation characteristics. Moreover, knowing the observation geometry of every pixel makes it possible to select and analyze only observations that were taken from, for example, the backward scattering direction. In this direction, observations have been demonstrated to be more useful for LAI prediction [56]. Besides this, in the backward scattering direction, shadows are minimized and observations are more sensitive to vegetation pigments [57], which makes frame-based measurements in combination with their observation geometry also interesting for the study of biochemical vegetation parameters.

5. Conclusions

The overlap of images that are collected during a typical UAV mapping flight using a frame camera provides multi-angular views of georeferenced ground pixels. In this paper, we describe how to extract the observation geometry of these multi-angular views. The results of this paper demonstrate that, in clear sky illumination conditions, strong anisotropy effects occur in measurements with off-nadir viewing when using a frame camera.
By fitting the RPV model through the multi-angular measurements, we parameterized and interpreted the anisotropic reflectance effects captured during two UAV flights in the growing season of 2016 for a potato field that contained eight experimental plots with different LAI and canopy cover levels. Analysis of the anisotropy patterns indicated that in general the potato crop had a higher reflectance in the backward scattering direction. An increase in canopy cover during the growing season resulted in a reduction of this backward scattering, which can be attributed to the diminishing row structure with increasing canopy cover, and thereby weaker row-induced shadow effects.
The results of this study indicate that the parameters describing the anisotropy with the RPV model contain information on structural characteristics of the potato crop such as LAI and canopy cover. This suggests that including such anisotropy information may lead to more accurate estimates of these characteristics. Future research will focus on how to use this anisotropy information to better estimate vegetation parameters, for example, by using radiative transfer models. Moreover, since the method described in this paper can in theory be used to study the anisotropy of any surface, we will apply it to other vegetation, crop, and soil targets as well. Special focus will be on the influence of canopy development on the anisotropy signal.

Acknowledgments

This research was funded by a grant from the User Support Programme Space Research (GO/12-15) in The Netherlands. The authors would like to thank the Wageningen University & Research Unmanned Aerial Remote Sensing Facility (UARSF) for providing the UAV and sensor system used in this study. In addition, the authors thank Jacob van den Borne for the preparation of and access to the experimental field and Marnix van den Brande for doing the LAI measurements.

Author Contributions

P.P.J.R., J.M.S., and L.K. conceived and designed the experiments; P.P.J.R. and J.M.S. performed the experiments; P.P.J.R., J.M.S., H.M.B., and J.G.P.W.C. analyzed the data; P.P.J.R., J.M.S., H.M.B., and J.G.P.W.C. contributed reagents/materials/analysis tools; P.P.J.R., J.M.S., H.M.B., L.K., and J.G.P.W.C. wrote the paper.

Conflicts of Interest

The authors declare no conflict of interest.

References

  1. Schaepman-Strub, G.; Schaepman, M.E.; Painter, T.H.; Dangel, S.; Martonchik, J.V. Reflectance quantities in optical remote sensing-definitions and case studies. Remote Sens. Environ. 2006, 103, 27–42. [Google Scholar] [CrossRef]
  2. Kimes, D.S. Dynamics of directional reflectance factor distributions for vegetation canopies. Appl. Opt. 1983, 22, 1364–1372. [Google Scholar] [CrossRef] [PubMed]
  3. Sandmeier, S.; Müller, C.; Hosgood, B.; Andreoli, G. Physical mechanisms in hyperspectral BRDF data of grass and watercress. Remote Sens. Environ. 1998, 66, 222–233. [Google Scholar] [CrossRef]
  4. Sun, T.; Fang, H.; Liu, W.; Ye, Y. Impact of water background on canopy reflectance anisotropy of a paddy rice field from multi-angle measurements. Agric. For. Meteorol. 2017, 233, 143–152. [Google Scholar] [CrossRef]
  5. Hill, M.J.; Averill, C.; Jiao, Z.; Schaaf, C.B.; Armston, J.D. Relationship of MISR RPV parameters and MODIS BRDF shape indicators to surface vegetation patterns in an australian tropical savanna. Can. J. Remote Sens. 2008, 34 (Suppl. S2), S247–S267. [Google Scholar] [CrossRef]
  6. Lavergne, T.; Kaminski, T.; Pinty, B.; Taberner, M.; Gobron, N.; Verstraete, M.M.; Vossbeck, M.; Widlowski, J.L.; Giering, R. Application to MISR land products of an RPV model inversion package using adjoint and hessian codes. Remote Sens. Environ. 2007, 107, 362–375. [Google Scholar] [CrossRef]
  7. Widlowski, J.L.; Pinty, B.; Gobron, N.; Verstraete, M.M.; Diner, D.J.; Davis, A.B. Canopy structure parameters derived from multi-angular remote sensing data for terrestrial carbon studies. Clim. Chang. 2004, 67, 403–415. [Google Scholar] [CrossRef]
  8. Bousquet, L.; Lachérade, S.; Jacquemoud, S.; Moya, I. Leaf BRDF measurements and model for specular and diffuse components differentiation. Remote Sens. Environ. 2005, 98, 201–211. [Google Scholar] [CrossRef]
  9. Huang, W.; Niu, Z.; Wang, J.; Liu, L.; Zhao, C.; Liu, Q. Identifying crop leaf angle distribution based on two-temporal and bidirectional canopy reflectance. IEEE Trans. Geosci. Remote Sens. 2006, 44, 3601–3608. [Google Scholar] [CrossRef]
  10. Feingersh, T.; Ben-Dor, E.; Filin, S. Correction of reflectance anisotropy: A multi-sensor approach. Int. J. Remote Sens. 2010, 31, 49–74. [Google Scholar] [CrossRef]
  11. Roujean, J.L.; Leroy, M.; Deschamps, P.Y. A bidirectional reflectance model of the earth’s surface for the correction of remote sensing data. J. Geophys. Res. 1992, 97, 20455–20468. [Google Scholar] [CrossRef]
  12. Schaaf, C.B.; Gao, F.; Strahler, A.H.; Lucht, W.; Li, X.; Tsang, T.; Strugnell, N.C.; Zhang, X.; Jin, Y.; Muller, J.P.; et al. First operational BRDF, albedo nadir reflectance products from modis. Remote Sens. Environ. 2002, 83, 135–148. [Google Scholar] [CrossRef]
  13. De Colstoun, E.C.B.; Walthall, C.L. Improving global scale land cover classifications with multi-directional polder data and a decision tree classifier. Remote Sens. Environ. 2006, 100, 474–485. [Google Scholar] [CrossRef]
  14. Duca, R.; Del Frate, F. Hyperspectral and multiangle chris-proba images for the generation of land cover maps. IEEE Trans. Geosci. Remote Sens. 2008, 46, 2857–2866. [Google Scholar] [CrossRef]
  15. Koukal, T.; Atzberger, C.; Schneider, W. Evaluation of semi-empirical BRDF models inverted against multi-angle data from a digital airborne frame camera for enhancing forest type classification. Remote Sens. Environ. 2014, 151, 27–43. [Google Scholar] [CrossRef]
  16. Su, L.; Huang, Y.; Chopping, M.J.; Rango, A.; Martonchik, J.V. An empirical study on the utility of brdf model parameters and topographic parameters for mapping vegetation in a semi-arid region with misr imagery. Int. J. Remote Sens. 2009, 30, 3463–3483. [Google Scholar] [CrossRef]
  17. Kneubühler, M.; Koetz, B.; Huber, S.; Schaepman, M.E.; Zimmermann, N.E. Space-based spectrodirectional measurements for the improved estimation of ecosystem variables. Can. J. Remote Sens. 2008, 34, 192–205. [Google Scholar]
  18. Wang, L.; Dong, T.; Zhang, G.; Niu, Z. LAI retrieval using PROSAIL model and optimal angle combination of multi-angular data in wheat. IEEE J. Sel. Top. Appl. Earth Obs. Remote Sens. 2013, 6, 1730–1736. [Google Scholar] [CrossRef]
  19. Wang, Y.; Li, G.; Ding, J.; Guo, Z.; Tang, S.; Wang, C.; Huang, Q.; Liu, R.; Chen, J.M. A combined glas and modis estimation of the global distribution of mean forest canopy height. Remote Sens. Environ. 2016, 174, 24–43. [Google Scholar] [CrossRef]
  20. Chen, J.M.; Menges, C.H.; Leblanc, S.G. Global mapping of foliage clumping index using multi-angular satellite data. Remote Sens. Environ. 2005, 97, 447–457. [Google Scholar] [CrossRef]
  21. He, L.; Chen, J.M.; Pisek, J.; Schaaf, C.B.; Strahler, A.H. Global clumping index map derived from the modis BRDF product. Remote Sens. Environ. 2012, 119, 118–130. [Google Scholar] [CrossRef]
  22. Biliouris, D.; Verstraeten, W.W.; Dutré, P.; Van Aardt, J.A.N.; Muys, B.; Coppin, P. A compact laboratory spectro-goniometer (CLabSpeG) to assess the BRDF of materials. Presentation, calibration and implementation on Fagus sylvatica L. leaves. Sensors 2007, 7, 1846–1870. [Google Scholar] [CrossRef]
  23. Roosjen, P.P.J.; Clevers, J.G.P.W.; Bartholomeus, H.M.; Schaepman, M.E.; Schaepman-Strub, G.; Jalink, H.; van der Schoor, R.; de Jong, A. A laboratory goniometer system for measuring reflectance and emittance anisotropy. Sensors 2012, 12, 17358–17371. [Google Scholar] [CrossRef] [PubMed]
  24. Bachmann, C.M.; Abelev, A.; Montes, M.J.; Philpot, W.; Gray, D.; Doctor, K.Z.; Fusina, R.A.; Mattis, G.; Chen, W.; Noble, S.D.; et al. Flexible field goniometer system: The goniometer for outdoor portable hyperspectral earth reflectance. J. Appl. Remote Sens. 2016, 10, 036012. [Google Scholar] [CrossRef]
  25. Coburn, C.A.; Peddle, D.R. A low-cost field and laboratory goniometer system for estimating hyperspectral bidirectional reflectance. Can. J. Remote Sens. 2006, 32, 244–253. [Google Scholar] [CrossRef]
  26. Deering, D.W.; Leone, P. A sphere-scanning radiometer for rapid directional measurements of sky and ground radiance. Remote Sens. Environ. 1986, 19, 1–24. [Google Scholar] [CrossRef]
  27. Painter, T.H.; Paden, B.; Dozier, J. Automated spectro-goniometer: A spherical robot for the field measurement of the directional reflectance of snow. Rev. Sci. Instrum. 2003, 74, 5179–5188. [Google Scholar] [CrossRef]
  28. Sandmeier, S.R.; Itten, K.I. A field goniometer system (FIGOS) for acquisition of hyperspectral BRDF data. IEEE Trans. Geosci. Remote Sens. 1999, 37, 978–986. [Google Scholar] [CrossRef]
  29. Suomalainen, J.; Hakala, T.; Peltoniemi, J.; Puttonen, E. Polarised multiangular reflectance measurements using the finnish geodetic institute field goniospectrometer. Sensors 2009, 9, 3891–3907. [Google Scholar] [CrossRef] [PubMed]
  30. Milton, E.J.; Schaepman, M.E.; Anderson, K.; Kneubühler, M.; Fox, N. Progress in field spectroscopy. Remote Sens. Environ. 2009, 113, S92–S109. [Google Scholar] [CrossRef]
  31. Painter, T.H.; Dozier, J. Measurements of the hemispherical-directional reflectance of snow at fine spectral and angular resolution. J. Geophys. Res. Atmos. 2004, 109, 1–21. [Google Scholar] [CrossRef]
  32. Peltoniemi, J.I.; Kaasalainen, S.; Näränen, J.; Matikainen, L.; Piironen, J. Measurement of directional and spectral signatures of light reflectance by snow. IEEE Trans. Geosci. Remote Sens. 2005, 43, 2294–2304. [Google Scholar] [CrossRef]
  33. Miller, I.; Forster, B.C.; Laffan, S.W.; Brander, R.W. Bidirectional reflectance of coral growth-forms. Int. J. Remote Sens. 2016, 37, 1553–1567. [Google Scholar] [CrossRef]
  34. Burkart, A.; Aasen, H.; Alonso, L.; Menz, G.; Bareth, G.; Rascher, U. Angular dependency of hyperspectral measurements over wheat characterized by a novel UAV based goniometer. Remote Sens. 2015, 7, 725–746. [Google Scholar] [CrossRef]
  35. Grenzdörffer, G.J.; Niemeyer, F. UAV based BRDF-measurements of agricultural surfaces with pfiffikus. Int. Arch. Photogramm. Remote Sens. Spat. Inf. Sci. 2011, 38, 229–234. [Google Scholar] [CrossRef]
  36. Roosjen, P.P.J.; Suomalainen, J.M.; Bartholomeus, H.M.; Clevers, J.G.P.W. Hyperspectral reflectance anisotropy measurements using a pushbroom spectrometer on an unmanned aerial vehicle-results for barley, winter wheat, and potato. Remote Sens. 2016, 8, 909. [Google Scholar] [CrossRef]
  37. Brede, B.; Suomalainen, J.; Bartholomeus, H.; Herold, M. Influence of solar zenith angle on the enhanced vegetation index of a guyanese rainforest. Remote Sens. Lett. 2015, 6, 972–981. [Google Scholar] [CrossRef]
  38. Colomina, I.; Molina, P. Unmanned aerial systems for photogrammetry and remote sensing: A review. ISPRS J. Photogramm. Remote Sens. 2014, 92, 79–97. [Google Scholar] [CrossRef]
  39. Hakala, T.; Suomalainen, J.; Peltoniemi, J.I. Acquisition of bidirectional reflectance factor dataset using a micro unmanned aerial vehicle and a consumer camera. Remote Sens. 2010, 2, 819–832. [Google Scholar] [CrossRef]
  40. Honkavaara, E.; Eskelinen, M.A.; Polonen, I.; Saari, H.; Ojanen, H.; Mannila, R.; Holmlund, C.; Hakala, T.; Litkey, P.; Rosnell, T.; et al. Remote sensing of 3-D geometry and surface moisture of a peat production area using hyperspectral frame cameras in visible to short-wave infrared spectral ranges onboard a small unmanned airborne vehicle (UAV). IEEE Trans. Geosci. Remote Sens. 2016, 54, 5440–5454. [Google Scholar] [CrossRef]
  41. Näsi, R.; Honkavaara, E.; Lyytikäinen-Saarenmaa, P.; Blomqvist, M.; Litkey, P.; Hakala, T.; Viljanen, N.; Kantola, T.; Tanhuanpää, T.; Holopainen, M. Using UAV-based photogrammetry and hyperspectral imaging for mapping bark beetle damage at tree-level. Remote Sens. 2015, 7, 15467–15493. [Google Scholar] [CrossRef]
  42. Rahman, H.; Pinty, B.; Verstraete, M.M. Coupled surface-atmosphere reflectance (CSAR) model 2. Semiempirical surface model usable with NOAA advanced very high resolution radiometer data. J. Geophys. Res. 1993, 98, 20791–20801. [Google Scholar] [CrossRef]
  43. Stoorvogel, J.J.; Kooistra, L.; Bouma, J. Managing soil variability at different spatial scales as a basis for precision agriculture. In Soil-Specific Farming: Precision Agriculture; CRC Press: Boca Raton, FL, USA, 2015; pp. 37–72. [Google Scholar]
  44. Clevers, J.G.P.W. Application of a weighted infrared-red vegetation index for estimating leaf area index by correcting for soil moisture. Remote Sens. Environ. 1989, 29, 25–37. [Google Scholar] [CrossRef]
  45. Kooistra, L.; Clevers, J.G.P.W. Estimating potato leaf chlorophyll content using ratio vegetation indices. Remote Sens. Lett. 2016, 7, 611–620. [Google Scholar] [CrossRef]
  46. Drusch, M.; Del Bello, U.; Carlier, S.; Colin, O.; Fernandez, V.; Gascon, F.; Hoersch, B.; Isola, C.; Laberinti, P.; Martimort, P.; et al. Sentinel-2: Esa’s optical high-resolution mission for gmes operational services. Remote Sens. Environ. 2012, 120, 25–36. [Google Scholar] [CrossRef]
  47. Rasmussen, J.; Ntakos, G.; Nielsen, J.; Svensgaard, J.; Poulsen, R.N.; Christensen, S. Are vegetation indices derived from consumer-grade cameras mounted on UAVs sufficiently reliable for assessing experimental plots? Eur. J. Agron. 2016, 74, 75–92. [Google Scholar] [CrossRef]
  48. Honkavaara, E.; Saari, H.; Kaivosoja, J.; Pölönen, I.; Hakala, T.; Litkey, P.; Mäkynen, J.; Pesonen, L. Processing and assessment of spectrometric, stereoscopic imagery collected using a lightweight UAV spectral camera for precision agriculture. Remote Sens. 2013, 5, 5006–5039. [Google Scholar] [CrossRef]
  49. Biliouris, D.; van der Zande, D.; Verstraeten, W.W.; Stuckens, J.; Muys, B.; Dutré, P.; Coppin, P. RPV model parameters based on hyperspectral bidirectional reflectance measurements of Fagus sylvatica L. leaves. Remote Sens. 2009, 1, 92–106. [Google Scholar] [CrossRef]
  50. Roosjen, P.P.J.; Bartholomeus, H.M.; Clevers, J.G.P.W. Effects of soil moisture content on reflectance anisotropy laboratory goniometer measurements and RPV model inversions. Remote Sens. Environ. 2015, 170, 229–238. [Google Scholar] [CrossRef]
  51. Maignan, F.; Bréon, F.M.; Lacaze, R. Bidirectional reflectance of earth targets: Evaluation of analytical models using a large set of spaceborne measurements with emphasis on the hot spot. Remote Sens. Environ. 2004, 90, 210–220. [Google Scholar] [CrossRef]
  52. Obidiegwu, J.E.; Bryan, G.J.; Jones, H.G.; Prashar, A. Coping with drought: Stress and adaptive responses in potato and perspectives for improvement. Front. Plant Sci. 2015, 6, 1–23. [Google Scholar] [CrossRef] [PubMed]
  53. Schlapfer, D.; Richter, R.; Feingersh, T. Operational BRDF effects correction for wide-field-of-view optical scanners (BREFCOR). IEEE Trans. Geosci. Remote Sens. 2015, 53, 1855–1864. [Google Scholar] [CrossRef]
  54. Verger, A.; Vigneau, N.; Chéron, C.; Gilliot, J.M.; Comar, A.; Baret, F. Green area index from an unmanned aerial system over wheat and rapeseed crops. Remote Sens. Environ. 2014, 152, 654–664. [Google Scholar] [CrossRef]
  55. Honkavaara, E.; Hakala, T.; Nevalainen, O.; Viljanen, N.; Rosnell, T.; Khoramshahi, E.; Näsi, R.; Oliveira, R.; Tommaselli, A. Geometric and reflectance signature characterization of complex canopies using hyperspectral stereoscopic images from uav and terrestrial platforms. In International Archives of the Photogrammetry, Remote Sensing and Spatial Information Sciences; ISPRS Archives: Prague, Czech Republic, 2016; pp. 77–82. [Google Scholar]
  56. Verrelst, J.; Romijn, E.; Kooistra, L. Mapping vegetation density in a heterogeneous river floodplain ecosystem using pointable CHRIS/PROBA data. Remote Sens. 2012, 4, 2866–2889. [Google Scholar] [CrossRef]
  57. Jacquemoud, S.; Verhoef, W.; Baret, F.; Bacour, C.; Zarco-Tejada, P.J.; Asner, G.P.; François, C.; Ustin, S.L. PROSPECT + SALL models: A review of use for vegetation characterization. Remote Sens. Environ. 2009, 113 (Suppl. 1), S56–S66. [Google Scholar] [CrossRef]
Figure 1. The eight experimental plots (A–H) in the potato field with different initial nitrogen (N) fertilization levels (horizontal zones), and additional nitrogen fertilization levels (vertical zones). The red line indicates the flight path flown during Flight 1 on 9 June 2016 (Table 2). The base map is an RGB composite (R = 674 nm, G = 559 nm, and B = 500 nm), produced with the data collected during the flight.
Figure 1. The eight experimental plots (A–H) in the potato field with different initial nitrogen (N) fertilization levels (horizontal zones), and additional nitrogen fertilization levels (vertical zones). The red line indicates the flight path flown during Flight 1 on 9 June 2016 (Table 2). The base map is an RGB composite (R = 674 nm, G = 559 nm, and B = 500 nm), produced with the data collected during the flight.
Remotesensing 09 00417 g001
Figure 2. An example of the observation geometry of the pixels of one of the georeferenced images. (a) An RGB composite (R = 674 nm, G = 559 nm, and B = 500 nm) of the study area, collected on 9 June 2016, showing the location of the current image (black line) and the position of the unmanned aerial vehicle (UAV) (sun-shaped symbol); (b) the reflectance factors at 864 nm of the current image; (c) the relative azimuth angles of the pixels of the current image, where 0° is the forward scattering direction and 180° is the backward scattering direction. The solar principal plane is indicated by the black line; (d) the view zenith angle of the pixels of the current image. Projection of (ad) is WGS84 UTM 31N.
Figure 2. An example of the observation geometry of the pixels of one of the georeferenced images. (a) An RGB composite (R = 674 nm, G = 559 nm, and B = 500 nm) of the study area, collected on 9 June 2016, showing the location of the current image (black line) and the position of the unmanned aerial vehicle (UAV) (sun-shaped symbol); (b) the reflectance factors at 864 nm of the current image; (c) the relative azimuth angles of the pixels of the current image, where 0° is the forward scattering direction and 180° is the backward scattering direction. The solar principal plane is indicated by the black line; (d) the view zenith angle of the pixels of the current image. Projection of (ad) is WGS84 UTM 31N.
Remotesensing 09 00417 g002
Figure 3. Normalized difference vegetation index (NDVI) values of the study area on 9 June 2016 (a) and 19 July 2016 (b), both in WGS84 UTM 31N projection. Estimated leaf area index (LAI) in the experimental plots based on the weighted difference vegetation index (WDVI), calculated from Cropscan measurements (c). The dates at which the additional sensor-based nitrogen (N) fertilization was applied to Plots B, D, F, and H is indicated by the solid lines. The date at which all plots received potassium (K) fertilization is indicated by the dashed line.
Figure 3. Normalized difference vegetation index (NDVI) values of the study area on 9 June 2016 (a) and 19 July 2016 (b), both in WGS84 UTM 31N projection. Estimated leaf area index (LAI) in the experimental plots based on the weighted difference vegetation index (WDVI), calculated from Cropscan measurements (c). The dates at which the additional sensor-based nitrogen (N) fertilization was applied to Plots B, D, F, and H is indicated by the solid lines. The date at which all plots received potassium (K) fertilization is indicated by the dashed line.
Remotesensing 09 00417 g003
Figure 4. The number of observations per pixel, captured during the flight on 9 June 2016, at 658 nm (a). The camera positions (numbers in white circles) from which an example pixel (white square between Plot C and D) was captured (b). The red lines indicate the flight path of the UAV and the black dots indicate the camera positions that did not capture the example pixel. Both (a) and (b) are in WGS84 UTM 31N projection.
Figure 4. The number of observations per pixel, captured during the flight on 9 June 2016, at 658 nm (a). The camera positions (numbers in white circles) from which an example pixel (white square between Plot C and D) was captured (b). The red lines indicate the flight path of the UAV and the black dots indicate the camera positions that did not capture the example pixel. Both (a) and (b) are in WGS84 UTM 31N projection.
Remotesensing 09 00417 g004
Figure 5. Linearly interpolated polar plots of the (anisotropy factors (ANIFs) observed for the example pixel in Figure 4 at 658 nm (a) and 848 nm (b), respectively. The numbers indicate the camera positions and correspond to the numbers in Figure 4b. The azimuth angles are relative azimuth angles, where 0° is the forward scattering direction and 180° is the backward scattering direction. The vertical dashed line indicates the solar principal plane.
Figure 5. Linearly interpolated polar plots of the (anisotropy factors (ANIFs) observed for the example pixel in Figure 4 at 658 nm (a) and 848 nm (b), respectively. The numbers indicate the camera positions and correspond to the numbers in Figure 4b. The azimuth angles are relative azimuth angles, where 0° is the forward scattering direction and 180° is the backward scattering direction. The vertical dashed line indicates the solar principal plane.
Remotesensing 09 00417 g005
Figure 6. ρ 0 , k , and Θ parameters obtained by fitting the Rahman–Pinty–Verstraete (RPV) model through the measurements of Flight 1 (9 June 2016) at 658 nm (ac) and at 848 nm (df). All figures are in WGS84 UTM 31N projection.
Figure 6. ρ 0 , k , and Θ parameters obtained by fitting the Rahman–Pinty–Verstraete (RPV) model through the measurements of Flight 1 (9 June 2016) at 658 nm (ac) and at 848 nm (df). All figures are in WGS84 UTM 31N projection.
Remotesensing 09 00417 g006
Figure 7. ρ 0 , k , and Θ parameters obtained by fitting the RPV model through the measurements of Flight 2 (19 July 2016) at 658 nm (ac) and at 848 nm (df). All figures are in WGS84 UTM 31N projection.
Figure 7. ρ 0 , k , and Θ parameters obtained by fitting the RPV model through the measurements of Flight 2 (19 July 2016) at 658 nm (ac) and at 848 nm (df). All figures are in WGS84 UTM 31N projection.
Remotesensing 09 00417 g007
Figure 8. The reflectance anisotropy derived from of all pixels within experimental Plot C (top) and Plot E (bottom) shown as linearly inter- and extrapolated polar graphs at 658 nm and 848 nm, collected on 9 June 2016 (left) and 19 July 2016 (right), respectively. The white stars indicate the position of the sun during data collection and the black dots indicate the measurement positions.
Figure 8. The reflectance anisotropy derived from of all pixels within experimental Plot C (top) and Plot E (bottom) shown as linearly inter- and extrapolated polar graphs at 658 nm and 848 nm, collected on 9 June 2016 (left) and 19 July 2016 (right), respectively. The white stars indicate the position of the sun during data collection and the black dots indicate the measurement positions.
Remotesensing 09 00417 g008
Figure 9. Average ρ 0 , k , and Θ parameters obtained by fitting the RPV model though each individual pixel in the experimental plots on both dates. The colored surfaces indicate the standard deviations.
Figure 9. Average ρ 0 , k , and Θ parameters obtained by fitting the RPV model though each individual pixel in the experimental plots on both dates. The colored surfaces indicate the standard deviations.
Remotesensing 09 00417 g009
Figure 10. The relation between the ρ 0 , k , and Θ parameters of the RPV model of all pixels in each experimental plot (indicated by letters A–H) and the canopy cover, at 658 nm (black circles) and 848 nm (gray triangles), respectively, for both dates. The error bars indicate the standard deviation.
Figure 10. The relation between the ρ 0 , k , and Θ parameters of the RPV model of all pixels in each experimental plot (indicated by letters A–H) and the canopy cover, at 658 nm (black circles) and 848 nm (gray triangles), respectively, for both dates. The error bars indicate the standard deviation.
Remotesensing 09 00417 g010
Figure 11. Ranking correlation (Kendall’s tau) between the ρ 0 , k , and Θ parameters of the RPV model and the canopy cover (a) and Cropscan LAI (b) for the flights of Day 1 and Day 2 separately and for the flights of both days combined. The gray-shaded areas indicate the significance levels: values above or below the dark-gray areas were significant for the analysis of both dates (n = 16) and values above or below the light-gray areas were significant for the analysis of the separate days (n = 8), both at the 5% confidence level.
Figure 11. Ranking correlation (Kendall’s tau) between the ρ 0 , k , and Θ parameters of the RPV model and the canopy cover (a) and Cropscan LAI (b) for the flights of Day 1 and Day 2 separately and for the flights of both days combined. The gray-shaded areas indicate the significance levels: values above or below the dark-gray areas were significant for the analysis of both dates (n = 16) and values above or below the light-gray areas were significant for the analysis of the separate days (n = 8), both at the 5% confidence level.
Remotesensing 09 00417 g011
Table 1. Overview of the nitrogen (N) fertilization applied to the various plots within the potato experimental field in 2016.
Table 1. Overview of the nitrogen (N) fertilization applied to the various plots within the potato experimental field in 2016.
PlotInitial FertilizationAdditional Fertilization
Before Planting
(kg N/ha)
28 June 2016
(kg N/ha)
15 July 2016
(kg N/ha)
A4000
B404230
C000
D014030
E7000
F70049
G2500
H258438
Table 2. Flights, times, and solar geometry (solar azimuth angle (SAA) and solar zenith angle (SZA)). Time is local time (UTC + 2 h). The start time indicates the time at which the first used image was taken and the end time indicates the time at which the last used image was taken.
Table 2. Flights, times, and solar geometry (solar azimuth angle (SAA) and solar zenith angle (SZA)). Time is local time (UTC + 2 h). The start time indicates the time at which the first used image was taken and the end time indicates the time at which the last used image was taken.
Flight #DateStart TimeEnd TimeSAA (°)SZA (°)
19 June 201612:1812:25144–14732–32
219 July 201612:2912:37147–15034–33
Table 3. Center wavelengths and Full-Width-at-Half-Maximum (FWHM) of the 16 spectral Rikola bands used in this study.
Table 3. Center wavelengths and Full-Width-at-Half-Maximum (FWHM) of the 16 spectral Rikola bands used in this study.
BandCenter Wavelength (nm)FWHM (nm)
1500.214.8
2547.013.2
3558.813.0
4568.812.9
5657.613.0
6673.613.2
7705.813.1
8739.019.4
9782.818.5
10791.618.4
11810.318.1
12829.017.8
13847.817.6
14864.717.4
15878.717.3
16894.717.1
Table 4. Leaf area index based on Cropscan measurements and the canopy cover during the two measurement days.
Table 4. Leaf area index based on Cropscan measurements and the canopy cover during the two measurement days.
Plot9 June 201619 July 2016
Cropscan (LAI)Canopy Cover (%)Cropscan (LAI)Canopy Cover (%)
A3.3561.83.4391.9
B4.3167.74.0893.8
C2.4531.52.1962.1
D3.2046.53.4991.0
E4.2062.53.7392.8
F4.4869.33.7992.8
G3.6261.52.9988.5
H3.7958.73.1989.8

Share and Cite

MDPI and ACS Style

Roosjen, P.P.J.; Suomalainen, J.M.; Bartholomeus, H.M.; Kooistra, L.; Clevers, J.G.P.W. Mapping Reflectance Anisotropy of a Potato Canopy Using Aerial Images Acquired with an Unmanned Aerial Vehicle. Remote Sens. 2017, 9, 417. https://doi.org/10.3390/rs9050417

AMA Style

Roosjen PPJ, Suomalainen JM, Bartholomeus HM, Kooistra L, Clevers JGPW. Mapping Reflectance Anisotropy of a Potato Canopy Using Aerial Images Acquired with an Unmanned Aerial Vehicle. Remote Sensing. 2017; 9(5):417. https://doi.org/10.3390/rs9050417

Chicago/Turabian Style

Roosjen, Peter P. J., Juha M. Suomalainen, Harm M. Bartholomeus, Lammert Kooistra, and Jan G. P. W. Clevers. 2017. "Mapping Reflectance Anisotropy of a Potato Canopy Using Aerial Images Acquired with an Unmanned Aerial Vehicle" Remote Sensing 9, no. 5: 417. https://doi.org/10.3390/rs9050417

APA Style

Roosjen, P. P. J., Suomalainen, J. M., Bartholomeus, H. M., Kooistra, L., & Clevers, J. G. P. W. (2017). Mapping Reflectance Anisotropy of a Potato Canopy Using Aerial Images Acquired with an Unmanned Aerial Vehicle. Remote Sensing, 9(5), 417. https://doi.org/10.3390/rs9050417

Note that from the first issue of 2016, this journal uses article numbers instead of page numbers. See further details here.

Article Metrics

Back to TopTop