[go: up one dir, main page]
More Web Proxy on the site http://driver.im/
Next Article in Journal
A Novel Ensemble Approach for Landslide Susceptibility Mapping (LSM) in Darjeeling and Kalimpong Districts, West Bengal, India
Previous Article in Journal
Retrieval of Snow Depth over Arctic Sea Ice Using a Deep Neural Network
You seem to have javascript disabled. Please note that many of the page functionalities won't work as expected without javascript enabled.
 
 
Font Type:
Arial Georgia Verdana
Font Size:
Aa Aa Aa
Line Spacing:
Column Width:
Background:
Article

Detection of New Zealand Kauri Trees with AISA Aerial Hyperspectral Data for Use in Multispectral Monitoring

1
Environmental Remote Sensing and Geoinformatics, Trier University, D-54296 Trier, Germany
2
Te Kura Ngahere | School of Forestry, University of Christchurch, Christchurch 8041, New Zealand
3
Manaaki Whenua | Landcare Research, Palmerston North 4472, New Zealand
*
Author to whom correspondence should be addressed.
Remote Sens. 2019, 11(23), 2865; https://doi.org/10.3390/rs11232865
Submission received: 11 October 2019 / Revised: 18 November 2019 / Accepted: 28 November 2019 / Published: 2 December 2019
(This article belongs to the Section Forest Remote Sensing)
Graphical abstract
">
Figure 1
<p>Kauri growth classes used in this study, depending on the mean crown diameter (cdm). (Photos: [<a href="#B39-remotesensing-11-02865" class="html-bibr">39</a>]).</p> ">
Figure 2
<p>(<b>a</b>) Location of the Waitakere Ranges on the North Island of New Zealand west of Auckland City. The general area with naturally occurring kauri in New Zealand [<a href="#B2-remotesensing-11-02865" class="html-bibr">2</a>] is marked as hatched. (<b>b</b>) Study sites in the Waitakere Ranges with the reference crowns marked in red (background map: [<a href="#B42-remotesensing-11-02865" class="html-bibr">42</a>]).</p> ">
Figure 3
<p>Reference crowns (total 3165), used in the analysis, per class and diameter.</p> ">
Figure 4
<p>Mean spectra of the target classes “kauri”, “dead/dying” and “other” with standard deviations (stdev).</p> ">
Figure 5
<p>Jeffries–Matusita separability [<a href="#B61-remotesensing-11-02865" class="html-bibr">61</a>] of the three target classes for different spectral ranges. A value larger than 1.9 indicates a high separability. The analysis was based on MNF transformations for all bands in the different spectral ranges.</p> ">
Figure 6
<p>Mean spectra of kauri (thick black line) and six selected other canopy species (grey) that got most easily confused with kauri. The number of pixels (pix) used to generate the mean spectra is given in parentheses. The spectra of these species show the lowest separability from the kauri spectrum in this study (see <a href="#remotesensing-11-02865-t0A2" class="html-table">Table A2</a>).</p> ">
Figure 7
<p>Mean spectra of kauri (black) and five other canopy species (grey) that have the highest separabilities from the kauri spectrum in this study (see <a href="#remotesensing-11-02865-t0A2" class="html-table">Table A2</a>). The number of pixels (pix) used to generate the mean spectra is given in parentheses.</p> ">
Figure 8
<p>Mean spectra of the target classes “kauri”, “dead/dying” and “other” with standard deviations (“stdev”). Below: Band positions of 13 selected spectral indices.</p> ">
Figure 9
<p>Performance of selected indices and index combinations to identify the class “dead/dying” (light grey) and to distinguish between “kauri” and “other vegetation” (dark grey) with an RF classification (five-fold random split, 20 repetitions). Please note that the x-axis starts at 55%.</p> ">
Figure 10
<p>Performance of the final 4–8-band index combinations to distinguish the three target classes “kauri”, “dead/dying” and “other” canopy vegetation. (RF, five-fold random split, 20 repetitions). Please note that the y-axis starts at 89%.</p> ">
Figure 11
<p>RGB images of the three first bands of MNF transformations [<a href="#B49-remotesensing-11-02865" class="html-bibr">49</a>] from: (<b>a</b>) the VIS to NIR1 spectral range (431–970 nm); (<b>b</b>) VIS to NIR2 (431–1327 nm); and (<b>c</b>) the full spectral range from VIS to SWIR (431–2337 nm). The importance of the NIR2 and SWIR spectrum is visible in the higher colour contrast of kauri crowns compared to the VNIR image. The numbers in the kauri polygons indicate the stress symptom class for the crown with 1 = non-symptomatic and 5 = dead.</p> ">
Figure 12
<p>Histograms for selected indices on sunlit pixels for all crown diameters, with the class “kauri” marked in light blue, the class “dead/dying” in red and the class “other” in dark blue. (<b>a</b>) The histogram for the mNDWI-Hyp index, which performed best to separate the class kauri from other vegetation by capturing distinctive features in the NIR2 region, is shown. For the separation of the class “dead/dying”, indices in the RED/NIR1 region are better suited, such as (<b>b</b>) the SR800 and (<b>c</b>) the NDNI index (see <a href="#remotesensing-11-02865-t0A3" class="html-table">Table A3</a> for descriptions of these indices).</p> ">
Figure 13
<p>Overall accuracies for two selected sets of six and eight bands in the visible to NIR1 range. The accuracies are calculated for two and three target classes both with and without an additional CHM layer. The results are based on an RF classification with a three-fold split in 10 repetitions on 94,971 pixel values, including small crowns (&lt;3 m diameter). The standard deviations vary from 0.12 to 0.2.</p> ">
Figure 14
<p>Combined results of 10 RF classifications with a 5-fold stratified random split with different seed values. Overview (left) and detailed maps (right) for the Cascades (<b>a</b>,<b>b</b>), Maungaroa (<b>c</b>,<b>d</b>) and Kauri Grove area (<b>e</b>,<b>f</b>). The numbers indicate the symptom classes in kauri crowns (1 = non-symptomatic, 5 = dead).</p> ">
Versions Notes

Abstract

:
The endemic New Zealand kauri trees (Agathis australis) are of major importance for the forests in the northern part of New Zealand. The mapping of kauri locations is required for the monitoring of the deadly kauri dieback disease (Phytophthora agathidicida (PTA)). In this study, we developed a method to identify kauri trees by optical remote sensing that can be applied in an area-wide campaign. Dead and dying trees were separated in one class and the remaining trees with no to medium stress symptoms were defined in the two classes “kauri” and “other”. The reference dataset covers a representative selection of 3165 precisely located crowns of kauri and 21 other canopy species in the Waitakere Ranges west of Auckland. The analysis is based on an airborne hyperspectral AISA Fenix image (437–2337 nm, 1 m2 pixel resolution). The kauri spectra show characteristically steep reflectance and absorption features in the near-infrared (NIR) region with a distinct long descent at 1215 nm, which can be parameterised with a modified Normalised Water Index (mNDWI-Hyp). With a Jeffries–Matusita separability over 1.9, the kauri spectra can be well separated from 21 other canopy vegetation spectra. The Random Forest classifier performed slightly better than Support Vector Machine. A combination of the mNDWI-Hyp index with four additional spectral indices with three red to NIR bands resulted in an overall pixel-based accuracy (OA) of 91.7% for crowns larger 3 m diameter. While the user’s and producer’s accuracies for the class “kauri” with 94.6% and 94.8% are suitable for management purposes, the separation of “dead/dying trees” from “other” canopy vegetation poses the main challenge. The OA can be improved to 93.8% by combining “kauri” and “dead/dying” trees in one class, separate classifications for low and high forest stands and a binning to 10 nm bandwidths. Additional wavelengths and their respective indices only improved the OA up to 0.6%. The method developed in this study allows an accurate location of kauri trees for an area-wide mapping with a five-band multispectral sensor in a representative selection of forest ecosystems.

Graphical Abstract">

Graphical Abstract

1. Introduction

New Zealand kauri (Agathis australis (D.Don) Lindl. ex Loudon) are an important component of New Zealand’s northern indigenous forests. The overall distribution of kauri is well known [1,2], but there is an urgent need to locate kauri crowns in more detail for monitoring the deadly kauri dieback disease (Phytophthora agathidicida (PTA)). Current methods for mapping of kauri rely on the manual interpretation of nadir and oblique aerial images and photos taken from a helicopter [3,4]. These manual interpretations of aerial images are elaborate and only suitable for smaller areas. Remote sensing enables large area coverage in a more objective approach [5]. This study analysed the spectral characteristics of kauri crowns and developed a method to identify their exact position in an area-wide pixel based analysis.

1.1. Research Context

Remote sensing technology allows for automatic tree species discrimination based on reflectance signals with passive optical sensors and structural crown characteristics by active LiDAR sensors [6,7]. Multispectral sensors tend to have a limited number of bands, typically up to six, in the visible (VIS) to near-infrared (NIR) spectral range with a bandwidth of at least 10 nm, usually broader. Typically, they use two-dimensional staring arrays that are mounted in the focal plane, such as charge-coupled devise elements (CCD) [8]. Hyperspectral sensors can cover the whole spectral continuum up to the short wave infrared (SWIR) range with a high number of narrow bands [9]. The bandwidth of hyperspectral sensors for airborne acquisitions is typically around 3 nm in the visible (VIS) to first near-infrared (NIR1) part of the spectrum (Table 1) and around 10 nm in higher wavelengths. Airborne hyperspectral sensors usually use moving scanner lines in push broom or whiskbroom systems, although first snapshot hyperspectral cameras with staring arrays for airborne use are coming on the market [8,9,10].
Airborne hyperspectral remote sensing has proven useful in the analysis and identification of individual tree crowns in boreal and temperate [12,13,14,15] and subtropical to tropical forests [16,17,18,19,20]. The NIR bands from 700 to 1327 nm are important for species classification in tropical forests [16,17,18], which are perhaps more similar to kauri forests in terms of structural complexity [7]. The reduction of dimensionality and correlation in hyperspectral datasets can significantly improve the accuracy [21,22,23], as well as the extraction of the sunlit part of the crown [24,25]. An object-based classification can increase the accuracy by balancing the within crown variation, noise and illumination effects, and it allows to integrate additional structural and spatial crown statistics [26,27,28]. However, the whole processing chain for individual tree crown identification is complex, processing-intensive and error-prone, which can compromise the advantages of an object-based approach [12,29]. The importance of additional LiDAR data describing height and structural crown characteristics has been confirmed in many studies [13,21,30,31]. However, texture measures can also be integrated based on optical data alone [32].
Airborne multispectral sensors are, for the time being, better suited to cover large areas than hyperspectral sensors, with a wider field of view, a larger signal to noise ratio and a more robust technical setup. They are often operated in combination with LiDAR data for tree species classification [33,34,35]. Fassnacht et al. [34] recommended linking the analytical findings in the hyperspectral space with the operational advantages of multispectral sensors.
Most of the kauri in the study area grow in a more diverse second-growth forest [36]. The young growth form of this evergreen conifer is a conical shape with dense foliage (Figure 1). They often develop under the protection of angiosperm nurse trees. Kauri leaves are linear, 2–5 cm long, with a smooth leather-like surface. They form a spiky foliage surface with single branches protruding. The leaf colour is less characteristic with variants from yellow- to blue-green [37]. Stressed foliage shows all stages of decline over yellow to brown and bare branches. The spring aspect features bright green new-growth of kauri leaves [2] and an asynchronous flowering of other canopy species. The spectrally more stable summer aspect can be affected in drought years by early aging and dropping of leaves [38]. Dying stands of infected trees, climbers, vines and epiphytes add to the spectral and spatial complexity of kauri forests, which are more similar to subtropical and tropical forests than to other temperate forests [11,33].
The natural distribution of the endemic New Zealand conifer, kauri, extends over the warm temperate lowland forests of the upper North Island (Figure 2), although its abundance has been severely impacted by logging associated with European settlement [2]. In the remaining patches of mature kauri forest, the upper canopy is dominated by large dome shape kauri with an open crown structure and scattered foliage (Figure 1).

1.2. Objectives and Approach

The overall aim of this study was to develop a cost-efficient method to identify the location of kauri trees in New Zealand’s kauri forests based on optical remote sensing. The method should be applicable for wall-to-wall large area monitoring with multispectral sensors. Dead and dying trees were mapped in a separate class since it was not possible to define spectrally if these are kauri. Moreover, the management needs to document the location of dead trees before they are overgrown or fallen. The resulting “kauri mask” can then be used for further applications such as a detailed analysis of stress symptoms.
The main objectives of this study were:
  • Objective 1: Identify and compare the spectra of kauri and associated canopy tree species with no to medium stress symptoms and analyse their spectral characteristics and separability.
  • Objective 2: Identify and describe the best spectral indices for the separation of the three target classes “kauri”, “dead/dying trees” and “other” canopy vegetation (see class description below).
  • Objective 3: Define an efficient non-parametric classification method to differentiate the three target classes that is applicable for large area monitoring with multispectral sensors.
We chose a pixel-based approach as it did not require a prior crown segmentation.

2. Materials and Methods

2.1. Study Area

Three sites in the Waitakere Ranges Heritage Area, northwest of central Auckland (Figure 2), cover a representative range of kauri stands in all sizes and stages of stress [36]. The Cascade area (10.3 km2) contains patches of old established kauri stands, the Maungaroa area (5.4 km2) includes mainly second-growth kauri forests, and a diverse selection of mature crowns can be found in the Kauri Grove Valley (1.1 km2). A rough terrain characterises the ranges with elevations from sea level to a maximum of 336 m in the study sites and 474 m at the highest peak [40]. The climate is warm-temperate and influenced by the proximity of the sea [41].

2.2. Data and Data Preparation

LiDAR data (RIEGL LMS-Q1560 sensor, average 35 returns/m2 with circa 0.5 ground returns/m2) and RGB aerial images (15 cm) were flown for the three study sites in one acquisition on 30 January 2016. Pit-free terrain model (DTM), surface model (DSM) and crown-height model (CHM) were generated with LAStools [43]. The aerial image was orthorectified in two versions, on the DTM and the DSM. An additional 7.5 cm RGB aerial image was acquired in summer 2017 [44].
The airborne hyperspectral image was acquired on 15 March 2017 with an AISA Fenix hyperspectral sensor at 1 m pixel resolution and was delivered in 23 stripes in radiance units. The sensor features 448 spectral bands with an average bandwidth of 3.6 nm in the VNIR1 region and 10 nm in the NIR2/SWIR-region. The flight conditions were cloud-free but windy, with a high amount of moisture in the forest after recent rain. Reflectance measurements with an ASD field spectrometer were taken as a reference during the flight on homogenous flat areas (grass, gravel, tarmac) and black and white sheets of 5 m × 5 m.
The atmospheric correction was performed using ATCOR 4 [14] with a variable water vapour correction on the 1130 nm spectral region and a “maritime” atmosphere setting for the aerosol parameters. The spectral distortions of the push broom sensor were addressed by developing a sensor model with an adapted shift in the bandwidths. The parameters for the shift were empirically derived for each sensor part (VNIR1 and SWIR) from atmospheric gas absorption features on a homogenous part of the image. The O2 absorption bands at 760 and 820 nm could be sufficiently corrected by applying the sensor-shift in ATCOR. Remaining spikes and dips in the 940 and 1130 nm regions were removed by applying a non-linear interpolation. The ASD reflectance field measures were used as a reference to evaluate the parameters for the atmospheric correction, not for the analysis itself.
The original image showed some distinct non-periodic, single black and white “bad lines” in the columns of wavelengths at the beginning and end of the spectrum and close to the shift between the VNIR and SWIR sensors. These lines were identified by their mean value compared to the mean values of the direct neighbouring lines according to a local approach described in [45]. For the de-striping, the pixel values in these lines were replaced with the average of the neighbouring pixels.
The geographic distortions were corrected in a two-step approach: First, the basic corrections for the Global Navigation Satellite System (GNSS) position, altitude, roll, pitch, heading and offset between the inertial measurement unit and lens were applied in PARGE [46]. In a second step, the remaining distortions were corrected with an individual polynomial orthorectification per stripe in ERDAS Imagine based on over 2300 ground control points.
Ninety-six of the 448 bands that were most affected by noise and stripes were removed, leaving 352 useable bands for the analysis. The noisy bands were located in the beginning and end of the spectrum and in the absorption bands of water.
The 23 corrected stripes were stitched together with “mosaic data seamlines” in ArcGIS to three smaller mosaics covering the three study areas before they were combined into one large mosaic covering ca. 9 km².
During fieldwork in the 2015/2016 and 2016/2017 summer months, the reference crowns in denser stands were located with a mapping grade GNSS (Trimble-GeoXH-3.5G) with distance and bearing in circular sampling plots of 20 and 30 m diameter. In open stands, crowns were edited directly on aerial images and a CHM layer on a field tablet. Table A1 gives an overview of the reference data with scientific names and the priority of neighbouring canopy species, according to their resemblance to and association with kauri. A threshold of at least 40% dead branches visible in the crown area in the 2017 aerial image was defined to identify the class “dead/dying trees”. The sunlit parts were identified with a threshold on the average of the RGB-NIR bands [47]. The challenge was to define a brightness threshold that removes the core shadow areas without useful spectral information and keep the partly shaded inner-crown pixels that still contribute to the species identification. The threshold on the RGB average was defined by comparing the resulting areas with manually identified sunlit parts of the crown. A brightness threshold was also calculated on the NIR band, to match reduced band selections. Edge effects were reduced by removing an internal buffer of 10% crown diameter. The final reference set includes 3165 crowns with a total of 95,194 sunlit pixels in 1 m2 resolution (Table A1 and Figure 3).
The crown size classes used in this study refer to the mean crown diameter. It was defined as the average of the maximum and the minimum diameter based on the “minimum bounding geometry – rectangle by width” in ArcGIS. The thresholds for the size classes were empirically defined from the field measurements to mark the transition from small kauri crowns (>3 m to 4.8 m diameter) to the more open medium crown sizes (>4.8 m to <12.2 m diameter) and the large dome shape crowns (>12.2 m diameter) (Figure 1). In addition, the minimum object size for a 1 m pixel resolution was marked in a class of very small crowns of <3 m diameter. The information about the crown size was used to interpret results of the pixel-based classification, not as an attribute in the classification.
As a preparation for a separated analysis for different forest types, two forest stand categories “high” and “low” were segmented on the CHM in eCognition (scale 15 m, shape 0.3 and compactness 0.9 [48]) and defined by a mean height threshold of 21 m.
The crowns and thereby the reference pixels were sorted in three target classes for the analysis:
  • “dead/dying trees” with a minimum of 40% visible dead branches in the aerial image;
  • “kauri” that were not classified as “dead/dying”; and
  • “other” canopy vegetation that was not classified as “dead/dying”.
The crowns in the classes “kauri” and “other” showed no to medium stress symptoms with an intact crown architecture.

2.3. Extraction and Analysis of Spectra and Spectral Separabilities

Outliers that were caused by mixed pixels, single dead branches or patches of deviant plant material could be visually identified in Envi’s n-D Visualizer with the bands 1, 3 and 5 of a Minimum Noise Transformation (MNF) on all 352 bands [49]. These outlier pixels were removed for each class for test purposes. The mean signatures of kauri and associated tree species, the standard deviation and Jeffries–Matusita separability were calculated both with and without the removal of outlier pixels.
A Random Forest classification of kauri and 10 neighbouring tree species was calculated with a 10-fold cross-validation in 10 random repetitions. Only crowns larger than 5 m diameter with no or slight stress symptoms were included, to reduce the confusion with mixed pixels and declining foliage. A randomly spread subsample was extracted from the more frequent classes to match the species distribution in the study areas. The results were presented in a confusion matrix.

2.4. Band and Indices Selection

The aim of the selection process was to identify a set of 4–8 wavelengths and derived indices to distinguish the three target classes. Multispectral sensors usually feature up to six bands, but since an eight-band multispectral sensor was available, a maximum of eight bands for the index combinations was included in the analysis. This objective has two tasks:
  • separate “dead/dying trees” from less symptomatic “kauri” and “other” canopy vegetation; and
  • distinguish “kauri” from “other” canopy vegetation.
Initially, 52 indices were calculated on the 352 selected hyperspectral bands. In noisy areas of the spectrum, the values of three neighbouring bands were averaged. Indices with a high correlation (>0.98 or <-0.98) were removed by reducing the number of bands and keeping the best performing indices. For the attribute selection, several ranker methods (Correlation, GainRatio, InfoGain, Symmetrical Uncertainty and Principal Component) were combined in WEKA [50] by applying a weight according to the ranking results. The final combinations with 4–8 bands were identified with a Wrapper Subset Evaluator and the attribute importance for a Random Forest classification. The same selection process was repeated with indices in only the visible to NIR1 spectral range (VNIR1), up to 970 nm.

2.5. Selection and Parametrisation of the Classifier

Random Forest (RF) and Support Vector Machine (SVM) have been used successfully for tree classification in several studies [51,52,53,54]. As non-parametric classifiers, they do not require a normal distribution of the reference data and are well suited to handle a large number of attributes and high variability in the classes [55,56]. The SVM separates the classes by constructing a hyperplane based on support vectors at the outer class edges [57]. The parameters (cost: 1000, gamma: 0.1) were defined in WEKA with the GridSearch package [58]. The Sequential Minimal Optimization function in WEKA for the SVM analysis could handle the three target classes by using pairwise classification.
The RF classifier combines a large number of decision trees based on bootstrap samples with an ensemble learning algorithm [51]. A random selection of a given number of features is used to split each node in the RF implementation in WEKA. The final model is based on the number of similar outcomes (“votes”) from all decision trees [59]. The parameters were systematically tested, and the highest accuracies could be achieved with 500 trees, two attributes per node and a maximum tree depth of 40. The performances of both classifiers with the defined parameters were tested in a five-fold random split of all sunlit pixels with 20 repetitions. As expected, the use of alternative classifiers (Maximum Likelihood, J48 decision tree) yielded inferior results in comparison to RF and SVM.

2.6. Tests to Further Improve the Accuracy

The default classification was calculated on the defined band selections and parameters. Several tests were conducted to improve the accuracies:
  • resampling of the original bandwidths to 10 nm, 20 nm and 30 nm;
  • addition of three selected texture values on the 800 nm band (data range (7 kernel (k)), variance (7 k) and second moment (3 k)), following the procedure for the indices’ selection;
  • addition of a LiDAR CHM as a layer for the classification;
  • separate classifications for low and high stands; and
  • removal or reclassification of outlier pixels in the training set.
The final accuracies were calculated pixel-based with test pixels in all crowns. Producer’s and user’s accuracies were determined for the three target classes as the mean values from all repetitions in the RF classifications.

3. Results and Interpretations

3.1. Results Objective 1: Kauri Spectrum

Compared to the mean spectra of other canopy vegetation, the mean spectra of kauri pixels (Figure 4) show a slightly lower reflectance in the green part of the spectra and lower signals in all spectral regions. The most distinctive feature in the kauri spectrum is a steep ascend from 1000 nm to 1070 nm with a long descent to the absorption feature at 1215 nm. The bands of the NIR2 range are the most important for kauri identification, followed by the NIR1 and SWIR1 (Figure 4 and Figure 5). The spectra of very small kauri crowns (<3 m DM) differ slightly from larger kauri (>4.8 m DM) with a Transformed Divergence value of 1.95 [60].
With a Jeffries–Matusita value of over 1.9 [61], the pixel-based spectra of 21 other species can all well be separated from the pixel-based kauri spectra (Table A2). The separability increases after the removal of outlier pixels. The main species that are incorrectly classified as kauri are rimu, tanekaha, rewarewa, tōtara, miro and kawaka (Table A2). These species show similar spectral features to the kauri spectrum with the long descent in the NIR2 range and lower SWIR values (Figure 6). The spectra of species with a high spectral separability from kauri, such as flax, kanuka, tree fern and pohutukawa have higher reflectance features in the VIS, NIR and SWIR range and a lower descent to the 1215 nm water vapour window (Figure 7).
The overall high separability of kauri with neighbouring species could also be confirmed in a classification of kauri and 10 other tree species on the full spectral range of the AISA image (Table 2). Only non-symptomatic crowns larger 5 m diameter were chosen to avoid confusion caused by mixed pixels and declining foliage. The overall accuracy of 94.8% and user’s accuracies from 98.1% for rata to 98.7% for kauri, confirm the high spectral separability of kauri and also between the selected 10 tree species. Most species show high producer’s accuracies of over 93% with 99.1% for kauri. However, tōtara, rewarewa, tanekaha and miro have the lowest producer’s accuracies: from 58% for rewarewa to 77% for miro.
The category “dead/dying” was difficult to define from the classes “kauri” and “other” with user’s accuracy of 80.3% and a producer’s accuracy of only 52.1% in the final setup. In a test with aggregated percentages of the classes per reference crown, the producer’s accuracy for the class “dead/dying” could be improved to 75.5% for a minimum threshold of 15% crown area defined as “dead/dying”. The main characteristic features of the spectra of dead/dying trees are a lower chlorophyll absorption in the red region (around 670 nm), a lower reflectance of green leaf scattering in the NIR1 region (around 800 nm), a blue shift of the red edge point and overall high values in the SWIR region (Figure 5). Tests with separate classes for incorrectly classified pixels as well as the inclusion of shadow pixels gave no improvement for the “dead/dying” class.

3.2. Results Objective 2: Indices Selection

A preselection of 13 best performing indices over the whole spectrum is described in Error! Reference source not found., and their position in relation to the mean spectra of the target classes is illustrated in Figure 8. Figure 9 presents the performance of each index to identify the class “dead/dying” and to distinguish “kauri” from “other” canopy vegetation, with the best resulting combinations shown in Figure 10. A paired t-test for the resulting accuracies with a p-value of 0.05 confirmed that these results and thereby the ranking of index combinations are significant.
For a four-band multispectral sensor, the highest performance of 90.1% OA (Figure 10) could be achieved with four indices, based on bands in the VIS (670 nm), NIR1 (800 nm) and NIR2 region (1074 and 1209 nm). The combination of three indices on the red and NIR1 bands helped to identify the class “dead/dying” (Figure 9). The NIR 2 spectral range proved to be the most important for the identification of kauri followed by the NIR1, SWIR1, VIS and SWIR2 spectral ranges (Figure 4 and Figure 11).
The best distinction between kauri and other canopy vegetation could be achieved with a normalised index (mNDWI-Hyp, Figure 12) that captures the distinctive long descent in the NIR2 spectrum. It was first described as an alternative to a Normalised Water Index (NDWI) adapted to Hyperion data [59] and was further modified in this study by using the natural logarithm values to address outliers (after [60]).
Other indices that are useful to identify kauri like the Moisture Stress Index (MSI), NDWI and Water Band Index (WBI) (Figure 9) also include bands in the NIR1 and NIR2 spectral range. However, in combination with the best performing mNDWI-Hyp index, they did not increase the overall accuracy.
For a five-band sensor, an additional Simple Ratio Index with an extra Red Edge band at 708 nm (SR708) increased the OA to 90.8% for all classes on all crown sizes. The combination of the SR708 with the other three indices on the RED to NIR1 bands performed best to distinguish the class “dead/dying” Figure 9. This five-band combination was considered the best trade-off between the number of bands and the resulting accuracy. It was therefore used as the default combination for the development of the finale classification method. The inclusion of further bands and respective indices only resulted in slight improvements in the accuracy.
An additional band at 970 nm for a six-band sensor allows including a further Normalised Difference Index (ND970) and resulted in an OA of 90.9% (Figure 10). This index was developed in this study to describe the characteristically steep ascend in the kauri spectra from the first NIR water vapour window at 970 nm to the reflectance feature on 1074 nm (Figure 5, Table A3).
With seven multispectral bands, the best results of 91.3% OA could be achieved by adding the Normalised Difference Nitrogen Index (NDNI) with two bands in the SWIR1 region (Figure 9, Table A3). It describes the leaf nitrogen concentration in the 1510 nm band in relation to the canopy foliar mass measured at 1680 nm, which again depends on the absorption by leaf and canopy water [62]. The kauri spectrum shows a lower magnitude in the slope between the 1510 nm band versus the reflectance feature at 1680 nm compared to the mean spectrum of the two other target classes (see Figure 5).
As an alternative for a seven-band sensor, the addition of a Photochemical Reflectance Index (PRI) [63], with two bands in the green region, results in an OA of 91.2% (StD 0.19). This index describes the photosynthetic light use efficiency by carbon dioxide uptake. It captures the slightly lower green reflectance feature in the kauri spectrum. A test confirmed its usability on a resampled 10 nm bandwidth.
With eight spectral bands available, the highest OA of 91.3% (StD 0.2) could be achieved by adding both the 970 nm band for the ND970 index and the two SWIR1 bands for the NDNI index to the five bands of the default setup.
In general, the NIR2 Indices are more important to distinguish kauri than indices in the visible to NIR1 (VNIR1) range. The best performing VNIR1 index combination for an eight-band sensor includes bands in the 550–970 nm spectral range (Table 3). This combination resulted in 84.6% OA to distinguish the three target classes (Figure 13). If only six bands are available, three indices on red to NIR1 bands (675–970 nm) classified the three target classes with an OA of 78.4%.

3.3. Results Objective 3: Method Development

The final accuracies are based on an image with five wavelengths (10 nm bandwidth) and five derived indices including the NIR2 bands, according to the recommended index selection for the whole spectrum in the previous sections. It enabled the distinction of “kauri and dead/dying trees” from “other canopy vegetation” (two classes) with a pixel-based overall accuracy (OA) of 93.4%. The three classes with “dead/dying crowns” as a separated category could be identified with 91.3% OA (Table 4, Test E). The separation of the class “dead/dying” from the class “other” poses the main challenge, while the pixel-based user’s and producer’s accuracies for the class “kauri” are close to 95% (Table 4). These results are based on a RF classification. The RF classifier performed with 90.9% OA for the default setup slightly better than the SVM classifier (89.5% OA), at half of the processing time and it was easier to optimise. The resulting maps for the final setup that was applied to independent test crowns for the three study areas are shown in Figure 14. Crowns that were not chosen as test crowns in the 10 repetitions are marked as “unclassified”.
The accuracies for index combinations that include only bands in the visible to NIR1 range are significantly lower with 84.6% OA for three classes on eight bands and 78% on six bands. Combining the classes “kauri” and “dead/dying” improved the OA to 86.9% for the eight-band selection (see Figure 13). Further improvements of about 2% could be achieved by adding a CHM layer.
The full spectral range of 25 MNF bands resulted in overall pixel-based accuracy of 93.9% for three classes and 96.2% for two classes. Attempts to remove mixed pixels and noise by excluding an MNF forward and backward transformation did not improve the overall accuracy.
A binning to 10 nm helped to remove noise and redundancies (Table 4, Tests B1 and B2), while 20 and 30 nm resampling proved to be too coarse to capture the small spectral windows of the selected indices.
A separated classification for low and high stands (Table 4, Test C) improved the accuracy by 1.5%. Adding a CHM layer achieved a similar improvement, but it was not used for the final setups, because the LiDAR data do not match the hyperspectral image sufficiently for a direct pixel-based combination. Additional texture features based on the 800 nm NIR band gave a slight improvement in the classification of small crowns but lowered the overall accuracy in the larger crowns. In addition, the partial removal of outliers in the training set (Table 4, Test D) resulted in a slightly enhanced OA of 0.7%. This method was not applied for the final accuracies because it is too elaborate for large area applications.
Post-processing by reclassifying kauri pixels with a height lower than 4 m to the class “other” improved wrongly classified lower shrub areas, but it requires a spatially matching CHM. The merging of singular pixels with a majority kernel according to the stand situation improved the pixel-based accuracy and should be considered for further analysis.

4. Discussion and Recommendations for Further Analysis

The use of a multispectral sensor with at least five bands in the VIS to NIR2 range is recommended for the detection of kauri and dead/dying trees. This study confirms the findings of Asner [68], Clark, Roberts [69] and Ferreira, Zortea [20] about the importance of the NIR spectrum for the identification of tree species in a diverse forest environment.
Index combinations with bands only in the visible to NIR1 range (up to 970 nm) perform significantly lower than index combinations that include NIR2 and SWIR bands. The overall accuracy was only 84.6% for three target classes on all crown sizes in the visible to NIR1 range, compared to 89.9% accuracy for a similar setup that includes bands in the NIR2 spectrum. If only bands in the VNIR1 spectral range are available, a combination with LiDAR attributes is recommended, ideally in an object-based approach according to the authors of [26,27,28].
The characteristic high reflectance in the kauri spectrum at 1070 nm indicates a particularly high amount of scattering of radiation at air–cell–water edges in the complex structure of the kauri foliage and the thick kauri leaves [11,62,70]. The pronounced water vapour window at 1215 nm is caused by a strong absorption from high leaf, respective crown water content. These results confirm the field observations that kauri crowns are more distinct in structural features than in colour. Since there was a lot of moisture in the forest on the flight day of the AISA sensor, the performance of the selected indices should also be tested under dryer conditions.
Other well-performing indices to identify kauri such as the MSI and NDWI also have bands in the NIR1 and NIR2 range and confirm the importance of structural features and water content for kauri identification. The lower reflectance values of kauri in all spectral regions is most likely caused by the more open crown structure in medium and large kauri compared to neighbouring species.
The main species that are incorrectly classified as kauri tend to have a similar “rough” foliage or needle-like leaves such as rimu, tanekaha, rewarewa, tōtara, miro and kawaka (Table 4, Test B1). Species with similar conical shapes in smaller growth stages such as tanekaha, rimu, kahikatea and rewarewa are easily confused with small kauri. They show low producer’s accuracies from 58% for rewarewa to 77% for miro in the individual species classification (Table 2). While rata has overall high user’s and producer’s accuracies of 89.0% and 97.8%, it has wrongly classified pixels with all other tree species, including kauri. This confusion is most likely caused by the fact that rata starts its growing cycle as an epiphyte and occurs therefore as part of the foliage on other trees.
The category “dead/dying” is difficult to define because of the graded transition from the two other classes for trees with declining foliage and a higher amount of shadow and mixed reflectance with understory layers. In addition, canopy vegetation with a high amount of carbon fibre such as flax and cabbage trees, wooden seed capsules on kānuka and older dry foliage on rimu were wrongly classified as “dead/dying”. In addition, specular reflections on the smooth waxy surface of kahikatea trees and the shiny leaves of tree ferns cause confusion with the class “dead/dying”. Higher producer’s accuracy in a test with a crown-aggregated setup revealed that misclassification of the class “dead/dying” is partly caused by single pixels on dead branch material in otherwise less symptomatic crowns. While the classification of these pixels is correct, they appear as wrongly classified in the confusion matrix because the reference is crown based.
A separated classification for low and high stands (Table 4, Test C) improved the OA about 1.5%. This can be explained by a reduced variability in the dataset after separating young trees with dense foliage and lower shrub layers from the mature trees in the higher stands. An alternative to consider different size classes is the direct inclusion of a CHM as an additional layer. For a pixel-based classification, this requires a sub-pixel matching between the optical data and the CHM, which is difficult to achieve in a varied topography with large trees.
The partial removal of outliers in the training set (Table 4, Test D) reduced the effect of mixed pixels, especially for small crowns and resulted in a slightly enhanced OA by 0.7%. However, this analysis is elaborate and should only be considered if it is not possible to include LiDAR data or obtain optical data in a higher spatial resolution, which will reduce the number of mixed pixels.
The 1 m pixel size of the AISA Fenix image put some constraints on the analysis of crowns with a diameter smaller than 3 m, with an overall accuracy of 66.6% in the final setup (Table 4, Test E). The identification of small crowns requires a higher spatial resolution, ideally ≤30 cm, to avoid the effect of mixed pixels.
While some spaceborne hyperspectral sensors cover the recommended bands in the NIR2, their spatial resolution of, e.g. 30 m for the Prisma [71] and the EnMAP mission [72] is too coarse for individual tree crown identification. For larger pixel sizes, also in Landsat and Sentinel satellite images, the detection of stands with younger kauri trees should be further investigated with a spectral unmixing approach for homogenous stand units in combination with LiDAR attributes. The potential of the bright green spring aspect for kauri identification could be analysed in a time series of high-resolution satellite data.
The Random Forest classifier is very efficient to handle classes with a high spectral variability; however, the resulting model is difficult to understand. The clear separation of the “kauri” class in the histogram of the mNDWI index (Figure 12) indicates that a manual decision tree can be developed, which would be easier to understand and to implement.
While the large reference dataset of kauri in different growth and symptom stages is representative for the Waitakere Ranges, the indices and model for kauri identification should be tested and if necessary readjusted in other kauri forests with a different amount and composition of neighbouring species.

5. Conclusions

This study is the first to analyse the spectra of kauri and the main neighbouring canopy tree species with an airborne hyperspectral sensor on the full VIS to SWIR spectral range. The main objectives were: (1) to describe the kauri spectra and analyse its separability from other neighbouring tree species; (2) to identify the best spectral indices to separate the class “kauri” from “other” and “dead dying” canopy vegetation; and (3) to define a method for classification of the three target classes that is applicable for large area monitoring with multispectral sensors.
Kauri crowns have characteristic spectra with a steep reflectance feature in the NIR2 spectral region at 1070 nm and a distinct descent to the water vapour windows at 1215 nm and lower reflectance features in the green and SWIR spectral region than other canopy vegetation. The spectral characteristics indicate that kauri crowns are more distinct in their structural than biochemical features. The high separabilities of the kauri spectra from 21 other tree species and canopy vegetation with a Jeffries–Matusita separability larger 1.9 could be confirmed with a high OA of 94.8% for the classification of non-symptomatic crowns larger 5 m diameter of kauri and 10 other tree species.
For the use on a five-band multispectral sensor, five indices (Table A3) in the VIS to NIR2 range performed best to distinguish the three target classes “kauri”, “dead/dying trees” and “other canopy vegetation”. They are suitable for multispectral area-wide forest mapping.
The Random Forest classifier performed slightly better than Support Vector Machine. The final results with 91.7% OA are based on a separated Random Forest classification of low and high forest stand, a binning to 10 nm bandwidth and the removal of very small crowns (<3 m diameter). The class “kauri” could be discriminated with high user’s and producer’s accuracies of 94.6% and 94.7% from other canopy vegetation by using the selected five bands from the red spectrum at 670–1215 nm in the NIR2 spectrum. The main challenge was the confusion between the classes “dead/dying” and “other” canopy vegetation. A further improvement to 93.8% OA could be achieved by combining “kauri” and “dead/dying” trees in one class as a “kauri mask” for the further analysis, e.g. of stress symptoms. Additional indices enhance the overall accuracy only slightly, up to 0.6% for an eight-band sensor.
The method for accurate, cost efficient, wall-to-wall mapping of kauri trees presented in this study has important implications for the monitoring of kauri dieback disease and the implementation of measures to control disease over the entire distribution of New Zealand’s native kauri forests.

Author Contributions

Conceptualisation, J.J.M.; methodology, J.J.M.; software, J.J.M.; validation, J.J.M.; formal analysis, J.J.M.; investigation, J.J.M.; resources, J.J.M. and J.S.; data curation, J.J.M.; writing—original draft preparation, J.J.M.; writing—review and editing, J.J.M., H.B., J.H., J.S. and D.A.N.; visualisation, J.J.M.; supervision, H.B., J.H., J.S. and D.A.N.; project administration, J.J.M. and D.A.N.; and funding acquisition, J.J.M., D.A.N. and J.H.

Funding

The Ministry of Primary Industries funded most of the remote sensing data (agreement No. 17766), while the University of Canterbury, the University of Trier and FrontierSI (former CRCSI) Australia provided scholarships for living costs, fieldwork, equipment and additional LiDAR data. Digital Globe and Blackbridge helped with grants for satellite data. Auckland Council supported the fieldwork and supplied LiDAR data and aerial images and Landcare Research provided field equipment. Rapidlasso and Harris Geospatial helped with grants for software licenses. Henning Buddenbaum was supported within the framework of the EnMAP project (FKZ 50 EE 1530) by the German Aerospace Center (DLR) and the Federal Ministry of Economic Affairs and Energy. The publication was funded by the Open Access Fund of Universität Trier and the German Research Foundation (DFG) within the Open Access Publishing funding programme.

Acknowledgments

Our sincere thanks go to all people and institutions who supported this project. We are especially grateful to Nick Waipara, Lee Hill and Yue Chin Chew from Auckland Council who helped to establish the project and provided data. Justin Morgenroth at the University of Canterbury helped with the initial budget setup. We also like to thank Fredrik Hjelm from the Living Tree Company and Joanne Peace for their excellent support during the fieldwork. Jeanette Allen, Vicki Wilton and Nicole Gellner helped with the University administration.

Conflicts of Interest

The authors declare no conflict of interest.

Appendix A

Table A1. Number of crowns and sunlit/shadow pixel for all reference data used in the analysis sorted according to the main classes and species resp. vegetation groups.
Table A1. Number of crowns and sunlit/shadow pixel for all reference data used in the analysis sorted according to the main classes and species resp. vegetation groups.
Common NameScientific NameCrownsPixels (1)
kaurikauriAgathis australis (D.Don) Lindl. ex Loudon148357,700
kauri group/standAgathis australis (D.Don) Lindl. ex Loudon9850
dead/dyingkauri dead/dyingAgathis australis (D.Don) Lindl. ex Loudon3265329
unknown dead/dyingNN911937
other dead/dyingNN22839
other 1. prioritykahikateaDacrycarpus dacrydioides (A.Rich.) de Laub.872932
kanukaKunzea spp.2184224
miroPrumnopitys ferruginea (D.Don) de Laub.21780
pohutukawaMetrosideros excelsa Sol. ex Gaertn.522273
puririVitex lucens Kirk401741
rataMetrosideros robusta A.Cunn.1026504
rewarewaKnightia excelsa R.Br.931082
rimuDacrydium cupressinum Sol. ex G.Forst22610,841
tanekahaPhyllocladus trichomanoides G.Benn ex D.Don126964
taraireBeilschmiedia tarairi (A.Cunn.) Benth. & Hook.f. ex Kirk11253
taraire/puririNN379
tōtaraPodocarpus totara D.Don371761
other 2. prioritybroadleaf mixNN16370
cabbage treeCordyline australis (G.Forst.) Endl.25302
coprosma sp.Coprosma spp.56790
flaxPhormium tenax J.R.Forst. & G.Forst.391
karakaCorynocarpus laevigatus J.R.Forst. & G.Forst.473
kowhaiSophora spp.5119
kawakaLibocedrus plumosa (D.Don) Sarg.484
mataiPrumnopitys taxifolia (Sol. ex D.Don) de Laub.3103
nikauRhopalostylis sapida H.Wendl. & Drude27431
other pine treesNN4360
pukateaLaurelia novae-zelandiae A.Cunn.8215
shrub mix (nikau, tree fern, cabbage…)NN131346
tawaBeilschmiedia tawa (A.Cunn.) Benth. & Hook.f. ex Kirk23965
tree fernCyathea spp.20294
other species (not kauri)NN7406
Total 3165106,028
(1) In total, 10,834 shadow pixels, 95,194 sunlit pixels.

Appendix B

Table A2. Spectral separability and confusion of the class “kauri” with “other tree species”. The Jeffries–Matusita separability is given both for all sunlit pixels and a pixel set with removed outliers. A value over 1.9 indicates a high spectral separability.
Table A2. Spectral separability and confusion of the class “kauri” with “other tree species”. The Jeffries–Matusita separability is given both for all sunlit pixels and a pixel set with removed outliers. A value over 1.9 indicates a high spectral separability.
“Other” Classified as “Kauri”Jeffries–Matusita Separability to the Kauri SpectrumConfusion of Kauri with other Species (2)
Outliers RemovedAll Sunlit PixelsMean No. of Confused PixelsMean Percent ConfusedMean No. of Test Pixels
rimu (1)1.9481.99573.30.2%1821.1
totara (1)1.9291.97950.31.1%321.6
other pine species (1) (3)1.9972.00034.24.3%88.6
tanekaha (1)1.8601.99227.72.1%169.5
rata1.9891,99825.90.3%1189.3
rewarewa (1)1.9681.9958.82.1%175.6
miro1.9601.9958.42.6%139.8
kahikatea1.9831.9936.40.7%530.5
pohutukawa1.9971.9994.60.8%441.8
coprosma sp.NNNN4.33.4%143.3
kawaka1.9992.0003.64.1%709
tawa1.9962.00011.6%114.9
puriri1.9901.9990.90.9%235.7
scrub mix1.9881.9970.94.9%62.6
karaka2.0002.0000.715.8%9
nikau1.9982.0000.61.6%28.1
pukateaNNNN0.63.9%18.4
tree fern2.0002.0000.32.1%8.7
taraire1.9901.9990.22.1%7.7
broadleaf mixNNNN0.10.2%5.8
other (not kauri)NNNN0.10.4%2.6
kanuka1.9962.000no confusion of kauri
with these species
flax2.0002.000
kanuka flowering2.0002.000
kowhai2.0002.000
(1) Main species that were confused with kauri; (2) Mean values of a Random Forest classification in a five-fold split in 10 repetitions; (3) Planted pine trees close to the Piha settlement, without species identification.

Appendix C

Table A3. Selected indices for the detection of kauri and dead/dying trees.
Table A3. Selected indices for the detection of kauri and dead/dying trees.
NameEquationName, Description
(Sensitive to…)
Literature
Selected indices for a 5-band sensor
SR800 (1) = R 800 R 670 Simple Ratio 800/670
…chlorophyll concentration and Leaf Area Index (LAI)
[73]
SR708 = R 670 R 708 Simple Ratio 670/800
…chlorophyll concentration and LAI
[66]
(modified)
RDVI (1) = R 800 R 670 R 800 + R 670 Renormalised Difference Vegetation Index
…chlorophyll concentration and LAI
[64]
NDVI (1) = R 800 R 670 R 800 + R 670 Normalised Difference Vegetation Index
…chlorophyll concentration and LAI
[74]
mNDWI-Hyp = log ( R 1074 ) 1   log ( R 1209 ) 1 log ( R 1074 ) 1 +   log ( R 1209 ) 1 Modified Normalised Difference Water Index – Hyperion
…vegetation canopy water content and canopy structure
[75]
Additional indices for a 6–8-band sensor
ND970 = R 1074 R 970 R 1074 + R 970 Normalised Difference 1074/970
…vegetation canopy water content and canopy structure
This study
PRI = R 531 R 570 R 531 + R 570 Photochemical Reflectance Index
…photosynthetic light use efficiency of carotenoid pigments
[63]
NDNI = log ( R 1510 ) 1   log ( R 1680 ) 1 log ( R 1510 ) 1 +   log ( R 1680 ) 1 Normalised Nitrogen Index
…canopy nitrogen
[76]
Other selected indices
WBI = R 970 R 900 Water Band Index
…relative water content at leaf level
[67]
MSI = R 1599 R 819 Moisture Stress Index
…moisture stress in vegetation
[77]
NDWI = R 860 R 1240 R 860 + R 1240 Normalised Difference Water Index
…total water content
[78]
NDLI = log ( R 1754 ) 1   log ( R 1680 ) 1 log ( R 1754 ) 1 +   log ( R 1680 ) 1 Normalised Difference Lignin Index
…leaf and canopy lignin content
[79]
CAI=0.5*(2000 + 2200) − 2100Cellulose Absorption Index
…cellulose, dried plant material
[80]
(1) The value for the R800 band was averaged with the values of the two neighbouring bands, to reduce noise.

References

  1. MPI. Kauri Dieback Sampling Locations; Ministry of Primary Industries: Wellington, New Zealand, 2018.
  2. Ecroyd, C. Biological flora of New Zealand 8. Agathis australis (D. Don) Lindl.(Araucariaceae) Kauri. N. Z. J. Bot. 1982, 20, 17–36. [Google Scholar] [CrossRef]
  3. Waipara, N.W.; Hill, S.; Hill, L.M.W.; Hough, E.G.; Horner, I.J. Surveillance methods to determine tree health, distribution of kauri dieback disease and associated pathogens. N. Z. Plant Prot. 2013, 66, 235–241. [Google Scholar] [CrossRef]
  4. Jamieson, A.; Bassett, I.E.; Hill, L.M.W.; Hill, S.; Davis, A.; Waipara, N.W.; Hough, E.G.; Horner, I.J. Aerial surveillance to detect kauri dieback in New Zealand. N. Z. Plant Prot. 2014, 67, 60–65. [Google Scholar] [CrossRef]
  5. Bock, C.H.; Poole, G.H.; Parker, P.E.; Gottwald, T.R. Plant disease severity estimated visually, by digital photography and image analysis, and by hyperspectral imaging. Crit. Rev. Plant Sci. 2010, 29, 59–107. [Google Scholar] [CrossRef]
  6. Jones, H.G.; Vaughan, R.A. Remote Sensing of Vegetation: Principles, Techniques, and Applications; Oxford University Press: New York, NY, USA, 2010. [Google Scholar]
  7. Thenkabail, P.S.; Lyon, J.G.; Huete, A. Fundamentals, Sensor Systems, Spectral Libraries, and Data Mining for Vegetation; CRC Press: Boca Raton, FL, USA, 2018. [Google Scholar]
  8. Sandau, R. Digital Airborne Camera: Introduction and Technology; Springer Science & Business Media: New York, NY, USA, 2009. [Google Scholar]
  9. Petrie, G.; Walker, A.S. Airborne digital imaging technology: A new overview. Photogramm. Rec. 2007, 22, 203–225. [Google Scholar] [CrossRef]
  10. Hagen, N.A.; Gao, L.S.; Tkaczyk, T.S.; Kester, R.T. Snapshot advantage: A review of the light collection improvement for parallel high-dimensional measurement systems. Opt. Eng. 2012, 51, 111702. [Google Scholar] [CrossRef]
  11. Asner, G.P. Hyperspectral remote sensing of canopy chemistry, physiology, and biodiversity in tropical rainforests. In Hyperspectral Remote Sensing of Tropical and Sub-Tropical Forests; CRC Press: Boca Raton, FL, USA, 2008; pp. 261–296. [Google Scholar]
  12. Dalponte, M.; Ørka, H.O.; Ene, L.T.; Gobakken, T.; Næsset, E. Tree crown delineation and tree species classification in boreal forests using hyperspectral and ALS data. Remote Sens. Environ. 2014, 140, 306–317. [Google Scholar] [CrossRef]
  13. Jones, T.G.; Coops, N.C.; Sharma, T. Assessing the utility of airborne hyperspectral and LiDAR data for species distribution mapping in the coastal Pacific Northwest, Canada. Remote Sens. Environ. 2010, 114, 2841–2852. [Google Scholar] [CrossRef]
  14. Richter, R.; Schläpfer, D. ATCOR-4 User Guide, Version 7.3.0, April 2019. In Atmospheric/Topographic Correction for Airborne Imagery; ReSe Applications LLC: Wil, Switzerland, 2019. [Google Scholar]
  15. Trier, Ø.D.; Salberg, A.-B.; Kermit, M.; Rudjord, Ø.; Gobakken, T.; Næsset, E.; Aarsten, D. Tree species classification in Norway from airborne hyperspectral and airborne laser scanning data. Eur. J. Remote Sens. 2018, 51, 336–351. [Google Scholar] [CrossRef]
  16. Asner, G.P.; Martin, R.E. Airborne spectranomics: Mapping canopy chemical and taxonomic diversity in tropical forests. Front. Ecol. Environ. 2009, 7, 269–276. [Google Scholar] [CrossRef]
  17. Carlson, K.M.; Asner, G.P.; Hughes, R.F.; Ostertag, R.; Martin, R.E. Hyperspectral remote sensing of canopy biodiversity in Hawaiian lowland rainforests. Ecosystems 2007, 10, 536–549. [Google Scholar] [CrossRef]
  18. Clark, M.L. Identification of Canopy Species in Tropical Forests Using Hyperspectral Data. In Huete, Hyperspectral Remote Sensing of Vegetation, 2nd ed.; Thenkabail, P.S., Lyon, J.G., Eds.; Biophysical and Biochemical Characterisation and Plant Species Studies; CRC Press: Boca Raton, FL, USA, 2018; Volume 3, p. 423. [Google Scholar]
  19. Féret, J.-B.; Asner, G.P. Tree species discrimination in tropical forests using airborne imaging spectroscopy. IEEE Trans. Geosci. Remote Sens. 2013, 51, 73–84. [Google Scholar] [CrossRef]
  20. Ferreira, M.P.; Zortea, M.; Zanotta, D.C.; Shimabukuro, Y.E.; de Souza Filho, C.R. Mapping tree species in tropical seasonal semi-deciduous forests with hyperspectral and multispectral data. Remote Sens. Environ. 2016, 179, 66–78. [Google Scholar] [CrossRef]
  21. Dalponte, M.; Bruzzone, L.; Gianelle, D. Fusion of hyperspectral and LIDAR remote sensing data for classification of complex forest areas. IEEE Trans. Geosci. Remote Sens. 2008, 46, 1416–1427. [Google Scholar] [CrossRef]
  22. Peerbhay, K.Y.; Mutanga, O.; Ismail, R. Commercial tree species discrimination using airborne AISA Eagle hyperspectral imagery and partial least squares discriminant analysis (PLS-DA) in KwaZulu–Natal, South Africa. ISPRS J. Photogramm. Remote Sens. 2013, 79, 19–28. [Google Scholar] [CrossRef]
  23. Shen, X.; Cao, L. Tree-species classification in subtropical forests using airborne hyperspectral and LiDAR data. Remote Sens. 2017, 9, 1180. [Google Scholar] [CrossRef]
  24. Asner, G.P.; Warner, A.S. Canopy shadow in IKONOS satellite observations of tropical forests and savannas. Remote Sens. Environ. 2003, 87, 521–533. [Google Scholar] [CrossRef]
  25. Kempeneers, P.; Vandekerkhove, K.; Devriendt, F.; van Coillie, F. Propagation of shadow effects on typical remote sensing applications in forestry. In Proceedings of the 2013 5th Workshop on Hyperspectral Image and Signal Processing: Evolution in Remote Sensing (WHISPERS), Gainesville, FL, USA, 26–28 June 2013. [Google Scholar]
  26. Blaschke, T. Object based image analysis for remote sensing. ISPRS J. Photogramm. Remote Sens. 2010, 65, 2–16. [Google Scholar] [CrossRef]
  27. Heumann, B.W. An object-based classification of mangroves using a hybrid decision tree—Support vector machine approach. Remote Sens. 2011, 3, 2440–2460. [Google Scholar] [CrossRef]
  28. Machala, M.; Zejdová, L. Forest mapping through object-based image analysis of multispectral and LiDAR aerial data. Eur. J. Remote Sens. 2014, 47, 117–131. [Google Scholar] [CrossRef]
  29. Leckie, D.; Gougeon, F.; Hill, D.; Quinn, R.; Armstrong, L.; Shreenan, R. Combined high-density lidar and multispectral imagery for individual tree crown analysis. Can. J. Remote Sens. 2003, 29, 633–649. [Google Scholar] [CrossRef]
  30. Ghosh, A.; Ewald Fassnacht, F.; Joshi, P.K.; Koch, B. A framework for mapping tree species combining hyperspectral and LiDAR data: Role of selected classifiers and sensor across three spatial scales. Int. J. Appl. Earth Obs. Geoinf. 2014, 26, 49–63. [Google Scholar] [CrossRef]
  31. Zhang, C.; Qiu, F. Mapping individual tree species in an urban forest using airborne lidar data and hyperspectral imagery. Photogramm. Eng. Remote Sens. 2012, 78, 1079–1087. [Google Scholar] [CrossRef]
  32. Buddenbaum, H.; Schlerf, M.; Hill, J. Classification of coniferous tree species and age classes using hyperspectral data and geostatistical methods. Int. J. Remote Sens. 2005, 26, 5453–5465. [Google Scholar] [CrossRef]
  33. Baldeck, C.A.; Asner, G.P.; Martin, R.E.; Anderson, C.B.; Knapp, D.E.; Kellner, J.R.; Wright, S.J. Operational tree species mapping in a diverse tropical forest with airborne imaging spectroscopy. PLoS ONE 2015, 10, e0118403. [Google Scholar] [CrossRef] [PubMed]
  34. Fassnacht, F.E.; Latifi, H.; Stereńczak, K.; Modzelewska, A.; Lefsky, M.; Waser, L.T.; Straub, C.; Ghosh, A. Review of studies on tree species classification from remotely sensed data. Remote Sens. Environ. 2016, 186, 64–87. [Google Scholar] [CrossRef]
  35. Holmgren, J.; Persson, Å.; Söderman, U. Species identification of individual trees by combining high resolution LiDAR data with multi-spectral images. Int. J. Remote Sens. 2008, 29, 1537–1552. [Google Scholar] [CrossRef]
  36. Singers, N.; Osborne, B.; Lovegrove, T.; Jamieson, A.; Boow, J.; Sawyer, J.; Hill, K.; Andrews, J.; Hill, S.; Webb, C. Indigenous terrestrial and wetland ecosystems of Auckland; Auckland Council: Auckland, New Zealand, 2017. Available online: http://www.knowledgeauckland.org.nz (accessed on 20 July 2019).
  37. Steward, G.A.; Beveridge, A.E. A review of New Zealand kauri (Agathis australis (D. Don) Lindl.): Its ecology, history, growth and potential for management for timber. N. Z. J. For. Sci. 2010, 40, 33–59. [Google Scholar]
  38. Macinnis-Ng, C.; Schwendenmann, L. Litterfall, carbon and nitrogen cycling in a southern hemisphere conifer forest dominated by kauri (Agathis australis) during drought. Plant Ecol. 2015, 216, 247–262. [Google Scholar] [CrossRef]
  39. Meiforth, J. Photos, Waitakere Ranges. Photos taken during fieldwork in January to March 2016. 2016. [Google Scholar]
  40. Jongkind, A.; Buurman, P. The effect of kauri (Agathis australis) on grain size distribution and clay mineralogy of andesitic soils in the Waitakere Ranges, New Zealand. Geoderma 2006, 134, 171–186. [Google Scholar] [CrossRef]
  41. Chappell, P.R. The Climate and Weather of Auckland; Niwa Science and Technology Series; NIWA: Auckland, New Zealand, 2012. [Google Scholar]
  42. LINZ. NZ Topo50. Topographical Map for New Zealand. 2019. Available online: https://www.linz.govt.nz/land/maps/topographic-maps/topo50-maps (accessed on 20 July 2019).
  43. Khosravipour, A.; Skidmore, A.K.; Isenburg, M. Generating spike-free digital surface models using LiDAR raw point clouds: A new approach for forestry applications. Int. J. Appl. Earth Obs. Geoinf. 2016, 52, 104–114. [Google Scholar] [CrossRef]
  44. Auckland Council, A. Auckland 0.075m Urban Aerial Photos (2017), RGB, Waitakere Ranges. 2017. Available online: https://data.linz.govt.nz/layer/95497-auckland-0075m-urban-aerial-photos-2017/ (accessed on 12 April 2019).
  45. Datt, B.; McVicar, T.R.; van Niel, T.G.; Jupp, D.L.B.; Pearlman, J.S. Preprocessing EO-1 Hyperion hyperspectral data to support the application of agricultural indexes. IEEE Trans. Geosci. Remote Sens. 2003, 41, 1246–1259. [Google Scholar] [CrossRef] [Green Version]
  46. Schlaepfer, D. PARGE—Parametric Geocoding & Orthorectification for Airborne Optical Scanner Data. Available online: http://www.rese.ch/products/parge/ (accessed on 21 March 2019).
  47. Adeline, K.R.M.; Chen, M.; Briottet, X.; Pang, S.K.; Paparoditis, N. Shadow detection in very high spatial resolution aerial images: A comparative study. ISPRS J. Photogramm. Remote Sens. 2013, 80, 21–38. [Google Scholar] [CrossRef]
  48. Trimble. eCognition® Developer 9.3. User Guide; Trimble Germany GmbH: Munich, Germany, 2018. [Google Scholar]
  49. Green, A.A.; Berman, M.; Switzer, P.; Craig, M.D. A transformation for ordering multispectral data in terms of image quality with implications for noise removal. IEEE Trans. Geosci. Remote Sens. 1988, 26, 65–74. [Google Scholar] [CrossRef] [Green Version]
  50. Witten, I.H.; Frank, E.; Hall, M.A.; Pal, C.J. Data Mining: Practical Machine Learning Tools and Techniques; Morgan Kaufmann: San Francisco, CA, USA, 2016. [Google Scholar]
  51. Belgiu, M.; Drăguţ, L. Random forest in remote sensing: A review of applications and future directions. ISPRS J. Photogramm. Remote Sens. 2016, 114, 24–31. [Google Scholar] [CrossRef]
  52. Dalponte, M.; Ørka, H.O.; Gobakken, T.; Gianelle, D.; Næsset, E. Tree species classification in boreal forests with hyperspectral data. IEEE Trans. Geosci. Remote Sens. 2013, 51, 2632–2645. [Google Scholar] [CrossRef]
  53. Fassnacht, F.E.; Neumann, C.; Förster, M.; Buddenbaum, H.; Ghosh, A.; Clasen, A.; Joshi, P.K.; Koch, B. Comparison of feature reduction algorithms for classifying tree species with hyperspectral data on three central European test sites. IEEE J. Sel. Top. Appl. Earth Obs. Remote Sens. 2014, 7, 2547–2561. [Google Scholar] [CrossRef]
  54. Raczko, E.; Zagajewski, B. Comparison of support vector machine, random forest and neural network classifiers for tree species classification on airborne hyperspectral APEX images. Eur. J. Remote Sens. 2017, 50, 144–154. [Google Scholar] [CrossRef] [Green Version]
  55. Bollandsås, O.M.; Maltamo, M.; Gobakken, T.; Næsset, E. Comparing parametric and non-parametric modelling of diameter distributions on independent data using airborne laser scanning in a boreal conifer forest. Forestry 2013, 86, 493–501. [Google Scholar] [CrossRef] [Green Version]
  56. James, G.; Witten, D.; Hastie, T.; Tibshirani, R. An Introduction to Statistical Learning; Springer: New York, NY, USA, 2013; Volume 112. [Google Scholar]
  57. Bruzzone, L.; Chi, M.; Marconcini, M. A novel transductive SVM for semisupervised classification of remote-sensing images. IEEE Trans. Geosci. Remote Sens. 2006, 44, 3363–3373. [Google Scholar] [CrossRef] [Green Version]
  58. Chang, C.-C.; Lin, C.-J. LIBSVM: A Library for Support Vector Machines [EB/OL]. 2001. Available online: https://www.csie.ntu.edu.tw/~cjlin/libsvm/ (accessed on 6 May 2019).
  59. Breiman, L. Random forests. Mach. Learn. 2001, 45, 5–32. [Google Scholar] [CrossRef] [Green Version]
  60. Richards, J.A. Remote Sensing Digital Image Analysis; Springer: Berlin/Heidelberg, Germany, 1999; Volume 3. [Google Scholar]
  61. Jeffreys, H. An invariant form for the prior probability in estimation problems. Proc. R. Soc. Lond. Ser. A. Math. Phys. Sci. 1946, 186, 453–461. [Google Scholar]
  62. Serrano, L.; Penuelas, J.; Ustin, S.L. Remote sensing of nitrogen and lignin in Mediterranean vegetation from AVIRIS data: Decomposing biochemical from structural signals. Remote Sens. Environ. 2002, 81, 355–364. [Google Scholar] [CrossRef]
  63. Gamon, J.; Penuelas, J.; Field, C. A narrow-waveband spectral index that tracks diurnal changes in photosynthetic efficiency. Remote Sens. Environ. 1992, 41, 35–44. [Google Scholar] [CrossRef]
  64. Roujean, J.-L.; Breon, F.-M. Estimating PAR absorbed by vegetation from bidirectional reflectance measurements. Remote Sens. Environ. 1995, 51, 375–384. [Google Scholar] [CrossRef]
  65. Datt, B. A new reflectance index for remote sensing of chlorophyll content in higher plants: Tests using Eucalyptus leaves. J. Plant Physiol. 1999, 154, 30–36. [Google Scholar] [CrossRef]
  66. Datt, B. Remote sensing of chlorophyll a, chlorophyll b, chlorophyll a + b, and total carotenoid content in eucalyptus leaves. Remote Sens. Environ. 1998, 66, 111–121. [Google Scholar] [CrossRef]
  67. Peñuelas, J.; Filella, I.; Biel, C.; Serrano, L.; Savé, R. The reflectance at the 950–970 nm region as an indicator of plant water status. Int. J. Remote Sens. 1993, 14, 1887–1905. [Google Scholar] [CrossRef]
  68. Asner, G.P. Biophysical and biochemical sources of variability in canopy reflectance. Remote Sens. Environ. 1998, 64, 234–253. [Google Scholar] [CrossRef]
  69. Clark, M.L.; Roberts, D.; Clark, D. Hyperspectral discrimination of tropical rain forest tree species at leaf to crown scales. Remote Sens. Environ. 2005, 96, 375–398. [Google Scholar] [CrossRef]
  70. Hill, J. State-of-the-Art and Review of Algorithms with Relevance for Retrieving Biophysical and Structural Information on Forests and Natural Vegetation with Hyper-Spectral Remote Sensing Systems. In Hyperspectral algorithms: report in the frame of EnMAP Preparation Activities; Kaufmann, H., Ed.; Scientific Technical Report (STR); 10/08; Deutsches GeoForschungsZentrum GFZ: Potsdam, Germany, 2010. [Google Scholar]
  71. Loizzo, R.; Guarini, R.; Longo, F.; Scopa, T.; Formaro, R.; Facchinetti, C.; Varacalli, G. PRISMA: The Italian hyperspectral mission. In Proceedings of the IGARSS 2018—2018 IEEE International Geoscience and Remote Sensing Symposium, Valencia, Spain, 22–27 July 2018. [Google Scholar]
  72. Guanter, L.; Kaufmann, H.; Segl, K.; Foerster, S.; Rogass, C.; Chabrillat, S.; Kuester, T.; Hollstein, A.; Rossner, G.; Chlebek, C.; et al. The EnMAP spaceborne imaging spectroscopy mission for earth observation. Remote Sens. 2015, 7, 8830–8857. [Google Scholar] [CrossRef] [Green Version]
  73. Birth, G.S.; McVey, G.R. Measuring the Color of Growing Turf with a Reflectance Spectrophotometer 1. Agron. J. 1968, 60, 640–643. [Google Scholar] [CrossRef]
  74. Rouse, J.W., Jr.; Haas, R.H.; Schell, J.A.; Deering, D.W. Monitoring the Vernal Advancement and Retrogradation (Green Wave Effect) of Natural Vegetation; NASA Technical Report; Texas A&M University: College Station, TX, USA, 1973. [Google Scholar]
  75. Ustin, S.L.; Roberts, D.A.; Gardner, M.; Dennison, P. Evaluation of the potential of Hyperion data to estimate wildfire hazard in the Santa Ynez Front Range, Santa Barbara, California. In Proceedings of the IEEE International Geoscience and Remote Sensing Symposium, Toronto, ON, Canada, 24–28 June 2002. [Google Scholar]
  76. Fourty, T.; Baret, F.; Jacquemoud, S.; Schmuck, G.; Verdebout, J. Leaf optical properties with explicit description of its biochemical composition: Direct and inverse problems. Remote Sens. Environ. 1996, 56, 104–117. [Google Scholar] [CrossRef]
  77. Hunt, E.R., Jr.; Rock, B.N. Detection of changes in leaf water content using near-and middle-infrared reflectances. Remote Sens. Environ. 1989, 30, 43–54. [Google Scholar]
  78. Gao, B.-C. NDWI—A normalized difference water index for remote sensing of vegetation liquid water from space. Remote Sens. Environ. 1996, 58, 257–266. [Google Scholar] [CrossRef]
  79. Melillo, J.M.; Aber, J.D.; Muratore, J.F. Nitrogen and lignin control of hardwood leaf litter decomposition dynamics. Ecology 1982, 63, 621–626. [Google Scholar] [CrossRef]
  80. Nagler, P.L.; Inoue, Y.; Glenn, E.P.; Russ, A.L.; Daughtry, C.S.T. Cellulose absorption index (CAI) to quantify mixed soil–plant litter scenes. Remote Sens. Environ. 2003, 87, 310–325. [Google Scholar] [CrossRef]
Figure 1. Kauri growth classes used in this study, depending on the mean crown diameter (cdm). (Photos: [39]).
Figure 1. Kauri growth classes used in this study, depending on the mean crown diameter (cdm). (Photos: [39]).
Remotesensing 11 02865 g001
Figure 2. (a) Location of the Waitakere Ranges on the North Island of New Zealand west of Auckland City. The general area with naturally occurring kauri in New Zealand [2] is marked as hatched. (b) Study sites in the Waitakere Ranges with the reference crowns marked in red (background map: [42]).
Figure 2. (a) Location of the Waitakere Ranges on the North Island of New Zealand west of Auckland City. The general area with naturally occurring kauri in New Zealand [2] is marked as hatched. (b) Study sites in the Waitakere Ranges with the reference crowns marked in red (background map: [42]).
Remotesensing 11 02865 g002
Figure 3. Reference crowns (total 3165), used in the analysis, per class and diameter.
Figure 3. Reference crowns (total 3165), used in the analysis, per class and diameter.
Remotesensing 11 02865 g003
Figure 4. Mean spectra of the target classes “kauri”, “dead/dying” and “other” with standard deviations (stdev).
Figure 4. Mean spectra of the target classes “kauri”, “dead/dying” and “other” with standard deviations (stdev).
Remotesensing 11 02865 g004
Figure 5. Jeffries–Matusita separability [61] of the three target classes for different spectral ranges. A value larger than 1.9 indicates a high separability. The analysis was based on MNF transformations for all bands in the different spectral ranges.
Figure 5. Jeffries–Matusita separability [61] of the three target classes for different spectral ranges. A value larger than 1.9 indicates a high separability. The analysis was based on MNF transformations for all bands in the different spectral ranges.
Remotesensing 11 02865 g005
Figure 6. Mean spectra of kauri (thick black line) and six selected other canopy species (grey) that got most easily confused with kauri. The number of pixels (pix) used to generate the mean spectra is given in parentheses. The spectra of these species show the lowest separability from the kauri spectrum in this study (see Table A2).
Figure 6. Mean spectra of kauri (thick black line) and six selected other canopy species (grey) that got most easily confused with kauri. The number of pixels (pix) used to generate the mean spectra is given in parentheses. The spectra of these species show the lowest separability from the kauri spectrum in this study (see Table A2).
Remotesensing 11 02865 g006
Figure 7. Mean spectra of kauri (black) and five other canopy species (grey) that have the highest separabilities from the kauri spectrum in this study (see Table A2). The number of pixels (pix) used to generate the mean spectra is given in parentheses.
Figure 7. Mean spectra of kauri (black) and five other canopy species (grey) that have the highest separabilities from the kauri spectrum in this study (see Table A2). The number of pixels (pix) used to generate the mean spectra is given in parentheses.
Remotesensing 11 02865 g007
Figure 8. Mean spectra of the target classes “kauri”, “dead/dying” and “other” with standard deviations (“stdev”). Below: Band positions of 13 selected spectral indices.
Figure 8. Mean spectra of the target classes “kauri”, “dead/dying” and “other” with standard deviations (“stdev”). Below: Band positions of 13 selected spectral indices.
Remotesensing 11 02865 g008
Figure 9. Performance of selected indices and index combinations to identify the class “dead/dying” (light grey) and to distinguish between “kauri” and “other vegetation” (dark grey) with an RF classification (five-fold random split, 20 repetitions). Please note that the x-axis starts at 55%.
Figure 9. Performance of selected indices and index combinations to identify the class “dead/dying” (light grey) and to distinguish between “kauri” and “other vegetation” (dark grey) with an RF classification (five-fold random split, 20 repetitions). Please note that the x-axis starts at 55%.
Remotesensing 11 02865 g009
Figure 10. Performance of the final 4–8-band index combinations to distinguish the three target classes “kauri”, “dead/dying” and “other” canopy vegetation. (RF, five-fold random split, 20 repetitions). Please note that the y-axis starts at 89%.
Figure 10. Performance of the final 4–8-band index combinations to distinguish the three target classes “kauri”, “dead/dying” and “other” canopy vegetation. (RF, five-fold random split, 20 repetitions). Please note that the y-axis starts at 89%.
Remotesensing 11 02865 g010
Figure 11. RGB images of the three first bands of MNF transformations [49] from: (a) the VIS to NIR1 spectral range (431–970 nm); (b) VIS to NIR2 (431–1327 nm); and (c) the full spectral range from VIS to SWIR (431–2337 nm). The importance of the NIR2 and SWIR spectrum is visible in the higher colour contrast of kauri crowns compared to the VNIR image. The numbers in the kauri polygons indicate the stress symptom class for the crown with 1 = non-symptomatic and 5 = dead.
Figure 11. RGB images of the three first bands of MNF transformations [49] from: (a) the VIS to NIR1 spectral range (431–970 nm); (b) VIS to NIR2 (431–1327 nm); and (c) the full spectral range from VIS to SWIR (431–2337 nm). The importance of the NIR2 and SWIR spectrum is visible in the higher colour contrast of kauri crowns compared to the VNIR image. The numbers in the kauri polygons indicate the stress symptom class for the crown with 1 = non-symptomatic and 5 = dead.
Remotesensing 11 02865 g011
Figure 12. Histograms for selected indices on sunlit pixels for all crown diameters, with the class “kauri” marked in light blue, the class “dead/dying” in red and the class “other” in dark blue. (a) The histogram for the mNDWI-Hyp index, which performed best to separate the class kauri from other vegetation by capturing distinctive features in the NIR2 region, is shown. For the separation of the class “dead/dying”, indices in the RED/NIR1 region are better suited, such as (b) the SR800 and (c) the NDNI index (see Table A3 for descriptions of these indices).
Figure 12. Histograms for selected indices on sunlit pixels for all crown diameters, with the class “kauri” marked in light blue, the class “dead/dying” in red and the class “other” in dark blue. (a) The histogram for the mNDWI-Hyp index, which performed best to separate the class kauri from other vegetation by capturing distinctive features in the NIR2 region, is shown. For the separation of the class “dead/dying”, indices in the RED/NIR1 region are better suited, such as (b) the SR800 and (c) the NDNI index (see Table A3 for descriptions of these indices).
Remotesensing 11 02865 g012
Figure 13. Overall accuracies for two selected sets of six and eight bands in the visible to NIR1 range. The accuracies are calculated for two and three target classes both with and without an additional CHM layer. The results are based on an RF classification with a three-fold split in 10 repetitions on 94,971 pixel values, including small crowns (<3 m diameter). The standard deviations vary from 0.12 to 0.2.
Figure 13. Overall accuracies for two selected sets of six and eight bands in the visible to NIR1 range. The accuracies are calculated for two and three target classes both with and without an additional CHM layer. The results are based on an RF classification with a three-fold split in 10 repetitions on 94,971 pixel values, including small crowns (<3 m diameter). The standard deviations vary from 0.12 to 0.2.
Remotesensing 11 02865 g013
Figure 14. Combined results of 10 RF classifications with a 5-fold stratified random split with different seed values. Overview (left) and detailed maps (right) for the Cascades (a,b), Maungaroa (c,d) and Kauri Grove area (e,f). The numbers indicate the symptom classes in kauri crowns (1 = non-symptomatic, 5 = dead).
Figure 14. Combined results of 10 RF classifications with a 5-fold stratified random split with different seed values. Overview (left) and detailed maps (right) for the Cascades (a,b), Maungaroa (c,d) and Kauri Grove area (e,f). The numbers indicate the symptom classes in kauri crowns (1 = non-symptomatic, 5 = dead).
Remotesensing 11 02865 g014
Table 1. Spectral ranges with wavelengths used in this study (adapted from [11]).
Table 1. Spectral ranges with wavelengths used in this study (adapted from [11]).
Spectral RangeElectromagnetic Wavelengths
Visible (VIS)437–700 nm 1
1st near-infrared (NIR1)700–ca. 970 nm 2
2nd near-infrared (NIR2)970–1327 nm
1st short wave infrared (SWIR1)1467–1771 nm
2nd short wave infrared (SWIR2)1994–2337 nm 1
1 The useable bands of the AISA Fenix sensor cover the range between 437 and 2337 nm; 2 The NIR1 range marks the shift between the two sensor parts at 970 nm.
Table 2. Confusion matrix and user’s and producer’s accuracies for a RF classification of kauri and ten neighbouring tree species on the full hyperspectral range of the AISA image (first 25 bands of a 35 band MNF transformation) evaluated with a 10-fold cross-validation for the seed value 1. Only sunlit pixels of trees with a minimum diameter of 5 m were chosen to avoid shadows and to reduce the effects of mixed pixels. The selected crowns were either non-symptomatic or showed only mild symptoms of stress.
Table 2. Confusion matrix and user’s and producer’s accuracies for a RF classification of kauri and ten neighbouring tree species on the full hyperspectral range of the AISA image (first 25 bands of a 35 band MNF transformation) evaluated with a 10-fold cross-validation for the seed value 1. Only sunlit pixels of trees with a minimum diameter of 5 m were chosen to avoid shadows and to reduce the effects of mixed pixels. The selected crowns were either non-symptomatic or showed only mild symptoms of stress.
Classified
Classified As -->KauriKahikateaTotaraKanukaRimuRewarewaTanekahaRataMiroPuririPohutu-kawaTotalProducers
Accuracy
ReferenceKauri7412120500113000747999.1
Kahikatea42043110653011470214895.1
Totara221190354921182658119475.6
Kanuka36131912102650117330796.5
Rimu253661444465097870464495.7
Rewarewa1338571822908116440257.0
Tanekaha691131502043800028671.3
Rata674241104988152411509997.8
Miro962112530493811450076.2
Puriri0805230341144022151595.0
Pohutukawa70045300460441964210993.1
Total750721659543283473524620856044161535203028,683
Users Accuracy98.794.494.797.293.993.198.189.091.693.896.7 94.8
Table 3. Overview of selected indices for the identification of the three target classes in the visible to NIR1 spectral range (448–970 nm).
Table 3. Overview of selected indices for the identification of the three target classes in the visible to NIR1 spectral range (448–970 nm).
Index Abbrev.NameEquationWavelengthsLiterature
RDVI (1) (2)Renormalised Difference Vegetation IndexRDVI = (R800-R675)/
√(R800+R675)
675 800 [64]
GM1Gitelson and Merzlyak Index 1GM1 = R750/R550550 750 [65]
SRb2 (2)Simple Ratio Chlorophyll b2SRchlb2 = R675/R710 675710 [66]
LCI (1) (2)Leaf Chlorophyll IndexLCI = (R850-R710)/
(R850+R675)
675710 850 [65]
WBI (1)Water Band IndexWBI = 900/970 900970[67]
(1) Selection with three indices; (2) The original wavelengths of the index was slightly modified to reduce the number of bands.
Table 4. Overall accuracies with standard deviations of the default, test and final setups. The classifications are pixel-based with training and test data selected on all crowns (RF, five-fold stratified random split in 10 repetitions).
Table 4. Overall accuracies with standard deviations of the default, test and final setups. The classifications are pixel-based with training and test data selected on all crowns (RF, five-fold stratified random split in 10 repetitions).
2 Classes3 ClassesUser’s AccuracyProducer’s Accuracy
All DM≥3 m<3 mAll DM≥3 m<3 mKauriDead/DyingOtherKauriDead/DyingOther
Default
Test ATraining and test: all outliers included. Image on original bandwidths for the default 5 indices on 5 bands92.1
(0.1)
92.5
(0.1)
66.9
(1.7)
89.9
(0.2)
90.3
(0.2)
64.6
(1.7)
93.1
(0.9)
78.7
(1.6)
86.7
(0.8)
94.3
(0.3)
45.0
(1.3)
93.0
(1.0)
Test B1Resampling to 10 nm93.0
(0.1)
93.4
(0.1)
67.2
(1.1)
91.0
(0.1)
91.4
(0.1)
65.0
(1.5)
Test B2Resampling to 20 nm92.8
(0.2)
93.2
(0.2)
67.0
(2.8)
90.7
(0.2)
91.1
(0.2)
64.7
(3.0)
Test CSeparate classification for low and high stands93.4
(0.1)
93.7
(0.1)
70.8
(1.9)
91.4
(0.1)
91.7
(0.1)
67.5
(1.8)
Test DOutliers removed in the training set that confuse “kauri” with “other” and pixels that cause confusion with “dead/dying” < 3 m diameter92.6
(0.1)
93.0
(0.1)
68.3
(1.4)
90.6
(0.1)
91.0
(0.1)
65.8
(1.3)
Final
Test ETraining and test: all outliers included. 5 bands (10 nm), 5 indices; no textures, low and high stands separated. No post-processing93.4
(0.1)
93.8
(0.1)
69.0
(2.1)
91.3
(0.1)
91.7
(0.1)
66.6
(2.0)
94.6
(0.2)
80.3
(0.7)
88.3
(0.3)
94.8
(0.2)
52.1
(1.4)
94.7
(0.3)

Share and Cite

MDPI and ACS Style

Meiforth, J.J.; Buddenbaum, H.; Hill, J.; Shepherd, J.; Norton, D.A. Detection of New Zealand Kauri Trees with AISA Aerial Hyperspectral Data for Use in Multispectral Monitoring. Remote Sens. 2019, 11, 2865. https://doi.org/10.3390/rs11232865

AMA Style

Meiforth JJ, Buddenbaum H, Hill J, Shepherd J, Norton DA. Detection of New Zealand Kauri Trees with AISA Aerial Hyperspectral Data for Use in Multispectral Monitoring. Remote Sensing. 2019; 11(23):2865. https://doi.org/10.3390/rs11232865

Chicago/Turabian Style

Meiforth, Jane J., Henning Buddenbaum, Joachim Hill, James Shepherd, and David A. Norton. 2019. "Detection of New Zealand Kauri Trees with AISA Aerial Hyperspectral Data for Use in Multispectral Monitoring" Remote Sensing 11, no. 23: 2865. https://doi.org/10.3390/rs11232865

APA Style

Meiforth, J. J., Buddenbaum, H., Hill, J., Shepherd, J., & Norton, D. A. (2019). Detection of New Zealand Kauri Trees with AISA Aerial Hyperspectral Data for Use in Multispectral Monitoring. Remote Sensing, 11(23), 2865. https://doi.org/10.3390/rs11232865

Note that from the first issue of 2016, this journal uses article numbers instead of page numbers. See further details here.

Article Metrics

Back to TopTop