[go: up one dir, main page]
More Web Proxy on the site http://driver.im/
Next Issue
Volume 11, August-2
Previous Issue
Volume 11, July-2
You seem to have javascript disabled. Please note that many of the page functionalities won't work as expected without javascript enabled.
 
 
remotesensing-logo

Journal Browser

Journal Browser

Remote Sens., Volume 11, Issue 15 (August-1 2019) – 101 articles

Cover Story (view full-size image): Understanding how different crops use water over time is essential for planning and managing water resources and agricultural production. The main objective of this paper is to characterize the spatiotemporal dynamics of crop water use in the Central Valley of California using Landsat-derived actual evapotranspiration (ETa). Crop water use for 10 crops has been characterized for the entire Central Valley since 2008, and in the case of Kern County, since 1999. A Mann–Kendall trend analysis revealed a significant increase in area cultivated with almonds and their increasing water use at a rate of 13,488 ha-m per year, while alfalfa showed a significant decline, with 13,901 ha-m per year during the same period. This study demonstrates the useful application of historical Landsat ETa to produce water management information. View this paper.
  • Issues are regarded as officially published after their release is announced to the table of contents alert mailing list.
  • You may sign up for e-mail alerts to receive table of contents of newly released issues.
  • PDF is the official format for papers published in both, html and pdf forms. To view the papers in pdf format, click on the "PDF Full-text" link, and use the free Adobe Reader to open them.
Order results
Result details
Section
Select all
Export citation of selected articles as:
22 pages, 4678 KiB  
Article
Modeling Mid-Season Rice Nitrogen Uptake Using Multispectral Satellite Data
by James Brinkhoff, Brian W. Dunn, Andrew J. Robson, Tina S. Dunn and Remy L. Dehaan
Remote Sens. 2019, 11(15), 1837; https://doi.org/10.3390/rs11151837 - 6 Aug 2019
Cited by 31 | Viewed by 6577
Abstract
Mid-season nitrogen (N) application in rice crops can maximize yield and profitability. This requires accurate and efficient methods of determining rice N uptake in order to prescribe optimal N amounts for topdressing. This study aims to determine the accuracy of using remotely sensed [...] Read more.
Mid-season nitrogen (N) application in rice crops can maximize yield and profitability. This requires accurate and efficient methods of determining rice N uptake in order to prescribe optimal N amounts for topdressing. This study aims to determine the accuracy of using remotely sensed multispectral data from satellites to predict N uptake of rice at the panicle initiation (PI) growth stage, with a view to providing optimum variable-rate N topdressing prescriptions without needing physical sampling. Field experiments over 4 years, 4–6 N rates, 4 varieties and 2 sites were conducted, with at least 3 replicates of each plot. One WorldView satellite image for each year was acquired, close to the date of PI. Numerous single- and multi-variable models were investigated. Among single-variable models, the square of the NDRE vegetation index was shown to be a good predictor of N uptake (R 2 = 0.75, RMSE = 22.8 kg/ha for data pooled from all years and experiments). For multi-variable models, Lasso regularization was used to ensure an interpretable and compact model was chosen and to avoid over fitting. Combinations of remotely sensed reflectances and spectral indexes as well as variety, climate and management data as input variables for model training achieved R 2 < 0.9 and RMSE < 15 kg/ha for the pooled data set. The ability of remotely sensed data to predict N uptake in new seasons where no physical sample data has yet been obtained was tested. A methodology to extract models that generalize well to new seasons was developed, avoiding model overfitting. Lasso regularization selected four or less input variables, and yielded R 2 of better than 0.67 and RMSE better than 27.4 kg/ha over four test seasons that weren’t used to train the models. Full article
(This article belongs to the Special Issue Remote Sensing for Precision Nitrogen Management)
Show Figures

Graphical abstract

Graphical abstract
Full article ">Figure 1
<p>WorldView image areas. 21 December 2014 and 30 December 2015 (red), 7 January 2017 (green) and 9 January 2018 (blue). The red-green-blue image of the 2014 capture is also shown. The study sites are shown in yellow (LFS in the center and YAI on the right).</p>
Full article ">Figure 2
<p>Training-validation-test model extraction methods with multi-season data. (<b>a</b>) The three-fold cross validation procedure used to select the best Lasso <math display="inline"><semantics> <mi>α</mi> </semantics></math> and evaluate the model against held-back test data. (<b>b</b>) Randomly assigning training/validation and test data points from all seasons data. (<b>c</b>) Randomly assigning training/validation and test data points from three seasons data and testing the extracted model on the fourth season. (<b>d</b>) Training a model on two seasons data and validating on a third season (repeating this three times for all combinations of training/validation data), then testing the extracted model on the fourth season. Note, only the first of the three folds in the cross validation procedure of (<b>a</b>) is shown in (<b>b</b>,<b>d</b>).</p>
Full article ">Figure 3
<p>Coefficient of determination R<math display="inline"><semantics> <msup> <mrow/> <mn>2</mn> </msup> </semantics></math> between sampled nitrogen uptake and derived image indexes. (<b>a</b>) Ratios of image bands (<math display="inline"><semantics> <mrow> <msub> <mi>b</mi> <mn>1</mn> </msub> <mo>/</mo> <msub> <mi>b</mi> <mn>2</mn> </msub> </mrow> </semantics></math>). (<b>b</b>) Normalized difference ratios of image bands (<math display="inline"><semantics> <mrow> <mrow> <mo>(</mo> <msub> <mi>b</mi> <mn>1</mn> </msub> <mo>−</mo> <msub> <mi>b</mi> <mn>2</mn> </msub> <mo>)</mo> </mrow> <mo>/</mo> <mrow> <mo>(</mo> <msub> <mi>b</mi> <mn>1</mn> </msub> <mo>+</mo> <msub> <mi>b</mi> <mn>2</mn> </msub> <mo>)</mo> </mrow> </mrow> </semantics></math>.</p>
Full article ">Figure 4
<p>Nitrogen uptake vs. NDVI (<b>a</b>) and NDRE (<b>b</b>).</p>
Full article ">Figure 5
<p>Comparison of fitting equations, (<b>a</b>) N vs. NDRE. (<b>b</b>) ln(N) vs. NDRE. (<b>c</b>) N vs. NDRE<math display="inline"><semantics> <msup> <mrow/> <mn>2</mn> </msup> </semantics></math>.</p>
Full article ">Figure 6
<p>Comparison of sampled and predicted N uptake from the 2017 experiment at the LFS site. Red indicates 0 kg/ha, green indicates 200 kg/ha N uptake. (<b>a</b>) Sampled. (<b>b</b>) Predicted. (<b>c</b>) Graph showing sampled vs. predicted N uptake per plot.</p>
Full article ">Figure 7
<p>Regression of N uptake vs. NDRE<math display="inline"><semantics> <msup> <mrow/> <mn>2</mn> </msup> </semantics></math> per year.</p>
Full article ">Figure 8
<p>RMSE for the Lasso model as a function of <math display="inline"><semantics> <mi>α</mi> </semantics></math> including all variables (last row of <a href="#remotesensing-11-01837-t005" class="html-table">Table 5</a>).</p>
Full article ">
25 pages, 5953 KiB  
Article
Mapping Irrigated Areas Using Sentinel-1 Time Series in Catalonia, Spain
by Hassan Bazzi, Nicolas Baghdadi, Dino Ienco, Mohammad El Hajj, Mehrez Zribi, Hatem Belhouchette, Maria Jose Escorihuela and Valérie Demarez
Remote Sens. 2019, 11(15), 1836; https://doi.org/10.3390/rs11151836 - 6 Aug 2019
Cited by 80 | Viewed by 8064
Abstract
Mapping irrigated plots is essential for better water resource management. Today, the free and open access Sentinel-1 (S1) and Sentinel-2 (S2) data with high revisit time offers a powerful tool for irrigation mapping at plot scale. Up to date, few studies have used [...] Read more.
Mapping irrigated plots is essential for better water resource management. Today, the free and open access Sentinel-1 (S1) and Sentinel-2 (S2) data with high revisit time offers a powerful tool for irrigation mapping at plot scale. Up to date, few studies have used S1 and S2 data to provide approaches for mapping irrigated plots. This study proposes a method to map irrigated plots using S1 SAR (synthetic aperture radar) time series. First, a dense temporal series of S1 backscattering coefficients were obtained at plot scale in VV (Vertical-Vertical) and VH (Vertical-Horizontal) polarizations over a study site located in Catalonia, Spain. In order to remove the ambiguity between rainfall and irrigation events, the S1 signal obtained at plot scale was used conjointly to S1 signal obtained at a grid scale (10 km × 10 km). Later, two mathematical transformations, including the principal component analysis (PCA) and the wavelet transformation (WT), were applied to the several SAR temporal series obtained in both VV and VH polarization. Irrigated areas were then classified using the principal component (PC) dimensions and the WT coefficients in two different random forest (RF) classifiers. Another classification approach using one dimensional convolutional neural network (CNN) was also performed on the obtained S1 temporal series. The results derived from the RF classifiers with S1 data show high overall accuracy using the PC values (90.7%) and the WT coefficients (89.1%). By applying the CNN approach on SAR data, a significant overall accuracy of 94.1% was obtained. The potential of optical images to map irrigated areas by the mean of a normalized differential vegetation index (NDVI) temporal series was also tested in this study in both the RF and the CNN approaches. The overall accuracy obtained using the NDVI in RF classifier reached 89.5% while that in the CNN reached 91.6%. The combined use of optical and radar data slightly enhanced the classification in the RF classifier but did not significantly change the accuracy obtained in the CNN approach using S1 data. Full article
Show Figures

Graphical abstract

Graphical abstract
Full article ">Figure 1
<p>(<b>a</b>) Location of the study site (in black), Catalonia Spain, (<b>b</b>) Sentinel-1 footprints over Catalonia used in the study, (<b>c</b>) digital elevation model (DEM) from shuttle radar topography mission (SRTM) data, (<b>d</b>) agricultural areas of Catalonia derived from geographical information system for agricultural parcels (SIGPAC) data. The hatched area represents the zone finally used for classification.</p>
Full article ">Figure 2
<p>Precipitation and temperature records for a local meteorological station in Tornabous of the interior plain of Catalonia, Spain.</p>
Full article ">Figure 3
<p>Distribution of the number of agricultural plots per class of crop type.</p>
Full article ">Figure 4
<p>Workflow overview using random forest (RF) and the convolutional neural network (CNN).</p>
Full article ">Figure 5
<p>Architecture of the one dimensional (1D) CNN model (CNN1D) used for classification of irrigated/non-irrigated plots using SAR and optical data.</p>
Full article ">Figure 6
<p>Temporal evolution of SAR backscattering coefficient σ° in VV polarization at plot scale (green curve) and 10 km grid scale (red curve) with precipitation data recorded at a local meteorological station for (<b>a</b>) the non-irrigated plot, (<b>b</b>) the irrigated plot.</p>
Full article ">Figure 7
<p>Scatter plot of a random sample of 2000 irrigated and 2000 non-irrigated plots using different combinations of important principal component (PC) variables. Irrigated plots are presented in blue and non-irrigated plots represented in red. (<b>a</b>) PC1 of <math display="inline"><semantics> <mrow> <msubsup> <mi mathvariant="sans-serif">σ</mi> <mrow> <mi mathvariant="normal">P</mi> <mo>,</mo> <mi>VV</mi> </mrow> <mn>0</mn> </msubsup> </mrow> </semantics></math> with PC1 of <math display="inline"><semantics> <mrow> <msubsup> <mi mathvariant="sans-serif">σ</mi> <mrow> <mi mathvariant="normal">P</mi> <mo>,</mo> <mi>VH</mi> </mrow> <mn>0</mn> </msubsup> </mrow> </semantics></math>, (<b>b</b>) PC16 of <math display="inline"><semantics> <mrow> <msubsup> <mi mathvariant="sans-serif">σ</mi> <mrow> <mi mathvariant="normal">P</mi> <mo>,</mo> <mi>VV</mi> </mrow> <mn>0</mn> </msubsup> </mrow> </semantics></math> with PC1 of <math display="inline"><semantics> <mrow> <msubsup> <mi mathvariant="sans-serif">σ</mi> <mrow> <mi mathvariant="normal">P</mi> <mo>,</mo> <mi>VH</mi> </mrow> <mn>0</mn> </msubsup> </mrow> </semantics></math>, (<b>c</b>) PC1 of <math display="inline"><semantics> <mrow> <msubsup> <mi mathvariant="sans-serif">σ</mi> <mrow> <mi mathvariant="normal">P</mi> <mo>,</mo> <mi>VH</mi> </mrow> <mn>0</mn> </msubsup> </mrow> </semantics></math> with PC16 of <math display="inline"><semantics> <mrow> <msubsup> <mi mathvariant="sans-serif">σ</mi> <mrow> <mi mathvariant="normal">P</mi> <mo>,</mo> <mi>VH</mi> </mrow> <mn>0</mn> </msubsup> </mrow> </semantics></math>, (<b>d</b>) PC1 of <math display="inline"><semantics> <mrow> <msubsup> <mi mathvariant="sans-serif">σ</mi> <mrow> <mi mathvariant="normal">P</mi> <mo>,</mo> <mi>VH</mi> </mrow> <mn>0</mn> </msubsup> </mrow> </semantics></math> with PC2 of <math display="inline"><semantics> <mrow> <mo>Δ</mo> <msubsup> <mi mathvariant="sans-serif">σ</mi> <mrow> <mi>PG</mi> <mo>,</mo> <mi>VV</mi> </mrow> <mn>0</mn> </msubsup> </mrow> </semantics></math>, (<b>e</b>) PC16 of <math display="inline"><semantics> <mrow> <msubsup> <mi mathvariant="sans-serif">σ</mi> <mrow> <mi mathvariant="normal">P</mi> <mo>,</mo> <mi>VH</mi> </mrow> <mn>0</mn> </msubsup> </mrow> </semantics></math> with PC2 of <math display="inline"><semantics> <mrow> <mo>Δ</mo> <msubsup> <mi mathvariant="sans-serif">σ</mi> <mrow> <mi>PG</mi> <mo>,</mo> <mi>VH</mi> </mrow> <mn>0</mn> </msubsup> <mo> </mo> </mrow> </semantics></math> and (<b>f</b>) PC5 of <math display="inline"><semantics> <mrow> <mo>Δ</mo> <msubsup> <mi mathvariant="sans-serif">σ</mi> <mrow> <mi>PG</mi> <mo>,</mo> <mi>VV</mi> </mrow> <mn>0</mn> </msubsup> </mrow> </semantics></math> with PC2 of <math display="inline"><semantics> <mrow> <mo>Δ</mo> <msubsup> <mi mathvariant="sans-serif">σ</mi> <mrow> <mi>PG</mi> <mo>,</mo> <mi>VV</mi> </mrow> <mn>0</mn> </msubsup> </mrow> </semantics></math>. <math display="inline"><semantics> <mrow> <msubsup> <mi mathvariant="sans-serif">σ</mi> <mrow> <mi>PG</mi> <mo>,</mo> <mi>VV</mi> </mrow> <mn>0</mn> </msubsup> <mo> </mo> </mrow> </semantics></math> = <math display="inline"><semantics> <mrow> <msubsup> <mi mathvariant="sans-serif">σ</mi> <mrow> <mi mathvariant="normal">P</mi> <mo>,</mo> <mi>VV</mi> </mrow> <mn>0</mn> </msubsup> </mrow> </semantics></math> − <math display="inline"><semantics> <mrow> <msubsup> <mi mathvariant="sans-serif">σ</mi> <mrow> <mi mathvariant="normal">G</mi> <mo>,</mo> <mi>VV</mi> </mrow> <mn>0</mn> </msubsup> </mrow> </semantics></math> and <math display="inline"><semantics> <mrow> <msubsup> <mi mathvariant="sans-serif">σ</mi> <mrow> <mi>PG</mi> <mo>,</mo> <mi>VH</mi> </mrow> <mn>0</mn> </msubsup> <mo> </mo> </mrow> </semantics></math> = <math display="inline"><semantics> <mrow> <msubsup> <mi mathvariant="sans-serif">σ</mi> <mrow> <mi mathvariant="normal">P</mi> <mo>,</mo> <mi>VH</mi> </mrow> <mn>0</mn> </msubsup> </mrow> </semantics></math> − <math display="inline"><semantics> <mrow> <msubsup> <mi mathvariant="sans-serif">σ</mi> <mrow> <mi mathvariant="normal">P</mi> <mo>,</mo> <mi>VH</mi> </mrow> <mn>0</mn> </msubsup> </mrow> </semantics></math>. “P” means plot scale and “G” means grid scale.</p>
Full article ">Figure 8
<p>Reconstruction of SAR signal in VV polarization at plot scale through the linear combinations of the ‘Haar’ wavelet coefficients using (<b>a</b>) 2 coefficients, (<b>b</b>) 4 coefficients, (<b>c</b>) 8 coefficients, (<b>d</b>) 16 coefficients, (<b>e</b>) 32 coefficients, and (<b>f</b>) 64 coefficients.</p>
Full article ">Figure 9
<p>Scatter plot of a random sample of 2000 irrigated and 2000 non-irrigated plots using different combinations of important wavelet transformation (WT) coefficients. Irrigated plots are presented in blue and non-irrigated plots represented in red (<b>a</b>) WC61 of <math display="inline"><semantics> <mrow> <msubsup> <mi mathvariant="sans-serif">σ</mi> <mrow> <mi mathvariant="normal">P</mi> <mo>,</mo> <mi>VH</mi> </mrow> <mn>0</mn> </msubsup> </mrow> </semantics></math> with WC62 of <math display="inline"><semantics> <mrow> <msubsup> <mi mathvariant="sans-serif">σ</mi> <mrow> <mi mathvariant="normal">P</mi> <mo>,</mo> <mi>VH</mi> </mrow> <mn>0</mn> </msubsup> </mrow> </semantics></math>, (<b>b</b>) WC62 of <math display="inline"><semantics> <mrow> <msubsup> <mi mathvariant="sans-serif">σ</mi> <mrow> <mi mathvariant="normal">P</mi> <mo>,</mo> <mi>VH</mi> </mrow> <mn>0</mn> </msubsup> </mrow> </semantics></math> with WC53 of <math display="inline"><semantics> <mrow> <mo>Δ</mo> <msubsup> <mi mathvariant="sans-serif">σ</mi> <mrow> <mi>PG</mi> <mo>,</mo> <mi>VH</mi> </mrow> <mn>0</mn> </msubsup> </mrow> </semantics></math>, (<b>c</b>) WC62 of <math display="inline"><semantics> <mrow> <msubsup> <mi mathvariant="sans-serif">σ</mi> <mrow> <mi mathvariant="normal">P</mi> <mo>,</mo> <mi>VH</mi> </mrow> <mn>0</mn> </msubsup> </mrow> </semantics></math> with WC53 of <math display="inline"><semantics> <mrow> <mo>Δ</mo> <msubsup> <mi mathvariant="sans-serif">σ</mi> <mrow> <mi>PG</mi> <mo>,</mo> <mi>VH</mi> </mrow> <mn>0</mn> </msubsup> </mrow> </semantics></math>. <math display="inline"><semantics> <mrow> <msubsup> <mi mathvariant="sans-serif">σ</mi> <mrow> <mi>PG</mi> <mo>,</mo> <mi>VV</mi> </mrow> <mn>0</mn> </msubsup> </mrow> </semantics></math> = <math display="inline"><semantics> <mrow> <msubsup> <mi mathvariant="sans-serif">σ</mi> <mrow> <mi mathvariant="normal">P</mi> <mo>,</mo> <mi>VV</mi> </mrow> <mn>0</mn> </msubsup> </mrow> </semantics></math> − <math display="inline"><semantics> <mrow> <msubsup> <mi mathvariant="sans-serif">σ</mi> <mrow> <mi mathvariant="normal">G</mi> <mo>,</mo> <mi>VV</mi> </mrow> <mn>0</mn> </msubsup> </mrow> </semantics></math> and <math display="inline"><semantics> <mrow> <msubsup> <mi mathvariant="sans-serif">σ</mi> <mrow> <mi>PG</mi> <mo>,</mo> <mi>VH</mi> </mrow> <mn>0</mn> </msubsup> </mrow> </semantics></math> = <math display="inline"><semantics> <mrow> <msubsup> <mi mathvariant="sans-serif">σ</mi> <mrow> <mi mathvariant="normal">P</mi> <mo>,</mo> <mi>VH</mi> </mrow> <mn>0</mn> </msubsup> </mrow> </semantics></math> − <math display="inline"><semantics> <mrow> <msubsup> <mi mathvariant="sans-serif">σ</mi> <mrow> <mi mathvariant="normal">P</mi> <mo>,</mo> <mi>VH</mi> </mrow> <mn>0</mn> </msubsup> </mrow> </semantics></math>. “P” means plot scale and “G” means grid scale and WC means wavelet coefficient.</p>
Full article ">Figure 10
<p>Irrigation mapping using the (<b>a</b>) WT-RF model, (<b>b</b>) PC-RF model and (<b>c</b>) the CNN model. Irrigated areas are presented in blue while non-irrigated areas are shown in red. A zoom version of the yellow box in each map is provided to better visualize different classification results.</p>
Full article ">Figure 11
<p>Comparison of accuracy indices between RF and CNN classifications in three different scenarios: (<b>a</b>) Using the S1 SAR data, (<b>b</b>) using S2 optical data and (<b>c</b>) using S1 SAR and S2 optical data.</p>
Full article ">
23 pages, 8806 KiB  
Article
Evaluation of Grass Quality under Different Soil Management Scenarios Using Remote Sensing Techniques
by Mohammad Sadegh Askari, Timothy McCarthy, Aidan Magee and Darren J. Murphy
Remote Sens. 2019, 11(15), 1835; https://doi.org/10.3390/rs11151835 - 6 Aug 2019
Cited by 42 | Viewed by 7070
Abstract
Hyperspectral and multispectral imagery have been demonstrated to have a considerable potential for near real-time monitoring and mapping of grass quality indicators. The objective of this study was to evaluate the efficiency of remote sensing techniques for quantification of aboveground grass biomass (BM) [...] Read more.
Hyperspectral and multispectral imagery have been demonstrated to have a considerable potential for near real-time monitoring and mapping of grass quality indicators. The objective of this study was to evaluate the efficiency of remote sensing techniques for quantification of aboveground grass biomass (BM) and crude protein (CP) in a temperate European climate such as Ireland. The experiment was conducted on 64 plots and 53 paddocks with varying quantities of nitrogen applied. Hyperspectral imagery (HSI) and multispectral imagery (MSI) were analyzed to develop the prediction models. The MSI data used in this study were captured using an unmanned aircraft vehicle (UAV) and the satellite Sentinel-2, while the HSI data were obtained using a handheld hyperspectral camera. The prediction models were developed using partial least squares regression (PLSR) and stepwise multi-linear regression (MLR). Eventually, the spatial distribution of grass biomass over plots and paddocks was mapped to assess the within-field variability of grass quality metrics. An excellent accuracy was achieved for the prediction of BM and CP using HSI (RPD > 2.5 and R2 > 0.8), and a good accuracy was obtained via MSI-UAV (2 < RPD < 2.5 and R2 > 0.7) for the grass quality indicators. The accuracy of the models calculated using MSI-Sentinel-2 was reasonable for BM prediction and insufficient for CP estimation. The red-edge range of the wavelengths showed the maximum impact on the predictability of grass BM, and the NIR range had the greatest influence on the estimation of grass CP. Both the PLSR and MLR techniques were found to be sufficiently robust for spectral modelling of aboveground BM and CP. The PLSR yielded a slightly better model than MLR. This study suggested that remote sensing techniques can be used as a rapid and reliable approach for near real-time quantitative assessment of fresh grass quality under a temperate European climate. Full article
(This article belongs to the Section Remote Sensing in Agriculture and Vegetation)
Show Figures

Figure 1

Figure 1
<p>Location of studied paddocks. The red boundary shows the paddocks studied in 2017 and the blue boundary illustrates the paddocks studied in 2018.</p>
Full article ">Figure 2
<p>Location of studied plots. The red boundary shows the plots used for the hyperspectral imagery (HSI) test and the black boundary shows the plots used for the multispectral imagery (MSI) test.</p>
Full article ">Figure 3
<p>The biomass models (BM) and crude protein models (CPM) calculated using partial least squares regression (PLSR) and stepwise multi-linear regression (MLR). HSI: hyperspectral imagery; MSI: multispectral imagery; Sat: Satellite; UAV: unmanned aircraft vehicle.</p>
Full article ">Figure 4
<p>Important wavelengths identified using the HSI dataset for predicting biomass (BM-1) and crude protein (CPM-1). Blue bars indicate significant wavelength with <span class="html-italic">p</span> &lt; 0.05.</p>
Full article ">Figure 5
<p>Important bands and spectral indices identified using the MSI-UAV dataset (BM-2) and MSI- Sentinel-2 dataset (BM-3) for predicting biomass. Blue bars indicate significant bands and indices with <span class="html-italic">p</span> &lt; 0.05.</p>
Full article ">Figure 6
<p>Important bands and spectral indices identified using the MSI-UAV dataset (CMP-2) and MSI-Sentinel-2 dataset (CMP-3) for predicting CP. Blue bars indicate significant bands and indices with <span class="html-italic">p</span> &lt; 0.05.</p>
Full article ">
18 pages, 3726 KiB  
Article
Comparison of Changes in Urban Land Use/Cover and Efficiency of Megaregions in China from 1980 to 2015
by Shu Zhang, Chuanglin Fang, Wenhui Kuang and Fengyun Sun
Remote Sens. 2019, 11(15), 1834; https://doi.org/10.3390/rs11151834 - 6 Aug 2019
Cited by 20 | Viewed by 4019
Abstract
Urban land use/cover and efficiency are important indicators of the degree of urbanization. However, research about comparing their changes at the megaregion level is relatively rare. In this study, we depicted the differences and inequalities of urban land and efficiency among megaregions in [...] Read more.
Urban land use/cover and efficiency are important indicators of the degree of urbanization. However, research about comparing their changes at the megaregion level is relatively rare. In this study, we depicted the differences and inequalities of urban land and efficiency among megaregions in China using China’s Land Use/cover Dataset (CLUD) and China’s Urban Land Use/cover Dataset (CLUD-Urban). Furthermore, we analyzed regional inequality using the Theil index. The results indicated that the Guangdong-Hong Kong-Macao Great Bay Area had the highest proportion of urban land (8.03%), while the Chengdu-Chongqing Megaregion had the highest proportion of developed land (64.70%). The proportion of urban impervious surface area was highest in the Guangdong-Hong Kong-Macao Great Bay Area (75.16%) and lowest in the Chengdu-Chongqing Megaregion (67.19%). Furthermore, the highest urban expansion occurred in the Yangtze River Delta (260.52 km2/a), and the fastest period was 2000–2010 (298.19 km2/a). The decreasing Theil index values for the urban population and economic density were 0.305 and 1.748, respectively, in 1980–2015. This study depicted the development trajectory of different megaregions, and will expect to provide a valuable insight and new knowledge on reasonable urban growth modes and sustainable goals in urban planning and management. Full article
Show Figures

Graphical abstract

Graphical abstract
Full article ">Figure 1
<p>Study area and the location of the megaregions.</p>
Full article ">Figure 2
<p>Flow chart on analysis of artificial construction intensity and urban land efficiency.</p>
Full article ">Figure 3
<p>Urban expansion in the five megaregions.</p>
Full article ">Figure 4
<p>The urban impervious density from CLUD-Urban for cities in 2015.</p>
Full article ">Figure 5
<p>Regional divergence of the urban land expansion.</p>
Full article ">Figure 6
<p>Changes in Theil index in different megaregions.</p>
Full article ">
20 pages, 3965 KiB  
Article
A Novel Coarse-to-Fine Scheme for Remote Sensing Image Registration Based on SIFT and Phase Correlation
by Han Yang, Xiaorun Li, Liaoying Zhao and Shuhan Chen
Remote Sens. 2019, 11(15), 1833; https://doi.org/10.3390/rs11151833 - 6 Aug 2019
Cited by 32 | Viewed by 4806
Abstract
Automatic image registration has been wildly used in remote sensing applications. However, the feature-based registration method is sometimes inaccurate and unstable for images with large scale difference, grayscale and texture differences. In this manuscript, a coarse-to-fine registration scheme is proposed, which combines the [...] Read more.
Automatic image registration has been wildly used in remote sensing applications. However, the feature-based registration method is sometimes inaccurate and unstable for images with large scale difference, grayscale and texture differences. In this manuscript, a coarse-to-fine registration scheme is proposed, which combines the advantage of feature-based registration and phase correlation-based registration. The scheme consists of four steps. First, feature-based registration method is adopted for coarse registration. A geometrical outlier removal method is applied to improve the accuracy of coarse registration, which uses geometric similarities of inliers. Then, the sensed image is modified through the coarse registration result under affine deformation model. After that, the modified sensed image is registered to the reference image by extended phase correlation. Lastly, the final registration results are calculated by the fusion of the coarse registration and the fine registration. High universality of feature-based registration and high accuracy of extended phase correlation-based registration are both preserved in the proposed method. Experimental results of several different remote sensing images, which come from several published image registration papers, demonstrate the high robustness and accuracy of the proposed method. The evaluation contains root mean square error (RMSE), Laplace mean square error (LMSE) and red–green image registration results. Full article
Show Figures

Graphical abstract

Graphical abstract
Full article ">Figure 1
<p>Overall workflow of the proposed method.</p>
Full article ">Figure 2
<p>Six pairs of original images: (<b>a</b>,<b>b</b>) Image Pair 1; (<b>c</b>,<b>d</b>) Image Pair 2; (<b>e</b>,<b>f</b>) Image Pair 3; (<b>g</b>,<b>h</b>) Image Pair 4; (<b>i</b>,<b>j</b>) Image Pair 5; and (<b>k</b>,<b>l</b>) Image Pair 6.</p>
Full article ">Figure 2 Cont.
<p>Six pairs of original images: (<b>a</b>,<b>b</b>) Image Pair 1; (<b>c</b>,<b>d</b>) Image Pair 2; (<b>e</b>,<b>f</b>) Image Pair 3; (<b>g</b>,<b>h</b>) Image Pair 4; (<b>i</b>,<b>j</b>) Image Pair 5; and (<b>k</b>,<b>l</b>) Image Pair 6.</p>
Full article ">Figure 3
<p>Six modified sensed images: (<b>a</b>) the modified sensed image of Image Pair 1; (<b>b</b>) the modified sensed image of Image Pair 2; (<b>c</b>) the modified sensed image of Image Pair 3; (<b>d</b>) the modified sensed image of Image Pair 4; (<b>e</b>) the modified sensed image of Image Pair 5; and (<b>f</b>) the modified sensed image of Image Pair 6.</p>
Full article ">Figure 4
<p>Image registration results: (<b>a</b>,<b>b</b>) correspondence to the registration result of Image Pair 1; (<b>a</b>) overlapped image before the registration; (<b>b</b>) the difference after the registration; (<b>c</b>,<b>d</b>) correspondence to the registration result of Image Pair 2; (<b>c</b>) overlapped image before the registration; (<b>d</b>) the difference after the registration; (<b>e</b>,<b>f</b>) correspondence to the registration result of Image Pair 3; (<b>e</b>) overlapped image before the registration; (<b>f</b>) the difference after the registration; (<b>g</b>,<b>h</b>) correspondence to the registration result of Image Pair 4; (<b>g</b>) overlapped image before the registration; (<b>h</b>) the difference after the registration; (<b>i</b>,<b>j</b>) correspondence to the registration result of Image Pair 5, (<b>i</b>) overlapped image before the registration; (<b>j</b>) the difference after the registration; (<b>k</b>,<b>l</b>) correspondence to the registration result of Image Pair 6; (<b>k</b>) overlapped image before the registration; and (<b>l</b>) the difference after the registration.</p>
Full article ">Figure 4 Cont.
<p>Image registration results: (<b>a</b>,<b>b</b>) correspondence to the registration result of Image Pair 1; (<b>a</b>) overlapped image before the registration; (<b>b</b>) the difference after the registration; (<b>c</b>,<b>d</b>) correspondence to the registration result of Image Pair 2; (<b>c</b>) overlapped image before the registration; (<b>d</b>) the difference after the registration; (<b>e</b>,<b>f</b>) correspondence to the registration result of Image Pair 3; (<b>e</b>) overlapped image before the registration; (<b>f</b>) the difference after the registration; (<b>g</b>,<b>h</b>) correspondence to the registration result of Image Pair 4; (<b>g</b>) overlapped image before the registration; (<b>h</b>) the difference after the registration; (<b>i</b>,<b>j</b>) correspondence to the registration result of Image Pair 5, (<b>i</b>) overlapped image before the registration; (<b>j</b>) the difference after the registration; (<b>k</b>,<b>l</b>) correspondence to the registration result of Image Pair 6; (<b>k</b>) overlapped image before the registration; and (<b>l</b>) the difference after the registration.</p>
Full article ">Figure 5
<p>Checkerboard mosaicked image of Image Pair 4: (<b>a</b>) correspondence to SIFT-RANSAC; (<b>b</b>) correspondence to SIFT-GSM; (<b>c</b>) correspondence to SIFT-RANSAC-PC; (<b>d</b>) correspondence to RIRMI; and (<b>e</b>) correspondence to the proposed method.</p>
Full article ">
24 pages, 3590 KiB  
Article
Vegetation and Soil Fire Damage Analysis Based on Species Distribution Modeling Trained with Multispectral Satellite Data
by Carmen Quintano, Alfonso Fernández-Manso, Leonor Calvo and Dar A. Roberts
Remote Sens. 2019, 11(15), 1832; https://doi.org/10.3390/rs11151832 - 6 Aug 2019
Cited by 23 | Viewed by 4677
Abstract
Forest managers demand reliable tools to evaluate post-fire vegetation and soil damage. In this study, we quantify wildfire damage to vegetation and soil based on the analysis of burn severity, using multitemporal and multispectral satellite data and species distribution models, particularly maximum entropy [...] Read more.
Forest managers demand reliable tools to evaluate post-fire vegetation and soil damage. In this study, we quantify wildfire damage to vegetation and soil based on the analysis of burn severity, using multitemporal and multispectral satellite data and species distribution models, particularly maximum entropy (MaxEnt). We studied a mega-wildfire (9000 ha burned) in North-Western Spain, which occurred from 21 to 27 August 2017. Burn severity was measured in the field using the composite burn index (CBI). Burn severity of vegetation and soil layers (CBIveg and CBIsoil) was also differentiated. MaxEnt provided the relative contribution of each pre-fire and post-fire input variable on low, moderate and high burn severity levels, as well as on all severity levels combined (burned area). In addition, it built continuous suitability surfaces from which the burned surface area and burn severity maps were built. The burned area map achieved a high accuracy level (κ = 0.85), but slightly lower accuracy when differentiating the three burn severity classes (κ = 0.81). When the burn severity map was validated using field CBIveg and CBIsoil values we reached lower κ statistic values (0.76 and 0.63, respectively). This study revealed the effectiveness of the proposed multi-temporal MaxEnt based method to map fire damage accurately in Mediterranean ecosystems, providing key information to forest managers. Full article
(This article belongs to the Section Forest Remote Sensing)
Show Figures

Figure 1

Figure 1
<p>(<b>a</b>) Situation of the study area using the 2 September 2017 Sentinel-2 MultiSpectral Instrument color composition RGB: 11:8a:12; (<b>b</b>) topographic map (m); (<b>c</b>) slope map (°); (<b>d</b>) aspect map (°); (<b>e</b>) annual average temperature map (°C); (<b>f</b>) annual rainfall map (mm/year) and (<b>g</b>) annual radiation (GJ/m2/year).</p>
Full article ">Figure 2
<p>Examples of field plots showing the CBI<sub>soil</sub> (brown), CBI<sub>veg</sub> (green) and CBI (orange) values. Upper row: High burn severity level; central row: Moderate burn severity level; lower row: Low burn severity level; left column: Broomlands; central column: Heathlands; left column: Oak woodlands.</p>
Full article ">Figure 3
<p>Flowchart of the proposed method to evaluate fire damage.</p>
Full article ">Figure 4
<p>Some spectra from the definitive post-fire spectral libraries. (<b>a</b>) Char spectral library, (<b>b</b>) green vegetation spectral library; with the most representative woody species; (<b>c</b>) non-photosynthetic vegetation (non irrigated lands) and soil spectral library (bare soil, open mine, urban areas, firebreaks and rock).</p>
Full article ">Figure 5
<p>Covariates of MaxEnt modeling. (<b>a</b>) Pre-fire shade normalized non-photosynthetic vegetation fraction image (pre_NPV); (<b>b</b>) pre-fire shade normalized green vegetation fraction image (pre_GV); (<b>c</b>) pre-fire shade normalized soil fraction image (pre_soil); (<b>d</b>) post-fire shade normalized char fraction image (post_char) and (<b>e</b>) post-fire land surface temperature image (post_LST).</p>
Full article ">Figure 6
<p>MaxEnt continuous outputs for high, moderate and low burn severity target classes and burn severity map. (<b>a</b>) Target class: High burn severity; (<b>b</b>) target class: Moderate burn severity; (<b>c</b>) target class: Low burn severity; (<b>d</b>) total burn severity map.</p>
Full article ">Figure 7
<p>(<b>a</b>) MaxEnt continuous outputs for the burned area target class and (<b>b</b>) burned area map.</p>
Full article ">
26 pages, 26740 KiB  
Article
PolSAR Image Classification via Learned Superpixels and QCNN Integrating Color Features
by Xinzheng Zhang, Jili Xia, Xiaoheng Tan, Xichuan Zhou and Tao Wang
Remote Sens. 2019, 11(15), 1831; https://doi.org/10.3390/rs11151831 - 6 Aug 2019
Cited by 19 | Viewed by 4722
Abstract
Polarimetric synthetic aperture radar (PolSAR) image classification plays an important role in various PolSAR image application. And many pixel-wise, region-based classification methods have been proposed for PolSAR images. However, most of the pixel-wise methods can not model local spatial relationship of pixels due [...] Read more.
Polarimetric synthetic aperture radar (PolSAR) image classification plays an important role in various PolSAR image application. And many pixel-wise, region-based classification methods have been proposed for PolSAR images. However, most of the pixel-wise methods can not model local spatial relationship of pixels due to negative effects of speckle noise, and most of the region-based methods fail to figure out the regions with the similar polarimetric features. Considering that color features can provide good visual expression and perform well for image interpretation, in this work, based on the PolSAR pseudo-color image over Pauli decomposition, we propose a supervised PolSAR image classification approach combining learned superpixels and quaternion convolutional neural network (QCNN). First, the PolSAR RGB pseudo-color image is formed under Pauli decomposition. Second, we train QCNN with quaternion PolSAR data converted by RGB channels to extract deep color features and obtain pixel-wise classification map. QCNN treats color channels as a quaternion matrix excavating the relationship among the color channels effectively and avoiding information loss. Third, pixel affinity network (PAN) is utilized to generate the learned superpixels of PolSAR pseudo-color image. The learned superpixels allow the local information exploitation available in the presence of speckle noise. Finally, we fuse the pixel-wise classification result and superpixels to acquire the ultimate pixel-wise PolSAR image classification map. Experiments on three real PolSAR data sets show that the proposed approach can obtain 96.56%, 95.59%, and 92.55% accuracy for Flevoland, San Francisco and Oberpfaffenhofen data set, respectively. And compared with state-of-the-art PolSAR image classification methods, the proposed algorithm can obtained competitive classification results. Full article
(This article belongs to the Section Remote Sensing Image Processing)
Show Figures

Graphical abstract

Graphical abstract
Full article ">Figure 1
<p>The pipline of the proposed approach. The idea of nearest neighbor (NN) is used to fuse the superpixel map and pixel-wise classification result, which will be described in <a href="#sec2dot3dot2-remotesensing-11-01831" class="html-sec">Section 2.3.2</a> in detail.</p>
Full article ">Figure 2
<p>The framework of the QCNN.</p>
Full article ">Figure 3
<p>The framework of the PAN. The PAN is utilized to predict horizontal and vertical affinities at every pixel. The gray area in the upper right corner shows the standard structure of a ResBlock [<a href="#B49-remotesensing-11-01831" class="html-bibr">49</a>].</p>
Full article ">Figure 4
<p>(<b>a</b>) The Pauli color-code image of Flevoland data set. (<b>b</b>) The corresponding ground truth. (<b>c</b>) The legend of the ground truth.</p>
Full article ">Figure 5
<p>(<b>a</b>) The Pauli color-code image of San Francisco data set. (<b>b</b>) The corresponding ground truth. (<b>c</b>) The legend of the ground truth.</p>
Full article ">Figure 6
<p>(<b>a</b>) The Pauli color-code image of Oberpfaffenhofen data set. (<b>b</b>) The corresponding ground truth. (<b>c</b>) The legend of the ground truth.</p>
Full article ">Figure 7
<p>(<b>a</b>) The selected local region of Flevoland data set. (<b>b</b>) The selected local region of San Francisco data set. (<b>c</b>) The selected local region of Oberpfaffenhofen data set. (<b>d</b>) The ground truth of (<b>a</b>). (<b>e</b>) The ground truth of (<b>b</b>). (<b>f</b>) The ground truth of (<b>c</b>).</p>
Full article ">Figure 8
<p>(<b>a</b>) The color image of Foulum data set which is used to get sampled data sets. (<b>b</b>) The corresponding segmentation ground truth.</p>
Full article ">Figure 9
<p>(<b>a</b>) The achievable segmentation accuracy under different iterations on Flevoland data set. (<b>b</b>) The boundary recall under different iterations on Flevoland data set. The subscript of <math display="inline"><semantics> <mrow> <msub> <mi mathvariant="normal">F</mi> <mi mathvariant="normal">N</mi> </msub> <mo>_</mo> <mi>PAN</mi> </mrow> </semantics></math> represents epoches.</p>
Full article ">Figure 10
<p>The visual superpixel generation maps. (<b>a</b>) The superpixel generation map and partial enlarged details obtained by SLIC on Flevoland data set. (<b>b</b>) The superpixel generation map and partial enlarged details obtained by SEEDs on Flevoland data set. (<b>c</b>) The superpixel generation map and partial enlarged details obtained by PAN on Flevoland data set. (<b>d</b>) The superpixel generation map and partial enlarged details obtained by SLIC on San Francisco data set. (<b>e</b>) The superpixel generation map and partial enlarged details obtained by SEEDs on San Francisco data set. (<b>f</b>) The superpixel generation map and partial enlarged details obtained by PAN on San Francisco data set. (<b>g</b>) The superpixel generation map and partial enlarged details obtained by SLIC on Oberpfaffenhofen data set. (<b>h</b>) The superpixel generation map and partial enlarged details obtained by SEEDs on Oberpfaffenhofen data set. (<b>i</b>) The superpixel generation map and partial enlarged details obtained by PAN on Oberpfaffenhofen data set.</p>
Full article ">Figure 11
<p>The curve of achievable segmentation accuracy and boundary recall. (<b>a</b>) The achievable segmentation accuracy on Flevoland data set. (<b>b</b>) The boundary recall on Flevoland data set. (<b>c</b>) The achievable segmentation accuracy on San Francisco data set. (<b>d</b>) The boundary recall on San Francisco data set. (<b>e</b>) The achievable segmentation accuracy on Oberpfaffenhofen data set. (<b>f</b>) The boundary recall on Oberpfaffenhofen data set.</p>
Full article ">Figure 12
<p>The visual classification results by CNN and QCNN on Flevoland data set. (<b>a</b>) The classification result obtained by CNN. (<b>b</b>) The result map (<b>a</b>) overlaid with the ground truth map. (<b>c</b>) The classification result obtained by QCNN. (<b>d</b>) The result map (<b>c</b>) overlaid with the ground truth map.</p>
Full article ">Figure 13
<p>The classification accuracies by CNN and QCNN on Flevoland data set.</p>
Full article ">Figure 14
<p>The visual classification results by CNN and QCNN on San Francisco data set. (<b>a</b>) The classification results by CNN. (<b>b</b>) The classification result by QCNN.</p>
Full article ">Figure 15
<p>The classification accuracies by CNN and QCNN on San Francisco data set.</p>
Full article ">Figure 16
<p>The visual classification results by CNN and QCNN on Oberpfaffenhofen data set. (<b>a</b>) The classification result obtained by CNN. (<b>b</b>) The result map (<b>a</b>) overlaids with the ground truth map. (<b>c</b>) The classification result obtained by QCNN. (<b>d</b>) The result map (<b>c</b>) overlaids with the ground truth map.</p>
Full article ">Figure 17
<p>The classification accuracies by CNN and QCNN on Oberpfaffenhofen data set.</p>
Full article ">Figure 18
<p>Accuracy curve of Flevoland, San Francisco and Oberpafaffenhofen data sets under different numbers of superpixels.</p>
Full article ">Figure 19
<p>The visual classification results on Flevoland data set. (<b>a</b>) The classification map by QSLIC. (<b>b</b>) The classification map by QSEEDs. (<b>c</b>) The classification map by QPAN. (<b>d</b>) The classification map (<b>a</b>) overlaid with the ground truth map. (<b>e</b>) The classification map (<b>b</b>) overlaid with the ground truth map. (<b>f</b>) The classification map (<b>c</b>) overlaid with the ground truth map.</p>
Full article ">Figure 20
<p>The visual classification results on San Francisco data set. (<b>a</b>) The classification map by QSLIC. (<b>b</b>) The classification map by QSEEDs. (<b>c</b>) The classification map by QPAN.</p>
Full article ">Figure 21
<p>he visual classification results on Oberpfaffenhofen data set. (<b>a</b>) The classification map by QSLIC. (<b>b</b>) The classification map by QSEEDs. (<b>c</b>) The classification map by QPAN. (<b>d</b>) The classification map (<b>a</b>) overlaid with the ground truth map. (<b>e</b>) The classification map (<b>b</b>) overlaid with the ground truth map. (<b>f</b>) The classification map (<b>c</b>) overlaid with the ground truth map.</p>
Full article ">
16 pages, 3278 KiB  
Technical Note
Leveraging Commercial High-Resolution Multispectral Satellite and Multibeam Sonar Data to Estimate Bathymetry: The Case Study of the Caribbean Sea
by Samuel Pike, Dimosthenis Traganos, Dimitris Poursanidis, Jamie Williams, Katie Medcalf, Peter Reinartz and Nektarios Chrysoulakis
Remote Sens. 2019, 11(15), 1830; https://doi.org/10.3390/rs11151830 - 6 Aug 2019
Cited by 27 | Viewed by 6378
Abstract
The global coastal seascape offers a multitude of ecosystem functions and services to the natural and human-induced ecosystems. However, the current anthropogenic global warming above pre-industrial levels is inducing the degradation of seascape health with adverse impacts on biodiversity, economy, and societies. Bathymetric [...] Read more.
The global coastal seascape offers a multitude of ecosystem functions and services to the natural and human-induced ecosystems. However, the current anthropogenic global warming above pre-industrial levels is inducing the degradation of seascape health with adverse impacts on biodiversity, economy, and societies. Bathymetric knowledge empowers our scientific, financial, and ecological understanding of the associated benefits, processes, and pressures to the coastal seascape. Here we leverage two commercial high-resolution multispectral satellite images of the Pleiades and two multibeam survey datasets to measure bathymetry in two zones (0–10 m and 10–30 m) in the tropical Anguilla and British Virgin Islands, northeast Caribbean. A methodological framework featuring a combination of an empirical linear transformation, cloud masking, sun-glint correction, and pseudo-invariant features allows spatially independent calibration and test of our satellite-derived bathymetry approach. The best R2 and RMSE for training and validation vary between 0.44–0.56 and 1.39–1.76 m, respectively, while minimum vertical errors are less than 1 m in the depth ranges of 7.8–10 and 11.6–18.4 m for the two explored zones. Given available field data, the present methodology could provide simple, time-efficient, and accurate spatio-temporal satellite-derived bathymetry intelligence in scientific and commercial tasks i.e., navigation, coastal habitat mapping and resource management, and reducing natural hazards. Full article
(This article belongs to the Special Issue Satellite Derived Bathymetry)
Show Figures

Graphical abstract

Graphical abstract
Full article ">Figure 1
<p>Area of interest for the study sites; (<b>a</b>) the Leeward Islands of the Caribbean; (<b>b</b>) the British Virgin Islands, and (<b>c</b>) Anguilla. The bounding boxes indicate the geographic corners of the satellite imagery. The dotted extent represents the boundary of in situ data available.</p>
Full article ">Figure 2
<p>Location of pseudo-invariant features (right column panels) for (<b>a</b>) Anguilla and (<b>b</b>) the British Virgin Islands.</p>
Full article ">Figure 3
<p>Pre-processed Pleiades imagery and satellite-derived bathymetry (SDB) outputs for Anguilla (<b>a</b>–<b>c</b>) and the British Virgin Islands (<b>d</b>–<b>f</b>). The pre-processed Pleiades images, including cloud and terrestrial masking, sun-glint correction, pseudo-invariant features, and low pass 3 × 3 filter (<b>a</b>,<b>d</b>); the SDB outputs, trained on in situ data at depths of 0–10 m (<b>b</b>,<b>e</b>); the SDB outputs trained on in situ data at depths of 10–30 m (<b>c</b>,<b>f</b>).</p>
Full article ">Figure 4
<p>Validation plots of in situ multibeam echosounders survey data (<span class="html-italic">x</span>-axis) and predicted depth from the modelled satellite-derived bathymetry (<span class="html-italic">y</span>-axis) models in Anguilla (<b>a</b>,<b>b</b>) and the British Virgin Islands (<b>c</b>,<b>d</b>) for: The depth ranges of 0–10 m (<b>a</b>,<b>c</b>) and 10–30 m (<b>b</b>,<b>d</b>).</p>
Full article ">Figure 5
<p>In situ multibeam echosounder (MBES) survey data and vertical error of the satellite-derived bathymetry (SDB) models for Anguilla (<b>a</b>–<b>c</b>) and the British Virgin Islands (<b>d</b>–<b>f</b>). The MBES survey data, resampled to 2 m where necessary (<b>a</b>,<b>d</b>); the vertical accuracy of the SDB models trained to depths of 0–10 m (<b>b</b>,<b>e</b>); the vertical accuracy of the SDB models trained to depths of 10–30 m (<b>c</b>,<b>f</b>). Note that high error values indicate the positive difference between the SDB and the MBES, and therefore both over- and under- estimation.</p>
Full article ">Figure 6
<p>In situ multibeam echosounder (MBES) survey data and absolute vertical error of the satellite-derived bathymetry (SDB) models for Anguilla (<b>a</b>,<b>b</b>) and the British Virgin Islands (<b>c</b>,<b>d</b>).</p>
Full article ">
32 pages, 9404 KiB  
Article
Random Noise Suppression of Magnetic Resonance Sounding Data with Intensive Sampling Sparse Reconstruction and Kernel Regression Estimation
by Xiaokang Yao, Jianmin Zhang, Zhenyang Yu, Fa Zhao and Yong Sun
Remote Sens. 2019, 11(15), 1829; https://doi.org/10.3390/rs11151829 - 5 Aug 2019
Cited by 6 | Viewed by 3773
Abstract
The magnetic resonance sounding (MRS) method is a non-invasive, efficient and advanced geophysical method for groundwater detection. However, the MRS signal received by the coil sensor is extremely susceptible to electromagnetic noise interference. In MRS data processing, random noise suppression of noisy MRS [...] Read more.
The magnetic resonance sounding (MRS) method is a non-invasive, efficient and advanced geophysical method for groundwater detection. However, the MRS signal received by the coil sensor is extremely susceptible to electromagnetic noise interference. In MRS data processing, random noise suppression of noisy MRS data is an important research aspect. We propose an approach for intensive sampling sparse reconstruction (ISSR) and kernel regression estimation (KRE) to suppress random noise. The approach is based on variable frequency sampling, numerical integration and statistical signal processing combined with kernel regression estimation. In order to realize the approach, we proposed three specific sparse reconstructions, namely rectangular sparse reconstruction, trapezoidal sparse reconstruction and Simpson sparse reconstruction. To solve the distortion of peaks and valleys after sparse reconstruction, we introduced the KRE to deal with the processed data by the ISSR. Further, the simulation and field experiments demonstrate that the ISSR-KRE approach is a feasible and effective way to suppress random noise. Besides, we find that rectangular sparse reconstruction and trapezoidal sparse reconstruction are superior to Simpson sparse reconstruction in terms of noise suppression effect, and sampling frequency is positively correlated with signal-to-noise improvement ratio (SNIR). In one case of field experiment, the standard deviation of noisy MRS data was reduced from 1200.80 nV to 570.01 nV by the ISSR-KRE approach. The proposed approach provides theoretical support for random noise suppression and contributes to the development of MRS instrument with low power consumption and high efficiency. In the future, we will integrate the approach into MRS instrument and attempt to utilize them to eliminate harmonic noise from power line. Full article
(This article belongs to the Special Issue Recent Advances in Subsurface Sensing Technologies)
Show Figures

Graphical abstract

Graphical abstract
Full article ">Figure 1
<p>Schematic diagram for detection of magnetic resonance sounding (MRS) signal.</p>
Full article ">Figure 2
<p>Flowchart for implementation of the intensive sampling sparse reconstruction and kernel regression estimation (ISSR-KRE) approach.</p>
Full article ">Figure 3
<p>Schematic diagram of intensive sampling sparse reconstruction.</p>
Full article ">Figure 4
<p>A data fragment of intensive sampling in <math display="inline"><semantics> <mrow> <mo stretchy="false">[</mo> <msub> <mi>t</mi> <mrow> <mi>a</mi> <mi>j</mi> </mrow> </msub> <mo>,</mo> <msub> <mi>t</mi> <mrow> <mi>b</mi> <mi>j</mi> </mrow> </msub> <mo stretchy="false">]</mo> </mrow> </semantics></math>.</p>
Full article ">Figure 5
<p>Simulation case: Comparison of suppressing random noise effect of rectangular sparse reconstruction by different sampling frequencies <math display="inline"><semantics> <mrow> <msub> <mi>f</mi> <mrow> <msub> <mi>s</mi> <mi>H</mi> </msub> </mrow> </msub> </mrow> </semantics></math>. Time-series and spectra of the sampling frequencies 8<math display="inline"><semantics> <mrow> <msub> <mi>f</mi> <mrow> <mi>p</mi> <mi>r</mi> <mi>o</mi> <mi>p</mi> </mrow> </msub> </mrow> </semantics></math>, 16 <math display="inline"><semantics> <mrow> <msub> <mi>f</mi> <mrow> <mi>p</mi> <mi>r</mi> <mi>o</mi> <mi>p</mi> </mrow> </msub> </mrow> </semantics></math>, 32 <math display="inline"><semantics> <mrow> <msub> <mi>f</mi> <mrow> <mi>p</mi> <mi>r</mi> <mi>o</mi> <mi>p</mi> </mrow> </msub> </mrow> </semantics></math>, and 64 <math display="inline"><semantics> <mrow> <msub> <mi>f</mi> <mrow> <mi>p</mi> <mi>r</mi> <mi>o</mi> <mi>p</mi> </mrow> </msub> </mrow> </semantics></math>, respectively.</p>
Full article ">Figure 6
<p>The curves of signal to noise ratio (SNR), signal-to-noise improvement ratio (SNIR), mean squared error (MSE) with increasing sampling frequency by rectangular sparse reconstruction.</p>
Full article ">Figure 7
<p>Comparison of three sparse reconstruction methods (rectangular sparse reconstruction, trapezoidal sparse reconstruction and Simpson sparse reconstruction): (<b>a</b>) SNR comparison, (<b>b</b>) SNIR comparison, (<b>c</b>) MSE comparison.</p>
Full article ">Figure 7 Cont.
<p>Comparison of three sparse reconstruction methods (rectangular sparse reconstruction, trapezoidal sparse reconstruction and Simpson sparse reconstruction): (<b>a</b>) SNR comparison, (<b>b</b>) SNIR comparison, (<b>c</b>) MSE comparison.</p>
Full article ">Figure 8
<p>Comparison of rectangular sparse reconstruction, trapezoidal sparse reconstruction and Simpson sparse reconstruction for processing signals. (<b>a</b>) Time domain, (<b>b</b>) frequency domain.</p>
Full article ">Figure 9
<p>Different <math display="inline"><semantics> <mrow> <msub> <mi>f</mi> <mrow> <mi>p</mi> <mi>r</mi> <mi>o</mi> <mi>p</mi> </mrow> </msub> </mrow> </semantics></math> effect on sparse reconstruction. The first row shows the comparison of SNR in different <math display="inline"><semantics> <mrow> <msub> <mi>f</mi> <mrow> <mi>p</mi> <mi>r</mi> <mi>o</mi> <mi>p</mi> </mrow> </msub> </mrow> </semantics></math> cases, respectively. The second row shows the comparison of SNIR and MSE in different <math display="inline"><semantics> <mrow> <msub> <mi>f</mi> <mrow> <mi>p</mi> <mi>r</mi> <mi>o</mi> <mi>p</mi> </mrow> </msub> </mrow> </semantics></math> cases, respectively.</p>
Full article ">Figure 10
<p>Simulation case: Time-series and spectra of 8, 16, 32, 64 stacks at one pulse moment, respectively.</p>
Full article ">Figure 11
<p>The curves of SNR, SNIR and MSE with increasing stacking times.</p>
Full article ">Figure 12
<p>The effect of kernel regression parameters on estimation results. (<b>a</b>) The effect of window size. (<b>b</b>) The effect of the smoothing factor <span class="html-italic">h.</span></p>
Full article ">Figure 13
<p>Waveform comparison of the data processed by the local quadratic estimator (window size 19 and smoothing factor <math display="inline"><semantics> <mrow> <mi>h</mi> <mo>=</mo> <mn>5</mn> <mo>/</mo> <msub> <mi>f</mi> <mrow> <mi>p</mi> <mi>r</mi> <mi>o</mi> <mi>p</mi> </mrow> </msub> </mrow> </semantics></math> ). (<b>a</b>) Comparison in time domain. (<b>b</b>) Comparison in frequency domain. The green lines display the data by the rectangular sparse reconstruction method. The red lines display the ideal signal. The blue lines display the data by the rectangular sparse reconstruction method and the local quadratic kernel regression estimation. In other words, the blue lines display the ISSR-KRE data.</p>
Full article ">Figure 14
<p>Comparison of traditional low frequency sampling and the ISSR method. The gray lines display traditional low frequency sampling recording. The blue lines display the data processed by the ISSR method. (<b>a</b>) Comparison in time domain. (<b>b</b>) Comparison in frequency domain.</p>
Full article ">Figure 14 Cont.
<p>Comparison of traditional low frequency sampling and the ISSR method. The gray lines display traditional low frequency sampling recording. The blue lines display the data processed by the ISSR method. (<b>a</b>) Comparison in time domain. (<b>b</b>) Comparison in frequency domain.</p>
Full article ">Figure 15
<p>Field experiment location and the MRS instrument.</p>
Full article ">Figure 16
<p>Comparison of low frequency sampling result and the ISSR method results. The green lines display low frequency sampling recording (its sampling frequency is 8330 Hz) and the standard deviation is 1200.80 nV. The blue lines display the data by the ISSR method (its sampling frequency is 50 kHz, <math display="inline"><semantics> <mrow> <msub> <mi>f</mi> <mrow> <mi>p</mi> <mi>r</mi> <mi>o</mi> <mi>p</mi> </mrow> </msub> </mrow> </semantics></math> = 8333.3 Hz) and the standard deviation is 570.01 nV. (<b>a</b>) Comparison in time domain. (<b>b</b>) Comparison in frequency domain.</p>
Full article ">Figure 16 Cont.
<p>Comparison of low frequency sampling result and the ISSR method results. The green lines display low frequency sampling recording (its sampling frequency is 8330 Hz) and the standard deviation is 1200.80 nV. The blue lines display the data by the ISSR method (its sampling frequency is 50 kHz, <math display="inline"><semantics> <mrow> <msub> <mi>f</mi> <mrow> <mi>p</mi> <mi>r</mi> <mi>o</mi> <mi>p</mi> </mrow> </msub> </mrow> </semantics></math> = 8333.3 Hz) and the standard deviation is 570.01 nV. (<b>a</b>) Comparison in time domain. (<b>b</b>) Comparison in frequency domain.</p>
Full article ">
19 pages, 5427 KiB  
Article
Spatio–temporal Assessment of Drought in Ethiopia and the Impact of Recent Intense Droughts
by Yuei-An Liou and Getachew Mehabie Mulualem
Remote Sens. 2019, 11(15), 1828; https://doi.org/10.3390/rs11151828 - 5 Aug 2019
Cited by 77 | Viewed by 9978
Abstract
The recent droughts that have occurred in different parts of Ethiopia are generally linked to fluctuations in atmospheric and ocean circulations. Understanding these large-scale phenomena that play a crucial role in vegetation productivity in Ethiopia is important. In view of this, several techniques [...] Read more.
The recent droughts that have occurred in different parts of Ethiopia are generally linked to fluctuations in atmospheric and ocean circulations. Understanding these large-scale phenomena that play a crucial role in vegetation productivity in Ethiopia is important. In view of this, several techniques and datasets were analyzed to study the spatio–temporal variability of vegetation in response to a changing climate. In this study, 18 years (2001–2018) of Moderate Resolution Imaging Spectroscopy (MODIS) Terra/Aqua, normalized difference vegetation index (NDVI), land surface temperature (LST), Climate Hazards Group Infrared Precipitation with Stations (CHIRPS) daily precipitation, and the Famine Early Warning Systems Network (FEWS NET) Land Data Assimilation System (FLDAS) soil moisture datasets were processed. Pixel-based Mann–Kendall trend analysis and the Vegetation Condition Index (VCI) were used to assess the drought patterns during the cropping season. Results indicate that the central highlands and northwestern part of Ethiopia, which have land cover dominated by cropland, had experienced decreasing precipitation and NDVI trends. About 52.8% of the pixels showed a decreasing precipitation trend, of which the significant decreasing trends focused on the central and low land areas. Also, 41.67% of the pixels showed a decreasing NDVI trend, especially in major parts of the northwestern region of Ethiopia. Based on the trend test and VCI analysis, significant countrywide droughts occurred during the El Niño 2009 and 2015 years. Furthermore, the Pearson correlation coefficient analysis assures that the low NDVI was mainly attributed to the low precipitation and water availability in the soils. This study provides valuable information in identifying the locations with the potential concern of drought and planning for immediate action of relief measures. Furthermore, this paper presents the results of the first attempt to apply a recently developed index, the Normalized Difference Latent Heat Index (NDLI), to monitor drought conditions. The results show that the NDLI has a high correlation with NDVI (r = 0.96), precipitation (r = 0.81), soil moisture (r = 0.73), and LST (r = −0.67). NDLI successfully captures the historical droughts and shows a notable correlation with the climatic variables. The analysis shows that using the radiances of green, red, and short wave infrared (SWIR), a simplified crop monitoring model with satisfactory accuracy and easiness can be developed. Full article
Show Figures

Graphical abstract

Graphical abstract
Full article ">Figure 1
<p>Location of the study area: the administrative boundary of Ethiopia, constituting the nine regional states, with a background showing an Advanced Spaceborne Thermal Emission and Reflection Radiometer digital elevation model of 30 m resolution.</p>
Full article ">Figure 2
<p>Long-term seasonal average of rainfall (mm), land surface temperature (LST, °C), normalized difference vegetation index (NDVI), and soil moisture (m<sup>3</sup>m<sup>−3</sup>) for the period from 2001 to 2018.</p>
Full article ">Figure 3
<p>Land cover map of Ethiopia at 20 m spatial resolution during 2016, extracted from the European Space Agency.</p>
Full article ">Figure 4
<p>Standardized seasonal precipitation, LST, NDVI, and soil moisture anomalies from the 2001–2014 climatology averaged over June–September.</p>
Full article ">Figure 5
<p>The spatio–temporal variability droughts detected by the NDVI-based vegetation condition. index for the growing season in Ethiopia for the period 2001 to 2018.</p>
Full article ">Figure 6
<p>Spatial and temporal trends of seasonal precipitation, and NDVI in Ethiopia from 2001 to 2018. Positive slope values indicate an increasing monotonic trend, while negative slope values indicate a decreasing monotonic trend.</p>
Full article ">Figure 7
<p>Spatial and temporal trends of seasonal LST and soil moisture in Ethiopia from 2001 to 2018. Positive slope values indicate an increasing monotonic trend, while negative slope values indicate a decreasing monotonic trend.</p>
Full article ">Figure 8
<p>The monthly mean anomaly time series values of (<b>a</b>) NDVI and soil moisture, (<b>b</b>) precipitation and LST, and (<b>c</b>) NDLI and NDWI for 38E–39E, 9N–10N.</p>
Full article ">Figure 9
<p>The heat map of Pearson correlation coefficients for NDVI, Precipitation, LST, soil moisture, NDLI, NDWI, MEI, and DMI.</p>
Full article ">
16 pages, 6231 KiB  
Article
Remote Sensing of Explosives-Induced Stress in Plants: Hyperspectral Imaging Analysis for Remote Detection of Unexploded Threats
by Paul V. Manley, Vasit Sagan, Felix B. Fritschi and Joel G. Burken
Remote Sens. 2019, 11(15), 1827; https://doi.org/10.3390/rs11151827 - 5 Aug 2019
Cited by 15 | Viewed by 6551
Abstract
Explosives contaminate millions of hectares from various sources (partial detonations, improper storage, and release from production and transport) that can be life-threatening, e.g., landmines and unexploded ordnance. Exposure to and uptake of explosives can also negatively impact plant health, and these factors can [...] Read more.
Explosives contaminate millions of hectares from various sources (partial detonations, improper storage, and release from production and transport) that can be life-threatening, e.g., landmines and unexploded ordnance. Exposure to and uptake of explosives can also negatively impact plant health, and these factors can be can be remotely sensed. Stress induction was remotely sensed via a whole-plant hyperspectral imaging system as two genotypes of Zea mays, a drought-susceptible hybrid and a drought-tolerant hybrid, and a forage Sorghum bicolor were grown in a greenhouse with one control group, one group maintained at 60% soil field capacity, and a third exposed to 250 mg kg−1 Royal Demolition Explosive (RDX). Green-Red Vegetation Index (GRVI), Photochemical Reflectance Index (PRI), Modified Red Edge Simple Ratio (MRESR), and Vogelmann Red Edge Index 1 (VREI1) were reduced due to presence of explosives. Principal component analyses of reflectance indices separated plants exposed to RDX from control and drought plants. Reflectance of Z. mays hybrids was increased from RDX in green and red wavelengths, while reduced in near-infrared wavelengths. Drought Z. mays reflectance was lower in green, red, and NIR regions. S. bicolor grown with RDX reflected more in green, red, and NIR wavelengths. The spectra and their derivatives will be beneficial for developing explosive-specific indices to accurately identify plants in contaminated soil. This study is the first to demonstrate potential to delineate subsurface explosives over large areas using remote sensing of vegetation with aerial-based hyperspectral systems. Full article
(This article belongs to the Section Environmental Remote Sensing)
Show Figures

Graphical abstract

Graphical abstract
Full article ">Figure 1
<p>Schematic of the 3-m tall custom-built gantry system for sensor movement. The sensor is moved along a defined track by a lead screw. The halogen light source provides full spectrum artificial illumination for the object.</p>
Full article ">Figure 2
<p>Image processing workflow. The Difference Vegetation Index was used to create a mask on radiometrically-corrected images with a threshold of values. Once masked, reflectance indices are only calculated and averaged on parts of images containing vegetation. Reflectance index averages and reflectance spectra were then output for statistical analyses. Indices and spectra were averaged pixel-by-pixel for each image.</p>
Full article ">Figure 3
<p>Comparison of reflectance indices for all three plant types and groups: (<b>a</b>) Green Ratio Vegetation Index (GRVI) average values. (<b>b</b>) Photochemical Reflectance Index (PRI) average values. n = 8 for all groups less <span class="html-italic">S. bicolor</span> control (n = 7) and AM <span class="html-italic">Z. mays</span> exposed to RDX (n = 7). Error bars represent standard deviations.</p>
Full article ">Figure 4
<p>Comparison of reflectance indices for all three plant types and groups: (<b>a</b>) Average Anthocyanin Reflectance Index 2 values. (<b>b</b>) Modified Red Edge Simple Ratio (MRESR) average values. (<b>c</b>) Vogelmann Red Edge Index 1 (VREI1) average values. n = 8 for all groups less <span class="html-italic">S. bicolor</span> control (n = 7) and AM <span class="html-italic">Z. mays</span> exposed to RDX (n = 7). Error bars represent standard deviations.</p>
Full article ">Figure 5
<p>Principle Components Analyses of (<b>a</b>) control, drought, and RDX groups; (<b>b</b>) control and drought groups; (<b>c</b>) control and RDX groups; and (<b>d</b>) drought and RDX groups. Each PCA includes all plant types. Ovals indicated separation between groups. Percentages define the variance explained by Axis 1. n = 8 for all groups less <span class="html-italic">S. bicolor</span> control (n = 7) and AM <span class="html-italic">Z. mays</span> exposed to RDX (n = 7) that had one mortality.</p>
Full article ">Figure 6
<p>Average reflectance spectra of control, drought, and RDX groups for (<b>a</b>) drought-susceptible (AM) <span class="html-italic">Z</span>. <span class="html-italic">mays</span>, (<b>b</b>) drought-tolerant (AMX) <span class="html-italic">Z</span>. <span class="html-italic">mays</span>, and (<b>c</b>) <span class="html-italic">S</span>. <span class="html-italic">bicolor</span> (S). Line portions in red signify statistical difference from controls (α = 0.05). Shaded regions indicate statistical significance between drought and RDX groups (α = 0.05). n = 8 for all groups less <span class="html-italic">S. bicolor</span> control (n = 7) and AM <span class="html-italic">Z. mays</span> exposed to RDX (n = 7).</p>
Full article ">Figure 7
<p>First derivatives of control, drought, and RDX drought-susceptible <span class="html-italic">Z</span>. <span class="html-italic">mays</span> (AM) groups calculated and averaged from individual plant spectra. n = 8 for all groups less <span class="html-italic">S. bicolor</span> control (n = 7) and AM <span class="html-italic">Z. mays</span> exposed to RDX (n = 7).</p>
Full article ">
13 pages, 3547 KiB  
Letter
Ocean Optical Profiling in South China Sea Using Airborne LiDAR
by Peng Chen and Delu Pan
Remote Sens. 2019, 11(15), 1826; https://doi.org/10.3390/rs11151826 - 5 Aug 2019
Cited by 36 | Viewed by 4739
Abstract
Increasingly, LiDAR has more and more applications. However, so far, there are no relevant publications on using airborne LiDAR for ocean optical profiling in the South China Sea (SCS). The applicability of airborne LiDAR for optical profiling in the SCS will be presented. [...] Read more.
Increasingly, LiDAR has more and more applications. However, so far, there are no relevant publications on using airborne LiDAR for ocean optical profiling in the South China Sea (SCS). The applicability of airborne LiDAR for optical profiling in the SCS will be presented. A total of four airborne LiDAR flight experiments were conducted over autumn 2017 and spring 2018 in the SCS. A hybrid retrieval method will be presented here, which incorporates a Klett method to obtain LiDAR attenuation coefficient and a perturbation retrieval method for a volume scattering function at 180°. The correlation coefficient between the LiDAR-derived results and the traditional measurements was 0.7. The mean absolute relative error (MAE) and the normalized root mean square deviation (NRMSD) between the two are both between 10% and 12%. Subsequently, the vertical structure of the LiDAR-retrieved attenuation and backscattering along airborne LiDAR flight tracks was mapped. In addition to this, ocean subsurface phytoplankton layers were detected between 10 to 20 m depths along the flight track in Sanya Bay. Primary results demonstrated that our airborne LiDAR has an independent ability to survey and characterize ocean optical structure. Full article
(This article belongs to the Section Ocean Remote Sensing)
Show Figures

Graphical abstract

Graphical abstract
Full article ">Figure 1
<p>Diagram of the major optoelectronic components (<b>a</b>), and a picture of the LiDAR (<b>b</b>).</p>
Full article ">Figure 2
<p>LiDAR flight experiments in the SCS. The color lines are flight tracks of airborne LiDAR taken on 23 September (<b>black</b>) and 30 September (<b>blue</b>), 2017, and on 11 March (<b>red</b>) and 12 March (<b>green</b>), 2018.</p>
Full article ">Figure 3
<p>Flow chart showing the inversion process.</p>
Full article ">Figure 4
<p>An example of processing the LiDAR data acquired on 30 September, 2017. (<b>a</b>) The profile from the raw LiDAR data, (<b>b</b>) the geometric range corrected LiDAR return in logarithmical form after background-noise subtraction, (<b>c</b>) the profile from the retrieved α, and (<b>d</b>) the profile from the retrieved β.</p>
Full article ">Figure 5
<p>Comparison of airborne LiDAR-derived results with traditional measurements. (<b>a</b>,<b>c</b>) are the comparison results for S1; (<b>b</b>,<b>d</b>) are the comparison results for S2.</p>
Full article ">Figure 6
<p>LiDAR-retrieved attenuation and backscattering vertical structure distribution. α ranges from purple (=0.0 m<sup>−1</sup>) to red (=0.6 m<sup>−1</sup>). β ranges from purple (=0.0001 (m.sr)<sup>−1</sup>) to red (=0.005 (m.sr)<sup>−1</sup>).</p>
Full article ">Figure 7
<p>LiDAR-retrieved b<sub>bp</sub> vertical structure distribution in Sanya Bay water, ranges from purple (=0.0 m<sup>−1</sup>) to red (=0.007 m<sup>−1</sup>).</p>
Full article ">
22 pages, 7446 KiB  
Article
Chlorophyll Concentration Response to the Typhoon Wind-Pump Induced Upper Ocean Processes Considering Air–Sea Heat Exchange
by Yupeng Liu, Danling Tang and Morozov Evgeny
Remote Sens. 2019, 11(15), 1825; https://doi.org/10.3390/rs11151825 - 4 Aug 2019
Cited by 45 | Viewed by 6555
Abstract
The typhoon Wind-Pump induced upwelling and cold eddy often promote the significant growth of phytoplankton after the typhoon. However, the importance of eddy-pumping and wind-driven upwelling on the sea surface chlorophyll a concentration (Chl-a) during the typhoon are still not clearly distinguished. In [...] Read more.
The typhoon Wind-Pump induced upwelling and cold eddy often promote the significant growth of phytoplankton after the typhoon. However, the importance of eddy-pumping and wind-driven upwelling on the sea surface chlorophyll a concentration (Chl-a) during the typhoon are still not clearly distinguished. In addition, the air–sea heat flux exchange is closely related to the upper ocean processes, but few studies have discussed its role in the sea surface Chl-a variations under typhoon conditions. Based on the cruise data, remote sensing data, and model data, this paper analyzes the contribution of the vertical motion caused by the eddy-pumping upwelling and Ekman pumping upwelling on the surface Chl-a, and quantitatively analyzes the influence of air–sea heat exchange on the surface Chl-a after the typhoon Linfa over the northeastern South China Sea (NSCS) in 2009. The results reveal the Wind Pump impacts on upper ocean processes: (1) The euphotic layer-integrated Chl-a increased after the typhoon, and the increasing of the surface Chl-a was not only the uplift of the deeper waters with high Chl-a but also the growth of the phytoplankton; (2) The Net Heat Flux (air–sea heat exchange) played a major role in controlling the upper ocean physical processes through cooling the SST and indirectly increased the surface Chl-a until two weeks after the typhoon; (3) the typhoon-induced cyclonic eddy was the most important physical process in increasing the surface Chl-a rather than the Ekman pumping and wind-stirring mixing after typhoon; (4) the spatial shift between the surface Chl-a blooms and the typhoon-induced cyclonic eddy could be due to the Ekman transport; (5) nutrients uplifting and adequate light were two major biochemical elements supplying for the growth of surface phytoplankton. Full article
(This article belongs to the Special Issue Tropical Cyclones Remote Sensing and Data Assimilation)
Show Figures

Graphical abstract

Graphical abstract
Full article ">Figure 1
<p>Map of the study area and typhoon path (the blue box represents the study area; black line means the path of typhoon; the green, yellow, and red dots represent the tropical depression (td), tropical storm (ts), and typhoon (tp), respectively; the pink points and pink numbers represent the station positions and station names).</p>
Full article ">Figure 2
<p>Map of the changes of the Chl-a before and after the typhoon Linfa. (<b>a</b>) One week before typhoon; (<b>b</b>) during typhoon; (<b>c</b>) one week after typhoon; (<b>d</b>) two weeks after typhoon; (<b>e</b>) time series of the area average surface Chl-a within study area shown as the red box.</p>
Full article ">Figure 3
<p>Spatial distribution map of rainfall, sea surface temperature (SST), and sea level height anomaly (SLA) with sea surface geostrophic currents (geo-SSCs) before and after the typhoon (first column: Rainfall (mm); second column: Sea surface temperature (SST, °C); third column: SLA (cm) with geo-SSCs (m/s); (<b>a</b>–<b>d</b>) represent 1 week before, during, 1 week after, and 2 weeks after the typhoon Linfa; the blue box indicates the study area; the dotted line indicates the typhoon path).</p>
Full article ">Figure 4
<p>Distribution map of the Ekman pumping velocities (EPV, m/s) and wind fields before and after the typhoon (<b>a</b>) one week before typhoon; (<b>b</b>) during typhoon; (<b>c</b>) one week after typhoon; (<b>d</b>) two weeks after typhoon; the blue box indicates the study area, the black dotted line indicates the typhoon path, and the arrows indicate the wind vectors).</p>
Full article ">Figure 5
<p>Daily distribution map of the Chl-a (mg/m<sup>3</sup>), EPV (m/s), and SLA with geo-SSCs before and after the typhoon ((<b>a</b>) 10 days before typhoon; (<b>b</b>) during typhoon; (<b>c</b>) just after typhoon; (<b>d</b>) 3 days after typhoon; (<b>e</b>) 2 weeks after the typhoon; the blue box indicates the study area, the black dotted line indicates the typhoon path, and the arrows in the second column indicate the wind vectors, the arrows in the third column indicate the geo-SSCs).</p>
Full article ">Figure 6
<p>Hydrological profiles of the 18° N from cruise data in 16–17 July, 2009. [(<b>a</b>) Temperature profiles (°C); (<b>b</b>) salinity profiles (psu); (<b>c</b>) density profiles (kg/m<sup>3</sup>); the black dash line box represents the study region; the black line with dots represents the MLD; the white arrow shows the position of the maximum surface Chl-a], and the graph of the potential temperature, salinity, and potential density from cruise data [S,T,D represent the potential temperature, salinity, and potential density, respectively. (<b>d</b><b>1</b>) Profiles during (Station 6 at 18° N, 119° E) the typhoon; (<b>d2</b>) Profiles after (Station 20) the typhoon. (<b>e</b><b>1</b>) Temperature, Salinity and (<b>e2</b>) Density profiles of 4 stations within the study area one day before (Station 9) and during (Stations 4, 6, and 7) the typhoon].</p>
Full article ">Figure 7
<p>Time series of area average values about each factor within the study region before and after the typhoon. (<b>a</b>) Chl-a (mg/m<sup>3</sup>); (<b>b</b>) SLA (m) and 0–75 m integrated water flows minus that of 2 June (m<sup>3</sup>); (<b>c</b>) EPV (m/s); (<b>d</b>) P<sub>w</sub> (N/m<sup>2</sup>); (<b>e</b>) SST (°C); (<b>f</b>) proportions and values of 0–75 m integrated water flows minus that of 2 June (m<sup>3</sup>, red and black lines represent the values of southwestward and northeastward water masses transport, respectively; the blue and red bars represent the proportions of southwestward and northeastward water masses transport, respectively; positive means inflows and divergence); red dash line box represents the period of typhoon passing the study region.</p>
Full article ">Figure 8
<p>Horizontal time section of each parameter in the study area before and after the typhoon. (<b>a</b>) Chl-a (mg/m<sup>3</sup>); (<b>b</b>) SLA (m); (<b>c</b>) EPV (m/s); (<b>d</b>) P<sub>w</sub> (N/m<sup>2</sup>); (<b>e</b>) direction of the EMT (°); red dashed line represents the passage of the typhoon, the black dashed line box represents the position of the surface Chl-a blooms one week after the typhoon, the red box represents the position of the cyclonic eddy one week after the typhoon.</p>
Full article ">Figure 9
<p>Study area averaged heat budget analysis to the changes of the sea surface temperature (△SST) before and after the typhoon. (<b>a</b>) The weekly area average △SST due to net heat flux (NHF) and the ocean processes. (<b>b</b>) The daily area average △SST due to NHF and the ocean processes. (<b>c</b>) The proportion of the daily area average heat flux differences (sensible heat flux (SHF), latent heat flux (LHF), longwave radiation flux (LWRF), and shortwave radiation flux (SWRF)) to the total daily area averaged NHF. The red arrows represented the passing time of the typhoon Linfa. Positive values mean increase of the SST and negative values mean decrease of the SST.</p>
Full article ">Figure 10
<p>Monthly average nitrate distribution profile of WOA13 in June (μmol/L, the pink line with dots represents the MLD 1–2 days after the typhoon, the red dotted line represents the 1 μmol/L of nitrate contour, the black line with dots represent the euphotic depth one week after the typhoon, the white arrow represents the center of the typhoon-induced cyclonic eddy).</p>
Full article ">Figure 11
<p>Graphical abstract illustrates the Chl-a response to the typhoon “Wind Pump” impacts on upper ocean conditions and air–sea exchange. The EMT represents the Ekman mass transport, the EPV represents the Ekman pumping velocity, the black dashed line box represents the subsurface water from 0–100 m, the blue wavy line indicates the sea surface, and different colors represent different processes shown in the northeastern legend. The importance ranking of the roles in the Chl-a increasing is shown at the bottom of this figure within the yellow box.</p>
Full article ">
18 pages, 6761 KiB  
Article
Changes in Water Surface Area during 1989–2017 in the Huai River Basin using Landsat Data and Google Earth Engine
by Haoming Xia, Jinyu Zhao, Yaochen Qin, Jia Yang, Yaoping Cui, Hongquan Song, Liqun Ma, Ning Jin and Qingmin Meng
Remote Sens. 2019, 11(15), 1824; https://doi.org/10.3390/rs11151824 - 4 Aug 2019
Cited by 88 | Viewed by 9459
Abstract
The dynamics of surface water play a crucial role in the hydrological cycle and are sensitive to climate change and anthropogenic activities, especially for the agricultural zone. As one of the most populous areas in China’s river basins, the surface water in the [...] Read more.
The dynamics of surface water play a crucial role in the hydrological cycle and are sensitive to climate change and anthropogenic activities, especially for the agricultural zone. As one of the most populous areas in China’s river basins, the surface water in the Huai River Basin has significant impacts on agricultural plants, ecological balance, and socioeconomic development. However, it is unclear how water areas responded to climate change and anthropogenic water exploitation in the past decades. To understand the changes in water surface areas in the Huai River Basin, this study used the available 16,760 scenes Landsat TM, ETM+, and OLI images in this region from 1989 to 2017 and processed the data on the Google Earth Engine (GEE) platform. The vegetation index and water index were used to quantify the spatiotemporal variability of the surface water area changes over the years. The major results include: (1) The maximum area, the average area, and the seasonal variation of surface water in the Huai River Basin showed a downward trend in the past 29 years, and the year-long surface water areas showed a slight upward trend; (2) the surface water area was positively correlated with precipitation (p < 0.05), but was negatively correlated with the temperature and evapotranspiration; (3) the changes of the total area of water bodies were mainly determined by the 216 larger water bodies (>10 km2). Understanding the variations in water body areas and the controlling factors could support the designation and implementation of sustainable water management practices in agricultural, industrial, and domestic usages. Full article
Show Figures

Figure 1

Figure 1
<p>(<b>a</b>) Geographical location of the Huai River Basin; (<b>b</b>) Digital elevation model (DEM); (<b>c</b>) Average precipitation from 1989 to 2017; (<b>d</b>) Average temperature from 1989 to 2017.</p>
Full article ">Figure 2
<p>The numbers distribution of Landsat 5, 7, 8 images in Huai River Basin from 1989 to 2017: (<b>a</b>) The total numbers of Landsat observation; (<b>b</b>) the total number of high-quality Landsat images; (<b>c</b>) the total number of Landsat images in each path/row (tiles); (<b>d</b>) the total number images of different Landsat sensors; (<b>e</b>) Cumulative percentage of pixels with good observations of 0, 1, 2, 3, 4, [5,10), [10,20), [20,40), [40,80), [80,160), respectively during 1989–2017.</p>
Full article ">Figure 3
<p>A flowchart of the overall route of open surface water mapping using Landsat 5, 7, and 8 images and Google Earth Engine (GEE).</p>
Full article ">Figure 4
<p>Visually interpreted water and non-water pixels.</p>
Full article ">Figure 5
<p>The water frequency distribution in Huai River Basin: Water frequency distribution map of 2017 (<b>a</b>) and 1989–2017 (<b>b</b>); The number of pixel distributions of water at different frequency levels with a bin of 0.05 in 2017 (<b>c</b>) and 1984–2015 (<b>d</b>); The number of pixel distributions of water at different frequency levels with a bin of 0.1 during 1989–2017 (<b>e</b>).</p>
Full article ">Figure 6
<p>The water area distribution in the Huai River Basin from 1989 to 2017: (<b>a</b>) The maximum water body; (<b>b</b>) the average water body; (<b>c</b>) the year-long water body; (<b>d</b>) seasonally changing water body.</p>
Full article ">Figure 7
<p>The inter-annual variations of water body numbers across the Huai River Basin from 1989 to 2017: The numbers of maximum water body (<b>a</b>) and year-long water body (<b>b</b>).</p>
Full article ">Figure 8
<p>The numbers and area distribution of the maximum surface water body at different size levels during 1989-2017; (<b>a</b>) the number distribution of maximum surface water, (<b>b</b>) the area distribution of maximum surface water.</p>
Full article ">Figure 9
<p>Changes in annual total precipitation, total evapotranspiration, and the average temperature in the Huai River Basin during 1987–2017.</p>
Full article ">Figure 10
<p>The distribution and area distribution based on the water body range in 2001 and 2003: (<b>a</b>) The water body number distribution; (<b>b</b>) the water body size distribution.</p>
Full article ">Figure 11
<p>A comparison between the water map generated in this study (MAX (<b>a2</b>–<b>e2</b>) denotes the maximum water body, and YEAR-LONG (<b>a4</b>–<b>e4</b>) denotes the year-long water body) and JRC-data (JRC-All represents the sum of the annual seasonal surface water and permanent surface water (<b>a1</b>–<b>e1</b>), and JRC-PW represents permanent surface water of the JRC-data (<b>a3</b>–<b>e3</b>). Lakes (<b>a</b>), rivers (<b>b</b>), ponds (<b>c</b>), mountain waters (<b>d</b>) and urban water bodies (<b>e</b>).</p>
Full article ">
22 pages, 6795 KiB  
Article
Evaluating the Performance of Satellite-Derived Vegetation Indices for Estimating Gross Primary Productivity Using FLUXNET Observations across the Globe
by Xiaojuan Huang, Jingfeng Xiao and Mingguo Ma
Remote Sens. 2019, 11(15), 1823; https://doi.org/10.3390/rs11151823 - 4 Aug 2019
Cited by 76 | Viewed by 7579
Abstract
Satellite-derived vegetation indices (VIs) have been widely used to approximate or estimate gross primary productivity (GPP). However, it remains unclear how the VI-GPP relationship varies with indices, biomes, timescales, and the bidirectional reflectance distribution function (BRDF) effect. We examined the relationship between VIs [...] Read more.
Satellite-derived vegetation indices (VIs) have been widely used to approximate or estimate gross primary productivity (GPP). However, it remains unclear how the VI-GPP relationship varies with indices, biomes, timescales, and the bidirectional reflectance distribution function (BRDF) effect. We examined the relationship between VIs and GPP for 121 FLUXNET sites across the globe and assessed how the VI-GPP relationship varied among a variety of biomes at both monthly and annual timescales. We used three widely-used VIs: normalized difference vegetation index (NDVI), enhanced vegetation index (EVI), and 2-band EVI (EVI2) as well as a new VI - NIRV and used surface reflectance both with and without BRDF correction from the moderate resolution imaging spectroradiometer (MODIS) to calculate these indices. The resulting traditional (NDVI, EVI, EVI2, and NIRV) and BRDF-corrected (NDVIBRDF, EVIBRDF, EVI2BRDF, and NIRV, BRDF) VIs were used to examine the VI-GPP relationship. At the monthly scale, all VIs were moderate or strong predictors of GPP, and the BRDF correction improved their performance. EVI2BRDF and NIRV, BRDF had similar performance in capturing the variations in tower GPP as did the MODIS GPP product. The VIs explained lower variance in tower GPP at the annual scale than at the monthly scale. The BRDF-correction of surface reflectance did not improve the VI-GPP relationship at the annual scale. The VIs had similar capability in capturing the interannual variability in tower GPP as MODIS GPP. VIs were influenced by temperature and water stresses and were more sensitive to temperature stress than to water stress. VIs in combination with environmental factors could improve the prediction of GPP than VIs alone. Our findings can help us better understand how the VI-GPP relationship varies among indices, biomes, and timescales and how the BRDF effect influences the VI-GPP relationship. Full article
Show Figures

Graphical abstract

Graphical abstract
Full article ">Figure 1
<p>Location and distribution of the 121 eddy covariance (EC) flux sites used in our study. The site description of these FLUXNET sites is provided in <a href="#app1-remotesensing-11-01823" class="html-app">Table S1 in the Supplementary Material</a>. The base map is the MODIS land cover map.</p>
Full article ">Figure 2
<p>The boxplots of the R<sup>2</sup> values for the relationships between MODIS-derived VIs and tower GPP at the monthly scale across all sites within each biome. The letters on the x axis stand for: A - NDVI, B - EVI, C - EVI2, D - NDVI<sub>BRDF</sub>, E - EVI<sub>BRDF</sub>, F - EVI2<sub>BRDF</sub>, G - NIR<sub>V</sub>, H - NIR<sub>V, BRDF,</sub> and I - MODIS GPP. The biome types are as follows: evergreen needleleaf forest (ENF); evergreen broadleaf forest (EBF); deciduous needleleaf forest (DNF); mixed forest (MF); closed and open shrublands (COSH); woody savannas (WSA); savannas (SAV); grasslands (GRA); wetlands (WET); croplands (CRO).</p>
Full article ">Figure 3
<p>The bar graphs of the R<sup>2</sup> values for the relationships between the annual averaged VIs and tower GPP for all the sites with at least six years of flux data for each biome. The letters on the x axis stand for the R<sup>2</sup> values; the letters on the y axis stand for the sites. The asterisks - * and ** stand for the significance level p &lt; 0.1 and p &lt; 0.05, respectively. The biome types are as follows: evergreen needleleaf forest (ENF); evergreen broadleaf forest (EBF); deciduous needleleaf forest (DNF); mixed forest (MF); closed and open shrublands (COSH); woody savannas (WSA); savannas (SAV); grasslands (GRA); wetlands (WET); croplands (CRO).</p>
Full article ">Figure 4
<p>The relationships between VIs and tower GPP across all sites for each biome. The R<sup>2</sup> values for the relationships between VIs and tower GPP at each biome were provided in <a href="#remotesensing-11-01823-t001" class="html-table">Table 1</a>. The biome types are as follows: evergreen needleleaf forest (ENF); evergreen broadleaf forest (EBF); deciduous needleleaf forest (DNF); mixed forest (MF); closed and open shrublands (COSH); woody savannas (WSA); savannas (SAV); grasslands (GRA); wetlands (WET); croplands (CRO).</p>
Full article ">Figure 5
<p>The relationships of VIs and MODIS GPP with tower GPP across ten biomes. The error bars are standard deviations across all sites within each biome. The biome types are as follows: evergreen needleleaf forest (ENF); evergreen broadleaf forest (EBF); deciduous needleleaf forest (DNF); mixed forest (MF); closed and open shrublands (COSH); woody savannas (WSA); savannas (SAV); grasslands (GRA); wetlands (WET); croplands (CRO).</p>
Full article ">Figure 6
<p>The boxplots of the R<sup>2</sup> values for the relationships of monthly averaged VIs and MODIS GPP with temperature stress (f<sub>Topt</sub>) for each biome with the total of 121 sites. The letters on the x axis stand for: A - NDVI, B - EVI, C - EVI2, D - NDVI<sub>BRDF</sub>, E - EVI<sub>BRDF</sub>, F - EVI2<sub>BRDF</sub>, G - NIR<sub>V</sub>, H - NIR<sub>V, BRDF</sub>, and I - MODIS GPP. The biome types are as follows: evergreen needleleaf forest (ENF); evergreen broadleaf forest (EBF); deciduous needleleaf forest (DNF); mixed forest (MF); closed and open shrublands (COSH); woody savannas (WSA); savannas (SAV); grasslands (GRA); wetlands (WET); croplands (CRO).</p>
Full article ">Figure 7
<p>The boxplots of the R<sup>2</sup> for the relationships of monthly averaged VIs and MODIS GPP with water stress (f<sub>VPD</sub>) for each biome with the total of 121 sites. The letters on the x axis stand for: A - NDVI, B - EVI, C - EVI2, D - NDVI<sub>BRDF</sub>, E - EVI<sub>BRDF</sub>, F - EVI2<sub>BRDF</sub>, G - NIR<sub>V</sub>, H - NIR<sub>V, BRDF</sub>, and I - MODIS GPP. The biome types are as follows: evergreen needleleaf forest (ENF); evergreen broadleaf forest (EBF); deciduous needleleaf forest (DNF); mixed forest (MF); closed and open shrublands (COSH); woody savannas (WSA); savannas (SAV); grasslands (GRA); wetlands (WET); croplands (CRO).</p>
Full article ">Figure 8
<p>Daily VIs, temperature, precipitation, temperature stress (f<sub>Topt</sub>) and water stress (f<sub>VPD</sub>) at a grassland site (CH-Oe1) in a normal year (2002) and a drought year (2003).</p>
Full article ">Figure 9
<p>Daily VIs, temperature, precipitation, temperature stress (f<sub>Topt</sub>) and water stress (f<sub>VPD</sub>) at a deciduous broadleaf forest site (DK-Sor) in a normal year (2014) and a low temperature year (2007).</p>
Full article ">Figure 10
<p>The plot of the averaged R<sup>2</sup> values for the relationships of the GPP with VIs combined with environmental factors (f<sub>Topt</sub> and f<sub>VPD</sub> and PAR) for each biome. The letters on the x axis stand for: A - NDVI, B - EVI, C - EVI2, D - NDVI<sub>BRDF</sub>, E - EVI<sub>BRDF</sub>, F - EVI2<sub>BRDF</sub>, G - NIR<sub>V</sub>, and H - NIR<sub>V, BRDF</sub>. The biome types are as follows: evergreen needleleaf forest (ENF); evergreen broadleaf forest (EBF); deciduous needleleaf forest (DNF); mixed forest (MF); closed and open shrublands (COSH); woody savannas (WSA); savannas (SAV); grasslands (GRA); wetlands (WET); croplands (CRO).</p>
Full article ">Figure 11
<p>The seasonal cycles and relationship of tower GPP, two environmental scalars (f<sub>Topt</sub> and f<sub>VPD</sub>), and PAR at the Loobos forest site (NL-Loo, the Netherlands) from 2005 to 2008: (<b>a</b>) environmental scalars (f<sub>Topt</sub> and f<sub>VPD</sub>) and PAR; (<b>b</b>) NIR<sub>V, BRDF</sub> × f<sub>Topt</sub> × f<sub>VPD</sub> × PAR and tower GPP; (<b>c</b>) relationships of tower GPP with NIR<sub>V, BRDF</sub>, f<sub>Topt</sub>, f<sub>VPD</sub>, and PAR as well as NIR<sub>V, BRDF</sub> combined with environmental factors.</p>
Full article ">Figure 12
<p>The seasonal cycles and relationship of tower GPP, two environmental scalars (f<sub>Topt</sub> and f<sub>VPD</sub>), and PAR at the Daly River Savanna site (AU-Das, Australia) from 2010 to 2014: (<b>a</b>) environmental scalars (f<sub>Topt</sub> and f<sub>VPD</sub>) and PAR; (<b>b</b>) NIR<sub>V, BRDF</sub> × f<sub>Topt</sub> × f<sub>VPD</sub> × PAR and tower GPP; (<b>c</b>) relationships of tower GPP with NIR<sub>V, BRDF</sub>, f<sub>VPD</sub>, and PAR as well as NIR<sub>V, BRDF</sub> combined with environmental factors.</p>
Full article ">
19 pages, 1996 KiB  
Article
Tensor Discriminant Analysis via Compact Feature Representation for Hyperspectral Images Dimensionality Reduction
by Jinliang An, Yuzhen Song, Yuwei Guo, Xiaoxiao Ma and Xiangrong Zhang
Remote Sens. 2019, 11(15), 1822; https://doi.org/10.3390/rs11151822 - 4 Aug 2019
Cited by 6 | Viewed by 3793
Abstract
Dimensionality reduction is of great importance which aims at reducing the spectral dimensionality while keeping the desirable intrinsic structure information of hyperspectral images. Tensor analysis which can retain both spatial and spectral information of hyperspectral images has caused more and more concern in [...] Read more.
Dimensionality reduction is of great importance which aims at reducing the spectral dimensionality while keeping the desirable intrinsic structure information of hyperspectral images. Tensor analysis which can retain both spatial and spectral information of hyperspectral images has caused more and more concern in the field of hyperspectral images processing. In general, a desirable low dimensionality feature representation should be discriminative and compact. To achieve this, a tensor discriminant analysis model via compact feature representation (TDA-CFR) was proposed in this paper. In TDA-CFR, the traditional linear discriminant analysis was extended to tensor space to make the resulting feature representation more informative and discriminative. Furthermore, TDA-CFR redefines the feature representation of each spectral band by employing the tensor low rank decomposition framework which leads to a more compact representation. Full article
(This article belongs to the Special Issue Advanced Techniques for Spaceborne Hyperspectral Remote Sensing)
Show Figures

Figure 1

Figure 1
<p>The flowchart of the proposed TDA-CFR.</p>
Full article ">Figure 2
<p>The compact representation of hyperspectral images.</p>
Full article ">Figure 3
<p>The classification maps on Indian Pines.</p>
Full article ">Figure 4
<p>The classification maps on Pavia University.</p>
Full article ">Figure 5
<p>The classification maps on Salinas.</p>
Full article ">Figure 6
<p>OA with SVM classifier versus the variation of reduced dimensionality.</p>
Full article ">Figure 7
<p>OA with SVM classifier versus the variation of the number of training samples for each class.</p>
Full article ">Figure 8
<p>OA with SVM classifier versus the variation of window size.</p>
Full article ">Figure 9
<p>The effect of parameter <math display="inline"><semantics> <mi>ξ</mi> </semantics></math>.</p>
Full article ">
20 pages, 14351 KiB  
Article
Using Nighttime Light Data and POI Big Data to Detect the Urban Centers of Hangzhou
by Ge Lou, Qiuxiao Chen, Kang He, Yue Zhou and Zhou Shi
Remote Sens. 2019, 11(15), 1821; https://doi.org/10.3390/rs11151821 - 4 Aug 2019
Cited by 53 | Viewed by 7473
Abstract
The worldwide development of multi-center structures in large cities is a prevailing development trend. In recent years, China’s large cities developed from a predominantly mono-centric to a multi-center urban space structure. However, the definition and identification city centers is complex. Both nighttime light [...] Read more.
The worldwide development of multi-center structures in large cities is a prevailing development trend. In recent years, China’s large cities developed from a predominantly mono-centric to a multi-center urban space structure. However, the definition and identification city centers is complex. Both nighttime light data and point of interest (POI) data are important data sources for urban spatial structure research, but there are few integrated applications for these two kinds of data. In this study, visible infrared imaging radiometer suite (NPP-VIIRS) nighttime imagery and POI data were combined to identify the city centers in Hangzhou, China. First, the optimal parameters of multi-resolution segmentation were determined by experiments. The POI density was then calculated with the segmentation results as the statistical unit. High–high clustering units were then defined as the main centers by calculating the Anselin Local Moran’s I, and a geographically weighted regression model was used to identify the subcenters according to the square root of the POI density and the distances between the units and the city center. Finally, a comparison experiment was conducted between the proposed method and the relative cut-off_threshold method, and the experiment results were compared with the evaluation report of the master plan. The results showed that the optimal segmentation parameters combination was 0.1 shape and 0.5 compactness factors. Two main city centers and ten subcenters were detected. Comparison with the evaluation report of the master plan indicated that the combination of nighttime light data and POI data could identify the urban centers accurately. Combined with the characteristics of the two kinds of data, the spatial structure of the city could be characterized properly. This study provided a new perspective for the study of the spatial structure of polycentric cities. Full article
Show Figures

Graphical abstract

Graphical abstract
Full article ">Figure 1
<p>Location of the study area. The inset map shows the location of Hangzhou in Zhejiang province.</p>
Full article ">Figure 2
<p>Point of interest (POI) data of Hangzhou in May 2018: (<b>a</b>) Spatial distribution of POI, each point stands for one point of interest. (<b>b</b>) Gridded map of POI number with 500 m resolution, representing the number of POI in a square of 25 ha.</p>
Full article ">Figure 3
<p>Workflow of the proposed method. NTL: nighttime light; NPP-VIIRS: visible infrared imaging radiometer suite.</p>
Full article ">Figure 4
<p>NPP-VIIRS nighttime light intensity map of Hangzhou in May 2018. (NPP-VIIRS: Suomi-NPP satellite, visible infrared imaging radiometer).</p>
Full article ">Figure 5
<p>of segments changes with segmentation scale factor using multi-resolution segmentation in eCognition.</p>
Full article ">Figure 6
<p>Segmentation results of nine groups of shape factor and compactness factor combinations (here shows a small region). The yellow lines show the boundaries of segments. The image with red frame means the factor combination we chose to use (shape factor is 0.1 and compactness factor is 0.5).</p>
Full article ">Figure 7
<p>Weighted mean variances of segments for different scale factors (3,4,5,6,7,8).</p>
Full article ">Figure 8
<p>Density of POI counted in segmentation units. The small images on the right show detailed views of the areas of small units and big units respectively.</p>
Full article ">Figure 9
<p>Four cluster types of the results of local spatial autocorrelation analysis for the POI densities in units.</p>
Full article ">Figure 10
<p>Main centers and subcenters detected by the proposed method. The main centers are shown in <b>red,</b> and the subcenters are shown in <b>yellow</b>. The river and lake layer contains West Lake and the Qiantang River.</p>
Full article ">Figure 11
<p>Centers detected by different methods and datasets. All centers detected by threshold are shown in orange. In the results from the Local Moran’s I (LMI) and geographically weighted regression (GWR) methods, the main centers are shown in <b>red</b>, and the subcenters are shown in <b>yellow</b>.</p>
Full article ">Figure 12
<p>Hot spot analysis result of population data.</p>
Full article ">Figure 13
<p>Centers areas intersect results. The overlapped center areas were shown in <b>green</b> and the different center areas were shown in <b>gray</b>.</p>
Full article ">Figure 14
<p>Centers proposed in the master plan and detected by our experiment.</p>
Full article ">
21 pages, 7083 KiB  
Article
Impacts of Large-Scale Open-Pit Coal Base on the Landscape Ecological Health of Semi-Arid Grasslands
by Zhenhua Wu, Shaogang Lei, Qingqing Lu and Zhengfu Bian
Remote Sens. 2019, 11(15), 1820; https://doi.org/10.3390/rs11151820 - 4 Aug 2019
Cited by 54 | Viewed by 5863
Abstract
Coal is an important energy resource in the world, especially in China. Extensive coal exploitation seriously damaged the grassland and its fragile ecosystem. However, temporal and spatial impact laws of open-pit coal exploitation on Landscape Ecological Health (LEH) of semi-arid grasslands are still [...] Read more.
Coal is an important energy resource in the world, especially in China. Extensive coal exploitation seriously damaged the grassland and its fragile ecosystem. However, temporal and spatial impact laws of open-pit coal exploitation on Landscape Ecological Health (LEH) of semi-arid grasslands are still not clear. Therefore, the main objective of this paper is to study impact of Large-scale Open-pit Coal Base (LOCB) on the LEH of semi-arid grasslands from the perspectives of temporal and spatial. Taking Shengli LOCB of Xilinguole grassland in Inner Mongolia as an example, we demonstrate a conceptual model of LOCB impact on LEH of semi-arid grasslands, and establish a research system called landscape Index-pattern Evolution-Driving force-Spatial statistics (IEDS). A complete process integrated from investigation, monitoring, and evaluation to the analysis of impact laws was developed. Result indicated that coal mining causes gradual increase of landscape patches, landscape fragmentation, gradual decline of landscape connectivity, complexity and irregularity of landscape shape, enhancement of landscape heterogeneity and complexity, gradual decline of landscape stability, gradual decrease of grassland landscape and annual increase of unhealthy grassland landscape. The LEH of grassland basically belongs to the state of slight deterioration. In the past 15 years, the spatial and temporal distribution characteristics of LEH in the study area are similar. This study provides scientific reference for ecological disturbance research, environmental protection, landscape planning, restoration and renovation of ecological environment in mining areas. At the same time, future research should integrate geological, hydrological, soil, vegetation, microorganisms, animals, climate, and other perspectives to study the impact of mining on landscape ecology deeply. Full article
(This article belongs to the Section Environmental Remote Sensing)
Show Figures

Graphical abstract

Graphical abstract
Full article ">Figure 1
<p>Location of the Research Area. I: Open-pit Germanium Mine; II: West No. 2 Open-pit Mine; III: West No. 3 Open-pit Mine; IV: No. 1 Open-pit Mine; V: East No. 2 Open-pit Mine.</p>
Full article ">Figure 2
<p>Research system (landscape Index-pattern Evolution-Driving force-Spatial statistics; IEDS) of the impact of Large-scale Open-pit Coal Base (LOCB) on Landscape Ecological Health (LEH) of semi-arid grasslands.</p>
Full article ">Figure 3
<p>Landscape pattern classification maps of the study area in (<b>a</b>) 2002, (<b>b</b>) 2005, (<b>c</b>) 2008, (<b>d</b>) 2011, (<b>e</b>) 2014 and (<b>f</b>) 2017.</p>
Full article ">Figure 4
<p>The area ratio of (<b>a</b>) Open-pit Landscape, (<b>b</b>) Dump Landscape, (<b>c</b>) Mining Construction Land Landscape, (<b>d</b>) Industrial and Storage Land Landscape, (<b>e</b>) Town Construction Land Landscape, (<b>f</b>) Road Network Landscape, (<b>g</b>) Grassland Landscape, (<b>h</b>) Unhealthy Grassland Landscape.</p>
Full article ">Figure 5
<p>The area of grassland occupied by (<b>a</b>) Open-pit Landscape, (<b>b</b>) Dump Landscape, (<b>c</b>) Mining Construction Land Landscape, (<b>d</b>) Town Construction Land Landscape, (<b>e</b>) Industrial and Storage Land Landscape, (<b>f</b>) Road Network Landscape (Note: All values in this figure represent changes presented rather than cumulated values).</p>
Full article ">Figure 6
<p>Landscape ecological health changes in the study area from 2002 to 2017.</p>
Full article ">Figure 7
<p>Spatial Distribution and Time Variation of empirical orthogonal function 1 (EOF1).</p>
Full article ">Figure 8
<p>The evolution types of grassland landscape ecological health under the disturbance of coal exploitation.</p>
Full article ">Figure 9
<p>Correlation analysis of (<b>a</b>) open-pit mining landscape area and (<b>b</b>) mining landscape area with a total production of raw coal. Correlation analysis of (<b>c</b>) open-pit mining landscape area and (<b>d</b>) mining landscape area with total coal consumption in China over the Years.</p>
Full article ">Figure 10
<p>(<b>a</b>) Conceptual model diagram of Chinese pastoralists managing family pastures under the “Household Contract Responsibility System”. (<b>b</b>) Photographs of fenced (top) and grazing (bottom) regions in the study area.</p>
Full article ">
29 pages, 12156 KiB  
Article
BDS-3 Time Group Delay and Its Effect on Standard Point Positioning
by Peipei Dai, Yulong Ge, Weijin Qin and Xuhai Yang
Remote Sens. 2019, 11(15), 1819; https://doi.org/10.3390/rs11151819 - 3 Aug 2019
Cited by 17 | Viewed by 3685
Abstract
The development of the BeiDou navigation system (BDS) is divided into three phases: The demonstration system (BDS-1), the regional system (BDS-2) and the global BeiDou navigation system (BDS-3). At present, the construction of the global BeiDou navigation system (BDS-3) constellation network is progressing [...] Read more.
The development of the BeiDou navigation system (BDS) is divided into three phases: The demonstration system (BDS-1), the regional system (BDS-2) and the global BeiDou navigation system (BDS-3). At present, the construction of the global BeiDou navigation system (BDS-3) constellation network is progressing very smoothly. The signal design and functionality of BDS-3 are different from those of BDS-1 and BDS-2. The BDS-3 satellite not only broadcasts B1I (1561.098 MHz) and B3I (1268.52 MHz) signals but also broadcasts new signals B1C (1575.42 MHz) and B2a (1176.45 MHz). In this work, six tracking stations of the international GNSS monitoring and assessment system (iGMAS) were selected, and 41 consecutive days of observation data, were collected. To fully exploit the code observations of BDS-2 and BDS-3, the time group delay (TGD) correction model of BDS-2 and BDS-3 are described in detail. To further verify the efficacy of the broadcast TGD parameters in the broadcast ephemeris, the standard point positioning (SPP) of all the signals from BDS-2 and BDS-3 with and without TGD correction was studied. The experiments showed that the B1I SPP accuracy of BDS-2 was increased by approximately 50% in both the horizontal and vertical components, and B1I/B3I were improved by approximately 70% in the horizontal component and 47.4% in the vertical component with TGD correction. The root mean square (RMS) value of B1I and B1C from BDS-3 with TGD correction was enhanced by approximately 60%–70% in the horizontal component and by approximately 50% in the vertical component. The B2a-based SPP was increased by 60.2% and 64.4% in the east and north components, respectively, and the up component was increased by approximately 19.8%. For the B1I/B3I and B1C/B2a dual-frequency positioning accuracy with TGD correction, the improvement in the horizontal component ranges from 62.1% to 75.0%, and the vertical component was improved by approximately 45%. Furthermore, the positioning accuracy of the BDS-2 + BDS-3 combination constellation was obviously higher than that of BDS-2 or BDS-3. Full article
(This article belongs to the Special Issue Global Navigation Satellite Systems for Earth Observing System)
Show Figures

Graphical abstract

Graphical abstract
Full article ">Figure 1
<p>The distribution of the six selected stations from the international GNSS continuous monitoring and assessment system (iGMAS).</p>
Full article ">Figure 2
<p>Horizontal positioning error scatter plots of B1I SPP with and without time group delay (TGD) correction at the selected stations. In each plot, the horizontal and vertical axes indicate the East (E) and North (N) components errors, respectively (unit: m).</p>
Full article ">Figure 3
<p>Vertical positioning error scatter plots of B1I SPP with and without TGD correction on the selected stations. In each plot, the horizontal and vertical axes indicate the universal time (h) and the Up (U) component error, respectively (unit: m).</p>
Full article ">Figure 4
<p>Positioning error scatter plots of B3I SPP without TGD correction for the BRCH station. In each plot, the horizontal axis indicates the universal time (h), and the vertical axes indicates the E, N and U component errors, respectively (unit: m).</p>
Full article ">Figure 5
<p>Horizontal positioning error scatter plots of B1I/B3I SPP with and without TGD correction at the selected stations. In each plot, the horizontal and vertical axes indicate the E and N component errors, respectively (unit: m).</p>
Full article ">Figure 6
<p>Vertical positioning error scatter plots of B1I/B3I SPP with and without TGD correction at the selected stations. In each plot, the horizontal and vertical axes indicate the universal time (h) and the U component errors, respectively (unit: m).</p>
Full article ">Figure 7
<p>Box-whisker diagrams of the distributions of the three dimensional (3D) positioning errors in the tgd-corr (red) and non-corr (blue) schemes at the BRCH, GUA1, LHA1, WUH1, XIA1 and XIA5 stations for the 41-day period (tgd: tgd-corr, non: non-corr) using the regional BeiDou system (BDS-2) satellites. The box heights and the bars inside the boxes denote the inter-quartile ranges (IQRs) and the medians of the distributions, respectively. The whiskers’ lengths represent the maximum and minimum values of distributions (unit: m). Outliers are identified with plus signs (see the text).</p>
Full article ">Figure 8
<p>Daily RMS values of BDS-2 with different schemes for the BRCH station.</p>
Full article ">Figure 9
<p>Improvements in the E, N, and U components of the “tgd-corr” scheme compared to the “non-corr” scheme at different stations. Note that the mean RMS of 41 days at each station with and without TGD correction is first obtained. The improvement from the “tgd-corr” scheme is then calculated and compared to that from the “non-corr” scheme.</p>
Full article ">Figure 10
<p>Average global PDOP of BDS-2 on day of the year (DOY) 10 2019 with an elevation cut-off angle of 5°.</p>
Full article ">Figure 11
<p>Horizontal positioning error scatter plots of B1I SPP with and without TGD correction for the six selected stations. In each plot, the horizontal and vertical axes indicate the E component error and the N component error, respectively (unit: m).</p>
Full article ">Figure 12
<p>Vertical positioning error scatter plots of B1I SPP with and without TGD correction for the six selected stations. In each plot, the horizontal and vertical axes indicate the universal time (h) and the U component error, respectively (unit: m).</p>
Full article ">Figure 13
<p>Horizontal positioning error scatter plots of B1C SPP with and without TGD correction for the six selected stations. In each plot, the horizontal and vertical axes indicate the E component error and the N component error, respectively (unit: m).</p>
Full article ">Figure 14
<p>Vertical positioning error scatter plots of B1C SPP with and without TGD correction for the six selected stations. In each plot, the horizontal and vertical axes indicate the universal time (h) and the U component error, respectively (unit: m).</p>
Full article ">Figure 15
<p>Horizontal positioning error scatter plots of B2a SPP with and without TGD correction for the six selected stations. In each plot, the horizontal and vertical axes indicate the E component error and the N component error, respectively (unit: m).</p>
Full article ">Figure 16
<p>Vertical positioning error scatter plots of B2a SPP with and without TGD correction for the six selected stations. In each plot, the horizontal and vertical axes indicate the universal time (h) and the U component error, respectively (unit: m).</p>
Full article ">Figure 17
<p>Vertical positioning error scatter plots of B3I SPP without TGD correction for the BRCH station. In each plot, the horizontal axis is the universal time (h) and the vertical axes indicate the errors of the E, N and U components (unit: m).</p>
Full article ">Figure 18
<p>Horizontal positioning error scatter plots of B1I/B3I SPP with and without TGD correction for the six selected stations. In each plot, the horizontal and vertical axes indicate the E component error and the N component error, respectively (unit: m).</p>
Full article ">Figure 19
<p>Vertical positioning error scatter plots of B1I/B3I SPP with and without TGD correction for the six selected stations. In each plot, the horizontal and vertical axes indicate the universal time (h) and the U component error, respectively (unit: m).</p>
Full article ">Figure 20
<p>Horizontal positioning error scatter plots of B1C/B2a SPP with and without TGD correction for the six selected stations. In each plot, the horizontal and vertical axes indicate the E component error and N component error, respectively (unit: m).</p>
Full article ">Figure 21
<p>Vertical positioning error scatter plots of B1C/B2a SPP with and without TGD correction for the six selected stations. In each plot, the horizontal and vertical axes indicate the universal time (h) and the U component error, respectively (unit: m).</p>
Full article ">Figure 22
<p>Box-whisker diagrams of the distributions of the 3D positioning errors in the tgd-corr (red) and non-corr (blue) schemes at the BRCH, GUA1, LHA1, WUH1, XIA1 and XIA5 stations for the 41-day period (tgd: tgd-corr, non: non-corr) using the BDS-3 satellites. The box heights and the bars inside the boxes denote the IQRs and medians of the distributions, respectively. The whiskers’ lengths represent the maximum and minimum values of the distributions (unit: m). Outliers are identified with plus signs (see text).</p>
Full article ">Figure 23
<p>Daily RMSs of old signals from BDS-3 with and without TGD correction for the BRCH station.</p>
Full article ">Figure 24
<p>Daily RMSs of the new signals from BDS-3 with and without TGD correction for the BRCH station.</p>
Full article ">Figure 25
<p>Improvements in the E, N and U components from the “tgd-corr” scheme compared to those from the “non-corr” scheme for different stations. Note that the 41-day mean RMS at each station with and without TGD correction is obtained first. The improvements from the “tgd-corr” scheme is then calculated and compared to those from the “non-corr” scheme.</p>
Full article ">Figure 26
<p>Code residual scatter plots for the BRCH station with/without TGD correction. In each plot, the horizontal and vertical axes indicate the universal time (h) and the code residual, respectively (unit: m).</p>
Full article ">Figure 27
<p>Average global PDOP of BDS-3 on DOY 10 2019, with an elevation cut-off angle of 5°.</p>
Full article ">Figure 28
<p>Horizontal positioning error scatter plots of B1I SPP with TGD correction for the six selected stations. In each plot, the horizontal and vertical axes indicate the E component error and the N component error, respectively (unit: m).</p>
Full article ">Figure 29
<p>Vertical positioning error scatter plots of B1I SPP with TGD correction for the six selected stations. In each plot, the horizontal and vertical axes indicate the universal time (h) and the U component error, respectively (unit: m).</p>
Full article ">Figure 30
<p>Horizontal positioning error scatter plots of B3I SPP with TGD correction for the six selected stations. In each plot, the horizontal and vertical axes indicate the E component error and the N component error, respectively (unit: m).</p>
Full article ">Figure 31
<p>Vertical positioning error scatter plots of B3I SPP with TGD correction for the six selected stations. In each plot, the horizontal and vertical axes indicate the universal time (h) and the U component error, respectively (unit: m).</p>
Full article ">Figure 32
<p>Horizontal positioning error scatter plots of B1I/B3I SPP with TGD correction for the six selected stations. In each plot, the horizontal and vertical axes indicate the E component error and the N component error, respectively (unit: m).</p>
Full article ">Figure 33
<p>Vertical positioning error scatter plots of B1I/B3I SPP with TGD correction for the six selected stations. In each plot, the horizontal and vertical axes indicate the universal time (h) and the U component error, respectively (unit: m).</p>
Full article ">Figure 34
<p>Daily RMSs of BDS-2 + BDS-3 at the BRCH station with TGD correction.</p>
Full article ">
19 pages, 19285 KiB  
Article
Copernicus Imaging Microwave Radiometer (CIMR) Benefits for the Copernicus Level 4 Sea-Surface Salinity Processing Chain
by Daniele Ciani, Rosalia Santoleri, Gian Luigi Liberti, Catherine Prigent, Craig Donlon and Bruno Buongiorno Nardelli
Remote Sens. 2019, 11(15), 1818; https://doi.org/10.3390/rs11151818 - 3 Aug 2019
Cited by 10 | Viewed by 4953
Abstract
We present a study on the potential of the Copernicus Imaging Microwave Radiometer (CIMR) mission for the global monitoring of Sea-Surface Salinity (SSS) using Level-4 (gap-free) analysis processing. Space-based SSS are currently provided by the Soil Moisture and Ocean Salinity (SMOS) and Soil [...] Read more.
We present a study on the potential of the Copernicus Imaging Microwave Radiometer (CIMR) mission for the global monitoring of Sea-Surface Salinity (SSS) using Level-4 (gap-free) analysis processing. Space-based SSS are currently provided by the Soil Moisture and Ocean Salinity (SMOS) and Soil Moisture Active Passive (SMAP) satellites. However, there are no planned missions to guarantee continuity in the remote SSS measurements for the near future. The CIMR mission is in a preparatory phase with an expected launch in 2026. CIMR is focused on the provision of global coverage, high resolution sea-surface temperature (SST), SSS and sea-ice concentration observations. In this paper, we evaluate the mission impact within the Copernicus Marine Environment Monitoring Service (CMEMS) SSS processing chain. The CMEMS SSS operational products are based on a combination of in situ and satellite (SMOS) SSS and high-resolution SST information through a multivariate optimal interpolation. We demonstrate the potential of CIMR within the CMEMS SSS operational production after the SMOS era. For this purpose, we implemented an Observing System Simulation Experiment (OSSE) based on the CMEMS MERCATOR global operational model. The MERCATOR SSSs were used to generate synthetic in situ and CIMR SSS and, at the same time, they provided a reference gap-free SSS field. Using the optimal interpolation algorithm, we demonstrated that the combined use of in situ and CIMR observations improves the global SSS retrieval compared to a processing where only in situ observations are ingested. The improvements are observed in the 60% and 70% of the global ocean surface for the reconstruction of the SSS and of the SSS spatial gradients, respectively. Moreover, the study highlights the CIMR-based salinity patterns are more accurate both in the open ocean and in coastal areas. We conclude that CIMR can guarantee continuity for accurate monitoring of the ocean surface salinity from space. Full article
(This article belongs to the Special Issue Ten Years of Remote Sensing at Barcelona Expert Center)
Show Figures

Graphical abstract

Graphical abstract
Full article ">Figure 1
<p>Workflow of the observing system simulation experiment.</p>
Full article ">Figure 2
<p><math display="inline"> <semantics> <mi>σ</mi> </semantics> </math>SSS computed according to [<a href="#B19-remotesensing-11-01818" class="html-bibr">19</a>], example from 1 January 2016. The additional information in white are referenced in <a href="#sec3dot2dot1-remotesensing-11-01818" class="html-sec">Section 3.2.1</a>.</p>
Full article ">Figure 3
<p>(<b>a</b>) MERCATOR SSS, 1 January 2016, Gulf Stream area; (<b>b</b>) MERCATOR SSS with addition of white noise according to Equation (<a href="#FD3-remotesensing-11-01818" class="html-disp-formula">3</a>); 1 January 2016, Gulf Stream area.</p>
Full article ">Figure 4
<p>Simulating the CIMR observations from the MERCATOR SSS. (<b>a</b>) expected CIMR coverage; (<b>b</b>) daily mask for land, sea-ice and precipitation from AMSR-2 SST observations (blue and green respectively stand for available and unavailable observations); (<b>c</b>) synthetic CIMR observations obtained combining the information on the CIMR overpasses, the AMSR-2 observations and the noise. All of the figures are mapped onto a regular 1/4<math display="inline"> <semantics> <msup> <mrow/> <mo>∘</mo> </msup> </semantics> </math> grid (the same as the present-day CMEMS L4 SSS) and refer to 1 January 2016.</p>
Full article ">Figure 5
<p>(<b>a</b>) L4 SSS from in situ observations (IL4); (<b>b</b>) extraction of in situ SSS from the MERCATOR SSS, squares and circles, respectively, stand for pseudo and in situ observations; (<b>c</b>) L4 SSS from the combination of in situ and CIMR observations (CIL4); (<b>d</b>) simulated CIMR L3 SSS; (<b>e</b>) MERCATOR SSS (benchmark); (<b>f</b>) MERCATOR SST. All figures refer to 1 January 2016, in the Gulf Stream Area.</p>
Full article ">Figure 6
<p>(<b>a</b>) RMSE between the OI L4 SSS and the MERCATOR outputs. Blue and red, respectively, stand for IL4 and CIL4 reconstructions. The statistics are referred to the 45<math display="inline"> <semantics> <msup> <mrow/> <mo>∘</mo> </msup> </semantics> </math>S to the 45<math display="inline"> <semantics> <msup> <mrow/> <mo>∘</mo> </msup> </semantics> </math>N latitudinal band (Area M); (<b>b</b>) analyses referred to the the 45<math display="inline"> <semantics> <msup> <mrow/> <mo>∘</mo> </msup> </semantics> </math>N to the 90<math display="inline"> <semantics> <msup> <mrow/> <mo>∘</mo> </msup> </semantics> </math>N latitudinal band (Area N); (<b>c</b>) analyses referred to the 90<math display="inline"> <semantics> <msup> <mrow/> <mo>∘</mo> </msup> </semantics> </math>S to the 45<math display="inline"> <semantics> <msup> <mrow/> <mo>∘</mo> </msup> </semantics> </math>S latitudinal band (Area S).</p>
Full article ">Figure 7
<p>(<b>a</b>) <math display="inline"> <semantics> <mo>Δ</mo> </semantics> </math>RMSE based on weekly data, year 2016; (<b>b</b>) density of in situ SSS for the year 2016. The maximum number of in situ observations is ≃140 (in the North Atlantic). The colorbar is saturated to 5 in order to facilitate the visualization of the measurement sites at a global scale.</p>
Full article ">Figure 8
<p>(<b>a</b>) RMSE<math display="inline"> <semantics> <msup> <mrow/> <mrow> <mi>C</mi> <mi>I</mi> <mi>L</mi> <mn>4</mn> </mrow> </msup> </semantics> </math>; (<b>b</b>) RMSE<math display="inline"> <semantics> <msup> <mrow/> <mrow> <mi>I</mi> <mi>L</mi> <mn>4</mn> </mrow> </msup> </semantics> </math>.</p>
Full article ">Figure 9
<p><math display="inline"> <semantics> <mo>Δ</mo> </semantics> </math>RMSE<math display="inline"> <semantics> <msub> <mrow/> <mo>∇</mo> </msub> </semantics> </math> based on weekly data, year 2016.</p>
Full article ">Figure 10
<p>(<b>a</b>) blue line: <math display="inline"> <semantics> <mrow> <mo>〈</mo> <mi>σ</mi> </mrow> </semantics> </math>SSS〉 in the 45<math display="inline"> <semantics> <msup> <mrow/> <mo>∘</mo> </msup> </semantics> </math>S to 45<math display="inline"> <semantics> <msup> <mrow/> <mo>∘</mo> </msup> </semantics> </math>N latitudinal band (Area M). Red line: <math display="inline"> <semantics> <mrow> <mo>〈</mo> <mi>σ</mi> </mrow> </semantics> </math>SSS〉 from 45<math display="inline"> <semantics> <msup> <mrow/> <mo>∘</mo> </msup> </semantics> </math>N to 90<math display="inline"> <semantics> <msup> <mrow/> <mo>∘</mo> </msup> </semantics> </math>N (Area N) and from 45<math display="inline"> <semantics> <msup> <mrow/> <mo>∘</mo> </msup> </semantics> </math>S to 90<math display="inline"> <semantics> <msup> <mrow/> <mo>∘</mo> </msup> </semantics> </math>S (Area S); (<b>b</b>) same analysis for the AMSR-2 derived OWS; (<b>c</b>) same analysis for the AMSR-2 derived SST.</p>
Full article ">Figure 11
<p>Time average of the mean zonal spectra of the MERCATOR SSS (green line), the IL4 (red line) and the CIL4 (blue line). The time average is based on weekly data for the year 2016. The spectra are computed in five different areas of the global ocean, referenced in the top panel of the figure.</p>
Full article ">
16 pages, 1898 KiB  
Article
Deep Residual Squeeze and Excitation Network for Remote Sensing Image Super-Resolution
by Jun Gu, Xian Sun, Yue Zhang, Kun Fu and Lei Wang
Remote Sens. 2019, 11(15), 1817; https://doi.org/10.3390/rs11151817 - 3 Aug 2019
Cited by 79 | Viewed by 10671
Abstract
Recently, deep convolutional neural networks (DCNN) have obtained promising results in single image super-resolution (SISR) of remote sensing images. Due to the high complexity of remote sensing image distribution, most of the existing methods are not good enough for remote sensing image super-resolution. [...] Read more.
Recently, deep convolutional neural networks (DCNN) have obtained promising results in single image super-resolution (SISR) of remote sensing images. Due to the high complexity of remote sensing image distribution, most of the existing methods are not good enough for remote sensing image super-resolution. Enhancing the representation ability of the network is one of the critical factors to improve remote sensing image super-resolution performance. To address this problem, we propose a new SISR algorithm called a Deep Residual Squeeze and Excitation Network (DRSEN). Specifically, we propose a residual squeeze and excitation block (RSEB) as a building block in DRSEN. The RSEB fuses the input and its internal features of current block, and models the interdependencies and relationships between channels to enhance the representation power. At the same time, we improve the up-sampling module and the global residual pathway in the network to reduce the parameters of the network. Experiments on two public remote sensing datasets (UC Merced and NWPU-RESISC45) show that our DRSEN achieves better accuracy and visual improvements against most state-of-the-art methods. The DRSEN is beneficial for the progress in the remote sensing images super-resolution field. Full article
(This article belongs to the Special Issue Image Super-Resolution in Remote Sensing)
Show Figures

Figure 1

Figure 1
<p>The network structure comparison of the EDSR and our DRSEN. (<b>a</b>) network architecture of EDSR; (<b>b</b>) network architecture of our deep residual attention network (DRSEN).</p>
Full article ">Figure 2
<p>The comparison of the original Residual Block and our RSEB. (<b>a</b>) residual block structure of EDSR; (<b>b</b>) residual squeeze and excitation block (RSEB) architecture.</p>
Full article ">Figure 3
<p>Squeeze and excitation module architecture.</p>
Full article ">Figure 4
<p>Examples of images in two datasets. The first line is the UC Merced dataset, and the second line is the NWPU-RESISC45 dataset.</p>
Full article ">Figure 5
<p>We researched about three different structures of the local feature fusion module. (<b>a</b>) has no connections in the block, (<b>b</b>) adopt a long-distance skip connection and (<b>c</b>) use a short path connection.</p>
Full article ">Figure 6
<p>The index value of 10 training results. The results are evaluated with the UC Merced test dataset for <math display="inline"><semantics> <mrow> <mo>×</mo> <mn>2</mn> </mrow> </semantics></math> SR.</p>
Full article ">Figure 7
<p>The number of network parameters versus performance. The results are evaluated with a UC Merced test dataset for <math display="inline"><semantics> <mrow> <mo>×</mo> <mn>2</mn> </mrow> </semantics></math> SR. Our proposed models achieve better performance with relatively fewer parameters.</p>
Full article ">Figure 8
<p>Super-resolution results of “overpass26” (UC Merced) and “roundabout132” (NWPU-RESISC45) with scale factor ×2. The edges of the overpass and the lane line in our results are more clear. We refer to FSRCNN as FSR for short.</p>
Full article ">Figure 9
<p>Super-resolution results of “airplane76” (UC Merced) (a) and “runway512” (NWPU-RESISC45) (b) with scale factor ×3. The texture of the airplane and the lines in the sidewalk are observed in our methods, while others suffer from blurring artifacts. We refer to FSRCNN as FSR for short.</p>
Full article ">Figure 10
<p>Super-resolution results of “denseresident191” (UC Merced) (<b>a</b>) and “railwaystation565” (NWPU-RESISC45) (<b>b</b>) with scale factor ×4. The outline of the car is distinct and the lattices of the building roof are closer to the original image. We refer to FSRCNN as FSR for short.</p>
Full article ">
18 pages, 19991 KiB  
Article
Estimating and Examining the Sensitivity of Different Vegetation Indices to Fractions of Vegetation Cover at Different Scaling Grids for Early Stage Acacia Plantation Forests Using a Fixed-Wing UAS
by Kotaro Iizuka, Tsuyoshi Kato, Sisva Silsigia, Alifia Yuni Soufiningrum and Osamu Kozan
Remote Sens. 2019, 11(15), 1816; https://doi.org/10.3390/rs11151816 - 3 Aug 2019
Cited by 22 | Viewed by 5571
Abstract
Understanding the information on land conditions and especially green vegetation cover is important for monitoring ecosystem dynamics. The fraction of vegetation cover (FVC) is a key variable that can be used to observe vegetation cover trends. Conventionally, satellite data are utilized to compute [...] Read more.
Understanding the information on land conditions and especially green vegetation cover is important for monitoring ecosystem dynamics. The fraction of vegetation cover (FVC) is a key variable that can be used to observe vegetation cover trends. Conventionally, satellite data are utilized to compute these variables, although computations in regions such as the tropics can limit the amount of available observation information due to frequent cloud coverage. Unmanned aerial systems (UASs) have become increasingly prominent in recent research and can remotely sense using the same methods as satellites but at a lower altitude. UASs are not limited by clouds and have a much higher resolution. This study utilizes a UAS to determine the emerging trends for FVC estimates at an industrial plantation site in Indonesia, which utilizes fast-growing Acacia trees that can rapidly change the land conditions. First, the UAS was utilized to collect high-resolution RGB imagery and multispectral images for the study area. The data were used to develop general land use/land cover (LULC) information for the site. Multispectral data were converted to various vegetation indices, and within the determined resolution grid (5, 10, 30 and 60 m), the fraction of each LULC type was analyzed for its correlation between the different vegetation indices (Vis). Finally, a simple empirical model was developed to estimate the FVC from the UAS data. The results show the correlation between the FVC (acacias) and different Vis ranging from R2 = 0.66–0.74, 0.76–0.8, 0.84–0.89 and 0.93–0.94 for 5, 10, 30 and 60 m grid resolutions, respectively. This study indicates that UAS-based FVC estimations can be used for observing fast-growing acacia trees at a fine scale resolution, which may assist current restoration programs in Indonesia. Full article
(This article belongs to the Special Issue UAVs for Vegetation Monitoring)
Show Figures

Graphical abstract

Graphical abstract
Full article ">Figure 1
<p>Example of temporal differences for fast-growing species. The left image shows the fast-growing <span class="html-italic">Acacia</span> trees in its early stages in August, 2018, while the right image shows its rapid growth in October, 2018. Even with small temporal differences, the situation of the land area would change dramatically.</p>
Full article ">Figure 2
<p>Location of the study site at West Kalimantan.</p>
Full article ">Figure 3
<p>Overall flowchart of the methodology.</p>
Full article ">Figure 4
<p>Small and light-weight global navigation satellite system (GNSS) equipment for post-processing kinematic (PPK).</p>
Full article ">Figure 5
<p>Example of the GNSS signal status from each satellite and the selection of the omitted satellites.</p>
Full article ">Figure 6
<p>Firefly Pro 6 fixed-wing vertical takeoff and landing (VTOL) unmanned aerial system (UAS) carrying the multispectral sensor and the illumination sensor.</p>
Full article ">Figure 7
<p>Ortho imagery of the test site for (<b>a</b>) the true color composite (RGB) image from the digital camera and (<b>b</b>) the false color composite (RGB: NIR, Red and Green) imagery from the multispectral sensor. The resolution is reduced to 0.2 m from the original resolution for visual purposes.</p>
Full article ">Figure 8
<p>Land use/land cover (LULC) map of the compartment area, developed using RGB and multispectral data observed from the fixed-wing UAS using the multilayer-perceptron (MLP) method.</p>
Full article ">Figure 9
<p>Relationship analysis between the different vegetation indices and fraction of LULC types: <span class="html-italic">Acacia</span>, grass/shrub and non-vegetation (bare soil + water) at different resolution scales.</p>
Full article ">Figure 9 Cont.
<p>Relationship analysis between the different vegetation indices and fraction of LULC types: <span class="html-italic">Acacia</span>, grass/shrub and non-vegetation (bare soil + water) at different resolution scales.</p>
Full article ">Figure 9 Cont.
<p>Relationship analysis between the different vegetation indices and fraction of LULC types: <span class="html-italic">Acacia</span>, grass/shrub and non-vegetation (bare soil + water) at different resolution scales.</p>
Full article ">Figure 10
<p>Estimated FVC from the UAS-based trained model for (<b>a</b>) 5 m (<b>b</b>) 10 m (<b>c</b>) 30 m and (<b>d</b>) 60 m. (<b>e</b>) The estimated FVC from Sentinel-2 data using the neural network method. The validation of the sentinel-based FVC is compared with (<b>b</b>).</p>
Full article ">Figure 10 Cont.
<p>Estimated FVC from the UAS-based trained model for (<b>a</b>) 5 m (<b>b</b>) 10 m (<b>c</b>) 30 m and (<b>d</b>) 60 m. (<b>e</b>) The estimated FVC from Sentinel-2 data using the neural network method. The validation of the sentinel-based FVC is compared with (<b>b</b>).</p>
Full article ">
17 pages, 6323 KiB  
Article
Super-Resolution Land Cover Mapping Based on the Convolutional Neural Network
by Yuanxin Jia, Yong Ge, Yuehong Chen, Sanping Li, Gerard B.M. Heuvelink and Feng Ling
Remote Sens. 2019, 11(15), 1815; https://doi.org/10.3390/rs11151815 - 2 Aug 2019
Cited by 40 | Viewed by 6190
Abstract
Super-resolution mapping (SRM) is used to obtain fine-scale land cover maps from coarse remote sensing images. Spatial attraction, geostatistics, and using prior geographic information are conventional approaches used to derive fine-scale land cover maps. As the convolutional neural network (CNN) has been shown [...] Read more.
Super-resolution mapping (SRM) is used to obtain fine-scale land cover maps from coarse remote sensing images. Spatial attraction, geostatistics, and using prior geographic information are conventional approaches used to derive fine-scale land cover maps. As the convolutional neural network (CNN) has been shown to be effective in capturing the spatial characteristics of geographic objects and extrapolating calibrated methods to other study areas, it may be a useful approach to overcome limitations of current SRM methods. In this paper, a new SRM method based on the CNN ( SRM CNN ) is proposed and tested. Specifically, an encoder-decoder CNN is used to model the nonlinear relationship between coarse remote sensing images and fine-scale land cover maps. Two real-image experiments were conducted to analyze the effectiveness of the proposed method. The results demonstrate that the overall accuracy of the proposed SRM CNN method was 3% to 5% higher than that of two existing SRM methods. Moreover, the proposed SRM CNN method was validated by visualizing output features and analyzing the performance of different geographic objects. Full article
(This article belongs to the Special Issue New Advances on Sub-pixel Processing: Unmixing and Mapping Methods)
Show Figures

Graphical abstract

Graphical abstract
Full article ">Figure 1
<p>Three stages of flowchart.</p>
Full article ">Figure 2
<p>The proposed <math display="inline"><semantics> <mrow> <msub> <mrow> <mi>SRM</mi> </mrow> <mrow> <mi>CNN</mi> </mrow> </msub> </mrow> </semantics></math> model.</p>
Full article ">Figure 3
<p>Results of Vahingen Dataset. (<b>a</b>) Coarse image; (<b>b</b>) SASPM result; (<b>c</b>) VBSPM result; (<b>d</b>) <math display="inline"><semantics> <mrow> <msub> <mrow> <mi>SRM</mi> </mrow> <mrow> <mi>CNN</mi> </mrow> </msub> </mrow> </semantics></math> result; and (<b>e</b>) reference map. The second and third row were zoom-in areas from the first row.</p>
Full article ">Figure 4
<p>Results of Potsdam dataset. (<b>a</b>) Coarse image; (<b>b</b>) SASPM result; (<b>c</b>) VBSPM result; (<b>d</b>) <math display="inline"><semantics> <mrow> <msub> <mrow> <mi>SRM</mi> </mrow> <mrow> <mi>CNN</mi> </mrow> </msub> </mrow> </semantics></math> result; and (<b>e</b>) reference. The second and third row were zoom-in areas from the first row.</p>
Full article ">Figure 5
<p>Features visualization. (<b>a</b>) Input coarse remote sensing images; (<b>b</b>) simulated result; (<b>c</b>) reference result; (<b>d</b>) visualization of first output features; (<b>e</b>) visualization of ninth output features; (<b>f</b>) visualization of 18th output features.</p>
Full article ">Figure 6
<p>Examples of the performance of <math display="inline"><semantics> <mrow> <msub> <mrow> <mi>SRM</mi> </mrow> <mrow> <mi>CNN</mi> </mrow> </msub> </mrow> </semantics></math> for different geo-objects: 1, 2, and 3 represent the patch number; and a, b, and c are the input coarse remote sensing images, predicted result, and reference, respectively.</p>
Full article ">
19 pages, 10565 KiB  
Article
Coastal Dune Vegetation Mapping Using a Multispectral Sensor Mounted on an UAS
by Chen Suo, Eugene McGovern and Alan Gilmer
Remote Sens. 2019, 11(15), 1814; https://doi.org/10.3390/rs11151814 - 2 Aug 2019
Cited by 24 | Viewed by 5778
Abstract
Vegetation mapping, identifying the type and distribution of plant species, is important for analysing vegetation dynamics, quantifying spatial patterns of vegetation evolution, analysing the effects of environmental changes and predicting spatial patterns of species diversity. Such analysis can contribute to the development of [...] Read more.
Vegetation mapping, identifying the type and distribution of plant species, is important for analysing vegetation dynamics, quantifying spatial patterns of vegetation evolution, analysing the effects of environmental changes and predicting spatial patterns of species diversity. Such analysis can contribute to the development of targeted land management actions that maintain biodiversity and ecological functions. This paper presents a methodology for 3D vegetation mapping of a coastal dune complex using a multispectral camera mounted on an unmanned aerial system with particular reference to the Buckroney dune complex in Co. Wicklow, Ireland. Unmanned aerial systems (UAS), also known as unmanned aerial vehicles (UAV) or drones, have enabled high-resolution and high-accuracy ground-based data to be gathered quickly and easily on-site. The Sequoia multispectral sensor used in this study has green, red, red edge and near-infrared wavebands, and a regular camer with red, green and blue wavebands (RGB camera), to capture both visible and near-infrared (NIR) imagery of the land surface. The workflow of 3D vegetation mapping of the study site included establishing coordinated ground control points, planning the flight mission and camera parameters, acquiring the imagery, processing the image data and performing features classification. The data processing outcomes included an orthomosaic model, a 3D surface model and multispectral imagery of the study site, in the Irish Transverse Mercator (ITM) coordinate system. The planimetric resolution of the RGB sensor-based outcomes was 0.024 m while multispectral sensor-based outcomes had a planimetric resolution of 0.096 m. High-resolution vegetation mapping was successfully generated from these data processing outcomes. There were 235 sample areas (1 m × 1 m) used for the accuracy assessment of the classification of the vegetation mapping. Feature classification was conducted using nine different classification strategies to examine the efficiency of multispectral sensor data for vegetation and contiguous land cover mapping. The nine classification strategies included combinations of spectral bands and vegetation indices. Results show classification accuracies, based on the nine different classification strategies, ranging from 52% to 75%. Full article
Show Figures

Graphical abstract

Graphical abstract
Full article ">Figure 1
<p>Morphology of the Brittas-Buckroney dune complex. Photo taken by the author.</p>
Full article ">Figure 2
<p>Photogrammetry-based 3D construction workflow of unmanned aerial systems (UAS) technology.</p>
Full article ">Figure 3
<p>Study site (<b>a</b>) general location and (<b>b</b>) site details.</p>
Full article ">Figure 4
<p>Plant species at site (<b>a</b>) mosses land; (<b>b</b>) sharp rush (<span class="html-italic">J. acutus</span>); (<b>c</b>) European marram grass (<span class="html-italic">A. arenaria</span>); (<b>d</b>) gorse (<span class="html-italic">U. europaeus</span>); (<b>e</b>) common reed (<span class="html-italic">P. australis</span>); (<b>f</b>) rusty willow (<span class="html-italic">S. cinerea</span> subsp. <span class="html-italic">oleifolia</span>). Photos taken by the author.</p>
Full article ">Figure 4 Cont.
<p>Plant species at site (<b>a</b>) mosses land; (<b>b</b>) sharp rush (<span class="html-italic">J. acutus</span>); (<b>c</b>) European marram grass (<span class="html-italic">A. arenaria</span>); (<b>d</b>) gorse (<span class="html-italic">U. europaeus</span>); (<b>e</b>) common reed (<span class="html-italic">P. australis</span>); (<b>f</b>) rusty willow (<span class="html-italic">S. cinerea</span> subsp. <span class="html-italic">oleifolia</span>). Photos taken by the author.</p>
Full article ">Figure 5
<p>Sequoia multispectral sensor mounted on a DJI Phantom 3 Pro UAS. Photo taken by the author.</p>
Full article ">Figure 6
<p>Ground control points (GCPs) set on site for UAS surveying. Photo taken by the author.</p>
Full article ">Figure 7
<p>The balance card used for radiometric calibration.</p>
Full article ">Figure 8
<p>A sample of the 3D point cloud for the north section of study site.</p>
Full article ">Figure 9
<p>Orthomosaic model of the study site.</p>
Full article ">Figure 10
<p>Digital surface model (DSM) of the study site.</p>
Full article ">Figure 11
<p>NDVI map of the study site.</p>
Full article ">Figure 12
<p>Response of training samples in wavebands from (<b>a</b>) blue, green and red wavebands extracted from RGB camera; (<b>b</b>) green, red, NIR and red edge wavebands from multispectral sensor.</p>
Full article ">Figure 13
<p>Vegetation mapping of study site.</p>
Full article ">Figure 14
<p>Classification accuracy based on different strategies.</p>
Full article ">Figure 15
<p>Wavelength and response of discrete and non-discrete spectral bands.</p>
Full article ">
20 pages, 6756 KiB  
Article
Integration of Ground-Based Remote-Sensing and In Situ Multidisciplinary Monitoring Data to Analyze the Eruptive Activity of Stromboli Volcano in 2017–2018
by Flora Giudicepietro, Sonia Calvari, Salvatore Alparone, Francesca Bianco, Alessandro Bonaccorso, Valentina Bruno, Teresa Caputo, Antonio Cristaldi, Luca D’Auria, Walter De Cesare, Bellina Di Lieto, Antonietta M. Esposito, Salvatore Gambino, Salvatore Inguaggiato, Giovanni Macedonio, Marcello Martini, Mario Mattia, Massimo Orazi, Antonio Paonita, Rosario Peluso, Eugenio Privitera, Pierdomenico Romano, Giovanni Scarpato, Anna Tramelli and Fabio Vitaadd Show full author list remove Hide full author list
Remote Sens. 2019, 11(15), 1813; https://doi.org/10.3390/rs11151813 - 2 Aug 2019
Cited by 27 | Viewed by 6718
Abstract
After a period of mild eruptive activity, Stromboli showed between 2017 and 2018 a reawakening phase, with an increase in the eruptive activity starting in May 2017. The alert level of the volcano was raised from “green” (base) to “yellow” (attention) on 7 [...] Read more.
After a period of mild eruptive activity, Stromboli showed between 2017 and 2018 a reawakening phase, with an increase in the eruptive activity starting in May 2017. The alert level of the volcano was raised from “green” (base) to “yellow” (attention) on 7 December 2017, and a small lava overflowed the crater rim on 15 December 2017. Between July 2017 and August 2018 the monitoring networks recorded nine major explosions, which are a serious hazard for Stromboli because they affect the summit area, crowded by tourists. We studied the 2017–2018 eruptive phase through the analysis of multidisciplinary data comprising thermal video-camera images, seismic, geodetic and geochemical data. We focused on the major explosion mechanism analyzing the well-recorded 1 December 2017 major explosion as a case study. We found that the 2017–2018 eruptive phase is consistent with a greater gas-rich magma supply in the shallow system. Furthermore, through the analysis of the case study major explosion, we identified precursory phases in the strainmeter and seismic data occurring 77 and 38 s before the explosive jet reached the eruptive vent, respectively. On the basis of these short-term precursors, we propose an automatic timely alarm system for major explosions at Stromboli volcano. Full article
Show Figures

Graphical abstract

Graphical abstract
Full article ">Figure 1
<p>Left: Map of Stromboli island (and its location in southern Italy, inset) with the Stromboli and Ginostra villages, the “Sciara del Fuoco” depression (SdF), and the crater zone (yellow oval). The black triangles indicate the position of the seismic stations; the blue stars indicate the position of the strainmeters; the red circles are the GPS stations; the green squares are the tiltmeters; the blue diamonds are video-cameras, and the magenta triangles are the geochemical stations. Right: The crater zone with the main vent regions (red stars): Northeast (NE), central (C) and southwest (SW).</p>
Full article ">Figure 2
<p>(<b>a</b>) Example of an explosion-quake recorded by STRA station (vertical component); (<b>b</b>) the same signal filtered in the VLP frequency band (0.02–0.2 Hz); (<b>c</b>) rose diagrams of the polarization direction of about 83,000 VLP events recorded in 2012, plotted on the map. The red diamond indicates the average source centroid, obtained from the polarization parameters, located 450m above sea level; the blue dots are the seismic stations. (<b>d</b>) The automatic locations of a subset of the VLP events recorded in 2012 carried out by the EOLO system (<a href="http://eolo.ov.ingv.it/eolo/" target="_blank">http://eolo.ov.ingv.it/eolo/</a>). The subset consists of 1915 VLP events (in red) recorded by at least six stations (blue dots) with well-located hypocenters. The mean elevation of the VLP sources is about 400 m above sea level. The VLP sources retrieved by [<a href="#B23-remotesensing-11-01813" class="html-bibr">23</a>] are within the cloud of the VLP locations (in red).</p>
Full article ">Figure 3
<p>Variations of several monitoring parameters occurred between 1 April 2017 and 6 June 2018. The major explosions are marked in each graph by black vertical lines. (<b>a</b>) Amplitude of the explosion-quakes. (<b>b</b>) Volcanic tremor amplitude. (<b>c</b>) VLP hourly rate. (<b>d</b>) Explosion counting carried out using the INGV-OE video-camera network. We estimate a 10% error on the explosion counting. (<b>e</b>) GPS baseline length variations between SVIN and SPLB. (<b>f</b>) Tilt recorded at Timpone del Fuoco (TDF). The N275°E component is direct toward the summit area, and a positive signal variation means crater up. (<b>g</b>) Daily rate of landslides. (<b>h</b>) Summit soil CO<sub>2</sub> degassing at STR02 station.</p>
Full article ">Figure 4
<p>The major explosion occurred on 1 December 2017 at 12:42:30 UTC and the background explosive seismicity of Stromboli volcano due to the ordinary explosions (STRA east-west component). Time ranges from 12:00 to 16:00 UTC. The signal amplitude is expressed in counts. The distance between the two lines is the equivalent to 2048 counts.</p>
Full article ">Figure 5
<p>VLP seismograms and thermal images of the major explosions recorded by the thermal camera at 400 m elevation on the NE flank of the SdF (<b>a</b>–<b>c</b>,<b>f</b>–<b>h</b>) and by the thermal camera at “Il Pizzo Sopra la Fossa” (890 m a.s.l. and ~250 m from the craters) (<b>d</b>,<b>e</b>). The date and time of each episode is plotted in the format dd-mm-yyyy hh:mm at the top of each graph.</p>
Full article ">Figure 6
<p>Tilt data recorded at the two components of TDF station between 12:15 and 13:15 UTC on 1 December 2017. Tilt sampling is one minute, and each value is the average of 8000 samples.</p>
Full article ">Figure 7
<p>Recordings of the major explosion on 1 December 2017. The start time of the plot is 12:41:00.00 UTC. (<b>a</b>) Infrasonic signal (STRA, see <a href="#remotesensing-11-01813-f001" class="html-fig">Figure 1</a> for the location on the map); (<b>b</b>) raw seismic signal (STRA vertical component); (<b>c</b>) seismic signal filtered in the VLP band (0.02–0.2 Hz); (<b>d</b>) SVO strainmeter (see <a href="#remotesensing-11-01813-f001" class="html-fig">Figure 1</a> for the location on the map) filtered in the band 0.02–0.2 Hz; (<b>e</b>) SVO strainmeter (see <a href="#remotesensing-11-01813-f001" class="html-fig">Figure 1</a> for the location on the map) filtered in the 0.0002–0.02 Hz band. The grey area represents the lapse time between the initial variation detected by the strainmeter (<b>e</b>) and the start of the major explosion marked by the infrasonic signal (<b>a</b>).</p>
Full article ">Figure 8
<p>Seismogram (above) and spectrogram (below) of the 1 December 2017 major explosion (STRA vertical component).</p>
Full article ">Figure 9
<p>Locations of the precursory signal (red circle) and the VLP (red star, 311 m above sea level) of the 1 December 2017 major explosion, compared with the location of a dataset of 1915 VLP events mainly recorded in the first months of 2012. The gray star represents the centroid obtained from the polarization vectors of the VLPs recorded in 2012 (located 450 m above sea level). The gray diamond represents the area of maximum density of the localizations obtained with the analysis of semblances (386 events), about 400 m above sea level. The uncertainty on the precursory signal and VLP locations are about 100 m and 300 m in the position of the source, respectively.</p>
Full article ">Figure 10
<p>De-trended and filtered time series of the North−South component of the SPLB CGPS station. The second dashed black arrow indicates the increasing northward displacement of the station between 2016 and 2018. This displacement towards N (radial direction) is compatible with modest inflation of the volcano edifice.</p>
Full article ">Figure 11
<p>Raw SVO strainmeter (<a href="#remotesensing-11-01813-f001" class="html-fig">Figure 1</a>) signal (upper panel) and band-pass (0.004–0.02 Hz) filtered SVO strainmeter signal (bottom panel). The red ellipses indicate the 1 December 2017, major explosion. The tidal component is evident on the upper unfiltered signal.</p>
Full article ">Figure 12
<p>Outline of a proposal for a timely alarm system for the major explosions at Stromboli, based on strainmeter and seismic data. The vertical orange line marks the trigger of the Ultra Long Period signal, detected by the SVO borehole strainmeter 77 s before the potential impact of the explosion on the crater area, frequented by tourists. The red line marks the trigger of the seismic stations that recorded the precursory signal, about 38 s before the onset of the explosive phenomenon that is highlighted by the infrasound signal trigger (black line).</p>
Full article ">
21 pages, 6667 KiB  
Article
Early Detection of Invasive Exotic Trees Using UAV and Manned Aircraft Multispectral and LiDAR Data
by Jonathan P. Dash, Michael S. Watt, Thomas S. H. Paul, Justin Morgenroth and Grant D. Pearse
Remote Sens. 2019, 11(15), 1812; https://doi.org/10.3390/rs11151812 - 2 Aug 2019
Cited by 65 | Viewed by 6892
Abstract
Exotic conifers can provide significant ecosystem services, but in some environments, they have become invasive and threaten indigenous ecosystems. In New Zealand, this phenomenon is of considerable concern as the area occupied by invasive exotic trees is large and increasing rapidly. Remote sensing [...] Read more.
Exotic conifers can provide significant ecosystem services, but in some environments, they have become invasive and threaten indigenous ecosystems. In New Zealand, this phenomenon is of considerable concern as the area occupied by invasive exotic trees is large and increasing rapidly. Remote sensing methods offer a potential means of identifying and monitoring land infested by these trees, enabling managers to efficiently allocate resources for their control. In this study, we sought to develop methods for remote detection of exotic invasive trees, namely Pinus sylvestris and P. ponderosa. Critically, the study aimed to detect these species prior to the onset of maturity and coning as this is important for preventing further spread. In the study environment in New Zealand’s South Island, these species reach maturity and begin bearing cones at a young age. As such, detection of these smaller individuals requires specialist methods and very high-resolution remote sensing data. We examined the efficacy of classifiers developed using two machine learning algorithms with multispectral and laser scanning data collected from two platforms—manned aircraft and unmanned aerial vehicles (UAV). The study focused on a localized conifer invasion originating from a multi-species pine shelter belt in a grassland environment. This environment provided a useful means of defining the detection thresholds of the methods and technologies employed. An extensive field dataset including over 17,000 trees (height range = 1 cm to 476 cm) was used as an independent validation dataset for the detection methods developed. We found that data from both platforms and using both logistic regression and random forests for classification provided highly accurate (kappa < 0.996 ) detection of invasive conifers. Our analysis showed that the data from both UAV and manned aircraft was useful for detecting trees down to 1 m in height and therefore shorter than 99.3% of the coning individuals in the study dataset. We also explored the relative contribution of both multispectral and airborne laser scanning (ALS) data in the detection of invasive trees through fitting classification models with different combinations of predictors and found that the most useful models included data from both sensors. However, the combination of ALS and multispectral data did not significantly improve classification accuracy. We believe that this was due to the simplistic vegetation and terrain structure in the study site that resulted in uncomplicated separability of invasive conifers from other vegetation. This study provides valuable new knowledge of the efficacy of detecting invasive conifers prior to the onset of coning using high-resolution data from UAV and manned aircraft. This will be an important tool in managing the spread of these important invasive plants. Full article
(This article belongs to the Section Environmental Remote Sensing)
Show Figures

Figure 1

Figure 1
<p>A schematic overview of the methodology used in this research.</p>
Full article ">Figure 2
<p>(<b>a</b>)The location of the Mackenzie Basin within New Zealand, outlined in red over a shaded relief map, (<b>b</b>) an overview of the site, (<b>c</b>) a close up of the UAV imagery used in the study, (<b>d</b>) a close up of the manned aircraft imagery used in the study, (<b>e</b>) a part of the UAV-LS point cloud, and (<b>f</b>) a part of the ALS point cloud collected from the manned aircraft.</p>
Full article ">Figure 3
<p>(<b>a</b>) A hexplot showing the spatial distribution and density of invasive conifers in the field dataset, (<b>b</b>) a violin and box plot showing the height distribution of non-coning (n) and coning (y) trees in the field dataset, and (<b>c</b>) the relationship between height and crown width for <span class="html-italic">P. ponderosa</span> and <span class="html-italic">P. sylvestris</span>.</p>
Full article ">Figure 4
<p>Box and whisker plots of the spectral and structural properties of the training dataset for the manned aircraft (Plane) and UAV datasets.</p>
Full article ">Figure 5
<p>Kappa values extracted from the cross-validation results during model development. Each datum represents the kappa value from cross-validation for a single model. The model identifiers as shown in <a href="#remotesensing-11-01812-t001" class="html-table">Table 1</a> are shown along the X axis and shape of each datum represents the class of model represented. Please note that models 2 and 11 could only be developed using UAV data as these included the red edge band.</p>
Full article ">Figure 6
<p>Agreement values for the independent validation dataset for trees within each height class. Each datum shows the mean of the agreement value for both the combined RF and LR models.</p>
Full article ">
36 pages, 10574 KiB  
Article
New Strategies for Time Delay Estimation during System Calibration for UAV-Based GNSS/INS-Assisted Imaging Systems
by Lisa LaForest, Seyyed Meghdad Hasheminasab, Tian Zhou, John Evan Flatt and Ayman Habib
Remote Sens. 2019, 11(15), 1811; https://doi.org/10.3390/rs11151811 - 1 Aug 2019
Cited by 22 | Viewed by 4727
Abstract
The need for accurate 3D spatial information is growing rapidly in many of today’s key industries, such as precision agriculture, emergency management, infrastructure monitoring, and defense. Unmanned aerial vehicles (UAVs) equipped with global navigation satellite systems/inertial navigation systems (GNSS/INS) and consumer-grade digital imaging [...] Read more.
The need for accurate 3D spatial information is growing rapidly in many of today’s key industries, such as precision agriculture, emergency management, infrastructure monitoring, and defense. Unmanned aerial vehicles (UAVs) equipped with global navigation satellite systems/inertial navigation systems (GNSS/INS) and consumer-grade digital imaging sensors are capable of providing accurate 3D spatial information at a relatively low cost. However, with the use of consumer-grade sensors, system calibration is critical for accurate 3D reconstruction. In this study, ‘consumer-grade’ refers to cameras that require system calibration by the user instead of by the manufacturer or other high-end laboratory settings, as well as relatively low-cost GNSS/INS units. In addition to classical spatial system calibration, many consumer-grade sensors also need temporal calibration for accurate 3D reconstruction. This study examines the accuracy impact of time delay in the synchronization between the GNSS/INS unit and cameras on-board UAV-based mapping systems. After reviewing existing strategies, this study presents two approaches (direct and indirect) to correct for time delay between GNSS/INS recorded event markers and actual time of image exposure. Our results show that both approaches are capable of handling and correcting this time delay, with the direct approach being more rigorous. When a time delay exists and the direct or indirect approach is applied, horizontal accuracy of 1–3 times the ground sampling distance (GSD) can be achieved without either the use of any ground control points (GCPs) or adjusting the original GNSS/INS trajectory information. Full article
Show Figures

Graphical abstract

Graphical abstract
Full article ">Figure 1
<p>(<b>a</b>) Orthophoto generated while ignoring the time delay during calibration. (<b>b</b>) Orthophoto generated with time delay accounted for during calibration.</p>
Full article ">Figure 1 Cont.
<p>(<b>a</b>) Orthophoto generated while ignoring the time delay during calibration. (<b>b</b>) Orthophoto generated with time delay accounted for during calibration.</p>
Full article ">Figure 2
<p>Conceptual basis of bundle block adjustment.</p>
Full article ">Figure 3
<p>Illustration of collinearity equations.</p>
Full article ">Figure 4
<p>Establishing an expression for the correct IMU body frame orientation in the presence of time delay.</p>
Full article ">Figure 5
<p>Illustration of the direct approach for time delay estimation within the bundle block adjustment with system self-calibration.</p>
Full article ">Figure 6
<p>Illustration of where measurements are taken to acquire the nominal lever arm values.</p>
Full article ">Figure 7
<p>Processing workflow of the indirect approach process for time delay estimation.</p>
Full article ">Figure 8
<p>M200-based thermal/RGB system configuration.</p>
Full article ">Figure 9
<p>DJI M600-based Sony Alpha 7R system configuration.</p>
Full article ">Figure 10
<p>Trajectory and target locations for FLIR Duo Pro R and Sony Alpha 7R.</p>
Full article ">Figure 11
<p>(<b>a</b>) XYZ component linear velocity over flight time for the July 25th thermal dataset. (<b>b</b>) <math display="inline"><semantics> <mrow> <mi>ω</mi> <mo>,</mo> <mrow> <mtext> </mtext> <mo>φ</mo> </mrow> <mo>,</mo> <mrow> <mtext> </mtext> <mi>κ</mi> </mrow> </mrow> </semantics></math> component angular velocity over flight time for the July 25th thermal dataset.</p>
Full article ">Figure 12
<p>Sample corresponding thermal and RGB images from the FLIR Duo Pro R.</p>
Full article ">Figure 13
<p>Flight area with enhanced representations of checkerboard targets and sample thermal and RGB images of the FLIR camera around the target location.</p>
Full article ">Figure 14
<p>Sample images captured by the Sony RBG sensor over the calibration test field. (<b>a</b>) 20 m flying height and (<b>b</b>) 40 m flying height.</p>
Full article ">Figure 15
<p>Sony A7R calibration field.</p>
Full article ">Figure 16
<p>Distribution of the tie points used for the FLIR thermal sensor in the Sept. 14th collection date for the direct (<b>right</b>) and indirect* (<b>left</b>) approaches (*only 10% of total tie points are plotted).</p>
Full article ">Figure 17
<p>Orthophoto result while ignoring the time delay using the original trajectory data for FLIR thermal July 25th data collection (red boxes show the location of the check points, <math display="inline"><semantics> <mrow> <mi>O</mi> <mi>r</mi> <mi>i</mi> <mi>g</mi> <mi>n</mi> <mi>a</mi> <mi>l</mi> <mtext> </mtext> <mi>I</mi> <mi>m</mi> <mi>a</mi> <mi>g</mi> <mi>e</mi> <mtext> </mtext> <mi>G</mi> <mi>S</mi> <mi>D</mi> <mo>≈</mo> <mn>0.03</mn> <mi>m</mi> </mrow> </semantics></math>).</p>
Full article ">Figure 18
<p>Orthophoto result while ignoring the time delay using the refined trajectory data for FLIR thermal July 25th data collection (red boxes show the location of the check points, <math display="inline"><semantics> <mrow> <mi>O</mi> <mi>r</mi> <mi>i</mi> <mi>g</mi> <mi>n</mi> <mi>a</mi> <mi>l</mi> <mtext> </mtext> <mi>I</mi> <mi>m</mi> <mi>a</mi> <mi>g</mi> <mi>e</mi> <mtext> </mtext> <mi>G</mi> <mi>S</mi> <mi>D</mi> <mo>≈</mo> <mn>0.03</mn> <mi>m</mi> </mrow> </semantics></math>).</p>
Full article ">Figure 19
<p>Orthophoto result from direct approach for FLIR thermal July 25th data collection (red boxes show the location of the check points, <math display="inline"><semantics> <mrow> <mi>O</mi> <mi>r</mi> <mi>i</mi> <mi>g</mi> <mi>n</mi> <mi>a</mi> <mi>l</mi> <mtext> </mtext> <mi>I</mi> <mi>m</mi> <mi>a</mi> <mi>g</mi> <mi>e</mi> <mtext> </mtext> <mi>G</mi> <mi>S</mi> <mi>D</mi> <mo>≈</mo> <mn>0.03</mn> <mi>m</mi> </mrow> </semantics></math>).</p>
Full article ">Figure 20
<p>Orthophoto result from direct approach for FLIR thermal September 14th data collection (red boxes show the location of the check points, <math display="inline"><semantics> <mrow> <mi>O</mi> <mi>r</mi> <mi>i</mi> <mi>g</mi> <mi>n</mi> <mi>a</mi> <mi>l</mi> <mtext> </mtext> <mi>I</mi> <mi>m</mi> <mi>a</mi> <mi>g</mi> <mi>e</mi> <mtext> </mtext> <mi>G</mi> <mi>S</mi> <mi>D</mi> <mo>≈</mo> <mn>0.03</mn> <mi>m</mi> </mrow> </semantics></math>).</p>
Full article ">Figure 21
<p>Orthophoto result ignoring the time delay using the original trajectory data for FLIR RGB July 25th data collection (red boxes show the location of the check points—only two visible, <math display="inline"><semantics> <mrow> <mi>O</mi> <mi>r</mi> <mi>i</mi> <mi>g</mi> <mi>i</mi> <mi>n</mi> <mi>a</mi> <mi>l</mi> <mtext> </mtext> <mi>I</mi> <mi>m</mi> <mi>a</mi> <mi>g</mi> <mi>e</mi> <mtext> </mtext> <mi>G</mi> <mi>S</mi> <mi>D</mi> <mo>≈</mo> <mn>0.01</mn> <mi>m</mi> </mrow> </semantics></math>).</p>
Full article ">Figure 22
<p>Orthophoto result ignoring the time delay using the refined trajectory data for FLIR RGB July 25th data collection (red boxes show the location of the check points, <math display="inline"><semantics> <mrow> <mi>O</mi> <mi>r</mi> <mi>i</mi> <mi>g</mi> <mi>i</mi> <mi>n</mi> <mi>a</mi> <mi>l</mi> <mtext> </mtext> <mi>I</mi> <mi>m</mi> <mi>a</mi> <mi>g</mi> <mi>e</mi> <mtext> </mtext> <mi>G</mi> <mi>S</mi> <mi>D</mi> <mo>≈</mo> <mn>0.01</mn> <mi>m</mi> </mrow> </semantics></math>).</p>
Full article ">Figure 23
<p>Orthophoto result from direct approach for FLIR RGB July 25th data collection (red boxes show the location of the check points, <math display="inline"><semantics> <mrow> <mi>O</mi> <mi>r</mi> <mi>i</mi> <mi>g</mi> <mi>i</mi> <mi>n</mi> <mi>a</mi> <mi>l</mi> <mtext> </mtext> <mi>I</mi> <mi>m</mi> <mi>a</mi> <mi>g</mi> <mi>e</mi> <mtext> </mtext> <mi>G</mi> <mi>S</mi> <mi>D</mi> <mo>≈</mo> <mn>0.01</mn> <mi>m</mi> </mrow> </semantics></math>).</p>
Full article ">Figure 24
<p>Orthophoto result from Direct Approach for FLIR RGB September 14th data collection (red boxes show the location of the check points, <math display="inline"><semantics> <mrow> <mi>O</mi> <mi>r</mi> <mi>i</mi> <mi>g</mi> <mi>i</mi> <mi>n</mi> <mi>a</mi> <mi>l</mi> <mtext> </mtext> <mi>G</mi> <mi>S</mi> <mi>D</mi> <mo>≈</mo> <mn>0.01</mn> <mi>m</mi> </mrow> </semantics></math>).</p>
Full article ">Figure 25
<p>Orthophoto result from direct approach for Sony—May 06th data collection (red boxes show the location of the check points,<math display="inline"><semantics> <mrow> <mtext> </mtext> <mi>G</mi> <mi>S</mi> <mi>D</mi> <mo>≈</mo> <mn>0.0056</mn> <mi>m</mi> </mrow> </semantics></math>).</p>
Full article ">
26 pages, 13271 KiB  
Article
A Study of Vertical Structures and Microphysical Characteristics of Different Convective Cloud–Precipitation Types Using Ka-Band Millimeter Wave Radar Measurements
by Jiafeng Zheng, Peiwen Zhang, Liping Liu, Yanxia Liu and Yuzhang Che
Remote Sens. 2019, 11(15), 1810; https://doi.org/10.3390/rs11151810 - 1 Aug 2019
Cited by 6 | Viewed by 5488
Abstract
Millimeter wave cloud radar (MMCR) is one of the primary instruments employed to observe cloud–precipitation. With appropriate data processing, measurements of the Doppler spectra, spectral moments, and retrievals can be used to study the physical processes of cloud–precipitation. This study mainly analyzed the [...] Read more.
Millimeter wave cloud radar (MMCR) is one of the primary instruments employed to observe cloud–precipitation. With appropriate data processing, measurements of the Doppler spectra, spectral moments, and retrievals can be used to study the physical processes of cloud–precipitation. This study mainly analyzed the vertical structures and microphysical characteristics of different kinds of convective cloud–precipitation in South China during the pre-flood season using a vertical pointing Ka-band MMCR. Four kinds of convection, namely, multi-cell, isolated-cell, convective–stratiform mixed, and warm-cell convection, are discussed herein. The results show that the multi-cell and convective–stratiform mixed convections had similar vertical structures, and experienced nearly the same microphysical processes in terms of particle phase change, particle size distribution, hydrometeor growth, and breaking. A forward pattern was proposed to specifically characterize the vertical structure and provide radar spectra models reflecting the different microphysical and dynamic features and variations in different parts of the cloud body. Vertical air motion played key roles in the microphysical processes of the isolated- and warm-cell convections, and deeply affected the ground rainfall properties. Stronger, thicker, and slanted updrafts caused heavier showers with stronger rain rates and groups of larger raindrops. The microphysical parameters for the warm-cell cloud–precipitation were retrieved from the radar data and further compared with the ground-measured results from a disdrometer. The comparisons indicated that the radar retrievals were basically reliable; however, the radar signal weakening caused biases to some extent, especially for the particle number concentration. Note that the differences in sensitivity and detectable height of the two instruments also contributed to the compared deviation. Full article
(This article belongs to the Special Issue Radar Meteorology)
Show Figures

Graphical abstract

Graphical abstract
Full article ">Figure 1
<p>The flow chart of data processing and microphysical parameter retrieval for MMCR and Parsivel disdrometer.</p>
Full article ">Figure 2
<p>Weather charts on 21–23 April 2016: (<b>a</b>–<b>c</b>) at 700 hPa and (<b>d</b>–<b>f</b>) on the surface.</p>
Full article ">Figure 3
<p>S-band radar combination reflectivity (CR) images of different convections between 21–23 April. (<b>a</b>–<b>i</b>) represent nine different moments. The circle represents the location of the Heyuan S-band radar, the cross represents that of the Longmen site.</p>
Full article ">Figure 4
<p>Overall time–height cross-sections of the cloud–precipitation that happened over the Longmen site on 21–23 April 2016: (<b>a</b>) S-band radar reflectivity, (<b>b</b>) MMCR reflectivity, and (<b>c</b>) rain rate measured by Parsivel on the ground.</p>
Full article ">Figure 5
<p>Time–height cross-sections of the MMCR measurements and Parsivel-measured rain rate and DSD for a nonlinear multi-cell convection that occurred over the site between 10:20–12:00 BJT on 21 April. (<b>a</b>–<b>g</b>) are radar reflectivity, mean Doppler velocity, spectrum width, vertical air velocity, linear depolarization ratio, spectral skewness, and spectral kurtosis; (<b>h</b>,<b>i</b>) are ground rain rate and drop size distribution.</p>
Full article ">Figure 6
<p>Profiles of the averaged radar measurements and mean Doppler spectra at different heights observed between 10:20–12:00 BJT on 21 April. (<b>a</b>–<b>h</b>) are reflectivity, mean Doppler velocity, spectrum width, linear depolarization ratio, vertical air velocity, spectral skewness, spectral kurtosis, and Doppler spectra.</p>
Full article ">Figure 7
<p>Forward pattern of the Ka-band MMCR Doppler spectra for seven different height ranges in the multi-cell convection.</p>
Full article ">Figure 8
<p>Time–height cross-sections of the MMCR measurements and the Parsivel-measured rain rate and DSD on the ground for a squall line convection that occurred over the site between 08:00–12:20 BJT on 22 April. (<b>a</b>–<b>g</b>) are radar reflectivity, mean Doppler velocity, spectrum width, vertical air velocity, linear depolarization ratio, spectral skewness, and spectral kurtosis; (<b>h</b>,<b>i</b>) are ground rain rate and drop size distribution.</p>
Full article ">Figure 9
<p>Time–height cross-sections of the MMCR measurements and the Parsivel-measured rain rate and DSD for a series of isolated cells between 14:34–17:10 BJT on 21 April. (<b>a</b>–<b>g</b>) are radar reflectivity, mean Doppler velocity, spectrum width, vertical air velocity, linear depolarization ratio, spectral skewness, and spectral kurtosis; (<b>h</b>,<b>i</b>) are ground rain rate and drop size distribution.</p>
Full article ">Figure 10
<p>Time–height cross-sections of the MMCR measurements and the ground rain rate and DSD for a period of CSMCs obtained from 20:00 BJT on 22 April to 02:00 BJT on 23 April. (<b>a</b>–<b>i</b>) are radar reflectivity, particle terminal velocity, vertical air velocity, spectrum width, linear depolarization ratio, spectral skewness, and spectral kurtosis; (<b>h</b>,<b>i</b>) are ground rain rate and drop size distribution.</p>
Full article ">Figure 11
<p>Profiles of the averaged radar measurements and mean Doppler spectra at different heights observed from 20:45 to 21:50 BJT on 22 April and 23:45 to 01:08 BJT on 23 April. (<b>a</b>–<b>i</b>) are reflectivity, particle terminal velocity, vertical air velocity, spectrum width, linear depolarization ratio, spectral skewness, spectral kurtosis, and Doppler spectra.</p>
Full article ">Figure 12
<p>Time–height cross-sections of the MMCR measurements observed from 05:42 to 06:37 BJT on 22 April. (<b>a</b>–<b>f</b>) are reflectivity, particle terminal velocity, vertical air velocity, spectrum width, spectral skewness, and spectral kurtosis.</p>
Full article ">Figure 13
<p>Time–height cross-sections of the MMCR-retrieved microphysical parameters and the ground rain rate and DSD observed from 05:42 to 06:37 BJT on 22 April. (<b>a</b>–<b>d</b>) are radar-derived liquid water content, rain rate, particle mean diameter, total number concentration; (<b>e</b>,<b>f</b>) are ground rain rate and drop size distribution.</p>
Full article ">Figure 14
<p>Time series of (<b>a</b>) reflectivity, (<b>b</b>) rain rate, (<b>c</b>) particle mean diameter, and (<b>d</b>) total number concentration measured by the MMCR (at the first available range gate, 150 m) and Parsivel (on the ground).</p>
Full article ">
24 pages, 23283 KiB  
Article
Estimating Leaf Area Index with a New Vegetation Index Considering the Influence of Rice Panicles
by Jiaoyang He, Ni Zhang, Xi Su, Jingshan Lu, Xia Yao, Tao Cheng, Yan Zhu, Weixing Cao and Yongchao Tian
Remote Sens. 2019, 11(15), 1809; https://doi.org/10.3390/rs11151809 - 1 Aug 2019
Cited by 42 | Viewed by 6151
Abstract
The emergence of rice panicle substantially changes the spectral reflectance of rice canopy and, as a result, decreases the accuracy of leaf area index (LAI) that was derived from vegetation indices (VIs). From a four-year field experiment with using rice varieties, nitrogen (N) [...] Read more.
The emergence of rice panicle substantially changes the spectral reflectance of rice canopy and, as a result, decreases the accuracy of leaf area index (LAI) that was derived from vegetation indices (VIs). From a four-year field experiment with using rice varieties, nitrogen (N) rates, and planting densities, the spectral reflectance characteristics of panicles and the changes in canopy reflectance after panicle removal were investigated. A rice “panicle line”—graphical relationship between red-edge and near-infrared bands was constructed by using the near-infrared and red-edge spectral reflectance of rice panicles. Subsequently, a panicle-adjusted renormalized difference vegetation index (PRDVI) that was based on the “panicle line” and the renormalized difference vegetation index (RDVI) was developed to reduce the effects of rice panicles and background. The results showed that the effects of rice panicles on canopy reflectance were concentrated in the visible region and the near-infrared region. The red band (670 nm) was the most affected by panicles, while the red-edge bands (720–740 nm) were less affected. In addition, a combination of near-infrared and red-edge bands was for the one that best predicted LAI, and the difference vegetation index (DI) (976, 733) performed the best, although it had relatively low estimation accuracy (R2 = 0.60, RMSE = 1.41 m2/m2). From these findings, correcting the near-infrared band in the RDVI by the panicle adjustment factor (θ) developed the PRDVI, which was obtained while using the “panicle line”, and the less-affected red-edge band replaced the red band. Verification data from an unmanned aerial vehicle (UAV) showed that the PRDVI could minimize the panicle and background influence and was more sensitive to LAI (R2 = 0.77; RMSE = 1.01 m2/m2) than other VIs during the post-heading stage. Moreover, of all the assessed VIs, the PRDVI yielded the highest R2 (0.71) over the entire growth period, with an RMSE of 1.31 (m2/m2). These results suggest that the PRDVI is an efficient and suitable LAI estimation index. Full article
(This article belongs to the Section Remote Sensing in Agriculture and Vegetation)
Show Figures

Figure 1

Figure 1
<p>Views of rice canopy before and after panicle removal. (<b>a</b>) is the original rice canopy, and (<b>b</b>) is the rice canopy without panicles.</p>
Full article ">Figure 2
<p>Measurement of panicle reflectance.</p>
Full article ">Figure 3
<p>Influence of panicles on the reflectance of rice canopy. (<b>a</b>) is the variations in the spectral signature of rice canopies before (R<sub>original canopy</sub>) and after (R<sub>canopy without panicles</sub>) the removal of panicles, and (<b>b</b>) is the relative variation rates (%) and the differences between R<sub>original canopy</sub> and R<sub>canopy without panicles</sub> for different varieties (V3 = WYJ24, V4 = EY728) and transplanting densities (D2 = 22.22 plants m<sup>−2</sup>, D3 = 16.66 plants m<sup>−2</sup>, D4 = 13.33 plants m<sup>−2</sup>). The sampling date was 96 days after transplanting (2015). Significant noise in the reflectance (caused by the effects of atmospheric moisture) was removed in the wavelength ranges 1351–1380, 1781–1970, and 2351–2500 nm. The color-shaded areas in (a) and (b) represent 720–740 (green), 450–730 (gray), and 730–1350 nm (yellow).</p>
Full article ">Figure 3 Cont.
<p>Influence of panicles on the reflectance of rice canopy. (<b>a</b>) is the variations in the spectral signature of rice canopies before (R<sub>original canopy</sub>) and after (R<sub>canopy without panicles</sub>) the removal of panicles, and (<b>b</b>) is the relative variation rates (%) and the differences between R<sub>original canopy</sub> and R<sub>canopy without panicles</sub> for different varieties (V3 = WYJ24, V4 = EY728) and transplanting densities (D2 = 22.22 plants m<sup>−2</sup>, D3 = 16.66 plants m<sup>−2</sup>, D4 = 13.33 plants m<sup>−2</sup>). The sampling date was 96 days after transplanting (2015). Significant noise in the reflectance (caused by the effects of atmospheric moisture) was removed in the wavelength ranges 1351–1380, 1781–1970, and 2351–2500 nm. The color-shaded areas in (a) and (b) represent 720–740 (green), 450–730 (gray), and 730–1350 nm (yellow).</p>
Full article ">Figure 4
<p>Variation in leaf area index (LAI) during the entire growth period of rice. Vertical dotted lines in the graph represent the initial heading period. The initial heading dates in 2013, 2014, 2015, and 2016 were 77, 74, 76, and 75 days after transplanting, respectively.</p>
Full article ">Figure 5
<p>Rice canopy under different treatments. <b>a</b>, <b>b</b>, <b>c</b>, and <b>d</b> represent V3D4, V3D2, V4D4, and V4D2, respectively.</p>
Full article ">Figure 6
<p>Plot of near-infrared band (NIR) reflectance versus red-edge reflectance (<b>a</b>) and the panicle line developed in this study (<b>b</b>). The dotted lines in (<b>a</b>) indicate the maximum values of the red-edge and near-infrared reflectance of the panicle-removed canopy.</p>
Full article ">Figure 7
<p>Relative variation rate (R<sub>v</sub>, %) of LAI estimation accuracy (the coefficient of determination, R<sup>2</sup>) before (data set 5) and after panicle removal (data set 6). A positive R<sub>v</sub> value indicates that R<sup>2</sup> increased after panicle removal, whereas a negative value indicates a decrease.</p>
Full article ">Figure 8
<p>Relative variation rates (R<sub>v</sub>, %) of different vegetation indices in response to the removal of panicles from rice canopies with different treatments. R<sub>v</sub> is the percentage of the vegetation index after panicle removal minus the vegetation index of the original canopy over the absolute value of the vegetation index of the original canopy. A positive R<sub>v</sub> value indicates an increase in the vegetation index after panicle removal, whereas a negative value indicates a decrease. Erect treatment = erect-type variety (V3), 16.66 plants m<sup>−2</sup>, 300 kg N ha<sup>−1</sup>; Spread treatment = spread-type variety (V4), 16.66 plants m<sup>−2</sup>, 300 kg N ha<sup>−1</sup>; Low N treatment = spread-type variety (V4), 22.22 plants m<sup>−2</sup>, 100 kg N ha<sup>-1</sup>; High N treatment = spread-type variety (V4), 22.22 plants m<sup>−2</sup>, 300 kg N ha<sup>−1</sup>; High plant density = erect-type variety (V3), 22.22 plants m<sup>−2</sup>, 300 kg N ha<sup>−1</sup>; Low plant density = erect-type variety (V3), 13.33 plants m<sup>−2</sup>, 300 kg N ha<sup>−1</sup>.</p>
Full article ">Figure 9
<p>Correlations between DI, SAVI, RDVI, CI, and WDRVI, and rice canopy LAI that were calculated using R<sub>original canopy</sub> (data set 5) and R<sub>canopy without panicles</sub> (data set 6).</p>
Full article ">Figure 10
<p>Correlations between the DI, SAVI, RDVI, CI, and WDRVI and rice canopy LAI that were calculated using the reflectance of rice canopy in the pre-heading stages (data set 1), post-heading stages (data set 2), and all stages (data set 3).</p>
Full article ">Figure 11
<p>Relationships between LAI and the new vegetation index (Panicle-Adjusted Renormalized Difference Vegetation Index (PRDVI)) (using data set 2). For V3, V4, and V3 + V4, the panicle adjustment coefficients (θ) were 0.08, 0.16, and 0.13, respectively.</p>
Full article ">Figure 12
<p>Relationships between LAI and different vegetation indices (using data sets 10 and 11). The panicle adjustment coefficient (θ) for the PRDVI was 0.13.</p>
Full article ">
14 pages, 3074 KiB  
Letter
Are There Sufficient Landsat Observations for Retrospective and Continuous Monitoring of Land Cover Changes in China?
by Yan Zhou, Jinwei Dong, Jiyuan Liu, Graciela Metternicht, Wei Shen, Nanshan You, Guosong Zhao and Xiangming Xiao
Remote Sens. 2019, 11(15), 1808; https://doi.org/10.3390/rs11151808 - 1 Aug 2019
Cited by 27 | Viewed by 5542
Abstract
Unprecedented human-induced land cover changes happened in China after the Reform and Opening-up in 1978, matching with the era of Landsat satellite series. However, it is still unknown whether Landsat data can effectively support retrospective analysis of land cover changes in China over [...] Read more.
Unprecedented human-induced land cover changes happened in China after the Reform and Opening-up in 1978, matching with the era of Landsat satellite series. However, it is still unknown whether Landsat data can effectively support retrospective analysis of land cover changes in China over the past four decades. Here, for the first time, we conduct a systematic investigation on the availability of Landsat data in China, targeting its application for retrospective and continuous monitoring of land cover changes. The latter is significant to assess impact of land cover changes, and consequences of past land policy and management interventions. The total and valid observations (excluding clouds, cloud shadows, and terrain shadows) from Landsat 5/7/8 from 1984 to 2017 were quantified at pixel scale, based on the cloud computing platform Google Earth Engine (GEE). The results show higher intensity of Landsat observation in the northern part of China as compared to the southern part. The study provides an overall picture of Landsat observations suitable for satellite-based annual land cover monitoring over the entire country. We uncover that two sub-regions of China (i.e., Northeast China-Inner Mongolia-Northwest China, and North China Plain) have sufficient valid observations for retrospective analysis of land cover over 30 years (1987–2017) at an annual interval; whereas the Middle-Lower Yangtze Plain (MLYP) and Xinjiang (XJ) have sufficient observations for annual analyses for the periods 1989–2017 and 2004–2017, respectively. Retrospective analysis of land cover is possible only at a two-year time interval in South China (SC) for the years 1988–2017, Xinjiang (XJ) for the period 1992–2003, and the Tibetan Plateau (TP) during 2004–2017. For the latter geographic regions, land cover dynamics can be analyzed only at a three-year interval prior to 2004. Our retrospective analysis suggest that Landsat-based analysis of land cover dynamics at an annual interval for the whole country is not feasible; instead, national monitoring at two- or three-year intervals could be achievable. This study provides a preliminary assessment of data availability, targeting future continuous land cover monitoring in China; and the code is released to the public to facilitate similar data inventory in other regions of the world. Full article
Show Figures

Graphical abstract

Graphical abstract
Full article ">Figure 1
<p>Spatial distributions of the total L5/7/8 images across the entirety of China from 1984 to 2017. NIN: Northeast China–Inner Mongolia–Northwest China; NCP: North China Plain; MLYP: Middle–Lower Yangtze Plain; SC: South China; XJ: Xinjiang; TP: Tibetan Plateau</p>
Full article ">Figure 2
<p>Monthly average number of Landsat observations. Monthly average number of total (<b>a</b>) and valid (<b>b</b>) observations of L5/7/8 in China from 1984 to 2017. Monthly average number of total (<b>c</b>) and valid (<b>d</b>) observations of L5 (1984–2011), L7 (1999–2017), and L8 (2013–2017) in China from 1984 to 2017, respectively.</p>
Full article ">Figure 3
<p>Spatial distributions of total (<b>a</b>) and valid (<b>b</b>) Landsat observations across the entirety of China from L5/7/8 from 1984 to 2017. Insets in (<b>a</b>) and (<b>b</b>) show average numbers of total and valid observations as function of geographic latitude; (<b>c</b>) and (<b>d</b>) show histograms of Landsat pixels with different total and valid observation numbers, respectively, from 1984 to 2017.</p>
Full article ">Figure 4
<p>Percentages of Landsat pixels with at least one valid observation during the OIATs (see <a href="#remotesensing-11-01808-t001" class="html-table">Table 1</a>) for the regions of NIN, NCP, and MLYP at one-year interval; SC at one- and two-year intervals; and XJ and TP with one-, two-, three-, and four-year intervals from 1984 to 2017. Black dotted horizontal lines represent 95%, while dotted vertical lines and shaded bars show start years or epochs when the percentages of Landsat pixels with at least one valid observation were consistently greater than or equal to 95%.</p>
Full article ">Figure 5
<p>Epoch numbers that meet the demand for at least one valid observation during the corresponding OIAT (see <a href="#remotesensing-11-01808-t001" class="html-table">Table 1</a>) for individual Landsat pixels in China during 1984–2017; at one- and two-year intervals (<b>a</b>,<b>b</b>) during 1985–2017, and 1986–2017 with three- and four-year interval (<b>c</b>,<b>d</b>), respectively.</p>
Full article ">
Previous Issue
Next Issue
Back to TopTop