Fast Fusion of Sentinel-2 and Sentinel-3 Time Series over Rangelands
<p>The test areas are shown in white squares. The rangeland true-color image and NDVI data (<b>top</b>) were acquired by Sentinel-2 on 6 October 2021, and the cropland images (<b>bottom</b>) date from 13 December 2022. The cropland area is surrounded by grasslands along Lake Guiers. The rangeland area is situated a few kilometers northeast of the town of Dahra. The Dahra field site (represented by a small white dot on the top-right corner of the rangeland area) includes a hemispherical NDVI sensor that we use to evaluate the reliability of our method.</p> "> Figure 2
<p>Example showing Sentinel-2 and Sentinel-3 NDVI data around the wet seasons (months 7–12) of 2019, 2020, and 2021 in Senegal. Atmospheric effects lead to underestimations of Sentinel-2 and Sentinel-3 data which are especially apparent around September in 2020 and 2021. The timing of vegetation growth varies from July to August. Higher cloud cover during the wet season leads to fewer acquisitions, with an especially long time without Sentinel-2 data in 2020.</p> "> Figure 3
<p>The fusion principle—the EFAST creates synthetic high-resolution data through a simple transformation of Sentinel-2 and Sentinel-3 images as follows: <math display="inline"><semantics> <mrow> <msub> <mrow> <mi>S</mi> </mrow> <mrow> <mn>2</mn> </mrow> </msub> <mfenced separators="|"> <mrow> <msup> <mrow> <mi>t</mi> </mrow> <mrow> <mi>*</mi> </mrow> </msup> </mrow> </mfenced> <mo>+</mo> <msub> <mrow> <mi>S</mi> </mrow> <mrow> <mn>3</mn> </mrow> </msub> <mfenced separators="|"> <mrow> <mi>t</mi> </mrow> </mfenced> <mo>−</mo> <msub> <mrow> <mi>S</mi> </mrow> <mrow> <mn>3</mn> </mrow> </msub> <mo>(</mo> <msup> <mrow> <mi>t</mi> </mrow> <mrow> <mi>*</mi> </mrow> </msup> <mo>)</mo> </mrow> </semantics></math>.</p> "> Figure 4
<p>Validation was performed on all cloud-free Sentinel-2 images acquired during the wet season (black points); the rest of the Sentinel-2 cloud-free acquisitions (gray points) and all the Sentinel-3 observations were used for interpolation.</p> "> Figure 5
<p>STARFM, EFAST, and Whittaker filter compared to in situ data at Dahra field site.</p> "> Figure 6
<p>Example of a Sentinel-2 image used for validation, dating from 17 September 2019 (<b>a</b>), and the corresponding prediction by the EFAST (<b>b</b>). The absolute difference between these two images is one of the 12 terms (one for each validation image) of the mean absolute error map (<a href="#remotesensing-16-01833-f007" class="html-fig">Figure 7</a>c).</p> "> Figure 7
<p>Mean absolute error maps of the reconstructed NDVI profiles using the Whittaker filter (<b>a</b>), STARFM (<b>b</b>), and EFAST (<b>c</b>). The depicted dots represent specific points for which the corresponding time series are illustrated in <a href="#remotesensing-16-01833-f008" class="html-fig">Figure 8</a>.</p> "> Figure 8
<p>The predicted time series of the three interpolation methods (the Whittaker filter, STARFM, and EFAST) for the three points of the rangeland area (<a href="#remotesensing-16-01833-f007" class="html-fig">Figure 7</a>). Black points represent Sentinel-2 validation points, and gray points represent the Sentinel-2 data used for interpolation. Orange areas correspond to the time frames in which the reconstruction of the NDVI profile is assessed.</p> "> Figure 9
<p>The mean absolute error using the Whittaker filter (<b>a</b>), STARFM (<b>b</b>) and EFAST (<b>c</b>). The dots correspond to the points for which time series are displayed in <a href="#remotesensing-16-01833-f010" class="html-fig">Figure 10</a>.</p> "> Figure 10
<p>The predicted time series of the three interpolation methods (the Whittaker filter, STARFM, and EFAST) for three points in the cropland area (<a href="#remotesensing-16-01833-f009" class="html-fig">Figure 9</a>). Black points represent Sentinel-2 validation points, and gray points represent the Sentinel-2 data used for interpolation. Orange areas correspond to time frames in which the reconstruction of the NDVI profile is assessed.</p> "> Figure 11
<p>Pearson correlation coefficient between Sentinel-2 and Sentinel-3 time series. Small-scale features stand out as having a low correlation because of Sentinel-3’s limited spatial resolution. Conversely, large crop parcels and homogeneous grasslands present a high correlation. The white box corresponds to the area in <a href="#remotesensing-16-01833-f009" class="html-fig">Figure 9</a>.</p> "> Figure A1
<p>Maximum time, in days, without Sentinel-2 data over the African continent between August 2021 and January 2023. Extracted using Google Earth Engine. The brighter stripes correspond to areas of overlap between two orbits.</p> "> Figure A2
<p>Smoothing parameters s and D. (<b>a</b>) Distance to closest masked cloud in km for the parameter. Distance-to-clouds score is equal to 0.5 two kilometers from the cloud mask and to 1 from D = 4 km. (<b>b</b>) Impact of temporal smoothing parameter s (in days) on temporal weights <math display="inline"><semantics> <mrow> <mi>e</mi> <mi>x</mi> <mi>p</mi> <mfenced open="[" close="]" separators="|"> <mrow> <mo>−</mo> <mfrac> <mrow> <msup> <mrow> <mfenced separators="|"> <mrow> <mi>t</mi> <mo>−</mo> <msub> <mrow> <mi>t</mi> </mrow> <mrow> <mo>∗</mo> </mrow> </msub> </mrow> </mfenced> </mrow> <mrow> <mn>2</mn> </mrow> </msup> </mrow> <mrow> <mn>2</mn> <msup> <mrow> <mi>s</mi> </mrow> <mrow> <mn>2</mn> </mrow> </msup> </mrow> </mfrac> </mrow> </mfenced> </mrow> </semantics></math>, displayed as bars, when there is one cloud-free Sentinel-2 acquisition every five days. Lines represent Gaussian distributions for s = 10 and 30 days.</p> ">
Abstract
:1. Introduction
2. Materials and Methods
2.1. Study Area
2.2. Sentinel-2 Processing
2.3. Sentinel-3 Processing
2.4. Fusion Principle
2.5. Spatial and Temporal Smoothing
2.6. Validation Strategy
- The Whittaker filter, which smooths and interpolates time series while being resilient to missing data, making it a commonly used method in remote sensing [26,27]. The Whittaker filter also contains a smoothing parameter that we set to 400 days2 = (20 days)2, which appears consistent with the EFAST smoothing parameter days. This method only uses Sentinel-2 data, so the comparison of the EFAST and the Whittaker filter aims to demonstrate the value of adding Sentinel-3 to the equation.
- The STARFM, a spatio-temporal fusion algorithm [14], with the following parameters: four classes and a window size of 31 pixels. We use Mileva’s 2018 open-source implementation in Python [28] to compare its speed with our approach in the same environment. We use the single-pair version of the STARFM and choose the closest cloud-free Sentinel-2 image as input data. A comparison of the performance of our method with STARFM allows us to verify whether the increase in the computational efficiency of the EFAST over the STARFM translates into a reduction in performance and to quantify this reduction.
- A comparison using in situ data at the Dahra field site (experiment 1). The interpolated time series are produced using all Sentinel-2 observations that do not contain clouds within a radius of 1 km from the site (to avoid undetected clouds and cloud shadows). For the STARFM and EFAST, we also use the entire smoothed Sentinel-3 time series. The Sentinel-2 input data and the predictions are displayed at the position of the Dahra field site (as the average value over a 3-by-3-pixel box to account for misalignment between the Sentinel-2 resolution cell and the multispectral sensor). We compare these time series to in situ data obtained over four years from 2019 until the end of the year 2022.
- Across the two study areas highlighted in Figure 1 (experiment 2), to assess performance on a larger scale and at a high resolution, we use the Sentinel-2 data itself for validation. We keep the Sentinel-2 images acquired in July, August, or September for validation (Figure 4), leading to temporal gaps of three months. Discarding three months’ worth of data emulates plausible conditions in these semi-arid ecosystems (Figure A1). To avoid contaminating the errors with clouds and cloud shadows, we only consider Sentinel-2 images that are cloud-free over the extent of the study area. The absolute difference between the Sentinel-2 images kept for validation (12 images for the rangeland area and 17 for the cropland area) and the corresponding predictions are aggregated and displayed as error maps.
3. Results
3.1. Field Site Evaluation
3.2. Reconstruction of the Wet Season
3.2.1. Rangeland Area
3.2.2. Cropland Area
- The spatial averaging of the STARFM makes use of the lower part of the study area, where the Sentinel-3 pixels are more homogenous and mainly composed of grass.
- The temporal averaging of the EFAST gives a lower weight to individual cloud-free pixels, leading to the corruption of the phenological signal, even in periods of low cloud cover. This is particularly apparent in Figure 10(5), where the vegetation growth of the irrigated croplands around December in 2020, 2021, and 2022 leads to a predicted bi-seasonality of the grass.
3.3. Computation Time
4. Discussion
4.1. Efficiency over Large Scales
4.2. Consequences for Rangeland Monitoring
4.3. Limitations over Heterogeneous Areas
4.4. Land-Cover Change
4.5. Smoothing Parameters
4.6. Sentinel-3’s Temporal Profile
5. Conclusions
Author Contributions
Funding
Data Availability Statement
Acknowledgments
Conflicts of Interest
Appendix A
Appendix B
References
- Skidmore, A.K.; Coops, N.C.; Neinavaz, E.; Ali, A.; Schaepman, M.E.; Paganini, M.; Kissling, W.D.; Vihervaara, P.; Darvishzadeh, R.; Feilhauer, H.; et al. Priority list of biodiversity metrics to observe from space. Nat. Ecol. Evol. 2021, 5, 896–906. [Google Scholar] [CrossRef] [PubMed]
- Senf, C. Seeing the System from Above: The Use and Potential of Remote Sensing for Studying Ecosystem Dynamics. Ecosystems 2022, 25, 1719–1737. [Google Scholar] [CrossRef]
- Hansen, M.C.; Potapov, P.V.; Moore, R.; Hancher, M.; Turubanova, S.A.; Tyukavina, A.; Thau, D.; Stehman, S.V.; Goetz, S.J.; Loveland, T.R.; et al. High-Resolution Global Maps of 21st-Century Forest Cover Change. Science 2013, 342, 850–853. [Google Scholar] [CrossRef] [PubMed]
- Tucker, C.; Brandt, M.; Hiernaux, P.; Kariryaa, A.; Rasmussen, K.; Small, J.; Igel, C.; Reiner, F.; Melocik, K.; Meyer, J.; et al. Sub-continental-scale carbon stocks of individual trees in African drylands. Nature 2023, 615, 80–86. [Google Scholar] [CrossRef] [PubMed]
- Misra, G.; Cawkwell, F.; Wingler, A. Status of Phenological Research Using Sentinel-2 Data: A Review. Remote Sens. 2020, 12, 2760. [Google Scholar] [CrossRef]
- Cleland, E.E.; Chuine, I.; Menzel, A.; Mooney, H.A.; Schwartz, M.D. Shifting plant phenology in response to global change. Trends Ecol. Evol. 2007, 22, 357–365. [Google Scholar] [CrossRef] [PubMed]
- Zurita-Milla, R.; Gomez-Chova, L.; Guanter, L.; Clevers, J.G.P.W.; Camps-Valls, G. Multitemporal Unmixing of Medium-Spatial-Resolution Satellite Images: A Case Study Using MERIS Images for Land-Cover Mapping. IEEE Trans. Geosci. Remote Sens. 2011, 49, 4308–4317. [Google Scholar] [CrossRef]
- Amorós-López, J.; Gómez-Chova, L.; Alonso, L.; Guanter, L.; Zurita-Milla, R.; Moreno, J.; Camps-Valls, G. Multitemporal fusion of Landsat/TM and ENVISAT/MERIS for crop monitoring. Int. J. Appl. Earth Obs. Geoinf. 2013, 23, 132–141. [Google Scholar] [CrossRef]
- Gevaert, C.M.; García-Haro, F.J. A comparison of STARFM and an unmixing-based algorithm for Landsat and MODIS data fusion. Remote Sens. Environ. 2015, 156, 34–44. [Google Scholar] [CrossRef]
- Liu, X.; Deng, C.; Wang, S.; Huang, G.-B.; Zhao, B.; Lauren, P. Fast and Accurate Spatiotemporal Fusion Based Upon Extreme Learning Machine. IEEE Geosci. Remote Sens. Lett. 2016, 13, 2039–2043. [Google Scholar] [CrossRef]
- Song, H.; Liu, Q.; Wang, G.; Hang, R.; Huang, B. Spatiotemporal Satellite Image Fusion Using Deep Convolutional Neural Networks. IEEE J. Sel. Top. Appl. Earth Obs. Remote Sens. 2018, 11, 821–829. [Google Scholar] [CrossRef]
- Cai, J.; Huang, B.; Fung, T. Progressive spatiotemporal image fusion with deep neural networks. Int. J. Appl. Earth Obs. Geoinf. 2022, 108, 102745. [Google Scholar] [CrossRef]
- Wang, Z.; Ma, Y.; Zhang, Y. Review of pixel-level remote sensing image fusion based on deep learning. Inf. Fusion 2023, 90, 36–58. [Google Scholar] [CrossRef]
- Gao, F.; Masek, J.; Schwaller, M.; Hall, F. On the blending of the Landsat and MODIS surface reflectance: Predicting daily Landsat surface reflectance. IEEE Trans. Geosci. Remote Sens. 2006, 44, 2207–2218. [Google Scholar] [CrossRef]
- Jia, K.; Liang, S.; Zhang, L.; Wei, X.; Yao, Y.; Xie, X. Forest cover classification using Landsat ETM+ data and time series MODIS NDVI data. Int. J. Appl. Earth Obs. Geoinf. 2014, 33, 32–38. [Google Scholar] [CrossRef]
- Gao, F.; Anderson, M.C.; Zhang, X.; Yang, Z.; Alfieri, J.G.; Kustas, W.P.; Mueller, R.; Johnson, D.M.; Prueger, J.H. Toward mapping crop progress at field scales through fusion of Landsat and MODIS imagery. Remote Sens. Environ. 2017, 188, 9–25. [Google Scholar] [CrossRef]
- Olexa, E.M.; Lawrence, R.L. Performance and effects of land cover type on synthetic surface reflectance data and NDVI estimates for assessment and monitoring of semi-arid rangeland. Int. J. Appl. Earth Obs. Geoinf. 2014, 30, 30–41. [Google Scholar] [CrossRef]
- Zhu, X.; Chen, J.; Gao, F.; Chen, X.; Masek, J.G. An enhanced spatial and temporal adaptive reflectance fusion model for complex heterogeneous regions. Remote Sens. Environ. 2010, 114, 2610–2623. [Google Scholar] [CrossRef]
- Wang, Q.; Atkinson, P.M. Spatio-temporal fusion for daily Sentinel-2 images. Remote Sens. Environ. 2018, 204, 31–42. [Google Scholar] [CrossRef]
- Guan, Q.; Peng, X. High-performance Spatio-temporal Fusion Models for Remote Sensing Images with Graphics Processing Units. AGU Fall Meet. Abstr. 2018, 2018, IN41D-0866. [Google Scholar]
- Gao, H.; Zhu, X.; Guan, Q.; Yang, X.; Yao, Y.; Zeng, W.; Peng, X. cuFSDAF: An Enhanced Flexible Spatiotemporal Data Fusion Algorithm Parallelized Using Graphics Processing Units. IEEE Trans. Geosci. Remote Sens. 2022, 60, 4403016. [Google Scholar] [CrossRef]
- Xie, D.; Gao, F.; Sun, L.; Anderson, M. Improving Spatial-Temporal Data Fusion by Choosing Optimal Input Image Pairs. Remote Sens. 2018, 10, 1142. [Google Scholar] [CrossRef]
- Tagesson, T.; Fensholt, R.; Guiro, I.; Rasmussen, M.O.; Huber, S.; Mbow, C.; Garcia, M.; Horion, S.; Sandholt, I.; Holm-Rasmussen, B.; et al. Ecosystem properties of semiarid savanna grassland in West Africa and its relationship with environmental variability. Glob. Change Biol. 2015, 21, 250–264. [Google Scholar] [CrossRef] [PubMed]
- Zhu, X.; Cai, F.; Tian, J.; Williams, T.K.-A. Spatiotemporal Fusion of Multisource Remote Sensing Data: Literature Survey, Taxonomy, Principles, Applications, and Future Directions. Remote Sens. 2018, 10, 527. [Google Scholar] [CrossRef]
- Griffiths, P.; van der Linden, S.; Kuemmerle, T.; Hostert, P. A Pixel-Based Landsat Compositing Algorithm for Large Area Land Cover Mapping. IEEE J. Sel. Top. Appl. Earth Obs. Remote Sens. 2013, 6, 2088–2101. [Google Scholar] [CrossRef]
- Eilers, P.H.C.; Pesendorfer, V.; Bonifacio, R. Automatic smoothing of remote sensing data. In Proceedings of the 2017 9th International Workshop on the Analysis of Multitemporal Remote Sensing Images (MultiTemp), Brugge, Belgium, 27–29 June 2017; pp. 1–3. [Google Scholar]
- Atkinson, P.M.; Jeganathan, C.; Dash, J.; Atzberger, C. Inter-comparison of four models for smoothing satellite sensor time-series data to estimate vegetation phenology. Remote Sens. Environ. 2012, 123, 400–417. [Google Scholar] [CrossRef]
- Mileva, N.; Mecklenburg, S.; Gascon, F. New tool for spatio-temporal image fusion in remote sensing: A case study approach using Sentinel-2 and Sentinel-3 data. In Proceedings of the Image and Signal Processing for Remote Sensing XXIV, Berlin, Germany, 10–13 September 2018; p. 20. [Google Scholar]
- Higgins, S.I.; Delgado-Cartay, M.D.; February, E.C.; Combrink, H.J. Is there a temporal niche separation in the leaf phenology of savanna trees and grasses? J. Biogeogr. 2011, 38, 2165–2175. [Google Scholar] [CrossRef]
- Scholes, R.J.; Walker, B.H. An African Savanna: Synthesis of the Nylsvley Study; Cambridge Studies in Applied Ecology and Resource Management; Cambridge University Press: Cambridge, UK, 1993; ISBN 978-0-521-61210-4. [Google Scholar]
- Rao, Y.; Zhu, X.; Chen, J.; Wang, J. An Improved Method for Producing High Spatial-Resolution NDVI Time Series Datasets with Multi-Temporal MODIS NDVI Data and Landsat TM/ETM+ Images. Remote Sens. 2015, 7, 7865–7891. [Google Scholar] [CrossRef]
- Hilker, T.; Wulder, M.A.; Coops, N.C.; Linke, J.; McDermid, G.; Masek, J.G.; Gao, F.; White, J.C. A new data fusion model for high spatial- and temporal-resolution mapping of forest disturbance based on Landsat and MODIS. Remote Sens. Environ. 2009, 113, 1613–1627. [Google Scholar] [CrossRef]
- Frasso, G.; Eilers, P.H. L- and V-Curves for Optimal Smoothing. Stat. Model. 2015, 15, 91–111. [Google Scholar] [CrossRef]
Area | Whittaker | STARFM | EFAST |
---|---|---|---|
Rangeland | 0.172 | 0.042 | 0.044 |
Cropland | 0.075 | 0.040 | 0.042 |
Whittaker | STARFM | EFAST |
---|---|---|
0.1 * | 85 | 0.6 |
Disclaimer/Publisher’s Note: The statements, opinions and data contained in all publications are solely those of the individual author(s) and contributor(s) and not of MDPI and/or the editor(s). MDPI and/or the editor(s) disclaim responsibility for any injury to people or property resulting from any ideas, methods, instructions or products referred to in the content. |
© 2024 by the authors. Licensee MDPI, Basel, Switzerland. This article is an open access article distributed under the terms and conditions of the Creative Commons Attribution (CC BY) license (https://creativecommons.org/licenses/by/4.0/).
Share and Cite
Senty, P.; Guzinski, R.; Grogan, K.; Buitenwerf, R.; Ardö, J.; Eklundh, L.; Koukos, A.; Tagesson, T.; Munk, M. Fast Fusion of Sentinel-2 and Sentinel-3 Time Series over Rangelands. Remote Sens. 2024, 16, 1833. https://doi.org/10.3390/rs16111833
Senty P, Guzinski R, Grogan K, Buitenwerf R, Ardö J, Eklundh L, Koukos A, Tagesson T, Munk M. Fast Fusion of Sentinel-2 and Sentinel-3 Time Series over Rangelands. Remote Sensing. 2024; 16(11):1833. https://doi.org/10.3390/rs16111833
Chicago/Turabian StyleSenty, Paul, Radoslaw Guzinski, Kenneth Grogan, Robert Buitenwerf, Jonas Ardö, Lars Eklundh, Alkiviadis Koukos, Torbern Tagesson, and Michael Munk. 2024. "Fast Fusion of Sentinel-2 and Sentinel-3 Time Series over Rangelands" Remote Sensing 16, no. 11: 1833. https://doi.org/10.3390/rs16111833
APA StyleSenty, P., Guzinski, R., Grogan, K., Buitenwerf, R., Ardö, J., Eklundh, L., Koukos, A., Tagesson, T., & Munk, M. (2024). Fast Fusion of Sentinel-2 and Sentinel-3 Time Series over Rangelands. Remote Sensing, 16(11), 1833. https://doi.org/10.3390/rs16111833