Automated Cloud Shadow Detection from Satellite Orthoimages with Uncorrected Cloud Relief Displacements
<p>A workflow of the cloud shadow detection method.</p> "> Figure 2
<p>Illustrations of the location where cloud shadows are projected: (<b>a</b>) a case where cloud shadows are projected based on the height of the clouds; (<b>b</b>) the position of clouds and cloud shadows depicted in the image.</p> "> Figure 3
<p>An example of the positions of clouds and cloud shadows in satellite images: (<b>a</b>) before orthorectification; (<b>b</b>) after orthorectification.</p> "> Figure 4
<p>Illustrations depicting the cloud relief displacement: (<b>a</b>) a case of a vertical image; (<b>b</b>) a case of a high-oblique image.</p> "> Figure 5
<p>A calculation method for the direction vector from a cloud to cloud shadow in a 3-dimensional coordinate system.</p> "> Figure 6
<p>Explanation for search range of cloud shadow based on cloud height.</p> "> Figure 7
<p>Calculation method for cloud object movement in image coordinates using ground coordinate.</p> "> Figure 8
<p>Examples for explanations of redesigning the spectral threshold by Equation (12): (<b>a</b>) explanation of the NIR-RED ratio values for shadows projected on vegetation and water; (<b>b</b>) reason for changing the NIR threshold.</p> "> Figure 9
<p>A decision tree diagram applied in Equations (9)–(12).</p> "> Figure 10
<p>An example of noise removal: (<b>a</b>) before noise removal; (<b>b</b>) after noise removal.</p> "> Figure 11
<p>Satellite image and reference data used in the experiment: (<b>a</b>) Scene-1 image; (<b>b</b>) Scene-2 image; (<b>c</b>) Scene-3 image (white pixels and black pixels denote clouds and cloud shadows, respectively).</p> "> Figure 12
<p>Verification data for checking azimuth angle of cloud shadow from cloud collected in Scene-1: (<b>C1</b>–<b>C10</b>) represent verification data 1 to 10 (C for case) from Scene-1 (the orange arrows represent <math display="inline"><semantics> <mrow> <mover accent="true"> <mrow> <mi>T</mi> </mrow> <mo>→</mo> </mover> </mrow> </semantics></math>).</p> "> Figure 13
<p>The result of comparing the yellow directional vector considering only the geometry of the sun and the green directional vector considering both a sensor and the sun geometry in Scene-1: (<b>C6</b>) a result of case 6 in Scene-1; (<b>C7</b>) a result of case 7 in Scene-1 (white pixels and black pixels denote clouds and cloud shadows, respectively).</p> "> Figure 14
<p>Verification data for checking azimuth angle of cloud shadow from cloud collected in Scene-2: (<b>C1</b>–<b>C10</b>) represent verification data 1 to 10 (C for case) from Scene-2 (the orange arrows represent <math display="inline"><semantics> <mrow> <mover accent="true"> <mrow> <mi>T</mi> </mrow> <mo>→</mo> </mover> </mrow> </semantics></math>).</p> "> Figure 15
<p>The result of comparing the yellow directional vector considering only the geometry of the sun and the green directional vector considering both a sensor and the sun geometry in Scene-2: (<b>C1</b>) a result of case 1 in Scene-2; (<b>C5</b>) a result of case 5 in Scene-2 (white pixels and black pixels denote clouds and cloud shadows, respectively).</p> "> Figure 16
<p>Verification data for checking azimuth angle of cloud shadow from cloud collected in Scene-3: (<b>C1</b>–<b>C10</b>) represent verification data 1 to 10 (C for case) from Scene-3 (the orange arrows represent <math display="inline"><semantics> <mrow> <mover accent="true"> <mrow> <mi>T</mi> </mrow> <mo>→</mo> </mover> </mrow> </semantics></math>).</p> "> Figure 17
<p>The result of comparing the yellow directional vector considering only the geometry of the sun and the green directional vector considering both a sensor and the sun geometry in Scene-3: (<b>C5</b>) a result of case 5 in Scene-3; (<b>C6</b>) a result of case 6 in Scene-3 (white pixels and black pixels denote clouds and cloud shadows, respectively).</p> "> Figure 18
<p>Intermediate process of cloud shadow detection for searching PCSR in Scene-3: (<b>a</b>) the NIR-R-G composite image; (<b>b</b>–<b>e</b>) the process of shadow detection based on distance using <math display="inline"><semantics> <mrow> <mover accent="true"> <mrow> <msub> <mrow> <mi>C</mi> </mrow> <mrow> <mn>1</mn> </mrow> </msub> </mrow> <mo>→</mo> </mover> </mrow> </semantics></math>; (<b>f</b>–<b>i</b>) that using <math display="inline"><semantics> <mrow> <mover accent="true"> <mrow> <msub> <mrow> <mi>C</mi> </mrow> <mrow> <mn>2</mn> </mrow> </msub> </mrow> <mo>→</mo> </mover> </mrow> </semantics></math> (white pixels and black pixels denote clouds and cloud shadows, respectively).</p> "> Figure 19
<p>Post-processing process for cloud shadow detection: (<b>a</b>) the reference data; (<b>b</b>) cloud shadow detection before post-processing; (<b>c</b>) cloud shadow detection after post-processing(white pixels and black pixels denote clouds and cloud shadows, respectively).</p> "> Figure 20
<p>Cloud shadow detection results from Scene-1 images (white pixels and black pixels denote clouds and cloud shadows, respectively): (<b>a</b>–<b>c</b>) represent, in sequence, the reference cloud and cloud shadows, the detection results using <math display="inline"><semantics> <mrow> <mover accent="true"> <mrow> <msub> <mrow> <mi>C</mi> </mrow> <mrow> <mn>1</mn> </mrow> </msub> </mrow> <mo>→</mo> </mover> </mrow> </semantics></math>, and the detection results using <math display="inline"><semantics> <mrow> <mover accent="true"> <mrow> <msub> <mrow> <mi>C</mi> </mrow> <mrow> <mn>2</mn> </mrow> </msub> </mrow> <mo>→</mo> </mover> </mrow> </semantics></math> from an enlarged image.</p> "> Figure 21
<p>Cloud shadow detection results from Scene-2 images (white pixels and black pixels denote clouds and cloud shadows, respectively): (<b>a</b>–<b>c</b>) represent, in sequence, the reference cloud and cloud shadows, the detection results using <math display="inline"><semantics> <mrow> <mover accent="true"> <mrow> <msub> <mrow> <mi>C</mi> </mrow> <mrow> <mn>1</mn> </mrow> </msub> </mrow> <mo>→</mo> </mover> </mrow> </semantics></math>, and the detection results using <math display="inline"><semantics> <mrow> <mover accent="true"> <mrow> <msub> <mrow> <mi>C</mi> </mrow> <mrow> <mn>2</mn> </mrow> </msub> </mrow> <mo>→</mo> </mover> </mrow> </semantics></math> from an enlarged image.</p> "> Figure 22
<p>Cloud shadow detection results from Scene-3 images (white pixels and black pixels denote clouds and cloud shadows, respectively): (<b>a</b>–<b>c</b>) represent, in sequence, the reference cloud and cloud shadows, the detection results using <math display="inline"><semantics> <mrow> <mover accent="true"> <mrow> <msub> <mrow> <mi>C</mi> </mrow> <mrow> <mn>1</mn> </mrow> </msub> </mrow> <mo>→</mo> </mover> </mrow> </semantics></math>, and the detection results using <math display="inline"><semantics> <mrow> <mover accent="true"> <mrow> <msub> <mrow> <mi>C</mi> </mrow> <mrow> <mn>2</mn> </mrow> </msub> </mrow> <mo>→</mo> </mover> </mrow> </semantics></math> from an enlarged image.</p> ">
Abstract
:1. Introduction
- By applying both solar and sensor geometry, the difference from a value of the true angle was significantly reduced compared to using solar geometry alone, leading to more precise cloud shadow detection in orthoimages, particularly in high-oblique imagery.
- It laid the foundation for further improvement and application in complex scenarios involving the interaction between sun and sensor geometry through experiments.
- This approach showed potential for extending its use to high-resolution satellite imagery and UAV imagery, especially for detecting shadows of objects with uncorrected relief displacements, such as high-rise buildings.
2. Methods
2.1. Summary of Cloud Shadow Detection Process
2.2. Data Preprocessing
2.3. Estimation of Search Range for Cloud Shadow Detection
2.3.1. Calculation of Cloud Shadow Direction
2.3.2. Calculation of Cloud Shadow Range
2.4. Cloud Shadow Detection
3. Experiments
3.1. Datasets
3.2. Verification Method
- The estimation of cloud shadows’ direction angle based solely on sun illumination.
- The estimation of cloud shadows’ direction angle based on both sun illumination and sensor viewing direction.
3.3. Performance Evaluation
4. Results
4.1. Verification of Cloud Shadow Direction: Comparing (Sun-Only Use) and (Sun and Sensor-Based Use) with True Directions
4.2. Comparative Analysis of and in Proposed Method
4.3. Evaluation of Cloud Shadow Detection Accuracy with and in Proposed Method
5. Discussion and Conclusions
Author Contributions
Funding
Data Availability Statement
Conflicts of Interest
References
- Kang, J.; Kim, G.; Jeong, Y.; Kim, S.; Youn, Y.; Cho, S.; Lee, Y. U-Net Cloud Detection for the SPARCS Cloud Dataset from Landsat 8 Images. Korean J. Remote Sens. 2021, 37, 1149–1161. [Google Scholar] [CrossRef]
- Shahtahmassebi, A.; Yang, N.; Wang, K.; Moore, N.; Shen, Z. Review of shadow detection and de-shadowing methods in remote sensing. Chin. Geogr. Sci. 2013, 23, 403–420. [Google Scholar] [CrossRef]
- Mostafa, Y. A review on various shadow detection and compensation techniques in remote sensing images. Can. J. Remote Sens. 2017, 43, 545–562. [Google Scholar] [CrossRef]
- Aboutalebi, M.; Torres-Rua, A.F.; Kustas, W.P.; Nieto, H.; Coopmans, C.; McKee, M. Assessment of different methods for shadow detection in high-resolution optical imagery and evaluation of shadow impact on calculation of NDVI, and evapotranspiration. Irrig. Sci. 2019, 37, 407–429. [Google Scholar] [CrossRef]
- Liu, X.; Yang, F.; Wei, H.; Gao, M. Shadow Removal from UAV Images Based on Color and Texture Equalization Compensation of Local Homogeneous Regions. Remote Sens. 2022, 14, 2616. [Google Scholar] [CrossRef]
- Alavipanah, S.K.; Karimi Firozjaei, M.; Sedighi, A.; Fathololoumi, S.; Zare Naghadehi, S.; Saleh, S.; Naghdizadegan, M.; Gomeh, Z.; Arsanjani, J.J.; Makki, M.; et al. The Shadow Effect on Surface Biophysical Variables Derived from Remote Sensing: A Review. Land 2022, 11, 2025. [Google Scholar] [CrossRef]
- Mao, K.B.; Yuan, Z.J.; Zuo, Z.Y.; Xu, T.R.; Shen, X.Y.; Gao, C.Y. Changes in global cloud cover based on remote sensing data from 2003 to 2012. Chin. Geogr. Sci. 2019, 29, 306–315. [Google Scholar] [CrossRef]
- Kim, B.H.; Kim, Y.; Han, Y.K.; Choi, W.S.; Kim, Y. Fully Automated Generation of Cloud-free Imagery Using Landsat-8. J. Korean Soc. Surv. Geod. Photogramm. Cartogr. 2014, 32, 133–142. [Google Scholar] [CrossRef]
- Byeon, Y.; Choi, S.; Jin, D.; Seong, N.; Jung, D.; Sim, S.; Woo, J.; Jeon, U.; Han, K. Quality Evaluation through Inter-Comparison of Satellite Cloud Detection Products in East Asia. Korean J. Remote Sens. 2021, 37, 1829–1836. [Google Scholar] [CrossRef]
- Zekoll, V.; de los Reyes, R.; Richter, R. A Newly Developed Algorithm for Cloud Shadow Detection—TIP Method. Remote Sens. 2022, 14, 2922. [Google Scholar] [CrossRef]
- Li, Z.; Shen, H.; Li, H.; Xia, G.; Gamba, P.; Zhang, L. Multi-feature combined cloud and cloud shadow detection in GaoFen-1 wide field of view imagery. Remote Sens. Environ. 2017, 191, 342–358. [Google Scholar] [CrossRef]
- Zhang, C.; Weng, L.; Ding, L.; Xia, M.; Lin, H. CRSNet: Cloud and Cloud Shadow Refinement Segmentation Networks for Remote Sensing Imagery. Remote Sens. 2023, 15, 1664. [Google Scholar] [CrossRef]
- Tan, Y.; Zhang, W.; Yang, X.; Liu, Q.; Mi, X.; Li, J.; Yang, J.; Gu, X. Cloud and Cloud Shadow Detection of GF-1 Images Based on the Swin-UNet Method. Atmosphere 2023, 14, 1669. [Google Scholar] [CrossRef]
- Zhu, Z.; Wang, S.; Woodcock, C.E. Improvement and expansion of the Fmask algorithm: Cloud, cloud shadow, and snow detection for Landsats 4–7, 8, and Sentinel 2 images. Remote Sens. Environ. 2015, 159, 269–277. [Google Scholar] [CrossRef]
- Foga, S.; Scaramuzza, P.L.; Guo, S.; Zhu, Z.; Dilley, R.D.; Beckmann, T.; Schmidt, G.L.; Dwyer, J.L.; Joseph Hughes, M.; Laue, B. Cloud Detection Algorithm Comparison and Validation for Operational Landsat Data Products. Remote Sens. Environ. 2017, 194, 379–390. [Google Scholar] [CrossRef]
- Le Hegarat-Mascle, S.; Andre, C. Use of Markov Random Fields for automatic cloud/shadow detection on high resolution optical images. ISPRS J. Photogramm. Remote Sens. 2009, 64, 351–366. [Google Scholar] [CrossRef]
- Fisher, A. Cloud and Cloud-Shadow Detection in SPOT5 HRG Imagery with Automated Morphological Feature Extraction. Remote Sens. 2014, 6, 776–800. [Google Scholar] [CrossRef]
- Zhong, B.; Chen, W.; Wu, S.; Hu, L.; Luo, X.; Liu, Q. A cloud detection method based on relationship between objects of cloud and cloud-shadow for Chinese moderate to high resolution satellite imagery. IEEE J. Sel. Top. Appl. Earth Obs. Remote Sens. 2017, 10, 4898–4908. [Google Scholar] [CrossRef]
- Pailot-Bonnétat, S.; Harris, A.J.L.; Calvari, S.; De Michele, M.; Gurioli, L. Plume Height Time-Series Retrieval Using Shadow in Single Spatial Resolution Satellite Images. Remote Sens. 2020, 12, 3951. [Google Scholar] [CrossRef]
- Prabhakar, M.; Gopinath, K.; Reddy, A.; Thirupathi, M.; Rao, C.S. Mapping hailstorm damaged crop area using multispectral satellite data. Egypt. J. Remote Sens. Space Sci. 2019, 22, 73–79. [Google Scholar] [CrossRef]
- Elsharkawy, A.; Elhabiby, M.; El-Sheimy, N. New combined pixel/object-based technique for efficient urban classsification using WorldView-2 data. Int. Arch. Photogramm. Remote Sens. Spat. Inf. Sci. 2012, 39, 191–195. [Google Scholar] [CrossRef]
- Satellite Imagery Product Specifications. Available online: https://assets.planet.com/docs/1601.RapidEye.Image.Product.Specs_Jan16_V6.1_ENG.pdf (accessed on 10 January 2024).
- Luo, Y.; Trishchenko, A.P.; Khlopenkov, K.V. Developing Clear-Sky, Cloud and Cloud Shadow Mask for Producing Clear-Sky Composites at 250-Meter Spatial Resolution for the Seven MODIS Land Bands over Canada and North America. Remote Sens. Environ. 2008, 112, 4167–4185. [Google Scholar] [CrossRef]
- Sun, L.; Liu, X.; Yang, Y.; Chen, T.T.; Wang, Q.; Zhou, X. A Cloud Shadow Detection Method Combined with Cloud Height Iteration and Spectral Analysis for Landsat 8 OLI Data. ISPRS J. Photogramm. Remote Sens. 2018, 138, 193–207. [Google Scholar] [CrossRef]
- Leblon, B.; Gallant, L.; Granberg, H. Effects of Shadowing Types on Ground-Measured Visible and Near-Infrared Shadow Reflectances. Remote Sens. Environ. 1996, 58, 322–328. [Google Scholar] [CrossRef]
- Kustiyo; Dianovita, H.; Ismaya, H.; Rahayu, M.I.; Adiningsih, E.S. New Automated Cloud and Cloud-Shadow Detection Using Landsat Imagery. Int. J. Remote Sens. Earth Sci. 2012, 9, 100–111. [Google Scholar] [CrossRef]
- Yoon, W. A Study on Development of Automatic GCP Matching Technology for CAS-500 Imagery. Master’s Thesis, Inha University, Incheon, Republic of Korea, 2019. [Google Scholar]
- Park, H.; Son, J.H.; Jung, H.S.; Kweon, K.E.; Lee, K.D.; Kim, T. Development of the Precision Image Processing System for CAS-500. Korean J. Remote Sens. 2020, 36, 881–891. [Google Scholar] [CrossRef]
- Son, J.H.; Yoon, W.; Kim, T.; Rhee, S. Iterative Precision Geometric Correction for High-Resolution Satellite Images. Korean J. Remote Sens. 2021, 37, 431–447. [Google Scholar] [CrossRef]
No. | Scene-1 | Scene-2 | Scene-3 | |
---|---|---|---|---|
Category | ||||
Acquisition date | 17 September 2018 | 27 September 2018 | 22 September 2018 | |
Cloud cover (%) | 1.33 | 0.58 | 12.66 | |
Sun’s azimuth /zenith angle (°) | 159.4/39.6 | 155.6/44.0 | 151.4/42.6 | |
Viewing azimuth /zenith angle (°) | 281.3/16.3 | 99.8/3.8 | 98.8/17.1 | |
Used band | Red (555.0 nm), Near-infrared (710.0 nm) | |||
Product level | L1B | |||
Spatial resolution | 5 m |
Case | |||||
---|---|---|---|---|---|
Case 1 | 339.4° | 325.2° | 322.8° | 16.6° | 2.4° |
Case 2 | 324.4° | 15.0° | 0.8° | ||
Case 3 | 323.8° | 15.6° | 1.4° | ||
Case 4 | 321.3° | 18.1° | 3.9° | ||
Case 5 | 321.8° | 17.6° | 3.4° | ||
Case 6 | 323.0° | 16.4° | 2.2° | ||
Case 7 | 324.5° | 14.9° | 0.7° | ||
Case 8 | 322.3° | 17.1° | 2.9° | ||
Case 9 | 316.2° | 23.2° | 9.0° | ||
Case 10 | 319.3° | 20.1° | 5.9° | ||
Absolute Mean | - | - | - | 17.4° | 3.2° |
Case | |||||
---|---|---|---|---|---|
Case 1 | 335.6° | 339.1° | 337.2° | −1.6 | 1.9 |
Case 2 | 336.5° | −0.9 | 2.6 | ||
Case 3 | 335.1° | 0.5 | 4 | ||
Case 4 | 334.5° | 1.1 | 4.6 | ||
Case 5 | 337.9° | −2.3 | 1.2 | ||
Case 6 | 337.7° | −2.1 | 1.4 | ||
Case 7 | 337.8° | −2.1 | 1.3 | ||
Case 8 | 338.3° | −2.7 | 0.8 | ||
Case 9 | 335.1° | 0.5 | 4 | ||
Case 10 | 337.3° | −1.7 | 1.8 | ||
Absolute Mean | - | - | - | 1.5° | 2.3° |
Case | |||||
---|---|---|---|---|---|
Case 1 | 331.3° | 349.8° | 355.1° | −23.8° | −5.3° |
Case 2 | 351.8° | −20.5° | −2.0° | ||
Case 3 | 354.0° | −22.7° | −4.2° | ||
Case 4 | 354.1° | −22.8° | −4.3° | ||
Case 5 | 346.0° | −14.7° | 3.8° | ||
Case 6 | 351.4° | −20.1° | −1.6° | ||
Case 7 | 356.7° | −25.4° | −6.9° | ||
Case 8 | 355.6° | −24.3° | −5.8° | ||
Case 9 | 352.4° | −21.1° | −2.6° | ||
Case 10 | 349.3° | −18.0° | 0.5° | ||
Absolute Mean | - | - | - | 21.3° | 3.7° |
No. | Accuracy of Cloud Shadow Detection in Scene-1 Using | Accuracy of Cloud Shadow Detection in Scene-2 Using | Accuracy of Cloud Shadow Detection in Scene-3 Using | |||
---|---|---|---|---|---|---|
Precision | 0.79 | 0.87 | 0.80 | 0.82 | 0.93 | 0.96 |
Recall | 0.32 | 0.59 | 0.45 | 0.45 | 0.33 | 0.62 |
F1 score | 0.46 | 0.70 | 0.58 | 0.58 | 0.49 | 0.76 |
Detection ratio (%) | 86.2 | 97.5 | 99.6 | 98.3 | 80.8 | 94.0 |
Total cloud objects | 1332 | 300 | 2011 |
Disclaimer/Publisher’s Note: The statements, opinions and data contained in all publications are solely those of the individual author(s) and contributor(s) and not of MDPI and/or the editor(s). MDPI and/or the editor(s) disclaim responsibility for any injury to people or property resulting from any ideas, methods, instructions or products referred to in the content. |
© 2024 by the authors. Licensee MDPI, Basel, Switzerland. This article is an open access article distributed under the terms and conditions of the Creative Commons Attribution (CC BY) license (https://creativecommons.org/licenses/by/4.0/).
Share and Cite
Kim, H.; Yoon, W.; Kim, T. Automated Cloud Shadow Detection from Satellite Orthoimages with Uncorrected Cloud Relief Displacements. Remote Sens. 2024, 16, 3950. https://doi.org/10.3390/rs16213950
Kim H, Yoon W, Kim T. Automated Cloud Shadow Detection from Satellite Orthoimages with Uncorrected Cloud Relief Displacements. Remote Sensing. 2024; 16(21):3950. https://doi.org/10.3390/rs16213950
Chicago/Turabian StyleKim, Hyeonggyu, Wansang Yoon, and Taejung Kim. 2024. "Automated Cloud Shadow Detection from Satellite Orthoimages with Uncorrected Cloud Relief Displacements" Remote Sensing 16, no. 21: 3950. https://doi.org/10.3390/rs16213950
APA StyleKim, H., Yoon, W., & Kim, T. (2024). Automated Cloud Shadow Detection from Satellite Orthoimages with Uncorrected Cloud Relief Displacements. Remote Sensing, 16(21), 3950. https://doi.org/10.3390/rs16213950