A Classified Adversarial Network for Multi-Spectral Remote Sensing Image Change Detection
"> Figure 1
<p>Flowchart of the classified adversarial network (CAN)-based method for remote sensing image change detection. First, the initial change detection result is obtained using a change vector analysis (CVA)-based method. Then, the reliable, labeled data can be selected according the initial result. Adding noise into the labeled data is regarded as fake data. Labeled data and fake data are used to train the CAN, and the discriminator is used to judge whether the output of the classifier is reliable. Finally, bitemporal multi-spectral remote sensing images are fed into the classifier when it is trained well to obtain the final change map (CM).</p> "> Figure 2
<p>Flowchart of the CVA-based method for generating the initial change map.</p> "> Figure 3
<p>(<b>a</b>) The change map obtained with the CVA-based method in the Yandu Village data set. (<b>b</b>) The reference image of the Yandu Village data set.</p> "> Figure 4
<p>(<b>a</b>) The change map obtained with the CVA-based method in the Minfeng data set. (<b>b</b>) The reference image of the Minfeng data set.</p> "> Figure 5
<p>(<b>a</b>) The change map obtained with the CVA-based method in the Hongqi Canal data set. (<b>b</b>) The reference image of the Hongqi Canal data set.</p> "> Figure 6
<p>The impact of the threshold on the accuracy of the selected samples.</p> "> Figure 7
<p>Examples of the training samples selected under different thresholds in the Yandu Village data set. The white region is the selected changed pixels, the black region is the selected unchanged pixels, and the gray region is the not-selected pixels. (<b>a</b>) The threshold is 1. (<b>b</b>) The threshold is 0.9. (<b>c</b>) The threshold is 0.8. (<b>d</b>) The reference image of the Yandu Village data set.</p> "> Figure 8
<p>The behavior of the change map over the evolution of the training process. (<b>a</b>) The output of G in the first iteration (FP:44389; FN:4893; OE:49282; KC:0.3617; <math display="inline"><semantics> <mrow> <mi>F</mi> <mn>1</mn> </mrow> </semantics></math>:0.4355). (<b>b</b>) The output of G in the tenth iteration (FP:35905; FN:4945; OE:40850; KC:0.4167; <math display="inline"><semantics> <mrow> <mi>F</mi> <mn>1</mn> </mrow> </semantics></math>:0.4814). (<b>c</b>) The output of G in the twentieth iteration (FP:22824; FN:6182; OE:29006; KC:0.4998; <math display="inline"><semantics> <mrow> <mi>F</mi> <mn>1</mn> </mrow> </semantics></math>:0.5499). (<b>d</b>) The output of G in the final iteration (FP:9772; FN:7381; OE:17153; KC:0.6272; <math display="inline"><semantics> <mrow> <mi>F</mi> <mn>1</mn> </mrow> </semantics></math>:0.6583).</p> "> Figure 9
<p>The losses of the Generator and Discriminator during training. (<b>a</b>) The loss of G. (<b>b</b>) The loss of D.</p> "> Figure 10
<p>The Yandu Village data set. (<b>a</b>) Image acquired on 19 September 2012. (<b>b</b>) Image acquired on 10 February 2015. (<b>c</b>) Reference image.</p> "> Figure 11
<p>The Minfeng data set. (<b>a</b>) Image acquired on 9 December 2013. (<b>b</b>) Image acquired on 16 October 2015. (<b>c</b>) Reference image.</p> "> Figure 12
<p>The Hongqi Canal data set. (<b>a</b>) Image acquired on 9 December 2013. (<b>b</b>) Image acquired on 16 October 2015. (<b>c</b>) Reference image.</p> "> Figure 13
<p>The impact of <math display="inline"><semantics> <mi>ω</mi> </semantics></math> on the change detection results. (<b>a</b>) The impact of <math display="inline"><semantics> <mi>ω</mi> </semantics></math> on KC. (<b>b</b>) The impact of <math display="inline"><semantics> <mi>ω</mi> </semantics></math> on <math display="inline"><semantics> <msub> <mi>F</mi> <mn>1</mn> </msub> </semantics></math>.</p> "> Figure 14
<p>The impact of <math display="inline"><semantics> <mi>λ</mi> </semantics></math> on the change detection results. (<b>a</b>) The impact of <math display="inline"><semantics> <mi>λ</mi> </semantics></math> on KC. (<b>b</b>) The impact of <math display="inline"><semantics> <mi>λ</mi> </semantics></math> on <math display="inline"><semantics> <mrow> <mi>O</mi> <mi>E</mi> </mrow> </semantics></math>.</p> "> Figure 15
<p>CMs for the Yandu Village data set produced by various methods. (<b>a</b>) CVA. (<b>b</b>) Principal component analysis (PCA). (<b>c</b>) Deep neural network (DNN). (<b>d</b>) Generative adversarial network (GAN)-based method (GAND). (<b>e</b>) Iterative reweighted multivariate change detection (IR-MAD)+GAN. (<b>f</b>) CAN.</p> "> Figure 16
<p>CMs for the Minfeng data set produced by various methods. (<b>a</b>) CVA. (<b>b</b>) PCA. (<b>c</b>) DNN. (<b>d</b>) GAND. (<b>e</b>) IR-MAD+GAN. (<b>f</b>) CAN.</p> "> Figure 17
<p>CMs for the Hongqi data set produced by various methods. (<b>a</b>) CVA. (<b>b</b>) PCA. (<b>c</b>) DNN. (<b>d</b>) GAND. (<b>e</b>) IR-MAD+GAN. (<b>f</b>) CAN.</p> ">
Abstract
:1. Introduction
2. Methodology
2.1. Generative Adversarial Networks
2.2. Pre-Classification
2.3. Training Samples Selection
2.4. Network Establishment
2.5. Network Training
Algorithm 1 The procedure of the CAN. |
Input: A pair of initial images Output: Final change map (CM) 1. Obtain the difference image (DI) by change vector analysis (CVA). 2. Use Otsu to divide the pixels in the differential image into changed and unchanged, and obtain the initial change map (CM). 3. Use the sample selection algorithm to select training samples in the initial change map. 4. The parameters of G and D randomly initialize. 5. Fix network G, and update the parameters of D by optimizing Equation (3). 6. Fix network D, update the parameters of G by optimizing Equation (4). 7. Alternately perform step 5 and step 6 until Equation (3) is convergent. Return: The final classification result (changed or unchanged) |
3. Experimental Study
3.1. Data Sets Description
3.2. Parameter Setting
3.2.1. Effects of Parameter
3.2.2. Effects of Parameter
3.2.3. Results on the Yandu Village Data Set
3.2.4. Results on the Minfeng Data Set
3.2.5. Results on the Hongqi Canal Data Set
4. Conclusions
Author Contributions
Funding
Conflicts of Interest
References
- Jin, S.; Yang, L.; Zhu, Z.; Homer, C. A land cover change detection and classification protocol for updating Alaska NLCD 2001 to 2011. Remote Sens. Environ. 2017, 195, 44–55. [Google Scholar] [CrossRef]
- Lyu, H.; Lu, H.; Mou, L. Learning a transferable change rule from a recurrent neural network for land cover change detection. Remote Sens. 2016, 8, 506. [Google Scholar] [CrossRef] [Green Version]
- Polykretis, C.; Grillakis, M.G.; Alexakis, D.D. Exploring the impact of various spectral indices on land cover change detection using change vector analysis: A case study of Crete Island, Greece. Remote Sens. 2020, 12, 319. [Google Scholar] [CrossRef] [Green Version]
- Zhao, S.; Wang, Q.; Li, Y.; Liu, S.; Wang, Z.; Zhu, L.; Wang, Z. An overview of satellite remote sensing technology used in China’s environmental protection. Earth Sci. Inform. 2017, 10, 137–148. [Google Scholar] [CrossRef]
- Sofina, N.; Ehlers, M. Building change detection using high resolution remotely sensed data and GIS. IEEE J. Sel. Top. Appl. Earth Obs. Remote Sens. 2016, 9, 3430–3438. [Google Scholar] [CrossRef]
- López-Fandiño, J.; Heras, D.B.; Argüello, F.; Dalla Mura, M. GPU framework for change detection in multitemporal hyperspectral images. Int. J. Parallel Program. 2019, 47, 272–292. [Google Scholar] [CrossRef]
- Aminikhanghahi, S.; Cook, D.J. A survey of methods for time series change point detection. Knowl. Inf. Syst. 2017, 51, 339–367. [Google Scholar] [CrossRef] [PubMed] [Green Version]
- Tan, K.; Zhang, Y.; Wang, X.; Chen, Y. Object-based change detection using multiple classifiers and multi-scale uncertainty analysis. Remote Sens. 2019, 11, 359. [Google Scholar] [CrossRef] [Green Version]
- Kerekes, A.; Alexe, M. Evaluating Urban Sprawl and Land-Use Change Using Remote Sensing, Gis Techniques and Historical Maps. Case Study: The City of Dej, Romania. Analele Univ. Din Oradea Ser. Geogr. 2019, 29, 52–63. [Google Scholar] [CrossRef]
- Liu, S.; Marinelli, D.; Bruzzone, L.; Bovolo, F. A review of change detection in multitemporal hyperspectral images: Current techniques, applications, and challenges. IEEE Geosci. Remote Sens. Mag. 2019, 7, 140–158. [Google Scholar] [CrossRef]
- Tewkesbury, A.P.; Comber, A.J.; Tate, N.J.; Lamb, A.; Fisher, P.F. A critical synthesis of remotely sensed optical image change detection techniques. Remote Sens. Environ. 2015, 160, 1–14. [Google Scholar] [CrossRef] [Green Version]
- Scheffler, D.; Hollstein, A.; Diedrich, H.; Segl, K.; Hostert, P. AROSICS: An automated and robust open-source image co-registration software for multi-sensor satellite data. Remote Sens. 2017, 9, 676. [Google Scholar] [CrossRef] [Green Version]
- Cao, X.; Ji, Y.; Wang, L.; Ji, B.; Jiao, L.; Han, J. SAR image change detection based on deep denoising and CNN. IET Image Process. 2019, 13, 1509–1515. [Google Scholar] [CrossRef]
- Saha, S.; Bovolo, F.; Bruzzone, L. Unsupervised deep change vector analysis for multiple-change detection in VHR images. IEEE Trans. Geosci. Remote Sens. 2019, 57, 3677–3693. [Google Scholar] [CrossRef]
- Dharani, M.; Sreenivasulu, G. Land use and land cover change detection by using principal component analysis and morphological operations in remote sensing applications. Int. J. Comput. Appl. 2019, 1–10. [Google Scholar] [CrossRef]
- Lou, X.; Jia, Z.; Yang, J.; Kasabov, N. Change detection in SAR images based on the ROF model semi-Implicit denoising method. Sensors 2019, 19, 1179. [Google Scholar] [CrossRef] [Green Version]
- Ma, W.; Yang, H.; Wu, Y.; Xiong, Y.; Hu, T.; Jiao, L.; Hou, B. Change Detection Based on Multi-Grained Cascade Forest and Multi-Scale Fusion for SAR Images. Remote Sens. 2019, 11, 142. [Google Scholar] [CrossRef] [Green Version]
- Li, X.; Yuan, Z.; Wang, Q. Unsupervised Deep Noise Modeling for Hyperspectral Image Change Detection. Remote Sens. 2019, 11, 258. [Google Scholar] [CrossRef] [Green Version]
- Chen, H.; Jiao, L.; Liang, M.; Liu, F.; Yang, S.; Hou, B. Fast unsupervised deep fusion network for change detection of multitemporal SAR images. Neurocomputing 2019, 332, 56–70. [Google Scholar] [CrossRef]
- Yetgin, Z. Unsupervised change detection of satellite images using local gradual descent. IEEE Trans. Geosci. Remote Sens. 2011, 50, 1919–1929. [Google Scholar] [CrossRef]
- Ma, W.; Wu, Y.; Gong, M.; Xiong, Y.; Yang, H.; Hu, T. Change detection in SAR images based on matrix factorisation and a Bayes classifier. Int. J. Remote Sens. 2019, 40, 1066–1091. [Google Scholar] [CrossRef]
- Krinidis, S.; Chatzis, V. A robust fuzzy local information C-means clustering algorithm. IEEE Trans. Image Process. 2010, 19, 1328–1337. [Google Scholar] [CrossRef] [PubMed]
- Ghosh, A.; Mishra, N.S.; Ghosh, S. Fuzzy clustering algorithms for unsupervised change detection in remote sensing images. Inf. Sci. 2011, 181, 699–715. [Google Scholar] [CrossRef]
- Lv, Z.; Liu, T.; Shi, C.; Benediktsson, J.A.; Du, H. Novel land cover change detection method based on K-means clustering and adaptive majority voting using bitemporal remote sensing images. IEEE Access 2019, 7, 34425–34437. [Google Scholar] [CrossRef]
- Di Nucci, D.; Palomba, F.; Oliveto, R.; De Lucia, A. Dynamic selection of classifiers in bug prediction: An adaptive method. IEEE Trans. Emerg. Top. Comput. Intell. 2017, 1, 202–212. [Google Scholar] [CrossRef]
- Lv, P.; Zhong, Y.; Zhao, J.; Jiao, H.; Zhang, L. Change detection based on a multifeature probabilistic ensemble conditional random field model for high spatial resolution remote sensing imagery. IEEE Geosci. Remote Sens. Lett. 2016, 13, 1965–1969. [Google Scholar] [CrossRef]
- Liu, Q.; Liu, L.; Wang, Y. Unsupervised change detection for multispectral remote sensing images using random walks. Remote Sens. 2017, 9, 438. [Google Scholar] [CrossRef] [Green Version]
- Wan, L.; Zhang, T.; You, H. Multi-sensor remote sensing image change detection based on sorted histograms. Int. J. Remote Sens. 2018, 39, 3753–3775. [Google Scholar] [CrossRef]
- Chen, H.; Wu, C.; Du, B.; Zhang, L. Deep Siamese Multi-scale Convolutional Network for Change Detection in Multi-temporal VHR Images. In Proceedings of the International Workshop on the Analysis of Multitemporal Remote Sensing Images (MultiTemp), Shanghai, China, 5–7 August 2019; pp. 1–4. [Google Scholar]
- Li, Y.; Gong, M.; Jiao, L.; Li, L.; Stolkin, R. Change-detection map learning using matching pursuit. IEEE Trans. Geosci. Remote Sens. 2015, 53, 4712–4723. [Google Scholar] [CrossRef]
- Ma, W.; Xiong, Y.; Wu, Y.; Yang, H.; Zhang, X.; Jiao, L. Change Detection in Remote Sensing Images Based on Image Mapping and a Deep Capsule Network. Remote Sens. 2019, 11, 626. [Google Scholar] [CrossRef] [Green Version]
- Buslaev, A.; Seferbekov, S.S.; Iglovikov, V.; Shvets, A. Fully Convolutional Network for Automatic Road Extraction From Satellite Imagery. In Proceedings of the 2018 IEEE/CVF Conference on Computer Vision and Pattern Recognition Workshops (CVPRW), Salt Lake City, UT, USA, 18–22 June 2018; Volume 207, p. 210. [Google Scholar]
- Wang, Q.; Liu, S.; Chanussot, J.; Li, X. Scene classification with recurrent attention of VHR remote sensing images. IEEE Trans. Geosci. Remote Sens. 2018, 57, 1155–1167. [Google Scholar] [CrossRef]
- Liu, X.; Liu, Q.; Wang, Y. Remote sensing image fusion based on two-stream fusion network. Inf. Fusion 2020, 55, 1–15. [Google Scholar] [CrossRef] [Green Version]
- Krizhevsky, A.; Sutskever, I.; Hinton, G.E. Imagenet classification with deep convolutional neural networks. In Proceedings of the Advances in Neural Information Processing Systems, Lake Tahoe, NV, USA, 3–6 December 2012; pp. 1097–1105. [Google Scholar]
- Simonyan, K.; Zisserman, A. Very deep convolutional networks for large-scale image recognition. arXiv 2014, arXiv:1409.1556. [Google Scholar]
- Szegedy, C.; Liu, W.; Jia, Y.; Sermanet, P.; Reed, S.; Anguelov, D.; Erhan, D.; Vanhoucke, V.; Rabinovich, A. Going deeper with convolutions. In Proceedings of the IEEE Conference on Computer Vision and Pattern Recognition, Boston, MA, USA, 12 June 2015; pp. 1–9. [Google Scholar]
- Xu, H.; Wang, Y.; Guan, H.; Shi, T.; Hu, X. Detecting Ecological Changes with a Remote Sensing Based Ecological Index (RSEI) Produced Time Series and Change Vector Analysis. Remote Sens. 2019, 11, 2345. [Google Scholar] [CrossRef] [Green Version]
- Qahtan, A.A.; Alharbi, B.; Wang, S.; Zhang, X. A pca-based change detection framework for multidimensional data streams: Change detection in multidimensional data streams. In Proceedings of the 21th ACM SIGKDD International Conference on Knowledge Discovery and Data Mining, Sydney, Australia, 10–13 August 2015; pp. 935–944. [Google Scholar]
- Gong, M.; Zhao, J.; Liu, J.; Miao, Q.; Jiao, L. Change detection in synthetic aperture radar images based on deep neural networks. IEEE Trans. Neural Netw. Learn. Syst. 2015, 27, 125–138. [Google Scholar] [CrossRef] [PubMed]
- Gong, M.; Zhan, T.; Zhang, P.; Miao, Q. Superpixel-based difference representation learning for change detection in multispectral remote sensing images. IEEE Trans. Geosci. Remote Sens. 2017, 55, 2658–2673. [Google Scholar] [CrossRef]
- Lin, Y.; Li, S.; Fang, L.; Ghamisi, P. Multispectral Change Detection With Bilinear Convolutional Neural Networks. IEEE Geosci. Remote Sens. Lett. 2019. [Google Scholar] [CrossRef]
- Liu, Y.; Pang, C.; Zhan, Z.; Zhang, X.; Yang, X. Building Change Detection for Remote Sensing Images Using a Dual Task Constrained Deep Siamese Convolutional Network Model. arXiv 2019, arXiv:1909.07726. [Google Scholar]
- Zhang, X.; Liu, G.; Zhang, C.; Atkinson, P.M.; Tan, X.; Jian, X.; Zhou, X.; Li, Y. Two-phase object-based deep learning for multi-temporal SAR image change detection. Remote Sens. 2020, 12, 548. [Google Scholar] [CrossRef] [Green Version]
- Zhang, W.; Lu, X. The spectral-spatial joint learning for change detection in multispectral imagery. Remote Sens. 2019, 11, 240. [Google Scholar] [CrossRef] [Green Version]
- Samadi, F.; Akbarizadeh, G.; Kaabi, H. Change detection in SAR images using deep belief network: A new training approach based on morphological images. IET Image Process. 2019, 13, 2255–2264. [Google Scholar] [CrossRef]
- Isola, P.; Zhu, J.Y.; Zhou, T.; Efros, A.A. Image-to-image translation with conditional adversarial networks. In Proceedings of the IEEE Conference on Computer Vision and Pattern Recognition, Long Beach, CA, USA, 13–19 June 2017; pp. 1125–1134. [Google Scholar]
- Yi, Z.; Zhang, H.; Tan, P.; Gong, M. Dualgan: Unsupervised dual learning for image-to-image translation. In Proceedings of the IEEE International Conference on Computer Vision, Venice, Italy, 22–29 October 2017; pp. 2849–2857. [Google Scholar]
- Gong, M.; Yang, Y.; Zhan, T.; Niu, X.; Li, S. A generative discriminatory classified network for change detection in multispectral imagery. IEEE J. Sel. Top. Appl. Earth Obs. Remote Sens. 2019, 12, 321–333. [Google Scholar] [CrossRef]
- Goodfellow, I.; Pouget-Abadie, J.; Mirza, M.; Xu, B.; Warde-Farley, D.; Ozair, S.; Courville, A.; Bengio, Y. Generative adversarial nets. In Proceedings of the Advances in Neural Information Processing Systems, Montreal, QC, Canada, 8–13 December 2014; pp. 2672–2680. [Google Scholar]
- Otsu, N. A threshold selection method from gray-level histograms. IEEE Trans. Syst. Man Cybern. 1979, 9, 62–66. [Google Scholar] [CrossRef] [Green Version]
- Deng, J.; Wang, K.; Deng, Y.; Qi, G. PCA-based land-use change detection and analysis using multitemporal and multisensor satellite data. Int. J. Remote Sens. 2008, 29, 4823–4838. [Google Scholar] [CrossRef]
- Gong, M.; Niu, X.; Zhang, P.; Li, Z. Generative adversarial networks for change detection in multispectral imagery. IEEE Geosci. Remote Sens. Lett. 2017, 14, 2310–2314. [Google Scholar] [CrossRef]
- Rosenfield, G.H.; Fitzpatrick-Lins, K. A coefficient of agreement as a measure of thematic classification accuracy. Photogramm. Eng. Remote Sens. 1986, 52, 223–227. [Google Scholar]
Data Set | Minfeng | Yandu Village | Hongqi Canal |
---|---|---|---|
Image Size | |||
Spatial Resolution (unit: m) | 2.0 | 0.5 | 2.0 |
Satellite | GF-1 | WorldView-2 | GF-1 |
The percentage of selected training samples | 31.2% | 58.8% | 44.7% |
Method | FP | FN | OE | KC | |
---|---|---|---|---|---|
CVA | 5967 | 3169 | 9136 | 0.6205 | 0.6839 |
PCA | 5290 | 3132 | 8422 | 0.6358 | 0.6940 |
DNN | 1242 | 3501 | 4743 | 0.7693 | 0.8011 |
GAND | 338 | 4381 | 4719 | 0.7538 | 0.7845 |
IR-MAD+GAN | 595 | 4091 | 4686 | 0.7621 | 0.7927 |
CAN | 672 | 3706 | 4378 | 0.7813 | 0.8102 |
Method | FP | FN | OE | KC | |
---|---|---|---|---|---|
CVA | 56,199 | 7403 | 63,602 | 0.2519 | 0.3416 |
PCA | 36,172 | 4395 | 40,567 | 0.3943 | 0.4559 |
DNN | 30,971 | 4744 | 35,715 | 0.4531 | 0.5111 |
GAND | 5287 | 8728 | 14,015 | 0.6151 | 0.6390 |
IR-MAD+GAN | 22,727 | 7299 | 30,026 | 0.4729 | 0.5252 |
CAN | 9772 | 7381 | 17,153 | 0.6272 | 0.6583 |
Method | FP | FN | OE | KC | |
---|---|---|---|---|---|
CVA | 19,141 | 15,888 | 35,029 | 0.3418 | 0.4082 |
PCA | 7906 | 10,422 | 18,328 | 0.6092 | 0.6434 |
DNN | 7142 | 6546 | 13,688 | 0.7231 | 0.7489 |
GAND | 3236 | 12,100 | 15,336 | 0.7252 | 0.7539 |
IR-MAD+GAN | 11,443 | 4472 | 14915 | 0.7285 | 0.7466 |
CAN | 1472 | 10,030 | 11,502 | 0.7366 | 0.7572 |
© 2020 by the authors. Licensee MDPI, Basel, Switzerland. This article is an open access article distributed under the terms and conditions of the Creative Commons Attribution (CC BY) license (http://creativecommons.org/licenses/by/4.0/).
Share and Cite
Wu, Y.; Bai, Z.; Miao, Q.; Ma, W.; Yang, Y.; Gong, M. A Classified Adversarial Network for Multi-Spectral Remote Sensing Image Change Detection. Remote Sens. 2020, 12, 2098. https://doi.org/10.3390/rs12132098
Wu Y, Bai Z, Miao Q, Ma W, Yang Y, Gong M. A Classified Adversarial Network for Multi-Spectral Remote Sensing Image Change Detection. Remote Sensing. 2020; 12(13):2098. https://doi.org/10.3390/rs12132098
Chicago/Turabian StyleWu, Yue, Zhuangfei Bai, Qiguang Miao, Wenping Ma, Yuelei Yang, and Maoguo Gong. 2020. "A Classified Adversarial Network for Multi-Spectral Remote Sensing Image Change Detection" Remote Sensing 12, no. 13: 2098. https://doi.org/10.3390/rs12132098
APA StyleWu, Y., Bai, Z., Miao, Q., Ma, W., Yang, Y., & Gong, M. (2020). A Classified Adversarial Network for Multi-Spectral Remote Sensing Image Change Detection. Remote Sensing, 12(13), 2098. https://doi.org/10.3390/rs12132098