Correlation Matrix-Based Fusion of Hyperspectral and Multispectral Images
"> Figure 1
<p>Correlation matrix assumption.</p> "> Figure 2
<p>The false color images formed by bands 30, 20, and 10 of the fusion results (imgb2, <math display="inline"><semantics><mrow><mi>s</mi><mo>=</mo><mn>32</mn></mrow></semantics></math>). The yellow box region displays the image of the red box region magnified by two times. The blue box is the error image in the magnified portion at band 20. HR-HSI (<math display="inline"><semantics><mrow><mn>512</mn><mo>×</mo><mn>512</mn><mo>×</mo><mn>31</mn></mrow></semantics></math>) is obtained by a commercial hyperspectral camera (Nuance FX, CRI Inc.).</p> "> Figure 3
<p>The false color images formed by bands 11, 21, and 31 of the fusion results (imgh7, <math display="inline"><semantics><mrow><mi>s</mi><mo>=</mo><mn>32</mn></mrow></semantics></math>). The yellow box region displays the image of the red box region magnified by two times. The blue box is the error image in the magnified portion at band 11. HR-HSI (<math display="inline"><semantics><mrow><mn>512</mn><mo>×</mo><mn>512</mn><mo>×</mo><mn>31</mn></mrow></semantics></math>) is obtained by a commercial hyperspectral camera (Nuance FX, CRI Inc.).</p> "> Figure 4
<p>The false color images formed by bands 61, 36, 10 of the fusion results (<math display="inline"><semantics><mrow><mi>s</mi><mo>=</mo><mn>32</mn></mrow></semantics></math>). The yellow box region displays the image of the red box region magnified by two times. The blue box is the error image in the magnified portion at band 11. HR-HSI (<math display="inline"><semantics><mrow><mn>256</mn><mo>×</mo><mn>256</mn><mo>×</mo><mn>93</mn></mrow></semantics></math>) was obtained by ROSIS (Reflective Optics Spectrographic Imaging System) on 2003.</p> "> Figure 5
<p>The false color images formed by bands 80, 60, and 29 of the fusion results on the real dataset. The yellow box region displays the image of the red box region magnified by three times. HSI (<math display="inline"><semantics><mrow><mn>100</mn><mo>×</mo><mn>100</mn><mo>×</mo><mn>89</mn></mrow></semantics></math>) is obtained by Hyperion sensor on the Earth Observation 1 satellite, MSI (<math display="inline"><semantics><mrow><mn>300</mn><mo>×</mo><mn>300</mn><mo>×</mo><mn>4</mn></mrow></semantics></math>) is obtained by the Sentinel-2A satellite.</p> "> Figure 6
<p>The PSNR varies from parameter <math display="inline"><semantics><mi>ρ</mi></semantics></math>. (<b>a</b>,<b>b</b>) are for Harvard dataset. (<b>c</b>) is for Pavia University dataset.</p> "> Figure 7
<p>The PSNR varies from different downsampling factors (4, 8, 16, 32, 64). (<b>a</b>,<b>b</b>) are for Harvard dataset. (<b>c</b>) is for Pavia University dataset.</p> ">
Abstract
:1. Introduction
- We give an assumption of the correlation matrix, which establishes the correlation between the spectral information of the hyperspectral image (HSI) and the spatial information of the multispectral image (MSI). Through this correlation matrix, we avoid complex computational processes, greatly simplify the fusion process, and reduce the fusion time.
- The proposal of the correlation matrix assumption offers a novel method for future fusion processes, enabling efficient fusion through the solution of a matrix that fulfills the defined correlation matrix.
- Based on the generative relationship among HSI, MSI, and HR-HSI, we derive a method to solve the correlation matrix and construct a new fusion model using this correlation matrix. We achieve the initial fusion by performing simple matrix operations on HSI, MSI, and the correlation matrix. We further optimize the preliminary fusion result using the Sylvester equation. The entire fusion process is more straightforward compared to common fusion methods, without the need for complex operations or parameter adjustments.
- Experimental results on two simulated datasets and one real dataset validate the superiority of our proposed method over current state-of-the-art fusion methods. We can obtain high-quality fusion results while simplifying the fusion process and significantly improving fusion time.
2. Related Work
3. Fusion Model
Algorithm 1 The proposed method. |
Require: , , , , |
Obtain the from Equation (7) |
Compute the generalized inverse matrix of to obtain . |
Reconstruct the first stage HR-HSI using Equation (9). |
Solve Equation (10) by substituting to obtain , , and in Equation (12). |
Eigen-decomposition of : |
Eigen-decomposition of : |
for i = 1 to L do |
end for |
Set |
return |
4. Experiments
4.1. Dataset
4.2. Experimental Settings and Quantitative Metrics
5. Results and Analysis
5.1. Results on Simulated Dataset
5.2. Results on Real Dataset
5.3. Parameter Discussion
5.4. Comparison between CMF and CMF+
5.5. Computational Cost
6. Conclusions
Author Contributions
Funding
Data Availability Statement
Conflicts of Interest
Appendix A
References
- Sethy, P.K.; Pandey, C.; Sahu, Y.K.; Behera, S.K. Hyperspectral imagery applications for precision agriculture-a systemic survey. Multimed. Tools Appl. 2022, 81, 3005–3038. [Google Scholar] [CrossRef]
- Agilandeeswari, L.; Prabukumar, M.; Radhesyam, V.; Phaneendra, K.L.B.; Farhan, A. Crop classification for agricultural applications in hyperspectral remote sensing images. Appl. Sci. 2022, 12, 1670. [Google Scholar] [CrossRef]
- Wang, B.; Sun, J.; Xia, L.; Liu, J.; Wang, Z.; Li, P.; Guo, Y.; Sun, X. The applications of hyperspectral imaging technology for agricultural products quality analysis: A review. Food Rev. Int. 2021, 39, 1043–1062. [Google Scholar] [CrossRef]
- Lu, B.; Dao, P.D.; Liu, J.; He, Y.; Shang, J. Recent advances of hyperspectral imaging technology and applications in agriculture. Remote Sens. 2020, 12, 2659. [Google Scholar] [CrossRef]
- Ma, C.; Liu, X.; Li, S.; Li, C.; Zhang, R. Accuracy evaluation of hyperspectral inversion of environmental parameters of loess profile. Environ. Earth Sci. 2023, 82, 251. [Google Scholar] [CrossRef]
- Stuart, M.B.; Davies, M.; Hobbs, M.J.; Pering, T.D.; McGonigle, A.J.; Willmott, J.R. High-resolution hyperspectral imaging using low-cost components: Application within environmental monitoring scenarios. Sensors 2022, 22, 4652. [Google Scholar] [CrossRef]
- Cui, Y.; Zhang, B.; Yang, W.; Wang, Z.; Li, Y.; Yi, X.; Tang, Y. End-to-end visual target tracking in multi-robot systems based on deep convolutional neural network. In Proceedings of the IEEE International Conference on Computer Vision Workshops, Venice, Italy, 22–29 October 2017; pp. 1113–1121. [Google Scholar]
- Cui, Y.; Zhang, B.; Yang, W.; Yi, X.; Tang, Y. Deep CNN-based visual target tracking system relying on monocular image sensing. In Proceedings of the 2018 International Joint Conference on Neural Networks (IJCNN), Rio de Janeiro, Brazil, 8–13 July 2018; pp. 1–8. [Google Scholar]
- Hirai, A.; Tonooka, H. Mineral discrimination by combination of multispectral image and surrounding hyperspectral image. J. Appl. Remote Sens. 2019, 13, 024517. [Google Scholar] [CrossRef]
- He, J.; Barton, I. Hyperspectral remote sensing for detecting geotechnical problems at Ray mine. Eng. Geol. 2021, 292, 106261. [Google Scholar] [CrossRef]
- Vignesh, K.M.; Kiran, Y. Comparative analysis of mineral mapping for hyperspectral and multispectral imagery. Arab. J. Geosci. 2020, 13, 160. [Google Scholar] [CrossRef]
- Yuan, J.; Wang, S.; Wu, C.; Xu, Y. Fine-grained classification of urban functional zones and landscape pattern analysis using hyperspectral satellite imagery: A case study of wuhan. IEEE J. Sel. Top. Appl. Earth Obs. Remote Sens. 2022, 15, 3972–3991. [Google Scholar] [CrossRef]
- Zhang, C.; Sargent, I.; Pan, X.; Li, H.; Gardiner, A.; Hare, J.; Atkinson, P.M. An object-based convolutional neural network (OCNN) for urban land use classification. Remote Sens. Environ. 2018, 216, 57–70. [Google Scholar] [CrossRef] [Green Version]
- Weng, Q. A remote sensing? GIS evaluation of urban expansion and its impact on surface temperature in the Zhujiang Delta, China. Int. J. Remote Sens. 2001, 22, 1999–2014. [Google Scholar]
- Sun, W.; Du, Q. Graph-regularized fast and robust principal component analysis for hyperspectral band selection. IEEE Trans. Geosci. Remote Sens. 2018, 56, 3185–3195. [Google Scholar] [CrossRef]
- Akgun, T.; Altunbasak, Y.; Mersereau, R.M. Super-resolution reconstruction of hyperspectral images. IEEE Trans. Image Process. 2005, 14, 1860–1875. [Google Scholar] [CrossRef] [PubMed]
- Hong, D.; Liu, W.; Su, J.; Pan, Z.; Wang, G. A novel hierarchical approach for multispectral palmprint recognition. Neurocomputing 2015, 151, 511–521. [Google Scholar] [CrossRef]
- Yokoya, N.; Grohnfeldt, C.; Chanussot, J. Hyperspectral and multispectral data fusion: A comparative review of the recent literature. IEEE Geosci. Remote Sens. Mag. 2017, 5, 29–56. [Google Scholar] [CrossRef]
- Hong, D.; Yokoya, N.; Chanussot, J.; Zhu, X.X. CoSpace: Common subspace learning from hyperspectral-multispectral correspondences. IEEE Trans. Geosci. Remote Sens. 2019, 57, 4349–4359. [Google Scholar] [CrossRef] [Green Version]
- Ferraris, V.; Dobigeon, N.; Wei, Q.; Chabert, M. Robust fusion of multiband images with different spatial and spectral resolutions for change detection. IEEE Trans. Comput. Imaging 2017, 3, 175–186. [Google Scholar] [CrossRef] [Green Version]
- Qu, Y.; Qi, H.; Ayhan, B.; Kwan, C.; Kidd, R. DOES multispectral/hyperspectral pansharpening improve the performance of anomaly detection? In Proceedings of the 2017 IEEE International Geoscience and Remote Sensing Symposium (IGARSS), Fort Worth, TX, USA, 23–28 July 2017; pp. 6130–6133. [Google Scholar]
- Gómez-Chova, L.; Tuia, D.; Moser, G.; Camps-Valls, G. Multimodal classification of remote sensing images: A review and future directions. Proc. IEEE 2015, 103, 1560–1584. [Google Scholar] [CrossRef]
- Dian, R.; Li, S. Hyperspectral image super-resolution via subspace-based low tensor multi-rank regularization. IEEE Trans. Image Process. 2019, 28, 5135–5146. [Google Scholar] [CrossRef]
- Wei, Q.; Dobigeon, N.; Tourneret, J.Y. Bayesian fusion of hyperspectral and multispectral images. In Proceedings of the 2014 IEEE International Conference on Acoustics, Speech and Signal Processing (ICASSP), Florence, Italy, 4–9 May 2014; pp. 3176–3180. [Google Scholar]
- Sui, L.; Li, L.; Li, J.; Chen, N.; Jiao, Y. Fusion of hyperspectral and multispectral images based on a Bayesian nonparametric approach. IEEE J. Sel. Top. Appl. Earth Obs. Remote Sens. 2019, 12, 1205–1218. [Google Scholar] [CrossRef]
- Li, X.; Zhang, Y.; Ge, Z.; Cao, G.; Shi, H.; Fu, P. Adaptive nonnegative sparse representation for hyperspectral image super-resolution. IEEE J. Sel. Top. Appl. Earth Obs. Remote Sens. 2021, 14, 4267–4283. [Google Scholar] [CrossRef]
- Dong, W.; Fu, F.; Shi, G.; Cao, X.; Wu, J.; Li, G.; Li, X. Hyperspectral image super-resolution via non-negative structured sparse representation. IEEE Trans. Image Process. 2016, 25, 2337–2352. [Google Scholar] [CrossRef]
- Huang, B.; Song, H.; Cui, H.; Peng, J.; Xu, Z. Spatial and spectral image fusion using sparse matrix factorization. IEEE Trans. Geosci. Remote Sens. 2013, 52, 1693–1704. [Google Scholar] [CrossRef]
- Li, S.; Dian, R.; Fang, L.; Bioucas-Dias, J.M. Fusing hyperspectral and multispectral images via coupled sparse tensor factorization. IEEE Trans. Image Process. 2018, 27, 4118–4130. [Google Scholar] [CrossRef]
- Dian, R.; Li, S.; Fang, L. Learning a low tensor-train rank representation for hyperspectral image super-resolution. IEEE Trans. Neural Netw. Learn. Syst. 2019, 30, 2672–2683. [Google Scholar] [CrossRef] [PubMed]
- Yao, J.; Hong, D.; Chanussot, J.; Meng, D.; Zhu, X.; Xu, Z. Cross-attention in coupled unmixing nets for unsupervised hyperspectral super-resolution. In Proceedings of the Computer Vision–ECCV 2020: 16th European Conference, Glasgow, UK, 23–28 August 2020; pp. 208–224. [Google Scholar]
- Dian, R.; Li, S.; Kang, X. Regularizing hyperspectral and multispectral image fusion by CNN denoiser. IEEE Trans. Neural Netw. Learn. Syst. 2020, 32, 1124–1135. [Google Scholar] [CrossRef] [PubMed]
- Liu, J.; Wu, Z.; Xiao, L.; Wu, X.J. Model inspired autoencoder for unsupervised hyperspectral image super-resolution. IEEE Trans. Geosci. Remote Sens. 2022, 60, 1–12. [Google Scholar] [CrossRef]
- Zhuang, L.; Bioucas-Dias, J.M. Fast hyperspectral image denoising and inpainting based on low-rank and sparse representations. IEEE J. Sel. Top. Appl. Earth Obs. Remote Sens. 2018, 11, 730–742. [Google Scholar] [CrossRef]
- Penrose, R. A generalized inverse for matrices. Math. Proc. Camb. Philos. Soc. 1955, 51, 406–413. [Google Scholar] [CrossRef]
- Bartels, R.H.; Stewart, G.W. Solution of the matrix equation AX+ XB= C [F4]. Commun. ACM 1972, 15, 820–826. [Google Scholar] [CrossRef]
- Chakrabarti, A.; Zickler, T. Statistics of real-world hyperspectral images. In Proceedings of the CVPR 2011, Colorado Springs, CO, USA, 20–25 June 2011; pp. 193–200. [Google Scholar]
- Dell’Acqua, F.; Gamba, P.; Ferrari, A.; Palmason, J.A.; Benediktsson, J.A.; Árnason, K. Exploiting spectral and spatial information in hyperspectral urban data with high resolution. IEEE Geosci. Remote Sens. Lett. 2004, 1, 322–326. [Google Scholar] [CrossRef]
- Wei, Q.; Bioucas-Dias, J.; Dobigeon, N.; Tourneret, J.Y. Hyperspectral and multispectral image fusion based on a sparse representation. IEEE Trans. Geosci. Remote Sens. 2015, 53, 3658–3668. [Google Scholar] [CrossRef] [Green Version]
- Simoes, M.; Bioucas-Dias, J.; Almeida, L.B.; Chanussot, J. A convex formulation for hyperspectral image superresolution via subspace-based regularization. IEEE Trans. Geosci. Remote Sens. 2014, 53, 3373–3388. [Google Scholar] [CrossRef] [Green Version]
- Long, J.; Peng, Y.; Li, J.; Zhang, L.; Xu, Y. Hyperspectral image super-resolution via subspace-based fast low tensor multi-rank regularization. Infrared Phys. Technol. 2021, 116, 103631. [Google Scholar] [CrossRef]
- Zhang, L.; Wei, W.; Bai, C.; Gao, Y.; Zhang, Y. Exploiting clustering manifold structure for hyperspectral imagery super-resolution. IEEE Trans. Image Process. 2018, 27, 5969–5982. [Google Scholar] [CrossRef]
- Wald, L. Quality of high resolution synthesised images: Is there a simple criterion? In Proceedings of the Third International Conference Fusion of Earth Data: Merging Point Measurements, Raster Maps and Remotely Sensed Images, Sophia Antipolis, France, 26–28 January 2000; pp. 99–103. [Google Scholar]
- Yuhas, R.H.; Goetz, A.F.; Boardman, J.W. Discrimination among semi-arid landscape endmembers using the spectral angle mapper (SAM) algorithm. In Proceedings of the JPL, Summaries of the Third Annual JPL Airborne Geoscience Workshop. Volume 1: AVIRIS Workshop, Pasadena, CA, USA, 1–5 June 1992. [Google Scholar]
- Wang, Z.; Bovik, A.C. A universal image quality index. IEEE Signal Process. Lett. 2002, 9, 81–84. [Google Scholar] [CrossRef]
Dataset | SRF | PSF |
---|---|---|
Harvard | Nikon D700 [27] | Gaussian blur (standard deviation 2) |
Pavia University | IKONOS [39] | Gaussian blur (standard deviation 2) |
Sentinel 2 and Hyperion | PSF and SRF are Obtained by the Hysure [40] method |
Method | ||||||||
---|---|---|---|---|---|---|---|---|
PSNR | SAM | ERGAS | UIQI | PSNR | SAM | ERGAS | UIQI | |
Best Values | 0 | 0 | 1 | 0 | 0 | 1 | ||
CMF | 42.503 | 1.940 | 1.150 | 0.972 | 42.422 | 1.954 | 0.578 | 0.972 |
CMF+ | 42.648 | 1.888 | 1.148 | 0.972 | 42.454 | 1.942 | 0.578 | 0.972 |
LTMR | 43.519 | 1.827 | 1.054 | 0.972 | 39.957 | 3.247 | 0.648 | 0.957 |
FLTMR | 43.782 | 1.775 | 1.042 | 0.973 | 38.792 | 4.003 | 0.705 | 0.946 |
CSTF | 42.865 | 2.175 | 0.960 | 0.961 | 42.285 | 2.218 | 0.556 | 0.966 |
NSSR | 28.997 | 3.891 | 1.841 | 0.905 | 28.431 | 3.428 | 0.896 | 0.926 |
CMS | 43.350 | 1.589 | 1.032 | 0.970 | 42.015 | 2.223 | 0.624 | 0.965 |
CNN-Fus | 42.036 | 1.980 | 1.168 | 0.970 | 41.948 | 2.005 | 0.588 | 0.969 |
MIAE | 42.127 | 1.844 | 2.632 | 0.971 | 40.566 | 2.082 | 1.230 | 0.970 |
CUCaNet | 41.447 | 1.725 | 1.500 | 0.972 | 39.929 | 2.033 | 1.530 | 0.970 |
Method | ||||||||
---|---|---|---|---|---|---|---|---|
PSNR | SAM | ERGAS | UIQI | PSNR | SAM | ERGAS | UIQI | |
Best Values | 0 | 0 | 1 | 0 | 0 | 1 | ||
CMF | 53.026 | 1.751 | 0.983 | 0.445 | 53.021 | 1.752 | 0.492 | 0.445 |
CMF+ | 53.047 | 1.744 | 0.982 | 0.447 | 53.027 | 1.750 | 0.492 | 0.445 |
LTMR | 47.810 | 2.325 | 1.365 | 0.418 | 47.285 | 2.576 | 0.753 | 0.412 |
FLTMR | 47.369 | 2.392 | 1.416 | 0.414 | 46.215 | 3.116 | 0.849 | 0.390 |
CSTF | 52.799 | 1.775 | 0.989 | 0.414 | 52.672 | 1.807 | 0.502 | 0.419 |
NSSR | 46.085 | 3.251 | 1.361 | 0.344 | 52.610 | 1.835 | 0.508 | 0.447 |
CMS | 50.395 | 1.747 | 1.151 | 0.440 | 30.576 | 26.807 | 5.145 | 0.102 |
CNN-Fus | 52.585 | 1.801 | 1.003 | 0.407 | 52.578 | 1.804 | 0.501 | 0.406 |
MIAE | 52.963 | 1.763 | 0.493 | 0.443 | 52.932 | 1.771 | 0.490 | 0.445 |
CUCaNet | 51.066 | 1.807 | 1.037 | 0.442 | 49.063 | 1.985 | 0.589 | 0.430 |
Method | ||||||||
---|---|---|---|---|---|---|---|---|
PSNR | SAM | ERGAS | UIQI | PSNR | SAM | ERGAS | UIQI | |
Best Values | 0 | 0 | 1 | 0 | 0 | 1 | ||
CMF | 43.258 | 1.997 | 0.281 | 0.989 | 43.219 | 1.998 | 0.141 | 0.989 |
CMF+ | 43.416 | 2.060 | 0.275 | 0.989 | 43.253 | 2.042 | 0.140 | 0.989 |
LTMR | 33.655 | 6.802 | 0.958 | 0.935 | 31.292 | 9.074 | 0.656 | 0.908 |
FLTMR | 32.108 | 8.047 | 1.161 | 0.908 | 30.340 | 10.145 | 0.733 | 0.892 |
CSTF | 42.613 | 2.121 | 0.303 | 0.987 | 41.566 | 2.414 | 0.179 | 0.986 |
NSSR | 26.883 | 5.591 | 1.705 | 0.901 | 26.554 | 4.156 | 0.859 | 0.923 |
CMS | 36.507 | 5.168 | 0.746 | 0.946 | 34.313 | 6.555 | 0.492 | 0.930 |
CNN-Fus | 41.517 | 2.477 | 0.342 | 0.987 | 41.447 | 2.476 | 0.173 | 0.987 |
MIAE | 34.605 | 4.039 | 0.742 | 0.971 | 39.919 | 2.459 | 0.199 | 0.986 |
CUCaNet | 43.582 | 1.534 | 0.256 | 0.993 | 42.064 | 1.956 | 0.153 | 0.991 |
Method | ||||||||
---|---|---|---|---|---|---|---|---|
imgb2 | imgh7 | Pavia | imgb2 | imgh7 | Pavia | Real | Average | |
CMF | 0.053 | 0.053 | 0.517 | 0.067 | 0.059 | 0.048 | 0.082 | 0.126 |
CMF+ | 1.227 | 1.228 | 0.903 | 1.154 | 1.174 | 0.862 | 1.808 | 1.194 |
LTMR | 207.873 | 221.029 | 53.998 | 204.521 | 231.528 | 55.572 | 117.764 | 156.041 |
FLTMR | 52.842 | 58.508 | 12.966 | 44.57 | 51.512 | 16.891 | 26.266 | 37.651 |
CSTF | 130.619 | 129.765 | 102.492 | 105.265 | 135.655 | 125.971 | 233.009 | 137.539 |
NSSR | 125.331 | 124.088 | 51.889 | 98.379 | 128.97 | 59.89 | 166.427 | 107.853 |
CMS | 679.978 | 758.924 | 253.499 | 436.439 | 559.609 | 560.76 | 1100.2 | 621.344 |
CNN-Fus | 167.128 | 171.829 | 63.745 | 132.762 | 167.446 | 73.035 | 26.529 | 114.639 |
MIAE | 406.732 | 383.646 | 174.546 | 1678.538 | 1565.11 | 976.243 | 295.694 | 782.93 |
CUCaNet | 58880 | 6469 | 5139 | 5767 | 5826 | 5097 | 5027 | 5600.714 |
Disclaimer/Publisher’s Note: The statements, opinions and data contained in all publications are solely those of the individual author(s) and contributor(s) and not of MDPI and/or the editor(s). MDPI and/or the editor(s) disclaim responsibility for any injury to people or property resulting from any ideas, methods, instructions or products referred to in the content. |
© 2023 by the authors. Licensee MDPI, Basel, Switzerland. This article is an open access article distributed under the terms and conditions of the Creative Commons Attribution (CC BY) license (https://creativecommons.org/licenses/by/4.0/).
Share and Cite
Lin, H.; Li, J.; Peng, Y.; Zhou, T.; Long, J.; Gui, J. Correlation Matrix-Based Fusion of Hyperspectral and Multispectral Images. Remote Sens. 2023, 15, 3643. https://doi.org/10.3390/rs15143643
Lin H, Li J, Peng Y, Zhou T, Long J, Gui J. Correlation Matrix-Based Fusion of Hyperspectral and Multispectral Images. Remote Sensing. 2023; 15(14):3643. https://doi.org/10.3390/rs15143643
Chicago/Turabian StyleLin, Hong, Jun Li, Yuanxi Peng, Tong Zhou, Jian Long, and Jialin Gui. 2023. "Correlation Matrix-Based Fusion of Hyperspectral and Multispectral Images" Remote Sensing 15, no. 14: 3643. https://doi.org/10.3390/rs15143643
APA StyleLin, H., Li, J., Peng, Y., Zhou, T., Long, J., & Gui, J. (2023). Correlation Matrix-Based Fusion of Hyperspectral and Multispectral Images. Remote Sensing, 15(14), 3643. https://doi.org/10.3390/rs15143643