A New Look at Image Fusion Methods from a Bayesian Perspective
"> Figure 1
<p>Color compositions of (<b>a</b>) the original QuickBird MS image (2.4 m) at 600 × 600 pixels combined from red, green and blue bands and (<b>b</b>) the original WorldView-2 MS image (2.0 m) at 800 × 800 pixels combined from green, red and red edge bands. The degraded Pan images for (<b>c</b>) QuickBird at 2.4 m and (<b>d</b>) WorldView-2 at 2.0 m.</p> "> Figure 2
<p>Relative spectral radiance response of the WorldView-2 instrument. The bandwidths are Coastal: 400–450 nm, Blue: 450–510 nm, Green: 510–580 nm, Yellow: 585–625 nm, Red: 630–690 nm, Red Edge: 705–745 nm, NIR1: 770–895 nm and NIR2: 860–1040 nm. (Image credit: Digital Globe.)</p> "> Figure 3
<p>True color display of the fused results from different strategies for the QuickBird dataset at the reduced scale. (<b>a</b>): Observed at a 2.4-m spatial resolution; (<b>b</b>): EXP; (<b>c</b>): GS; (<b>d</b>): GS-spectrally consistent; (<b>e</b>): GLP-ECB; (<b>f</b>): GLP-ECB-spectrally consistent; (<b>g</b>): GLP-M3; (<b>h</b>): GLP-M3-spectrally consistent; (<b>i</b>): GLP with <span class="html-italic">s</span> = 0.75; (<b>j</b>): Spectrally consistent GLP with <span class="html-italic">s</span> = 0.75.</p> "> Figure 3 Cont.
<p>True color display of the fused results from different strategies for the QuickBird dataset at the reduced scale. (<b>a</b>): Observed at a 2.4-m spatial resolution; (<b>b</b>): EXP; (<b>c</b>): GS; (<b>d</b>): GS-spectrally consistent; (<b>e</b>): GLP-ECB; (<b>f</b>): GLP-ECB-spectrally consistent; (<b>g</b>): GLP-M3; (<b>h</b>): GLP-M3-spectrally consistent; (<b>i</b>): GLP with <span class="html-italic">s</span> = 0.75; (<b>j</b>): Spectrally consistent GLP with <span class="html-italic">s</span> = 0.75.</p> "> Figure 4
<p>A subset, marked in the left white box in <a href="#remotesensing-07-06828-f001" class="html-fig">Figure 1</a>, of different fusion results at the reduced scale of the WorldView-2 group 1 dataset. Bands 3, 5 and 6 are combined for RGB display. (<b>a</b>): Observed at a 2.0-m spatial resolution; (<b>b</b>): EXP; (<b>c</b>): GS; (<b>d</b>): GS-spectrally consistent; (<b>e</b>): GLP-ECB; (<b>f</b>): GLP-ECB-spatially consistent; (<b>g</b>): GLP-M3; (<b>h</b>): GLP-M3-spectrally consistent;(<b>i</b>): GLP with <span class="html-italic">s</span> = 0.75; (<b>j</b>): Spectrally consistent GLP with <span class="html-italic">s</span> = 0.75.</p> "> Figure 4 Cont.
<p>A subset, marked in the left white box in <a href="#remotesensing-07-06828-f001" class="html-fig">Figure 1</a>, of different fusion results at the reduced scale of the WorldView-2 group 1 dataset. Bands 3, 5 and 6 are combined for RGB display. (<b>a</b>): Observed at a 2.0-m spatial resolution; (<b>b</b>): EXP; (<b>c</b>): GS; (<b>d</b>): GS-spectrally consistent; (<b>e</b>): GLP-ECB; (<b>f</b>): GLP-ECB-spatially consistent; (<b>g</b>): GLP-M3; (<b>h</b>): GLP-M3-spectrally consistent;(<b>i</b>): GLP with <span class="html-italic">s</span> = 0.75; (<b>j</b>): Spectrally consistent GLP with <span class="html-italic">s</span> = 0.75.</p> "> Figure 5
<p>A subset, marked in the right white box in <a href="#remotesensing-07-06828-f001" class="html-fig">Figure 1</a>, of different fusion results at the reduced scale of the WorldView-2 group 2 dataset. Bands 1, 7 and 8 are combined for RGB display. (<b>a</b>): Observed at a 2.0-m spatial resolution; (<b>b</b>): EXP; (<b>c</b>): GS; (<b>d</b>): GS-spectrally consistent; (<b>e</b>): GLP-ECB; (<b>f</b>): GLP-ECB-spatially consistent; (<b>g</b>): GLP-M3; (<b>h</b>): GLP-M3-spectrally consistent;(<b>i</b>): GLP with <span class="html-italic">s</span> = 0.75; (<b>j</b>): Spectrally consistent GLP with <span class="html-italic">s</span> = 0.75.</p> "> Figure 5 Cont.
<p>A subset, marked in the right white box in <a href="#remotesensing-07-06828-f001" class="html-fig">Figure 1</a>, of different fusion results at the reduced scale of the WorldView-2 group 2 dataset. Bands 1, 7 and 8 are combined for RGB display. (<b>a</b>): Observed at a 2.0-m spatial resolution; (<b>b</b>): EXP; (<b>c</b>): GS; (<b>d</b>): GS-spectrally consistent; (<b>e</b>): GLP-ECB; (<b>f</b>): GLP-ECB-spatially consistent; (<b>g</b>): GLP-M3; (<b>h</b>): GLP-M3-spectrally consistent;(<b>i</b>): GLP with <span class="html-italic">s</span> = 0.75; (<b>j</b>): Spectrally consistent GLP with <span class="html-italic">s</span> = 0.75.</p> "> Figure 6
<p>ERGAS, SAM (<b>a</b>) and Q4 (<b>b</b>) as functions of the <span class="html-italic">s</span> value in the GLP-based methods for all three datasets, including WorldView-2 (WV) and QuickBird (QB).</p> ">
Abstract
:1. Introduction
1.1. Image Fusion from a Bayesian Perspective
1.2. Spectral Consistency, Spectral Preservation and Spatial Injection Models: Modeling Step
1.3. Maximum A Posteriori (MAP) Estimation: Inversion Step
1.4. Extended General Image Fusion (EGIF) Framework
1.5. CS Methods from a Bayesian Perspective
CS | MRA | ω/ωn |
---|---|---|
GIHS and IHS | GLP | iQ |
GS (s = 1) | GLP-M3 (s = 0.5) | (Y = X for MRA) |
Brovey | HPM | z̃n/ỹn (ỹn = x̃ for MRA) |
1.6. MRA Methods from a Bayesian Perspective
2. Relationships among Spectral Preservation, Spectral Consistency and Spatial Injection in Pansharpening
2.1. Relationship between the Multi-Resolution Analysis (MRA) and Component Substitution (CS) Methods
2.2. Degree of Spatial Enhancement
2.3. Are Spatial Injection and Spectral Preservation Competitive?
2.4. Are the Up-Sampled Images Spectrally Consistent?
2.5. How to Obtain a Spectrally Consistent Solution
Algorithm 1. Conjugate gradient search algorithm to avoid large matrix formation | |
Let Z(0) = ẑ | (set the initial solution Z(0) to be the results created by the traditional detail injection methods, ẑ) |
r(0) = b − AZ(0) | (r is a vector with the same dimensions as Z. For example, r(0) = b − AZ(0) = HTC−1 n(z-HZ(0)), where z-HZ(0) is a vector that represents the error of the initial solution Z(0) against the LR image z and HT represents the expansion of this error vector into the HR dimension) |
p(0) = r(0) | (p is a vector representing the conjugate direction in which the solution Z(0) should be adjusted) |
k = 0 repeat α(k) = [rT(k)r(k)]/[pT(k)Ap(k)] (α is a scalar representing the best step with which solution Z(k) should be adjusted) Z(k + 1) = Z(k) + α(k)p(k) (adjust the solution Z(k) to be more spectrally consistent) r(k + 1) = r(k) − α(k)Ap(k) (the following statements are to change the direction p to guarantee conjugacy) if (r(k + 1) is sufficiently small, e.g., has an average element value smaller than 1E-10 in our work, or k is sufficiently large, e.g., 5 in our work) then exit loop γ(k) = [rT(k + 1)r(k + 1)]/[rT(k)r(k)] p(k + 1) = r(k + 1) + γ(k)p(k) k = k + 1 end repeat |
2.6. Spectral Preservation is Complementary with Spatial Injection
3. Experimental Confirmation
3.1. Experimental Data
3.2. Validation Strategies
3.3. Experimental Results
Category | No-Sharpening | CS | MRA-Local | MRA-Global | |||||||
---|---|---|---|---|---|---|---|---|---|---|---|
Method | EXP | GS | GS-S | ECB | ECB-S | M3, s = 0.5 | M3-S | s = 0.75 | s = 0.75, S | s = 0.9 | s = 0.9, S |
ERGAS | 5.132 | 4.690 | 3.515 | 5.122 | 4.303 | 3.942 | 3.312 | 4.790 | 4.086 | 5.699 | 4.815 |
Q4 | 0.579 | 0.851 | 0.890 | 0.822 | 0.860 | 0.853 | 0.894 | 0.825 | 0.861 | 0.793 | 0.831 |
SAM (◦) | 4.946 | 5.511 | 4.449 | 6.029 | 4.993 | 5.064 | 4.123 | 5.513 | 4.491 | 6.104 | 4.840 |
SNR | 14.56 | 15.92 | 17.98 | 15.01 | 16.45 | 16.93 | 18.44 | 15.32 | 16.68 | 13.97 | 15.36 |
Method | EXP | GS | GS-S | ECB | ECB-S | M3 | M3-S | s = 0.75 | s = 0.75, S | s = 0.9 | s = 0.9, S |
---|---|---|---|---|---|---|---|---|---|---|---|
ERGAS | 1.166 | 1.881 | 0.402 | 1.033 | 0.804 | 0.919 | 0.357 | 1.016 | 0.539 | 1.149 | 0.701 |
Q4 | 0.983 | 0.986 | 0.999 | 0.990 | 0.999 | 0.991 | 0.999 | 0.989 | 0.999 | 0.986 | 0.998 |
SAM (◦) | 1.098 | 1.330 | 0.235 | 1.140 | 0.282 | 1.118 | 0.214 | 1.148 | 0.256 | 1.188 | 0.314 |
SNR | 27.26 | 23.93 | 36.79 | 28.65 | 31.89 | 29.46 | 37.50 | 28.61 | 34.21 | 27.65 | 32.23 |
Method | EXP | GS | GS-S | ECB | ECB-S | M3 | M3-S | s = 0.75 | s = 0.75, S | s = 0.9 | s = 0.9, S |
---|---|---|---|---|---|---|---|---|---|---|---|
Dλ | 0.037 | 0.117 | 0.034 | 0.081 | 0.057 | 0.070 | 0.030 | 0.119 | 0.055 | 0.149 | 0.078 |
Ds | 0.193 | 0.174 | 0.023 | 0.045 | 0.021 | 0.070 | 0.040 | 0.139 | 0.024 | 0.162 | 0.042 |
QNR | 0.777 | 0.729 | 0.944 | 0.878 | 0.922 | 0.864 | 0.931 | 0.758 | 0.922 | 0.713 | 0.883 |
Time | 0.02 | 0.28 | 6.30 | 3.10 | 9.12 | 0.27 | 6.29 | 0.29 | 6.42 | 0.28 | 6.44 |
Method | EXP | GS | GS-S | ECB | ECB-S | M3 | M3-S | s = 0.75 | s = 0.75, S | s = 0.9 | s = 0.9, S |
---|---|---|---|---|---|---|---|---|---|---|---|
ERGAS | 0.815 | 2.490 | 0.303 | 1.170 | 0.620 | 0.845 | 0.232 | 1.232 | 0.349 | 1.552 | 0.429 |
Q4 | 0.989 | 0.966 | 0.998 | 0.983 | 0.992 | 0.988 | 0.999 | 0.977 | 0.998 | 0.967 | 0.996 |
SAM (◦) | 0.880 | 1.537 | 0.375 | 1.036 | 0.652 | 0.867 | 0.310 | 0.986 | 0.339 | 1.112 | 0.365 |
Method | EXP | GS | GS-S | ECB | ECB-S | M3 | M3-S | s = 0.75 | s = 0.75, S | s = 0.9 | s = 0.9, S |
---|---|---|---|---|---|---|---|---|---|---|---|
ERGAS | 9.340 | 5.573 | 4.368 | 4.093 | 3.846 | 4.225 | 3.939 | 4.033 | 3.728 | 3.994 | 3.671 |
Q4 | 0.542 | 0.875 | 0.906 | 0.923 | 0.931 | 0.912 | 0.924 | 0.923 | 0.933 | 0.926 | 0.936 |
SAM (◦) | 6.106 | 6.184 | 5.400 | 5.820 | 5.290 | 6.019 | 5.317 | 5.950 | 5.282 | 5.935 | 5.275 |
SNR | 9.92 | 14.92 | 16.52 | 17.13 | 17.64 | 16.83 | 17.42 | 17.24 | 17.90 | 17.33 | 18.03 |
Method | EXP | GS | GS-S | ECB | ECB-S | M3 | M3-S | s = 0.75 | s = 0.75, S | s = 0.9 | s = 0.9, S |
---|---|---|---|---|---|---|---|---|---|---|---|
ERGAS | 1.908 | 2.439 | 0.168 | 0.646 | 0.190 | 0.644 | 0.136 | 0.663 | 0.139 | 0.685 | 0.143 |
Q4 | 0.977 | 0.982 | 1.000 | 0.997 | 1.000 | 0.997 | 1.000 | 0.997 | 1.000 | 0.997 | 1.000 |
SAM (◦) | 1.154 | 1.333 | 0.205 | 1.009 | 0.284 | 1.070 | 0.183 | 1.064 | 0.183 | 1.067 | 0.184 |
SNR | 23.13 | 21.65 | 44.21 | 32.75 | 43.16 | 32.73 | 46.02 | 32.50 | 45.79 | 32.23 | 45.55 |
Method | EXP | GS | GS-S | ECB | ECB-S | M3 | M3-S | s = 0.75 | s = 0.75, S | s = 0.9 | s = 0.9, S |
---|---|---|---|---|---|---|---|---|---|---|---|
Dλ | 0.046 | 0.013 | 0.015 | 0.021 | 0.012 | 0.030 | 0.013 | 0.035 | 0.018 | 0.036 | 0.020 |
Ds | 0.197 | 0.023 | 0.024 | 0.010 | 0.023 | 0.015 | 0.019 | 0.018 | 0.009 | 0.019 | 0.007 |
QNR | 0.766 | 0.964 | 0.962 | 0.969 | 0.965 | 0.955 | 0.969 | 0.948 | 0.974 | 0.945 | 0.974 |
Time | 0.03 | 0.50 | 11.20 | 5.64 | 16.25 | 0.48 | 11.22 | 0.49 | 11.21 | 0.49 | 11.31 |
Method | EXP | GS | GS-S | ECB | ECB-S | M3 | M3-S | s = 0.75 | s = 0.75, S | s = 0.9 | s = 0.9, S |
---|---|---|---|---|---|---|---|---|---|---|---|
ERGAS | 1.453 | 2.894 | 0.732 | 1.191 | 0.934 | 1.104 | 0.642 | 1.191 | 0.776 | 1.230 | 0.838 |
Q4 | 0.990 | 0.965 | 0.997 | 0.993 | 0.996 | 0.994 | 0.998 | 0.993 | 0.997 | 0.993 | 0.997 |
SAM (◦) | 0.927 | 1.433 | 0.420 | 0.860 | 0.523 | 0.925 | 0.353 | 0.931 | 0.417 | 0.936 | 0.448 |
Method | EXP | GS | GS-S | ECB | ECB-S | M3 | M3-S | s = 0.75 | s = 0.75, S | s = 0.9 | s = 0.9, S |
---|---|---|---|---|---|---|---|---|---|---|---|
ERGAS | 8.098 | 11.01 | 5.404 | 6.289 | 5.474 | 6.129 | 5.423 | 5.988 | 5.130 | 7.206 | 5.676 |
Q4 | 0.516 | 0.705 | 0.849 | 0.803 | 0.845 | 0.751 | 0.820 | 0.810 | 0.858 | 0.801 | 0.852 |
SAM (◦) | 8.769 | 9.850 | 6.581 | 7.383 | 6.600 | 7.729 | 6.735 | 7.206 | 6.383 | 8.428 | 6.934 |
SNR | 12.48 | 12.80 | 16.04 | 16.18 | 16.98 | 16.11 | 16.93 | 16.36 | 17.36 | 15.50 | 16.88 |
Method | EXP | GS | GS-S | ECB | ECB-S | M3 | M3-S | s = 0.75 | s = 0.75, S | s = 0.9 | s = 0.9, S |
---|---|---|---|---|---|---|---|---|---|---|---|
ERGAS | 1.632 | 7.631 | 0.353 | 1.368 | 0.412 | 1.202 | 0.220 | 1.333 | 0.262 | 1.793 | 0.480 |
Q4 | 0.974 | 0.826 | 0.999 | 0.985 | 0.999 | 0.987 | 1.000 | 0.986 | 0.999 | 0.976 | 0.998 |
SAM (◦) | 1.726 | 6.976 | 0.388 | 1.587 | 0.487 | 1.484 | 0.263 | 1.547 | 0.283 | 1.989 | 0.480 |
SNR | 25.97 | 17.64 | 40.60 | 30.16 | 38.13 | 30.65 | 42.69 | 29.95 | 41.36 | 28.40 | 38.15 |
Method | EXP | GS | GS-S | ECB | ECB-S | M3 | M3-S | s = 0.75 | s = 0.75, S | s = 0.9 | s = 0.9, S |
---|---|---|---|---|---|---|---|---|---|---|---|
Dλ | 0.024 | 0.065 | 0.023 | 0.060 | 0.044 | 0.072 | 0.045 | 0.078 | 0.050 | 0.072 | 0.048 |
Ds | 0.151 | 0.217 | 0.077 | 0.049 | 0.042 | 0.039 | 0.026 | 0.068 | 0.021 | 0.067 | 0.013 |
QNR | 0.828 | 0.732 | 0.902 | 0.894 | 0.915 | 0.892 | 0.931 | 0.860 | 0.930 | 0.866 | 0.939 |
Time | 0.03 | 0.50 | 11.29 | 5.67 | 16.32 | 0.52 | 11.15 | 0.49 | 11.10 | 0.50 | 11.26 |
Method | EXP | GS | GS-S | ECB | ECB-S | M3 | M3-S | s = 0.75 | s = 0.75, S | s = 0.9 | s = 0.9, S |
---|---|---|---|---|---|---|---|---|---|---|---|
ERGAS | 1.286 | 8.393 | 0.375 | 1.369 | 0.812 | 1.087 | 0.267 | 1.345 | 0.371 | 1.719 | 0.477 |
Q4 | 0.989 | 0.804 | 0.999 | 0.988 | 0.995 | 0.992 | 1.000 | 0.989 | 0.999 | 0.982 | 0.999 |
SAM (◦) | 1.365 | 6.644 | 0.381 | 1.364 | 0.738 | 1.202 | 0.217 | 1.336 | 0.271 | 1.612 | 0.348 |
3.3.1. Degree of Spatial Enhancement
3.3.2. Are the Up-Sampled Images Spectrally Consistent?
3.3.3. Spectral Preservation is Complementary with Spatial Injection
3.4. Discussions
Band (q) | 1 | 2 | 3 | 4 | Average | |
---|---|---|---|---|---|---|
QuickBird σe = 429 | ωq: GS | 0.589 | 1.086 | 0.940 | 1.384 | 1.00 |
sq: ECB | 0.565 | 0.536 | 0.575 | 0.715 | 0.597 | |
WorldView-2, group 1 σe = 154 | ω q: GS | 0.948 | 1.183 | 0.849 | 1.020 | 1.00 |
sq: ECB | 0.621 | 0.595 | 0.583 | 0.655 | 0.613 | |
WorldView-2, group 2 σe = 4558 | ωq: GS | 0.237 | 0.318 | 1.806 | 1.639 | 1.00 |
sq: ECB | 0.662 | 0.649 | 0.689 | 0.705 | 0.676 |
3.5. Computational Complexity
4. Conclusions
Acknowledgments
Author Contributions
Appendix
A. The Relationship between this Bayesian Solution and Previous Solutions
B. The Meaning of Spatial Independence
Conflicts of Interest
References
- Zhang, Y. Understanding image fusion. Photogramm. Eng. Remote Sens. 2004, 70, 657–661. [Google Scholar]
- Zhang, Y.; Mishra, R.K. From UNB PanSharp to Fuze Go—The success behind the pan-sharpening algorithm. Int. J. Image Data Fusion 2013, 5, 39–53. [Google Scholar] [CrossRef]
- Aiazzi, B.; Alparone, L.; Baronti, S.; Garzelli, A.; Selva, M. Pansharpening of hyperspectral images: A critical analysis of requirements and assessment on simulated PRISMA data. Proc. SPIE 2013. [Google Scholar] [CrossRef]
- Varshney, P.K. Multisensor data fusion. Electron. Commun. Eng. 1997, 9, 245–253. [Google Scholar] [CrossRef]
- Pohl, C.; van Genderen, J.L. Multisensor image fusion in remote sensing: Concepts, methods and applications. Int. J. Remote Sens. 1998, 19, 823–854. [Google Scholar] [CrossRef]
- Wald, L. Some terms of reference in data fusion. IEEE Trans. Geosci. Remote Sens. 1999, 37, 1190–1193. [Google Scholar] [CrossRef] [Green Version]
- Aiazzi, B.; Baronti, S.; Lotti, F.; Selva, M. A comparison between global and context-adaptive pansharpening of multispectral images. IEEE Geosci. Remote Sens. Lett. 2009, 6, 302–306. [Google Scholar] [CrossRef]
- Wang, Z.J.; Ziou, D.; Armenakis, C.; Li, D.; Li, Q.Q. A comparative analysis of image fusion methods. IEEE Trans. Geosci. Remote Sens. 2005, 43, 1391–1402. [Google Scholar] [CrossRef]
- Tu, T.-M.; Su, S.-C.; Shyu, H.-C.; Huang, P.S. A new look at IHS-like image fusion methods. Inform. Fusion 2001, 2, 177–186. [Google Scholar] [CrossRef]
- Thomas, C.; Ranchin, T.; Wald, L.; Chanussot, J. Synthesis of multispectral images to high spatial resolution: A critical review of fusion methods based on remote sensing physics. IEEE Trans. Geosci. Remote Sens. 2008, 46, 1301–1312. [Google Scholar] [CrossRef] [Green Version]
- Ranchin, T.; Aiazzi, B.; Alparone, L.; Baronti, S.; Wald, L. Image fusion—The ARSIS concept and some successful implementation schemes. ISPRS J. Photogramm. 2003, 58, 4–18. [Google Scholar] [CrossRef]
- Huang, B.; Zhang, H.K.; Song, H.; Wang, J.; Song, C. Unified fusion of remote-sensing imagery: generating simultaneously high-resolution synthetic spatial-temporal-spectral earth observations. Remote Sens. Lett. 2013, 4, 561–569. [Google Scholar] [CrossRef]
- Dou, W.; Chen, Y.H.; Li, X.B.; Sui, D.Z. A general framework for component substitution image fusion: An implementation using the fast image fusion method. Comput. Geosci. 2007, 33, 219–228. [Google Scholar] [CrossRef]
- Amro, I.; Mateos, J.; Vega, M.; Molina, R.; Katsaggelos, A.K. A survey of classical methods and new trends in pansharpening of multispectral images. Eurasip J. Adv. Sig. Process. 2011, 2011. [Google Scholar] [CrossRef]
- Alparone, L.; Wald, L.; Chanussot, J.; Thomas, C.; Gamba, P.; Bruce, L.M. Comparison of pansharpening algorithms: Outcome of the 2006 GRS-S data-fusion contest. IEEE Trans. Geosci. Remote Sens. 2007, 45, 3012–3021. [Google Scholar]
- Amolins, K.; Zhang, Y.; Dare, P. Wavelet based image fusion techniques—An introduction, review and comparison. ISPRS J. Photogramm. 2007, 62, 249–263. [Google Scholar] [CrossRef]
- Marcello, J.; Medina, A.; Eugenio, F. Evaluation of spatial and spectral effectiveness of pixel-level fusion techniques. IEEE Geosci. Remote Sens. Lett. 2013, 10, 432–436. [Google Scholar] [CrossRef]
- Zhang, H.; Huang, B.; Yu, L. Intermodality models in pan-sharpening: Analysis based on remote sensing physics. Int. J. Remote Sens. 2014, 35, 515–531. [Google Scholar] [CrossRef]
- Zhang, J. Multi-source remote sensing data fusion: Status and trends. Int. J. Image Data Fusion 2010, 1, 5–24. [Google Scholar] [CrossRef]
- Pohl, C.; van Genderen, J. Remote sensing image fusion: An update in the context of Digital Earth. Int. J. Dig. Earth. 2013, 7, 1–15. [Google Scholar] [CrossRef]
- Laporterie-Déjean, F.; de Boissezon, H.; Flouzat, G.; Lefèvre-Fonollosa, M.-J. Thematic and statistical evaluations of five panchromatic/multispectral fusion methods on simulated PLEIADES-HR images. Inform. Fusion 2005, 6, 193–212. [Google Scholar] [CrossRef]
- Du, P.J.; Liu, S.C.; Xia, J.S.; Zhao, Y.D. Information fusion techniques for change detection from multi-temporal remote sensing images. Inform. Fusion 2013, 14, 19–27. [Google Scholar] [CrossRef]
- Garzelli, A.; Nencini, F. Interband structure modeling for Pan-sharpening of very high-resolution multispectral images. Inform. Fusion 2005, 6, 213–224. [Google Scholar] [CrossRef]
- Laben, C.A.; Brower, B.V. Process for Enhancing the Spatial Resolution of Multispectral Imagery using Pan-sharpening. US Patent 6,011,875, 4 January 2000. [Google Scholar]
- Aiazzi, B.; Baronti, S.; Selva, M. Improving component substitution pansharpening through multivariate regression of MS plus Pan data. IEEE Trans. Geosci. Remote Sens. 2007, 45, 3230–3239. [Google Scholar] [CrossRef]
- Aiazzi, B.; Alparone, L.; Baronti, S.; Garzelli, A.; Selva, M. Advantages of Laplacian pyramids over “à trous” wavelet transforms for pansharpening of multispectral images. In Proceedings of the Image and Signal Processing for Remote Sensing XVIII, Edinburgh, UK, 24 September 2012; pp. 1–10.
- Aiazzi, B.; Alparone, L.; Baronti, S.; Garzelli, A.; Selva, M. MTF-tailored multiscale fusion of high-resolution MS and pan imagery. Photogramm. Eng. Remote Sens. 2006, 72, 591–596. [Google Scholar] [CrossRef]
- Aiazzi, B.; Alparone, L.; Baronti, S.; Garzelli, A. Context-driven fusion of high spatial and spectral resolution images based on oversampled multiresolution analysis. IEEE Trans. Geosci. Remote Sens. 2002, 40, 2300–2312. [Google Scholar] [CrossRef]
- Tu, T.M.; Huang, P.S.; Hung, C.L.; Chang, C.P. A fast intensity-hue-saturation fusion technique with spectral adjustment for IKONOS imagery. IEEE Geosci. Remote Sens. Lett. 2004, 1, 309–312. [Google Scholar] [CrossRef]
- González-Audícana, M.; Otazu, X.; Fors, O.; Alvarez-Mozos, J. A low computational-cost method to fuse IKONOS images using the spectral response function of its sensors. IEEE Trans. Geosci. Remote Sens. 2006, 44, 1683–1691. [Google Scholar] [CrossRef]
- Wang, Z.W.; Liu, S.X.; You, S.C.; Huang, X. Simulation of low-resolution panchromatic images by multivariate linear regression for pan-sharpening IKONOS imageries. IEEE Geosci. Remote Sens. Lett. 2010, 7, 515–519. [Google Scholar] [CrossRef]
- Ling, Y.R.; Ehlers, M.; Usery, E.L.; Madden, M. FFT-enhanced IHS transform method for fusing high-resolution satellite images. ISPRS J. Photogramm. 2007, 61, 381–392. [Google Scholar] [CrossRef]
- González-Audícana, M.; Saleta, J.L.; Catalán, R.G.; García, R. Fusion of multispectral and panchromatic images using improved IHS and PCA mergers based on wavelet decomposition. IEEE Trans. Geosci. Remote Sens. 2004, 42, 1291–1299. [Google Scholar] [CrossRef]
- Xu, Q.; Li, B.; Zhang, Y.; Ding, L. High-fidelity component substitution pansharpening by the fitting of substitution data. IEEE Trans. Geosci. Remote Sens. 2014, 52, 7380–7392. [Google Scholar]
- Nencini, F.; Garzelli, A.; Baronti, S.; Alparone, L. Remote sensing image fusion using the curvelet transform. Inform. Fusion 2007, 8, 143–156. [Google Scholar] [CrossRef]
- Aiazzi, B.; Alparone, L.; Baronti, S.; Pippi, I. Quality assessment of decision-driven pyramid-based fusion of high resolution multispectral with panchromatic image data. In Proceedings of the IEEE/ISPRS Joint Workshop on Remote Sensing and Data Fusion over Urban Areas, Rome, Italy, 8–9 November 2001; pp. 337–341.
- Lee, J.; Lee, C. Fast and efficient panchromatic sharpening. IEEE Trans. Geosci. Remote Sens. 2010, 48, 155–163. [Google Scholar]
- Otazu, X.; González-Audícana, M.; Fors, O.; Núñez, J. Introduction of sensor spectral response into image fusion methods. application to wavelet-based methods. IEEE Trans. Geosci. Remote Sens. 2005, 43, 2376–2385. [Google Scholar] [CrossRef]
- Ranchin, T.; Wald, L. Fusion of high spatial and spectral resolution images: The ARSIS concept and its implementation. Photogramm. Eng. Remote Sens. 2000, 66, 49–61. [Google Scholar]
- Chu, H.; Zhu, W.L. Fusion of IKONOS satellite imagery using IHS transform and local variation. IEEE Geosci. Remote Sens. Lett. 2008, 5, 653–657. [Google Scholar]
- Choi, M. A new intensity-hue-saturation fusion approach to image fusion with a tradeoff parameter. IEEE Trans Geosci Remote Sens. 2006, 44, 1672–1682. [Google Scholar] [CrossRef]
- Choi, J.; Yu, K.; Kim, Y. A new adaptive component-substitution-based satellite image fusion by using partial replacement. IEEE Trans. Geosci. Remote Sens. 2011, 49, 295–309. [Google Scholar] [CrossRef]
- Tu, T.M.; Hsu, C.L.; Tu, P.Y.; Lee, C.H. An adjustable pan-sharpening approach for IKONOS/QuickBird/GeoEye-1/WorldView-2 imagery. IEEE J. STARS 2012, 5, 125–134. [Google Scholar] [CrossRef]
- Tu, T.M.; Cheng, W.C.; Chang, C.P.; Huang, P.S.; Chang, J.C. Best tradeoff for high-resolution image fusion to preserve spatial details and minimize color distortion. IEEE Geosci. Remote Sens. Lett. 2007, 4, 302–306. [Google Scholar] [CrossRef]
- Saeedi, J.; Faez, K. A new pan-sharpening method using multiobjective particle swarm optimization and the shiftable contourlet transform. ISPRS J. Photogramm. 2011, 66, 365–381. [Google Scholar] [CrossRef]
- Garzelli, A.; Nencini, F.; Capobianco, L. Optimal MMSE pan sharpening of very high resolution multispectral images. IEEE Trans. Geosci. Remote Sens. 2008, 46, 228–236. [Google Scholar] [CrossRef]
- Mahyari, A.G.; Yazdi, M. Panchromatic and multispectral image fusion based on maximization of both spectral and spatial similarities. IEEE Trans. Geosci. Remote Sens. 2011, 49, 1976–1985. [Google Scholar] [CrossRef]
- Švab, A.; Oštir, K. High-resolution image fusion: Methods to preserve spectral and spatial resolution. Photogramm. Eng. Remote Sens. 2006, 72, 565–572. [Google Scholar] [CrossRef]
- Lillo-Saavedra, M.; Gonzalo, C. Spectral or spatial quality for fused satellite imagery? A trade-off solution using the wavelet a trous algorithm. Int. J. Remote Sens. 2006, 27, 1453–1464. [Google Scholar] [CrossRef]
- Zhou, X.; Liu, J.; Liu, S.; Cao, L.; Zhou, Q.; Huang, H. A GIHS-based spectral preservation fusion method for remote sensing images using edge restored spectral modulation. ISPRS J. Photogramm. 2014, 88, 16–27. [Google Scholar] [CrossRef]
- Wald, L.; Ranchin, T.; Mangolini, M. Fusion of satellite images of different spatial resolutions: Assessing the quality of resulting images. Photogramm. Eng. Remote Sens. 1997, 63, 691–699. [Google Scholar]
- Khan, M.M.; Alparone, L.; Chanussot, J. Pansharpening quality assessment using the modulation transfer functions of instruments. IEEE Trans. Geosci. Remote Sens. 2009, 47, 3880–3891. [Google Scholar] [CrossRef]
- Chen, C.; Li, Y.; Liu, W.; Huang, J. Image fusion with local spectral consistency and dynamic gradient sparsity. In Proceedings of the IEEE Conference on Computer Vision and Pattern Recognition, Columbus, OH, USA, 24–27 June 2014; pp. 2760–2765.
- Fasbender, D.; Radoux, J.; Bogaert, P. Bayesian data fusion for adaptable image pansharpening. IEEE Trans. Geosci. Remote Sens. 2008, 46, 1847–1857. [Google Scholar] [CrossRef]
- Hardie, R.C.; Eismann, M.T.; Wilson, G.L. MAP estimation for hyperspectral image resolution enhancement using an auxiliary sensor. IEEE Trans. Image Process. 2004, 13, 1174–1184. [Google Scholar] [CrossRef] [PubMed]
- Zhang, Y.; Duijster, A.; Scheunders, P. A Bayesian restoration approach for hyperspectral images. IEEE Trans. Geosci. Remote Sens. 2012, 50, 3453–3462. [Google Scholar] [CrossRef]
- Zhang, Y.F.; De Backer, S.; Scheunders, P. Noise-resistant wavelet-based Bayesian fusion of multispectral and hyperspectral images. IEEE Trans. Geosci. Remote Sens. 2009, 47, 3834–3843. [Google Scholar] [CrossRef]
- Eismann, M.T.; Hardie, R.C. Application of the stochastic mixing model to hyperspectral resolution, enhancement. IEEE Trans. Geosci. Remote Sens. 2004, 42, 1924–1933. [Google Scholar] [CrossRef]
- Ballester, C.; Caselles, V.; Igual, L.; Verdera, J.; Rouge, B. A variational model for P+XS image fusion. Int. J. Comput. Vis. 2006, 69, 43–58. [Google Scholar] [CrossRef]
- Eismann, M.T.; Hardie, R.C. Hyperspectral resolution enhancement using high-resolution multispectral imagery with arbitrary response functions. IEEE Trans. Geosci. Remote Sens. 2005, 43, 455–465. [Google Scholar] [CrossRef]
- Li, Z.H.; Leung, H. Fusion of multispectral and panchromatic images using a restoration-based method. IEEE Trans. Geosci. Remote Sens. 2009, 47, 1480–1489. [Google Scholar]
- Zhang, L.P.; Shen, H.F.; Gong, W.; Zhang, H.Y. Adjustable model-based fusion method for multispectral and panchromatic images. IEEE Trans. Syst. Man Cybern. B 2012, 42, 1693–1704. [Google Scholar] [CrossRef] [PubMed]
- Kalpoma, K.A.; Kudoh, J.I. Image fusion processing for IKONOS 1-m color imagery. IEEE Trans. Geosci. Remote Sens. 2007, 45, 3075–3086. [Google Scholar] [CrossRef]
- Molina, R.; Vega, M.; Mateos, J.; Katsaggelos, A.K. Variational posterior distribution approximation in Bayesian super resolution reconstruction of multispectral images. Appl. Comput. Harmon A. 2008, 24, 251–267. [Google Scholar] [CrossRef]
- Palsson, F.; Sveinsson, J.R.; Ulfarsson, M.O. A new pansharpening algorithm based on total variation. IEEE Geosci. Remote Sens. Lett. 2014, 11, 318–322. [Google Scholar] [CrossRef]
- Fang, F.M.; Li, F.; Shen, C.M.; Zhang, G.X. A variational approach for pan-sharpening. IEEE Trans. Image Process. 2013, 22, 2822–2834. [Google Scholar] [CrossRef] [PubMed]
- Nishii, R.; Kusanobu, S.; Tanaka, S. Enhancement of low spatial resolution image based on high resolution-bands. IEEE Trans. Geosci. Remote Sens. 1996, 34, 1151–1158. [Google Scholar] [CrossRef]
- Massip, P.; Blanc, P.; Wald, L. A method to better account for modulation transfer functions in ARSIS-based pansharpening methods. IEEE Trans. Geosci. Remote Sens. 2012, 50, 800–808. [Google Scholar] [CrossRef] [Green Version]
- Alparone, L.; Baronti, S.; Garzelli, A.; Nencini, F. Landsat ETM+ and SAR image fusion based on generalized intensity modulation. IEEE Trans. Geosci .Remote Sens. 2004, 42, 2832–2839. [Google Scholar] [CrossRef]
- Chen, C.M.; Hepner, G.F.; Forster, R.R. Fusion of hyperspectral and radar data using the IHS transformation to enhance urban surface features. ISPRS J. Photogramm. 2003, 58, 19–30. [Google Scholar] [CrossRef]
- Zhan, W.F.; Chen, Y.H.; Zhou, J.; Li, J.; Liu, W.Y. Sharpening thermal imageries: A generalized theoretical framework from an assimilation perspective. IEEE Trans. Geosci. Remote Sens. 2011, 49, 773–789. [Google Scholar] [CrossRef]
- Zhang, D.M.; Zhang, X.D. Pansharpening through proportional detail injection based on generalized relative spectral response. IEEE Geosci. Remote Sens. Lett. 2011, 8, 978–982. [Google Scholar] [CrossRef]
- Zhang, Y. A new merging method and its spectral and spatial effects. Int. J. Remote Sens. 1999, 20, 2003–2014. [Google Scholar] [CrossRef]
- Munechika, C.K.; Warnick, J.S.; Salvaggio, C.; Schott, J.R. Resolution enhancement of multispectral image data to improve classification accuracy. Photogramm. Eng. Remote Sens. 1993, 59, 67–72. [Google Scholar]
- Yang, S.; Wang, M.; Jiao, L. Fusion of multispectral and panchromatic images based on support value transform and adaptive principal component analysis. Inform. Fusion 2012, 13, 177–184. [Google Scholar] [CrossRef]
- Aiazzi, B.; Baronti, S.; Selva, M.; Alparone, L. Bi-cubic interpolation for shift-free pan-sharpening. ISPRS J. Photogramm. 2013, 86, 65–76. [Google Scholar] [CrossRef]
- Aanæs, H.; Sveinsson, J.R.; Nielsen, A.A.; Bøvith, T.; Benediktsson, J. A. Model-based satellite image fusion. IEEE Trans. Geosci. Remote Sens. 2008, 46, 1336–1346. [Google Scholar] [CrossRef]
- Joshi, M.; Jalobeanu, A. MAP estimation for multiresolution fusion in remotely sensed images using an IGMRF prior model. IEEE Trans. Geosci. Remote Sens. 2010, 48, 1245–1255. [Google Scholar] [CrossRef]
- Joshi, M.V.; Bruzzone, L.; Chaudhuri, S. A model-based approach to multiresolution fusion in remotely sensed images. IEEE Trans. Geosci. Remote Sens.. 2006, 44, 2549–2562. [Google Scholar] [CrossRef]
- Kundur, D.; Hatzinakos, D. Toward robust logo watermarking using multiresolution image fusion principles. IEEE Trans. Multimed. 2004, 6, 185–198. [Google Scholar] [CrossRef]
- You, X.; Du, L.; Cheung, Y.-M.; Chen, Q. A blind watermarking scheme using new nontensor product wavelet filter banks. IEEE Trans. Image Process. 2010, 19, 3271–3284. [Google Scholar] [PubMed]
- Pajares, G.; De La Cruz, J.M. A wavelet-based image fusion tutorial. Pattern Recogn. 2004, 37, 1855–1872. [Google Scholar] [CrossRef]
- Nunez, J.; Otazu, X.; Fors, O.; Prades, A.; Pala, V.; Arbiol, R. Multiresolution-based image fusion with additive wavelet decomposition. IEEE Trans. Geosci. Remote Sens. 1999, 37, 1204–1211. [Google Scholar] [CrossRef]
- Wald, L. Data Fusion: Definitions and Architectures—Fusion of Images of Different Spatial Resolutions; Les Presses: Paris, France, 2002. [Google Scholar]
- Alparone, L.; Baronti, S.; Garzelli, A.; Nencini, F. A global quality measurement of pan-sharpened multispectral imagery. IEEE Geosci. Remote Sens. Lett. 2004, 1, 313–317. [Google Scholar] [CrossRef]
- Yuhendraa, I.; Alimuddin, I.; Sumantyo, J.T.S.; Kuze, H. Assessment of pan-sharpening methods applied to image fusion of remotely sensed multi-band data. Int. J. Appl. Earth Obs 2012, 18, 165–175. [Google Scholar]
- Alparone, L.; Alazzi, B.; Baronti, S.; Garzelli, A.; Nencini, F.; Selva, M. Multispectral and panchromatic data fusion assessment without reference. Photogramm. Eng. Remote Sens. 2008, 74, 193–200. [Google Scholar] [CrossRef]
- Vivone, G.; Alparone, L.; Chanussot, J.; Dalla Mura, M.; Garzelli, A.; Licciardi, G.A.; Restaino, R.; Wald, L. A critical comparison among pansharpening algorithms. IEEE Trans. Geosci. Remote Sens. 2015, 53, 2565–2586. [Google Scholar] [CrossRef]
- Yang, J.; Zhang, J.; Huang, G. A parallel computing paradigm for pan-sharpening algorithms of remotely sensed images on a multi-core computer. Remote Sens. 2014, 6, 6039–6063. [Google Scholar] [CrossRef]
- Pardo-Igúzquiza, E.; Chica-Olmo, M.; Atkinson, P.M. Downscaling cokriging for image sharpening. Remote Sens. Environ. 2006, 102, 86–98. [Google Scholar] [CrossRef]
- Pardo-Igúzquiza, E.; Rodriguez-Galiano, V.F.; Chica-Olmo, M.; Atkinson, P.M. Image fusion by spatially adaptive filtering using downscaling cokriging. ISPRS J. Photogramm. 2011, 66, 337–346. [Google Scholar] [CrossRef]
© 2015 by the authors; licensee MDPI, Basel, Switzerland. This article is an open access article distributed under the terms and conditions of the Creative Commons Attribution license (http://creativecommons.org/licenses/by/4.0/).
Share and Cite
Zhang, H.K.; Huang, B. A New Look at Image Fusion Methods from a Bayesian Perspective. Remote Sens. 2015, 7, 6828-6861. https://doi.org/10.3390/rs70606828
Zhang HK, Huang B. A New Look at Image Fusion Methods from a Bayesian Perspective. Remote Sensing. 2015; 7(6):6828-6861. https://doi.org/10.3390/rs70606828
Chicago/Turabian StyleZhang, Hankui K., and Bo Huang. 2015. "A New Look at Image Fusion Methods from a Bayesian Perspective" Remote Sensing 7, no. 6: 6828-6861. https://doi.org/10.3390/rs70606828
APA StyleZhang, H. K., & Huang, B. (2015). A New Look at Image Fusion Methods from a Bayesian Perspective. Remote Sensing, 7(6), 6828-6861. https://doi.org/10.3390/rs70606828