CN109003256B - Multi-focus image fusion quality evaluation method based on joint sparse representation - Google Patents
Multi-focus image fusion quality evaluation method based on joint sparse representation Download PDFInfo
- Publication number
- CN109003256B CN109003256B CN201810608310.XA CN201810608310A CN109003256B CN 109003256 B CN109003256 B CN 109003256B CN 201810608310 A CN201810608310 A CN 201810608310A CN 109003256 B CN109003256 B CN 109003256B
- Authority
- CN
- China
- Prior art keywords
- image
- fusion
- residual
- sparse
- sparse representation
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Expired - Fee Related
Links
- 230000004927 fusion Effects 0.000 title claims abstract description 104
- 238000000034 method Methods 0.000 title claims abstract description 38
- 238000013441 quality evaluation Methods 0.000 title claims abstract description 24
- 239000011159 matrix material Substances 0.000 claims abstract description 43
- 238000000354 decomposition reaction Methods 0.000 claims abstract description 26
- 239000013598 vector Substances 0.000 claims description 13
- 238000011156 evaluation Methods 0.000 description 8
- 230000000694 effects Effects 0.000 description 7
- 238000010586 diagram Methods 0.000 description 6
- 238000012545 processing Methods 0.000 description 4
- 238000004458 analytical method Methods 0.000 description 3
- 238000004364 calculation method Methods 0.000 description 3
- 238000005457 optimization Methods 0.000 description 3
- 238000013459 approach Methods 0.000 description 2
- 230000001186 cumulative effect Effects 0.000 description 2
- 238000005516 engineering process Methods 0.000 description 2
- 230000006870 function Effects 0.000 description 2
- 238000007499 fusion processing Methods 0.000 description 2
- 230000006872 improvement Effects 0.000 description 2
- 230000009466 transformation Effects 0.000 description 2
- OAICVXFJPJFONN-UHFFFAOYSA-N Phosphorus Chemical compound [P] OAICVXFJPJFONN-UHFFFAOYSA-N 0.000 description 1
- 230000003321 amplification Effects 0.000 description 1
- 230000006835 compression Effects 0.000 description 1
- 238000007906 compression Methods 0.000 description 1
- 238000000605 extraction Methods 0.000 description 1
- 238000003384 imaging method Methods 0.000 description 1
- 238000012986 modification Methods 0.000 description 1
- 230000004048 modification Effects 0.000 description 1
- 238000003199 nucleic acid amplification method Methods 0.000 description 1
- 238000007500 overflow downdraw method Methods 0.000 description 1
- 230000008569 process Effects 0.000 description 1
- 238000003672 processing method Methods 0.000 description 1
- 238000011160 research Methods 0.000 description 1
- 238000012546 transfer Methods 0.000 description 1
- 230000000007 visual effect Effects 0.000 description 1
Images
Classifications
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T7/00—Image analysis
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T5/00—Image enhancement or restoration
- G06T5/50—Image enhancement or restoration using two or more images, e.g. averaging or subtraction
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2207/00—Indexing scheme for image analysis or image enhancement
- G06T2207/30—Subject of image; Context of image processing
- G06T2207/30168—Image quality inspection
Landscapes
- Engineering & Computer Science (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- Theoretical Computer Science (AREA)
- Computer Vision & Pattern Recognition (AREA)
- Image Processing (AREA)
Abstract
The invention provides a multi-focus image fusion quality evaluation method based on joint sparse representation, which mainly comprises the following steps: performing joint sparse decomposition on the source image and the fusion image by using a joint sparse representation model to obtain a residual coefficient matrix of the source image; calculating the residue of atoms by using a source image residue coefficient matrix; calculating the residual coefficient of each atom by using the residual coefficient matrix of the source image; and according to the target of multi-exposure image fusion with as many groups of combined significant information as possible, evaluating the quality of multi-focus image fusion by using the atomic residue and the atomic residue coefficient. The method applies the sparse representation model to the quality evaluation of the multi-focus image fusion; the method provides a method for comparing image differences from an atomic angle by using joint sparse representation; the method quantitatively analyzes the residual conditions of atoms with various degrees of significance, so that the fusion quality evaluation has more definite physical significance.
Description
Technical Field
The invention belongs to the technical field of image application such as digital image processing and machine vision, and particularly relates to a multi-focus image fusion quality evaluation method based on joint sparse representation.
Background
Multi-focus image fusion is a branch of image fusion technology, which is a widely used computational imaging technology. The multi-focus image fusion can effectively overcome the problem that the camera lens can only focus on a single depth of field. By combining a plurality of images related to the same scene but with different depths of field into one image through image processing, objects with different depths of field in the image can be clearly imaged. Fig. 1 shows a schematic illustration of multi-focus image fusion, wherein (a) left-side focusing in an image, (b) right-side focusing in the image, and a fusion result (c) integrates clear parts in two source images to achieve the purpose of focus everywhere. As shown in fig. 1, the purpose of multi-focus image fusion is to combine as much salient information (in focus, sharp content) in the source image into the fusion result as possible to obtain a "in focus" image.
Fusion quality evaluation is another important aspect in image fusion application, and the main concern in image fusion application is how to select a proper fusion algorithm and how to evaluate the fusion quality. The main principles for evaluating the fusion algorithm include:
(1) the fusion image should contain useful information in each source image as much as possible;
(2) artificial false information should not be introduced into the fusion image;
(3) reliability and stability should be maintained for possible spatial misregistration in the source image;
(4) has certain anti-noise capability.
The fusion quality evaluation specifically comprises two methods of subjective evaluation and objective evaluation.
Subjective evaluation is performed by an experienced observer by observing different fusion results and giving an evaluation result of fusion quality. Since subjective evaluation is easily disturbed and limited by various subjective factors, an objective evaluation method without reference standard is widely used in practical applications. The proposed objective image fusion quality evaluation indexes are based on simple low-level image features (gradients, edges and the like), and the fusion quality is evaluated by measuring the degree of transfer of the features from a source image to a fusion result, so that a common sparse idea is not used for multi-focus image fusion quality evaluation.
Disclosure of Invention
In view of the above, the present invention is directed to a multi-focus image fusion quality evaluation method based on joint sparse representation, so as to analyze the residue and the residue coefficient of an atom by using the joint sparse representation, and further perform multi-focus image fusion quality evaluation.
In order to achieve the purpose, the technical scheme of the invention is realized as follows:
a multi-focus image fusion quality evaluation method based on joint sparse representation specifically comprises the following steps:
(1) performing joint sparse decomposition on the source image and the fusion image by using a joint sparse representation model to obtain a residual coefficient matrix of the source image;
(2) calculating the residual coefficient of each atom by using the residual coefficient matrix of the source image;
(3) calculating the residue of atoms by using a source image residue coefficient matrix;
(4) and evaluating the quality of multi-focus image fusion by using the atomic residual degree and the atomic residual coefficient.
Further, the method for fusing images in step (1) specifically includes
(a1) Constructing a sparse representation over-complete atom library dictionary;
(a2) dividing a source image into small blocks by adopting a sliding window method, taking each small block as an independent vector signal, and performing sparse decomposition on the independent vector signal by using an over-complete dictionary;
(a3) combining the sparse decomposition coefficients of the corresponding positions of the source images according to a fusion rule;
(a4) and reconstructing a final fusion image by the combined sparse decomposition coefficient and the overcomplete dictionary.
Further, in the step (1), the fusion image and each source image are subjected to joint sparse decomposition by using formulas (4), (5) and (6), wherein,
in the above formula, alphacRepresenting a common sparse coefficient, αURepresenting difference sparse coefficients, D representing a joint sparse dictionary, and when two images are jointly decomposed:
in the above formula, D is an overcomplete dictionary, and "0" is a full 0 matrix with the same size as D; then, for the two images, the joint sparse decomposition of the fusion image F and the source image a can be expressed as:
in the above formula, YFAnd YAIs the block vectorization representation of the fusion image F and the source image A,a sparse representation coefficient matrix representing information common to both images,the unique information representing the fused image F relative to the source image a sparsely represents a coefficient matrix,a sparse representation coefficient matrix representing the unique information of the source image A relative to the fusion image F, whereinTo solve equation 6 using the OMP algorithm, the difference information of the source image relative to the fused image is regarded as unfused residual information, and its sparse representation coefficients are called residual coefficient matrix.
Further, in the step (2), the residual coefficient matrix is used to calculate the cumulative residual coefficients of all atoms in the overcomplete dictionary used in the current fusion result.
Further, in the step (3), a ratio of a sum of absolute values of the atomic residue coefficients to a sum of residue coefficients of all atoms is used as the atomic residue.
Further, the feature fusion capability of the fusion algorithm is evaluated by utilizing the product of the atom residual coefficient and the atom residual degree.
Compared with the prior art, the multi-focus image fusion quality evaluation method based on joint sparse representation has the following advantages:
(1) the method applies the sparse representation model to the quality evaluation of the multi-focus image fusion;
(2) the method provides a method for comparing image differences from an atomic angle by using joint sparse representation;
(3) the method quantitatively analyzes the residual conditions of atoms with various degrees of significance, so that the fusion quality evaluation has more definite physical significance.
Drawings
The accompanying drawings, which are incorporated in and constitute a part of this specification, illustrate an embodiment of the invention and, together with the description, serve to explain the invention and not to limit the invention. In the drawings:
FIG. 1 is a schematic diagram of multi-focus image fusion according to an embodiment of the present invention;
FIG. 2 is a schematic diagram of an image fusion process based on sparse representation according to an embodiment of the present invention;
FIG. 3 is a diagram illustrating an overcomplete dictionary according to an embodiment of the present invention;
FIG. 4 is a diagram of 10 atom images with the lowest residue and 10 atom images with the highest residue in a dictionary according to an embodiment of the present invention;
FIG. 5 is a diagram of discrete cosine transforms of the 5 lowest residual atoms and the 5 highest residual atoms in a dictionary according to an embodiment of the present invention;
FIG. 6 is a flow chart of a method of the present invention.
Detailed Description
It should be noted that the embodiments and features of the embodiments may be combined with each other without conflict.
The present invention will be described in detail below with reference to the embodiments with reference to the attached drawings.
The invention discloses a multi-focus image fusion quality evaluation method based on joint sparse representation. The core idea of the invention is to apply the concept of 'signal common sparsity' in the compressive sensing theory to the multi-focus image fusion quality evaluation, and the method mainly comprises the following steps:
1. performing joint Sparse decomposition on the source image and the fused image by applying a Joint Sparse Representation (JSR) model so as to obtain Sparse Representation of common information and Sparse Representation of respective difference information, wherein the difference information of the source image is regarded as unfused residual information, and a Sparse Representation coefficient of the difference information is called a residual coefficient matrix;
2. calculating the residual degrees of all atoms by using a residual coefficient matrix; the residual number indicates how easily the atom is fused, and is determined by the spatial characteristics of the atom and is basically irrelevant to the fusion algorithm;
3. calculating (accumulated) residual coefficients of all atoms in the used overcomplete dictionary in the current fusion result by using a residual coefficient matrix;
4. and according to the target of multi-exposure image fusion with as many groups of combined significant information as possible, evaluating the quality of multi-focus image fusion by using the atomic residue and the atomic residue coefficient.
In the invention, sparse representation is a new signal analysis method and comes from a compressive sensing theory. This approach represents the signal as a combination of "atoms" in an overcomplete (redundant) dictionary. The number of atoms in the overcomplete dictionary is greater than the dimensionality of the signal, thereby creating redundancy. Due to this overcomplete, there are many (approximate) representations of a signal on an overcomplete dictionary, where the representation with the least coefficients (sparsest) is the simplest, called sparse representation. The sparse representation can effectively reduce the data volume of the signal and can reflect the image characteristics more effectively. Have been widely used in recent years in various image processing applications.
Given matrix D ═ D1,d2,…,dm}∈Rn×mWhere the D matrix is called a dictionary, each column D of the matrix1,d2,…,dmReferred to as atoms. Sparse representation theory states that for any n-dimensional signal x ∈ RnCan be expressed in the form of linear combination of few atoms in a dictionary
Wherein α ═ (α)1,α2…,αm) Referred to as sparse representation coefficients. When the dimension of the dictionary satisfies n<m, D is called an overcomplete (redundant) dictionary, and the above equation is an under-definite equation with infinite solutions. To make the representation of the signal unique, sparse representation theory introduces a sparsity constraint, i.e. finding the sparsest from an infinite number of solutions. Sparsity of a signal is generally measured by a norm, i.e. the number of non-zero elements in a vector, called L0And (4) norm. By sparsity constraint, the sparse solution of the above equation can be represented by L below0Obtaining a norm optimization model:
solving for minimum L0Norm is an NP-hard problem. When X is sufficiently sparse, the above problem can be converted to solving L for X1Norm problem:
in the above formula, epsilon is an allowable error, | | alpha | | non-woven phosphor1Representing the sum of the absolute values of the sparse coefficients.
The sparse representation solving method currently comprises a greedy strategy, a convex relaxation strategy and a non-convex relaxation strategy. The greedy algorithm was originally proposed by Mallat et al, and Matching Pursuit (MP) is the basis for various types of improvement algorithms. The improved Orthogonal Matching Pursuit (OMP) algorithm based on MP introduces the least squares concept to solve the approximation signal, and is widely adopted at present. If it will be L0Norm optimization transformation to L1And (4) optimizing the norm, so that the problem of solving the sparse coefficient becomes a convex optimization problem. A more representative algorithm is the basis tracking (BP) algorithm. The non-convex strategy adopts a non-convex function to approximate, and mainly utilizes Lp(0<p<1) And (5) performing norm approximation.
At present, an over-complete sparse representation theory is largely adopted in researches such as image denoising, image compression, image recognition, image super-resolution and the like. The reason is that the overcomplete ensures that dictionary atoms are more diverse, and the sparsity enables sparse representation to more accurately select atoms most relevant to signals to be processed, so that the self-adaptive capacity of the signal processing method is enhanced.
There are generally two methods of constructing overcomplete dictionaries:
(1) the dictionary is formed by expanding the basis functions of a multi-scale analysis method, such as wavelet transform, discrete cosine transform, contourlet and the like, and the combination of the transformation bases is also possible;
(2) obtained through sample learning. The processing object or some kind of image is learned, and various features are extracted to form a dictionary. Compared with a fixed base dictionary, the learning dictionary has more optimized representation performance, and typical dictionary generation algorithms are K-VSD, MOD, PCA and the like.
The image fusion method based on sparse representation comprises the following steps of:
(1) constructing a sparse representation over-complete atom library dictionary;
(2) dividing a source image into small blocks (usually 8 x 8) by adopting a sliding window method, taking each small block as an independent vector signal, and performing sparse decomposition on the independent vector signal by using an over-complete dictionary;
(3) combining the sparse decomposition coefficients of the corresponding positions of the source images according to a fusion rule;
(4) and reconstructing a final fusion image by the combined sparse decomposition coefficient and the overcomplete dictionary.
FIG. 2 shows an image fusion process based on sparse representation, and three main steps are dictionary generation, sparse representation coefficient solving and sparse representation coefficient fusion.
In the present invention, the principle on which the joint sparse representation is based is: when the sensors with different attributes acquire the information of the same signal source, each sensing signal comprises two parts of information: common information and unique (difference) information, which may be sparsely represented using the same overcomplete dictionary. For image fusion, after different modality source images of the same scene are subjected to joint sparse representation, the sparse representation coefficient of each image can be divided into two parts: sparse coefficient of common part and sparse coefficient of difference part:
in the above formula, alphacRepresenting a common sparse coefficient, αURepresenting the difference sparse coefficients, D represents the joint sparse dictionary (when the two images are jointly decomposed):
in the above formula, D is an overcomplete dictionary, and "0" is a matrix of all 0's of the same size as D.
Then the joint sparse decomposition of the two images F, A may be expressed as:
in the above formula, YFAnd YAIs F, A (e.g., divide the image into 8 x 8 patches, convert each patch to 64 x 1 column vectors, and arrange the column vectors of all patches in the image laterally to form YFAnd YA),A sparse representation coefficient matrix representing information common to both images,the unique information representing the F image relative to a sparsely represents a coefficient matrix,the specific information representing the A image relative to the F sparsely represents a coefficient matrix.
In the invention, the atom residual degree refers to the residual (unabsorbed) degree of atoms in the source image after fusion, and atoms with high residual degree have poor fusion effect.
Atoms represent basic shapes in an image from which the image is composed. Fig. 3 shows a dictionary containing 256 atoms, each of which has a size of 8 x 8 (converted to 64 x 1 column vectors in sparse decomposition). As can be seen from fig. 3, there are large differences between atoms. Some of the atoms contain more rapid changes and therefore have a higher significance (amount of information), while others represent the background and are relatively smooth and less significant. In the process of combining the fusion results, due to mutual superposition of atoms, the atoms with high significance degree are inevitably subjected to larger interference deformation, while the atoms with low significance degree have relatively stronger 'shape-preserving' capability and relatively smaller information loss. Thus, atoms of high significance have higher residue (poorer fusion effect), while atoms of low significance have lower residue and fusion effect.
In order to obtain the residual degree of atoms, the fusion result (F) and each source image (A, B) are subjected to joint sparse decomposition by using the formulas (4), (5) and (6), and a difference information sparse representation coefficient matrix of the two source images relative to the fusion result can be obtainedEach coefficient in the two matrixes represents a residual coefficient of an atom in a source image which is not fused, and when the residual coefficient is 0, the atom is completely absorbed by a fusion result.
Experimental analysis shows that one atom in the dictionary isThe sum of absolute values of the residual coefficients in (1), and the sum of residual coefficients of all atoms (The sum of absolute values of all the coefficients) is basically fixed, is basically not influenced by a fusion algorithm and is mainly determined by the spatial characteristics of atoms. The fusion algorithm mainly influences the size of the residual coefficient. Based on this, we use the ratio of the sum of absolute values of the atomic residue coefficients and the sum of the residue coefficients of all atoms as the atomic residue, and see the detailed description section for the specific calculation formula. Because atoms with high significance have poor fusion effects, the fusion effects of atoms with high residual degrees are highlighted for comparison with the aim of combining 'focusing clear information' in the evaluation of the fusion quality of multi-focus images. A specific approach is to express the focusing effect of each atom as an atom residue x atom residue coefficient.Because the residual degree of the high-significance atoms is larger, the method is equivalent to the amplification of the residual coefficient so as to achieve the aim of mainly considering the fusion effect of the high-residual atoms.
Fig. 4 shows 10 atom images with the lowest residue and 10 atom images with the highest residue in the dictionary shown in fig. 3. It can be seen that the space characteristics of the low-residue atoms are not changed much and the content is monotonous. High-residual atoms contain richer detail variations.
Fig. 5 shows a diagram of the discrete cosine transform of the 5 lowest and 5 highest residual atoms in the dictionary shown in fig. 3. The direct proportional relation between the spatial variation degree, the frequency domain variation and the atom residue can be seen from the figure.
According to the joint sparse representation theory, the source images participating in fusion are different observation signals of the same visual field, and each source image contains common information and specific information. Since the fusion result is composed of the source images, the fusion image can also be regarded as an observation signal homologous to the source images. Suppose that two source images participating in fusion are SA and SB, and the fusion result is F. After the joint sparse decomposition of (SA, SB), there are: SA-Sc+SAU=DYc+DYAU,SB=Sc+SBU=DYc+DYBU,ScDY being common information between SA and SBcIs ScD is the dictionary used, YcFor sparse representation of coefficient matrices, SAUAnd SBUAre respective unique parts, DYAUAnd DYBUFor its sparse representation, YAU、YBUAnd (4) forming a sparse coefficient matrix. The result F obtained by fusion ideally contains all information in SA and SB, and then the SA is obtained after the combined sparse decomposition is carried out on the (SA, F) and the (SB, F)U、SBUShould be 0. However, due to the information extraction capability of the fusion algorithm and mutual interference when information is superimposed, it is impossible for F to include all information in SA and SB simultaneously, so SAU、SBUIs not 0. At this time YAU、YBUIncluded in the image are coefficients of the unfused part of the atoms in the source image, these "residualsThe linear combination of the remaining atoms constitutes the difference between the fusion result and the source image. The magnitude of these residual atomic coefficients thus characterizes the magnitude of the corresponding difference. The method of the algorithm analyzes YAU、YBUAnd evaluating the characteristic fusion capability of the fusion algorithm.
The specific calculation procedure of the evaluation method is given below. Suppose that two source images participating in fusion are SA and SB, and the fusion result is F. Two source images are taken as an example, and the situation of a plurality of the source images is similar.
1. Selecting the overcomplete dictionary D ∈ R64*LAnd L is>>64, empirical value 256 or 512. The dictionary may be generated using a K-SVD or other dictionary learning algorithm. The learning samples can be randomly extracted from the source images SA and SB, and can also be selected from other image libraries;
2. dividing SA, SB and F (M rows and N columns) into 8 × 8 small blocks, overlapping adjacent small blocks, and moving the step distance to be 1; converting all small blocks into 64 x 1 column vectors in a column-first mode; each image produces (M-8+1) × (N-8+1) column vectors (64 × 1);
3. the column vectors obtained by F, SA and SB decomposition are respectively transversely formed into a two-dimensional matrix of (64 × K), and the two-dimensional matrix is marked as YF、YSA、YSB(ii) a Will YFAre each independently of YSA、YSBPerforming joint sparse decomposition:
y in formula 1FA CSparsely representing a coefficient matrix for common information between SA-F, YFA USparsely representing a coefficient matrix for F versus SA disparity information, YAF USparsely representing a coefficient matrix for residual information of SA relative to F; y in formula 2FB CSparse representation of coefficient matrices for common information between SB-F,YFB USparsely representing a coefficient matrix for difference information of F relative to SB, YBF UThe coefficient matrix is sparsely represented for the residual information of SB versus F. The size of these coefficient matrices is (L × K).
Using OMP (orthogonal matching pursuit) algorithm to solve the two formulas (1) and (2) to obtain YFA C、YAF U、YFA UAnd YFB C、YBF U、YFB U。
4. From the calculation result of step 3, the (cumulative) residual coefficient of atom i is calculated:
5. calculate the residue of atom i:
6. quality Q for evaluating fusion quality of multi-focus imageJSR:
The above description is only for the purpose of illustrating the preferred embodiments of the present invention and is not to be construed as limiting the invention, and any modifications, equivalents, improvements and the like that fall within the spirit and principle of the present invention are intended to be included therein.
Claims (6)
1. A multi-focus image fusion quality evaluation method based on joint sparse representation is characterized by comprising the following steps: the method specifically comprises the following steps:
(1) performing joint sparse decomposition on the source image and the fusion image by using a joint sparse representation model to obtain a residual coefficient matrix of the source image, wherein the residual coefficient matrix is the sparse representation coefficient matrix of the source image obtained by performing joint sparse decomposition on the source image and the fusion image;
(2) calculating the residual coefficient of each atom by using a residual coefficient matrix of a source image, wherein the residual coefficient refers to an array element in a residual sparse matrix and is a sparse representation coefficient of an atom in an overcomplete dictionary used for joint sparse decomposition in a certain image block;
(3) calculating the residue of atoms by using a source image residue coefficient matrix, wherein the residue is the ratio of the sum of the absolute values of the residue coefficients of a certain atom in the overcomplete dictionary in all image blocks to the sum of the residue coefficients of all atoms;
(4) and evaluating the quality of multi-focus image fusion by using the atomic residual degree and the atomic residual coefficient.
2. The multi-focus image fusion quality evaluation method based on joint sparse representation according to claim 1, wherein: the method for fusing images in the step (1) specifically comprises
(a1) Constructing a sparse representation over-complete atom library dictionary;
(a2) dividing a source image into small blocks by adopting a sliding window method, taking each small block as an independent vector signal, and performing sparse decomposition on the independent vector signal by using an over-complete dictionary;
(a3) combining the sparse decomposition coefficients of the corresponding positions of the source images according to a fusion rule;
(a4) and reconstructing a final fusion image by the combined sparse decomposition coefficient and the overcomplete dictionary.
3. The multi-focus image fusion quality evaluation method based on joint sparse representation according to claim 2, wherein: in the step (1), the fusion image and each source image are subjected to joint sparse decomposition by using formulas (4), (5) and (6),
in the above formulaαcRepresenting a common sparse coefficient, αUThe difference sparse coefficient is represented by a difference sparse coefficient,Drepresenting a joint sparse dictionary, when two images are jointly decomposed:
in the above formula, D is an overcomplete dictionary, and "0" is a full 0 matrix with the same size as D;
then, for the two images, the joint sparse decomposition of the fusion image F and the source image a can be expressed as:
in the above formula, YFAnd YAIs the block vectorization representation of the fusion image F and the source image A,a sparse representation coefficient matrix representing information common to both images,the unique information representing the fused image F relative to the source image a sparsely represents a coefficient matrix,a sparse representation coefficient matrix representing the unique information of the source image A relative to the fusion image F, whereinTo solve equation 6 using the OMP algorithm, the difference information of the source image relative to the fused image is regarded as unfused residual information, and its sparse representation coefficients are called residual coefficient matrix.
4. The multi-focus image fusion quality evaluation method based on joint sparse representation according to claim 3, wherein: in the step (2), the residual coefficient matrix is used for calculating the accumulated residual coefficients of all atoms in the overcomplete dictionary in the current fusion result.
5. The multi-focus image fusion quality evaluation method based on joint sparse representation according to claim 4, wherein the method comprises the following steps: in the step (3), the atomic residue ratio is used as the atomic residue ratio, which is the sum of absolute values of atomic residue coefficients and the sum of residue coefficients of all atoms.
6. The multi-focus image fusion quality evaluation method based on joint sparse representation according to claim 5, wherein: and evaluating the feature fusion capability of the fusion algorithm by utilizing the product of the atom residual coefficient and the atom residual degree.
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN201810608310.XA CN109003256B (en) | 2018-06-13 | 2018-06-13 | Multi-focus image fusion quality evaluation method based on joint sparse representation |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN201810608310.XA CN109003256B (en) | 2018-06-13 | 2018-06-13 | Multi-focus image fusion quality evaluation method based on joint sparse representation |
Publications (2)
Publication Number | Publication Date |
---|---|
CN109003256A CN109003256A (en) | 2018-12-14 |
CN109003256B true CN109003256B (en) | 2022-03-04 |
Family
ID=64600761
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN201810608310.XA Expired - Fee Related CN109003256B (en) | 2018-06-13 | 2018-06-13 | Multi-focus image fusion quality evaluation method based on joint sparse representation |
Country Status (1)
Country | Link |
---|---|
CN (1) | CN109003256B (en) |
Families Citing this family (1)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN109934794B (en) * | 2019-02-20 | 2020-10-27 | 常熟理工学院 | Multi-focus image fusion method based on significant sparse representation and neighborhood information |
Citations (4)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN102855616A (en) * | 2012-08-14 | 2013-01-02 | 西北工业大学 | Image fusion method based on multi-scale dictionary learning |
CN107292316A (en) * | 2017-05-31 | 2017-10-24 | 昆明理工大学 | A kind of method of the improving image definition based on rarefaction representation |
CN107341786A (en) * | 2017-06-20 | 2017-11-10 | 西北工业大学 | The infrared and visible light image fusion method that wavelet transformation represents with joint sparse |
CN108038852A (en) * | 2017-12-14 | 2018-05-15 | 天津师范大学 | A kind of Quality Measures for Image Fusion represented based on joint sparse |
Family Cites Families (2)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US9152881B2 (en) * | 2012-09-13 | 2015-10-06 | Los Alamos National Security, Llc | Image fusion using sparse overcomplete feature dictionaries |
CN105913413B (en) * | 2016-03-31 | 2019-02-22 | 宁波大学 | A kind of color image quality method for objectively evaluating based on online manifold learning |
-
2018
- 2018-06-13 CN CN201810608310.XA patent/CN109003256B/en not_active Expired - Fee Related
Patent Citations (4)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN102855616A (en) * | 2012-08-14 | 2013-01-02 | 西北工业大学 | Image fusion method based on multi-scale dictionary learning |
CN107292316A (en) * | 2017-05-31 | 2017-10-24 | 昆明理工大学 | A kind of method of the improving image definition based on rarefaction representation |
CN107341786A (en) * | 2017-06-20 | 2017-11-10 | 西北工业大学 | The infrared and visible light image fusion method that wavelet transformation represents with joint sparse |
CN108038852A (en) * | 2017-12-14 | 2018-05-15 | 天津师范大学 | A kind of Quality Measures for Image Fusion represented based on joint sparse |
Non-Patent Citations (2)
Title |
---|
"An Improved Image Fusion Method of Infrared Image and SAR Image Based on Contourlet and Sparse Representation";Xiuxia Ji;《An Improved Image Fusion Method of Infrared Image and SAR Image Based on Contourlet and Sparse Representation》;20151123;page1-4 * |
"图像去雾的最新研究进展";吴迪等;《自动化学报》;20150228;第41卷(第2期);第221-239页 * |
Also Published As
Publication number | Publication date |
---|---|
CN109003256A (en) | 2018-12-14 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
CN111145131A (en) | Infrared and visible light image fusion method based on multi-scale generation type countermeasure network | |
WO2011137140A1 (en) | Range measurement using a coded aperture | |
KR20170022736A (en) | Apparatus and method for resolution enhancement based on dictionary learning | |
CN115424051B (en) | Panoramic stitching image quality evaluation method | |
Memisevic et al. | Stereopsis via deep learning | |
Sur et al. | On biases in displacement estimation for image registration, with a focus on photomechanics | |
CN108038852B (en) | Image fusion quality evaluation method based on joint sparse representation model | |
CN109087262B (en) | Multi-view spectral image reconstruction method and storage medium | |
CN115393404A (en) | Double-light image registration method, device and equipment and storage medium | |
Su et al. | Bivariate statistical modeling of color and range in natural scenes | |
CN109003256B (en) | Multi-focus image fusion quality evaluation method based on joint sparse representation | |
CN112598708A (en) | Hyperspectral target tracking method based on four-feature fusion and weight coefficient | |
Abas et al. | Multi-focus image fusion with multi-scale transform optimized by metaheuristic algorithms | |
CN112784747B (en) | Multi-scale eigen decomposition method for hyperspectral remote sensing image | |
CN110352387B (en) | System and method for reconstructing holographic lensless images by multi-depth sparse phase recovery | |
CN114651270A (en) | Depth loop filtering by time-deformable convolution | |
CN110176029B (en) | Image restoration and matching integrated method and system based on level sparse representation | |
CN112989593A (en) | High-spectrum low-rank tensor fusion calculation imaging method based on double cameras | |
Papanikolaou et al. | Colour digital image correlation method for monitoring of cultural heritage objects with natural texture | |
CN111028143A (en) | Design method for invariant features of different scale transformation of image | |
CN117880479A (en) | Computing unit and method for estimating depth map from digital hologram, encoding method of video sequence, computer program | |
CN111397733B (en) | Single/multi-frame snapshot type spectral imaging method, system and medium | |
CN116630388A (en) | Thermal imaging image binocular parallax estimation method and system based on deep learning | |
Oral et al. | A comparative study for image fusion | |
Mandal et al. | Super-resolving a single intensity/range image via non-local means and sparse representation |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
PB01 | Publication | ||
PB01 | Publication | ||
SE01 | Entry into force of request for substantive examination | ||
SE01 | Entry into force of request for substantive examination | ||
GR01 | Patent grant | ||
GR01 | Patent grant | ||
CF01 | Termination of patent right due to non-payment of annual fee | ||
CF01 | Termination of patent right due to non-payment of annual fee |
Granted publication date: 20220304 |