Disclosure of Invention
In view of the above-described deficiencies in the prior art, the present invention provides a multi-modal medical image registration method that fuses gradient information and generalized entropy similarity. The invention realizes the registration of the multi-modal medical image by using a method of combining gradient information and information theory measure.
In order to solve the technical problems, the technical scheme adopted by the invention is as follows:
a multi-modal medical image registration method fusing gradient information and generalized entropy similarity comprises the following steps:
and S1, constructing a generalized entropy similarity measure.
S1.1, Arimoto entropy A is definedα;
Wherein X is a discrete random variable, α is a parameter for controlling the non-expansibility of the Arimoto entropy, M is the number of elements of the discrete random variable X, i is the number of elements of the probability distribution P of the discrete random variable X, and PiIs the ith element of the probability distribution P.
The limit of Arimoto entropy when α → 1 is found to be equal to shannon entropy according to lobida's law, so shannon entropy can be considered as a special case of Arimoto entropy.
S1.2, construct Jensen-Arimoto Divergene (Jansen Arimoto divergence, JAD).
In the formula, Aα(. for) Arimoto entropy, ωiRepresents a weighting factor, and ωi≥0,∑ωiSince the limit of Arimoto entropy is shannon entropy when α tends towards 1, the limit of JAD is jensen shannon divergence when α → 1.
And S1.3, constructing generalized entropy similarity measure.
Wherein R (x) is a reference image; f is a floating image; f (T)μ(x) ) is the transformed floating image; t represents a spatial transformation; x is a pixel point of the reference image R; t isμ(x) Representing the transformed corresponding coordinate point, fjRepresenting the grey level, r, of the converted floating imageiRepresents the gray level of the reference image r (x); p (f)j|ri) Representing a conditional probability distribution of the transformed floating image given the known reference image; p (r)i) Representing a probability distribution of a reference image; p (f)j) Representing the probability distribution of the floating image under the action of the spatial transformation T; p (r)i,fj) Representing the joint probability distribution of the floating image and the reference image under the T-transform.
S2, constructing a gradient measure of the image to be registered.
And S2.1, obtaining the gradient direction of the image to be registered.
S2.1.1 calculating the gradient of the reference image R
Wherein,
representing the gradient of the reference image in the horizontal direction;
representing the gradient in the vertical direction of the reference image;
represents a convolution; g
x' (σ) denotes the first derivative in the x-direction of the Gaussian kernel with the scale σ; g
y' (σ) denotes the first derivative in the y-direction of the Gaussian kernel with the scale σ.
S2.1.2, computing the floating image F (T) under the action of the space transformation T
μ(x) Gradient of (2)
Wherein T represents a spatial transformation; t is
μ(x) Representing the transformed corresponding coordinate points; mu is a space transformation parameter;
representing the gradient of the horizontal direction of the transformed floating image;
representing the gradient in the vertical direction of the transformed floating image.
S2.1.3 calculating the gradient of the reference image
Gradient of floating image under T transform
The included angle between them;
wherein,
representing a gradient
The mold of (4);
representing a gradient
The die of (1).
S2.1.4, calculating the direction of the image gradient to be registered.
And S2.2, obtaining a mode of the gradient of the image to be registered.
S2.2.1 calculating reference image gradients
The mold of (4);
s2.2.2, calculating the gradient of the floating image under T transform
The die of (1).
S2.2.3, obtaining a modulus of the image gradient to be registered.
Where σ denotes the scale of the gaussian kernel and μ is a spatial transformation parameter.
S2.3, calculating the gradient measure of the image to be registered;
in the formula,
which indicates the direction of the gradient and,
the modulus representing the gradient, ω, can be regarded as the weight of the gradient modulus, in the reference image R (x) and in the transformed floating image F (T)
μ(x) ) the weighted sum of the pixels in the overlapping portion.
S3, constructing a similarity measure of the images to be registered.
S(R(x),F(Tμ(x)))=G(R(x),F(Tμ(x)))JAα(F(Tμ(x)),R(x)) (12);
Wherein G (R (x), F (T)μ(x) ) is in the spatial transformation TμThe gradient measurement of the floating image and the reference image under the action represents the gradient information between the images to be registered; JA (JA)α(F(Tμ(x) R (x)) represents the transformation T in spaceμAnd measuring the generalized entropy similarity of the floating image under the action and the reference image.
S4, constructing a registration model of the multi-modal image;
substituting the similarity measure of the image to be registered into a formula 13 to obtain a final registration function;
s5, solving the registration model by adopting a directional acceleration method to obtain optimal space transformation parameters;
s5.1, initializing, and setting an initial point, a maximum iteration number, an allowable error and n linearly independent directions;
and S5.2, basic searching, namely sequentially and respectively performing one-dimensional searching along n directions.
S5.3, accelerating search is carried out, an accelerating direction is selected, whether an allowable error is met or not is judged, if an error condition is met, iteration is terminated, and an optimal solution is output; otherwise jump S5.4 continues searching.
And S5.4, adjusting the search direction to form a new search direction, returning to S5.2 until iteration is terminated, and outputting the optimal space transformation parameters.
The method solves the registration problem of the multi-modal medical image by utilizing the gradient and gray level information of the image to be registered, comprehensively utilizes the gradient information and the generalized entropy similarity measure of the image, and measures the similarity between the reference image and the floating image; constructing a multi-modal medical image registration model by utilizing the spatial transformation of the image and combining the gradient information and the generalized entropy similarity measure; and solving the registration model by using the direction acceleration method, searching the spatial transformation relation of the image to be registered, and realizing the accurate registration of the medical image.
The invention has the advantages that:
(1) the invention comprehensively considers the gradient information and the gray distribution characteristics of medical images in different modes, and effectively completes the accurate registration of the multi-mode medical images.
(2) The method is based on a generalized information entropy, constructs generalized entropy similarity measure, and has wider properties.
(3) The method does not consider the gray difference between the images, but utilizes the joint probability distribution of the gray values of the two images to calculate the similarity of the information theory between the images to be registered, is suitable for the medical images in the same modality, is also suitable for the medical images in different modalities with larger gray difference, and enlarges the application range of the method.
(4) The invention does not need to carry out pretreatment such as feature extraction, segmentation and the like on the medical image to be registered, and has stronger self-adaptability and robustness.
Detailed Description
As shown in fig. 3, a multi-modality medical image registration method fusing gradient information and generalized entropy similarity includes the following steps:
and S1, constructing a generalized entropy similarity measure.
S1.1, Arimoto entropy A is definedα;
Wherein X is a discrete random variable, α is a parameter for controlling the non-expansibility of the Arimoto entropy, M is the number of elements of the discrete random variable X, i is the number of elements of the probability distribution P of the discrete random variable X, and PiIs the ith element of the probability distribution P.
The limit of Arimoto entropy when α → 1 is found to be equal to shannon entropy according to lobida's law, so shannon entropy can be considered as a special case of Arimoto entropy.
S1.2, construct Jensen-Arimoto divergence (JAD).
In the formula, Aα(. for) Arimoto entropy, ωiRepresents a weighting factor, and ωi≥0,∑ωiSince the limit of Arimoto entropy is shannon entropy when α tends towards 1, the limit of JAD is jensen shannon divergence when α → 1.
And S1.3, constructing generalized entropy similarity measure.
Wherein R (x) is a reference image; f is a floating image; f (T)μ(x) ) is the transformed floating image; t represents a spatial transformation; x is a pixel point of the reference image R; t isμ(x) Representing the transformed corresponding coordinate point, fjRepresenting the grey level, r, of the converted floating imageiRepresents the gray level of the reference image r (x); p (f)j|ri) Representing a conditional probability distribution of the transformed floating image given the known reference image; p (r)i) Representing a probability distribution of a reference image; p (f)j) Representing the probability distribution of the floating image under the action of the spatial transformation T; p (r)i,fj) Representing the joint probability distribution of the floating image and the reference image under the spatial transformation T.
S2, constructing a gradient measure of the image to be registered.
And S2.1, obtaining the gradient direction of the image to be registered.
S2.1.1 calculating the gradient of the reference image R
Wherein,
representing the gradient of the reference image in the horizontal direction;
representing the gradient in the vertical direction of the reference image;
represents a convolution; g
x' (σ) denotes the first derivative in the x-direction of the Gaussian kernel with the scale σ; g
y' (σ) denotes the first derivative in the y-direction of the Gaussian kernel with the scale σ.
S2.1.2, computing the floating image F (T) under the action of the space transformation T
μ(x) Gradient of (2)
Wherein T represents a spatial transformation; t is
μ(x) Representing the transformed corresponding coordinate points; mu is a space transformation parameter;
representing the gradient of the horizontal direction of the transformed floating image;
representing the gradient in the vertical direction of the transformed floating image.
S2.1.3 calculating the gradient of the reference image
And transforming the gradient of the floating image
The included angle between them;
wherein,
representing a gradient
The mold of (4);
representing a gradient
The die of (1).
S2.1.4, calculating the direction of the image gradient to be registered.
And S2.2, obtaining a mode of the gradient of the image to be registered.
S2.2.1 calculating the gradient of the reference image
The mold of (4);
s2.2.2, calculating the gradient of the floating image under T transform
The die of (1).
S2.2.3, obtaining a modulus of the image gradient to be registered.
Where σ denotes the scale of the gaussian kernel and μ is a spatial transformation parameter.
S2.3, calculating the gradient measure of the image to be registered;
in the formula,
which indicates the direction of the gradient and,
the modulus representing the gradient, ω being regarded as the weight of the modulus of the gradient, in the reference image R (x) and in the transformed floating image F (T)
μ(x) ) the weighted sum of the pixels in the overlapping portion.
S3, constructing a similarity measure of the images to be registered.
S(R(x),F(Tμ(x)))=G(R(x),F(Tμ(x)))JAα(F(Tμ(x)),R(x)) (12);
Wherein G (R (x), F (T)μ(x) ) is in the spatial transformation TμThe gradient measurement of the floating image and the reference image under the action represents the gradient information between the images to be registered; JA (JA)α(F(Tμ(x) R (x)) represents the transformation T in spaceμMeasuring the generalized entropy similarity of the floating image and the reference image under the action;
s4, constructing a registration model of the multi-modal image;
substituting the similarity measure of the image to be registered into a formula 13 to obtain a final registration function;
s5, solving the registration model by adopting a directional acceleration method to obtain optimal space transformation parameters;
s5.1, initializing, and setting an initial point, a maximum iteration number, an allowable error and n linearly independent directions;
s5.2, basic searching, namely sequentially and respectively performing one-dimensional searching along n directions;
s5.3, accelerating search is carried out, an accelerating direction is selected, whether an allowable error is met or not is judged, if an error condition is met, iteration is terminated, and an optimal solution is output; otherwise, skipping S5.4 to continue searching;
and S5.4, adjusting the search direction to form a new search direction, returning to S5.2 until iteration is terminated, and outputting the optimal space transformation parameters.
The idea of the invention will be described in further detail below with reference to the accompanying drawings and comparative examples.
The invention provides a multi-modal medical image registration method based on gradient information and generalized entropy similarity, which comprises the steps of firstly researching the property of an Arimoto entropy, constructing generalized entropy similarity measure, estimating the joint probability distribution of two images and measuring the similarity between the two images; secondly, calculating gradients of the reference image and the floating image, and acquiring mode and direction information of the image gradient; then, establishing a registration model of the medical image by utilizing the spatial transformation of the image and combining the gradient information and the generalized entropy similarity; and finally, solving the model through an optimization algorithm, searching the optimal transformation relation among the images to be registered, and realizing the accurate registration of the medical images. The flow of the registration algorithm is shown in fig. 3, and specifically includes:
firstly, the method comprises the following steps: construction of generalized entropy similarity measure
Step 1. definition of generalized entropy
Assume a discrete random variable X with a probability distribution of P ═ P (P)1,p2,…,pM) Then the Arimoto entropy of X is defined as:
the limit of Arimoto entropy when α → 1 is equal to shannon entropy, so shannon entropy can be regarded as a special case of Arimoto entropy, according to lobida's law.
Step 2, defining Jensen-Arimoto divergence
Suppose X (X)1,x2,…,xM) Is a random variable with a probability distribution of P ═ P (P)1,p2,…,pM) Then the Jensen-Arimoto divergence (JAD) of the probability distribution P is defined as:
in the formula Aα(. for) Arimoto entropy, ωiRepresents a weighting factor, and ωi≥0,∑ωiSince the limit of Arimoto entropy is shannon entropy when α tends towards 1, the limit of JAD is jensen shannon divergence when α → 1.
Step 3, constructing generalized entropy similarity measure
For reference image R, floating image F and spatial transformation T, x is the pixel point of the reference image, Tμ(x) Corresponding to the transformed coordinate point, let f equal (f)1,f2,…,fM) And r ═ r (r)1,r2,…,rM) Respectively represent F (T)μ(x) And R (x).
In formula (2), let pi=pi(F(Tμ(x) R (x)), i ═ 1,2, …, M, which represents the conditional probability distribution of a floating image transformed on the premise of a known reference image, and which is rewritten as p for simplicityi=p(F=fj|R=ri)=p(fj|ri) And then using the probability distribution of the reference image as a weighting factor, i.e. ωi=p(R(x))=p(R=ri)=p(ri) A 1 is to piAnd ωiBy substituting the definition of the JAD into the JAD,
where JAD is used to scale the reference image R (x) and the floating image F (T)μ(x) Measure of similarity between).
II, secondly: computation of gradient information of image to be registered
The method specifically comprises the following steps:
step 1, calculating the horizontal direction and vertical direction gradients of an image to be registered
Given a reference image R, a floating image F and a spatial transformation T, x is an arbitrary coordinate point of the reference image, Tμ(x) Corresponding to the transformed point, mu is a spatial transformation parameter, and the gradient of the reference image R is
Computing a transformed floating image F (T)μ(x) ) of the gradient,
the subscripts h and v in formulae (4) and (5) represent the horizontal and vertical directions, respectively,
for convolution operations, G' (σ) represents the first derivative of the Gaussian kernel scaled by σ.
Step 2, estimating direction information of gradient
Calculating the included angle between the gradient vectors by the formulas (4) and (5),
in the formula
Representing the gradient vector at point x at the scale σ, |, is the modulus of the gradient.
For multi-modality images, different imaging techniques result in the same tissue or organ having different gray values, and thus the gray vectors of the two images may point in different directions. However, the anatomical structures described by different modality medical images are consistent, so that the gradient directions of corresponding points of the images to be registered are the same or opposite, and the included angle between the gradient vectors should be small (equal to 0) or tend to be pi.
Therefore, a weighting function is used to estimate the direction information,
when the included angle is equal to 0 or pi, the maximum value 1 is obtained by the above formula; when the angle is equal to 0.5 pi, the value of the above formula is 0.
Step 3, obtaining the module information of the gradient
Aiming at multi-modal medical image registration, only a part containing stronger gradient information in two images is interested in being calculated
And
the die of (a) is used,
in order to obtain stronger gradient information in the image to be registered, only maximization is required
And
the minimum value of the modulus, the modulus information of the gradient is then estimated,
where σ denotes the scale of the gaussian kernel and μ is a spatial transformation parameter.
Step 4, fusion of gradient mode information and direction information
In order to make full use of the gradient information of the image during the registration process, the direction of the gradient and the mode are combined,
where ω and M represent the direction and mode information of the gradient, ω being the weight of the gradient mode, at R (x) and F (T)μ(x) ) the weighted sum of the pixels in the overlapping portion. When the two images are perfectly aligned, equation (11) takes the maximum value.
Thirdly, the method comprises the following steps: establishment of multi-modal image registration model
The method specifically comprises the following steps:
step 1. construction of similarity measure
The method combines the generalized entropy similarity and the gradient information of the image to be registered to construct the final similarity measure,
S(R(x),F(Tμ(x)))=G(R(x),F(Tμ(x)))JAα(F(Tμ(x)),R(x)) (12);
in the formula G and JAαRespectively representing a transformation in space TμGradient measurement and generalized entropy similarity measurement of the floating image and the reference image under the action.
Step 2. establishment of registration framework
Given a reference image R, a floating image F and a spatial transformation T, x is an arbitrary coordinate point of the reference image, Tμ(x) For transformed points, μ is a spatial transformation parameter, then the registration of R and F can be considered as the following optimization problem:
wherein S represents a similarity measure and μ is when R (x) and F (T)μ(x) Full registration, the optimal spatial transformation parameter at which the similarity takes a maximum.
Substituting the similarity measure S into the formula (13) to obtain
The purpose of image registration is to require a solution to the above optimization problem, and to achieve accurate registration of the reference image and the floating image through the obtained optimal transformation parameters.
Fourthly, the method comprises the following steps: solution of registration model
Since the registration function shown in equation (14) is composed of two terms, a gradient measure and a generalized entropy similarity measure, it is difficult to determine the derivative of the registration function. Therefore, the registration function is optimized by adopting a Powell algorithm, which is also called a direction acceleration method and is a search method capable of accelerating the convergence speed by utilizing the conjugate direction. The algorithm does not need to calculate the derivative of the registration function, and is an effective direct search method. The optimization algorithm iterates repeatedly within the search space, performing a one-dimensional search for each dimension until the algorithm converges.
For three-dimensional medical image registration, if a rigid body transformation is selected as a space transformation model, the dimension of a search space is 6, and the search space comprises three translation amounts and three rotation angles.
The optimization algorithm comprises the following specific steps:
first, an initial point x needs to be set(0)N linearly independent directions d0, d1, …, dn-1, and an allowed error, let k equal to 0;
a second step of sequentially performing one-dimensional search in n directions with y0 being x1, and performing one-dimensional search in n directions with j being 1, 2.. times.n, f (y (j-1) + λ (j-1) d (j-1)) ═ min f (y (j-1) + λ d (j-1)), y (j) -1) + λ (j-1) d (j-1);
a third step of taking an acceleration direction d (n) ═ y (n) — y (0); if | | d (n) | <, iteration is terminated, and y (n) is obtained and is an approximate optimal solution of image registration; otherwise, starting from point y (n), performing one-dimensional search along d (n), obtaining λ (n) so that f (y) (n) + λ (n), d (n)) min f (y (n) + λ d (n)), and x (k +1) ═ y (n) + λ (n) d (n)), and going to the fourth step;
fourthly, removing d0 and adding d (n) in the original n directions d (0), d (1) to form a new search direction, and returning to the second step.
The following experiments were performed using clinically common medical images including CT images, PET images, three types of MR images, and the like. The sizes of the images were, CT: 512 × 512, MR: 256 × 256, PET: 128 x 128.
Selecting CT and PET as reference images, as shown in FIG. 1a and FIG. 1 b; three types of MR were selected as floating images, as shown in fig. 2 a-2 c.
The images shown in fig. 1 and 2 are registered using a method based on gradient information and generalized entropy similarity. In the experiment, the value of the parameter alpha in the generalized entropy similarity is 1.5, the number of histogram boxes for estimating the joint probability distribution is 64, the allowed precision in the Powell optimization algorithm is 0.01, and the maximum iteration number is 100. The grids before CT and MR T1, MR T2 and MRPD images are registered are shown in figures 4 a-4 c, and the grids after the registration by adopting the invention are shown in figures 5 a-5 c. Similarly, the results of the PET images and the three types of MR images before registration are shown in FIGS. 6 a-6 c, and the results after registration are shown in FIGS. 7 a-7 c. As can be seen from fig. 4-7, the present invention can obtain accurate registration results when registering medical images of different modalities.
The above description is only for the purpose of illustrating the preferred embodiments of the present invention and is not to be construed as limiting the invention, and any modifications, equivalents, improvements and the like that fall within the spirit and principle of the present invention are intended to be included therein.