[go: up one dir, main page]
More Web Proxy on the site http://driver.im/

WO2014119412A1 - Medical image processing device, and medical image capture device - Google Patents

Medical image processing device, and medical image capture device Download PDF

Info

Publication number
WO2014119412A1
WO2014119412A1 PCT/JP2014/050944 JP2014050944W WO2014119412A1 WO 2014119412 A1 WO2014119412 A1 WO 2014119412A1 JP 2014050944 W JP2014050944 W JP 2014050944W WO 2014119412 A1 WO2014119412 A1 WO 2014119412A1
Authority
WO
WIPO (PCT)
Prior art keywords
interest
dimensional
filter
medical image
region
Prior art date
Application number
PCT/JP2014/050944
Other languages
French (fr)
Japanese (ja)
Inventor
正和 岡部
Original Assignee
株式会社 日立メディコ
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 株式会社 日立メディコ filed Critical 株式会社 日立メディコ
Priority to JP2014559632A priority Critical patent/JPWO2014119412A1/en
Publication of WO2014119412A1 publication Critical patent/WO2014119412A1/en

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T5/00Image enhancement or restoration
    • G06T5/20Image enhancement or restoration using local operators
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T5/00Image enhancement or restoration
    • G06T5/70Denoising; Smoothing
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/10Image acquisition modality
    • G06T2207/10072Tomographic images
    • G06T2207/10081Computed x-ray tomography [CT]
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/30Subject of image; Context of image processing
    • G06T2207/30004Biomedical image processing
    • G06T2207/30101Blood vessel; Artery; Vein; Vascular

Definitions

  • the present invention relates to a medical image processing apparatus and a medical image imaging apparatus, and more particularly to removal of noise components in a three-dimensional reconstructed image.
  • Patent Document 1 two-dimensional X-ray images are subjected to two-dimensional adaptive filter processing that selectively reduces noise components without reducing linear shadows or edge-shaped shadow contrast.
  • a diagnostic device is described.
  • Cited Document 1 when trying to apply to a three-dimensional image, Cited Document 1 does not consider a three-dimensional structure, so appropriate noise An image that has been subjected to the removal process could not be acquired.
  • the present invention has been made in view of the above problems, and an object of the present invention is to provide a technique for acquiring an image obtained by performing appropriate noise removal processing on a three-dimensional image.
  • the present invention sets a three-dimensional region of interest centered on a pixel of interest on a medical image including a three-dimensional reconstructed image of a subject, and based on a distribution of pixel values included in the three-dimensional region of interest, Detecting the traveling direction of the structure included in the three-dimensional region of interest, setting a three-dimensional spatial filter having a shape along the traveling direction of the structure, and using the three-dimensional spatial filter, the three-dimensional region of interest A noise removal process is performed on the image.
  • the medical image processing apparatus includes a region-of-interest setting unit that sets a three-dimensional region of interest around a pixel of interest on a medical image including a three-dimensional reconstructed image of a subject, and the three-dimensional Based on the distribution of pixel values included in the region of interest, a structure detection unit that detects the traveling direction of the structure included in the three-dimensional region of interest, and a three-dimensional shape having a shape along the traveling direction of the structure And a filter processing unit that sets a spatial filter and performs noise removal processing on the three-dimensional reconstructed image using the three-dimensional spatial filter.
  • the structure detection unit can detect at least one traveling direction of the linear structure or the planar structure included in the three-dimensional region of interest.
  • a planar structure is a structure spreading in a planar shape
  • the traveling direction means a direction spreading in a planar shape.
  • the filter processing unit performs a convolution operation in a plane direction parallel to the traveling direction of the planar structure and a line filter that performs an axial convolution operation parallel to the traveling direction of the linear structure.
  • the three-dimensional spatial filter can be generated by combining the surface filter with a predetermined intensity ratio.
  • the structure detection unit based on the covariance using the product of the deviation of the coordinate value of the pixel included in the three-dimensional region of interest and the pixel value of the pixel, At least one traveling direction of the planar structure may be detected.
  • the structure detection unit generates a cubic square matrix using the covariance, calculates three eigenvalues of the cubic square matrix and eigenvectors for each eigenvalue, and uses these three eigenvalues and eigenvectors. Then, at least one of the traveling direction of the linear structure and the traveling direction of the planar structure may be detected.
  • the structure detection unit calculates the direction of the eigenvector corresponding to the eigenvalue having the largest absolute value among the three eigenvalues as the traveling direction of the linear structure included in the three-dimensional region of interest. May be.
  • a plane orthogonal to the eigenvector corresponding to the eigenvalue having the smallest absolute value among the three eigenvalues may be calculated as the traveling direction of the planar structure included in the three-dimensional region of interest.
  • the structure detection unit generates a cubic square matrix using the covariance, calculates three eigenvalues of the cubic square matrix and eigenvectors for each eigenvalue, and uses these three eigenvalues and eigenvectors.
  • the unit may increase or decrease a weighting factor by which the pixel value in the three-dimensional spatial filter is multiplied according to the contrast of the linear structure and the contrast of the planar structure.
  • the structure detection unit calculates a ratio between the eigenvalue having the largest absolute value and the square root of the product of the second largest eigenvalue and the smallest eigenvalue among the three eigenvalues.
  • the contrast of the linear structure included in the region of interest may be calculated.
  • the structure detection unit is configured to calculate a ratio of a square root of a product of the largest eigenvalue and the second largest eigenvalue among the three eigenvalues and a eigenvalue having the smallest absolute value to the three-dimensional eigenvalue.
  • the contrast of the planar structure in the region of interest may be calculated.
  • the filter processing unit may generate a surface filter having directivity based on the eigenvalue having the largest absolute value and the eigenvalue having the second largest absolute value among the three eigenvalues.
  • the structure detection unit may obtain the traveling direction of the three-dimensional region of interest by weighted averaging using the traveling direction calculated from the region of interest located around the three-dimensional region of interest. .
  • the structure detection unit calculates a linear structure contrast or a planar structure contrast calculated from the three-dimensional region of interest from a region of interest located around the three-dimensional region of interest. It may be obtained by weighted averaging using the contrast of the linear structure or the contrast of the planar structure.
  • the medical imaging apparatus generates a three-dimensional reconstructed image of a subject by performing a reconstruction operation based on the imaging unit that captures the subject and generates image data, and the image data.
  • a region of interest setting unit for setting a three-dimensional region of interest around the pixel of interest on a medical image including the three-dimensionally reconstructed image, and a pixel included in the three-dimensional region of interest Based on the distribution of values, a structure detection unit that detects the traveling direction of the structure included in the three-dimensional region of interest, and a three-dimensional spatial filter having a shape along the traveling direction of the structure, Using the three-dimensional spatial filter, a filter processing unit that performs noise removal processing on the three-dimensional reconstruction image, and a display control unit that performs control for displaying the three-dimensional reconstruction image after the noise removal processing, It is characterized by having.
  • the imaging unit includes an X-ray source that generates X-rays, and an X-ray flat panel detector configured by arranging detection elements that detect the X-rays in a two-dimensional array, and the region of interest setting unit includes the The three-dimensional region of interest having the same shape as the shape of the reconstruction field formed by rotating the image receiving surface of the detection element by 360 ° may be set.
  • FIG. 1 is a functional block diagram of the medical image processing apparatus according to the present embodiment.
  • FIG. 2 is a flowchart showing a process flow of the medical image processing apparatus according to the first embodiment.
  • FIG. 3 is a flowchart showing details of the process in step S3 of FIG.
  • FIG. 4 is an explanatory diagram showing the contents of processing of the medical image processing apparatus according to the first embodiment.
  • a medical image processing apparatus 1 shown in FIG. 1 includes a control calculation unit 20 that performs noise removal processing on a three-dimensional reconstructed image by setting a three-dimensional spatial filter, and a mouse and a keyboard for inputting various parameters and instructions. Or an information input device 30 composed of a trackball or the like, and a display device 40 for displaying an image.
  • the control calculation unit 20 includes a reconstructed image acquisition unit 21 that acquires a three-dimensional reconstructed image that is a target of noise removal processing, and a three-dimensional region of interest (hereinafter referred to as a target pixel on the three-dimensional reconstructed image).
  • a filter processing unit 24 that sets a three-dimensional spatial filter and performs noise removal processing
  • a display control unit 25 that performs display control of the three-dimensional reconstructed image after the filter processing.
  • the reconstructed image acquisition unit 21 acquires image data from the imaging unit 10 that captures an image of a subject and generates image data, performs a reconstruction operation based on the acquired image data, and generates an image reconstruction image that generates a three-dimensional reconstructed image.
  • It may be configured as a configuration unit, or read or receive a 3D reconstructed image to be subjected to noise removal from an image storage unit (or image database) 11 that stores a 3D reconstructed image generated in advance.
  • the image reading unit may be configured.
  • Step S1 The reconstructed image acquisition unit 21 reads or generates a three-dimensional reconstructed image that is a target of noise removal processing (S1).
  • Step S2 The region-of-interest setting unit 22 sets a three-dimensional region of interest centered on the pixel of interest on the three-dimensional reconstruction of the subject read or generated in step S1 (S2).
  • the coordinates of the pixel of interest are represented as (I, j, k), and the pixel value of the pixel of interest is represented as v (I, j, k).
  • a cubic three-dimensional region of interest consisting of 5 ⁇ 5 ⁇ 5 pixels centered on the pixel of interest is set, but the number of pixels included in the three-dimensional region of interest and the shape of the three-dimensional region of interest are not limited to these.
  • the shape of a three-dimensional region of interest is determined by an X-ray image of a detection element of an X-ray detector mounted on the X-ray CT apparatus.
  • You may comprise so that the shape of the reconstruction visual field obtained by rotating a surface 360 degrees may be the same shape.
  • the X-ray image receiving surface is rectangular
  • the three-dimensional region of interest may be configured in a cylindrical shape configured by rotating the rectangular surface.
  • an image intensifier is used as the X-ray detector and the X-ray receiving surface is circular
  • the 3D reconstruction area is spherical, so the 3D region of interest is also configured in a spherical shape. May be.
  • Step S3 The structure detection unit 23 detects the traveling direction of the structure in the three-dimensional region of interest based on the distribution of pixel values in the three-dimensional region of interest (S3).
  • S3 the distribution of pixel values in the three-dimensional region of interest
  • Step S3-1 The structure detection unit 23 uses a product of the deviation (moment) of the coordinate value of each pixel included in the three-dimensional region of interest and the pixel value, and weights the deviation (moment) of the coordinate value with the pixel value.
  • the covariance of the region of interest is obtained, and a 3 ⁇ 3 covariance matrix is generated (S3-1).
  • the following equation (1) shows a 3 ⁇ 3 covariance matrix.
  • Step S3-2 The structure detection unit 23 solves the eigen equation of the 3 ⁇ 3 covariance matrix and obtains the eigen value and the eigen vector corresponding to each eigen value (S3-2). Therefore, first, an eigen equation shown in the following equation (2) is obtained from the covariance matrix (1) ′.
  • a, b, c, and d are defined as in the proviso of the above formula (4) and are positive (0 in special cases). Therefore, the inside of the two cube roots in the formula of the cubic equation solution (see equation (4)) is a conjugate complex number (equal value R in the special case), and is obtained by solving the eigen equation of equation (2) above
  • Equation (7) The three eigenvectors of Equation (7) are
  • the covariance matrix (1) is diagonalized using an orthogonal matrix U and an inverse matrix U ⁇ 1 obtained by transposing U (formula (10)).
  • the orthogonal matrix U is a rotation matrix having an angle ⁇ with a vector (l, m, n) as a rotation axis in a three-dimensional space.
  • the Euler parameters e 0 , e 1 , e 2 , e 3 of the equation (11), the rotation axis (l, m, n), and the rotation angle ⁇ are linked by the following equation (12).
  • Equation (7) The three eigenvectors of Equation (7) are orthogonal.
  • three eigenvalues ⁇ ( ⁇ , ⁇ , ⁇ ) are arranged so that their absolute values are
  • the eigenvector for the eigenvalue ⁇ is the unit vector in the X-axis direction rotated by the orthogonal matrix U
  • the eigenvector for the eigenvalue ⁇ is the unit vector in the Y-axis direction rotated by the orthogonal matrix U
  • the structure detection unit 23 detects the traveling direction of the structure using the three eigenvalues and eigenvectors obtained in step S3-2.
  • the structure here includes a linear structure and a planar structure.
  • the linear structure include a blood vessel, a catheter, and a guide wire that guides the catheter.
  • the planar structure include a bone, an organ surface, and a soft tissue boundary surface.
  • the planar structure means a boundary surface between organs and tissues and does not necessarily indicate a flat structure.
  • the presence or absence of a linear structure is determined (S3-3), and then the presence or absence of a planar structure is determined, but this order is changed to determine the presence or absence of a planar structure, Thereafter, the presence or absence of the linear structure may be determined.
  • Steps S3-4, S3-5 The presence / absence of the planar structure is determined (S3-4, S3-5).
  • Step S3-6 It is determined that both the linear structure and the planar structure are included (S3-6).
  • Step S3-7 The structure detection unit 23 determines that only the linear structure is included, and similarly calculates the traveling direction of the linear structure (S3-7).
  • Step S3-8 The structure detection unit 23 determines that only the planar structure is included, and calculates the traveling direction of the planar structure as described above (S3-8).
  • Step S3-9) The structure detection unit 23 determines that no structure is included in the three-dimensional region of interest (S3-9).
  • Step S4 The filter processing unit 24 generates a three-dimensional spatial filter having a shape along the traveling direction of the structure (S4). Based on FIG. 4, the contents of the filtering process in this step will be described. It is assumed that three-dimensional regions of interest 41a, 41b, 41c, and 41d are set in the three-dimensional reconstructed image 40 before the filter processing in FIG. Only the linear structure 42a is included in the 3D region of interest 41a, the linear structure and the planar structure 42b are included in the 3D region of interest 41b, and the planar shape is included in the 3D region of interest 41c. Only the structure 42c is included. In addition, no structure is included in the three-dimensional region of interest 41d.
  • step S3 When the process of step S3 is performed on the three-dimensional region of interest 41a, 41b, 41c, 41d, step S3-7 is performed on the three-dimensional region of interest 41a, and step S3-6 is performed on the three-dimensional region of interest 41b.
  • step S3-8 is obtained for the three-dimensional region of interest 41c, and step S3-9 is obtained for the three-dimensional region of interest 41d.
  • the filter processing unit 24 is a line filter for the three-dimensional region of interest 41a, a line filter and a surface filter for the three-dimensional region of interest 41b, a surface filter for the three-dimensional region of interest 41c, A three-dimensional smoothing filter that does not consider the shape of the structure is set for the three-dimensional region of interest 41d.
  • the 3D spatial filter is a function that determines the weighting coefficient to be set for each pixel when the pixel value of the pixel of interest is determined by convolution calculation of the pixel value of the pixel included in the 3D region of interest. Then, the weighting factor is determined so as to perform the convolution calculation in the axial direction parallel to the traveling direction of the linear structure, and the surface filter performs the convolution calculation in the surface direction parallel to the traveling direction of the planar structure. Thus, the weighting factor is determined.
  • the line filter is such that among the pixels included in the region of interest, the pixel value of the pixels arranged in the direction parallel to the traveling direction of the linear structure from the target pixel is relatively heavy.
  • a function in which a weighting factor w 1 (w 1 > 0) is set and a weighting factor w 2 (w 1 > w 2 or w 2 0) is set so that the weight is relatively small for other pixels. It is.
  • the surface filter is such that, among the pixels included in the region of interest, the weight is relatively heavy with respect to the pixel value of the pixel located in the traveling direction (surface spreading direction) of the planar structure around the pixel of interest.
  • This is a function in which a weighting coefficient is set and a weighting coefficient is set so that the weight is relatively small for other pixels.
  • the filter processing unit 24 performs a convolution operation by multiplying each pixel included in the three-dimensional region of interest 41a, 41b, 41c, 41d by the weighting factor of the set three-dimensional spatial filter.
  • the value obtained as a result is calculated as a filtered pixel value of the pixel of interest P. If it is determined that the region includes both a linear structure and a planar structure (like many tissues in this case), as shown in the region of interest 41b in FIG. 4, a line filter and Apply both area filters. In this case, it is possible to give priority to one of the line filters and the surface filters with different strengths depending on the inspection target. The setting of the filter strength will be described in another embodiment.
  • Step S5 The filter processing unit 24 determines whether or not the processing from steps S2 to S4 has been completed for all the pixels of the three-dimensional reconstructed image. If the determination is affirmative, the process proceeds to step S6. If the determination is negative, the process returns to step S2, and the subsequent processing is repeated (S5). That is, the processing from step S2 to step S4 is executed while scanning the three-dimensional region of interest over the entire region of the three-dimensional reconstructed image.
  • Step S6 The display control unit 25 displays the filtered three-dimensional reconstructed image on the screen of the display device 80 (S6).
  • the present embodiment by performing noise removal processing using a spatial filter in a three-dimensional direction having a shape along the traveling direction of the structure in the three-dimensional region of interest, the background region (in the three-dimensional region of interest Only the noise in the area (excluding the structure) is selectively reduced. Therefore, it is possible to selectively reduce the noise component of the three-dimensional reconstructed image without reducing the contrast of the linear / planar structure. In other words, the ability to extract linear structures and planar structures can be relatively enhanced.
  • step S5 when all the pixels are processed in step S5, the reconstructed image after the filter processing is displayed. However, after all the pixels are processed, the process returns to step S1 again, Multiple filtering processes may be performed on the three-dimensional reconstructed image. Also in this case, according to the present embodiment, noise is selectively removed from the background region, so that the ability to extract a structure can be improved even if filter processing is performed a plurality of times.
  • the second embodiment is an embodiment in which the weight coefficient w 1 is changed in a three-dimensional spatial filter.
  • the structure detector 23 has the largest absolute value among the three eigenvalues of the covariance matrix of the three-dimensional region of interest. Calculate the ratio of the large eigenvalue to the square root of the product of the second largest eigenvalue and the smallest eigenvalue as the contrast C 1 of the linear structure in the 3D region of interest (hereinafter referred to as “line structure contrast”) (Refer to the following formula (13)).
  • the structure detection unit 23 among the three eigenvalues of the covariance matrix of the three-dimensional region of interest, the square root of the product of the eigenvalue having the largest absolute value and the second largest eigenvalue, the eigenvalue having the smallest absolute value, Is calculated as the contrast C 2 of the planar structure in the three-dimensional region of interest (hereinafter referred to as “plane structure contrast”) (see the following equation (14)).
  • the filter processing unit 24 sets the weighting factor. For example, the weight coefficient w 1 is set to a value relatively close to “1”, and the weight coefficient w 2 is set to a value relatively close to “0” or 0. The same applies to the foliation contrast C 2.
  • the pixel value of the pixel of interest after the convolution calculation is increased by increasing the contribution ratio of the pixel values of the pixels constituting the structure.
  • the value can be calculated, and noise can be removed while suppressing a decrease in the definition of the structure.
  • the surface filter has directivity based on the eigenvalue having the largest absolute value and the eigenvalue having the second largest absolute value.
  • the weight value w 11 (w 11 > 0) is used for the pixel value of the pixel located along the eigenvector for the eigenvalue having the largest absolute value
  • the pixel value of the pixel located along the eigenvector for the eigenvalue having the second largest absolute value is used.
  • the pixel value may be multiplied by a weighting factor w 12 (0 ⁇ w 12 ⁇ w 11 ).
  • (2-3) aspect weighting coefficients w 1 to change the weighting coefficients w 1, depending on the distance from the target pixel, depending on the distance from the target pixel may be changed.
  • it may be relatively high weighting factor w 1 for multiplying the pixel value of the pixel near the target pixel.
  • the pixel value of the target pixel after the convolution calculation can be calculated by increasing the contribution ratio of the pixel closer to the target pixel in the three-dimensional spatial filter, and the blur of the image due to the noise removal processing is suppressed. be able to.
  • the mode shown in (2-3) can be applied alone or in combination with any of (2-1) and (2-2).
  • the weighting coefficient to be multiplied with respect to the pixels in the three-dimensional spatial filter can be changed according to the difference from the background area, the directionality of the surface filter, and the distance from the target pixel.
  • noise removal processing can be performed without causing blur.
  • the third embodiment is an embodiment in which the strength of a line filter and a surface filter is set using an operation GUI.
  • FIG. 5 is an explanatory diagram illustrating an example of an operation screen for setting the strengths of the line filter and the surface filter.
  • the operation screen 50 in FIG. 5 is a screen for setting the strength of the line filter and the surface filter for each imaging region and imaging condition.
  • An “add” button 55 is provided, and “abdomen” 53 is selected in FIG.
  • a “standard reconstructed image” column 61 and a “DSA reconstructed image” column 62 are provided for each photographing condition.
  • Slider 62a is for setting the first coefficient multiplying against lineation contrast C 1 of three-dimensional spatial filter to be used for standard reconstructed image
  • Slide 63a is used for standard reconstructed image
  • the second coefficient to be multiplied to the surface structure contrast C 2 of the three-dimensional spatial filter to be obtained is set.
  • the line structure contrast C 1 also increases, and the weight coefficient w 1 multiplied by the pixel located along the traveling direction of the linear structure around the target pixel is large. As a result, the strength of the line filter can be increased.
  • the structure in a DSA reconstructed image, there is a demand for highlighting only contrast-enhanced blood vessels, so the structure (contrast-enhanced blood vessels) is clearer than the non-structure (contrast-enhanced blood vessel region) compared to the standard reconstructed image. It is desirable to display on. Further, since the contrast blood vessel is closer to the linear structure, it is desirable to set the weight coefficient of the linear structure larger than the weight coefficient of the planar structure. In addition, depending on the imaging region, it may be different which of the linear structure or the planar structure is to be emphasized. For example, in general, it may be desired to emphasize the linear structure more strongly in the extremities and more strongly the planar structure in the chest or abdomen. And the intensity ratio of the surface filter can be set, and noise removal processing can be performed according to the shape of the part necessary for diagnosis.
  • the fourth embodiment is an embodiment in which the structure detection threshold (the first threshold TH 1 and the second threshold TH 2 described above ) is set using the operation GUI.
  • the present embodiment will be described with reference to FIG.
  • FIG. 6 is an explanatory diagram illustrating an example of an operation screen for setting a structure detection threshold.
  • the value of the first threshold TH 1 can be increased or decreased by moving the slider 72 up and down, and the value of the second threshold TH 2 can be increased or decreased by moving the slider 73 up and down.
  • the operator sets the first threshold value TH 1 relatively high.
  • the second threshold TH 2 is set relatively low, and the surface shape is drawn more finely. It is possible to generate a three-dimensional reconstruction image drawn on a simple surface.
  • the traveling direction and the contrast are obtained for each set three-dimensional region of interest.
  • the weighted average is calculated with the traveling direction and the contrast calculated from the surrounding three-dimensional region of interest, and the set three-dimensional region of interest is set.
  • the traveling direction and contrast may be obtained.
  • a three-dimensional region of interest including a linear structure e.g., the three-dimensional region of interest 42a in FIG. 4
  • neighboring three-dimensional regions of interest e.g., in FIG. 4
  • the value obtained by weighted average of the traveling direction and contrast of the three-dimensional region of interest 42a centered on the three-dimensional region of interest 42a in the vertical direction in FIG. 4) is obtained as the traveling direction and contrast of the three-dimensional region of interest. Also good.
  • a three-dimensional region of interest including a planar structure (for example, the three-dimensional region of interest 42c in FIG. 4)
  • neighboring peripheral three-dimensional regions of interest along the traveling direction of the planar structure.
  • the three-dimensional region of interest 42c is the center of the three-dimensional region of interest adjacent to the left-right direction and the depth direction in FIG. You may ask as. Thereby, the change of the running direction and contrast between adjacent regions of interest can be smoothed.
  • a C-arm type X-ray CT apparatus will be described as an example of a medical imaging apparatus.
  • the present invention can be applied to any medical imaging apparatus that generates a three-dimensional reconstructed image regardless of the type.
  • the X-ray diagram 7 according to the present embodiment is a functional block diagram of a cone beam X-ray CT apparatus (C-arm system) to which the present invention is applied.
  • the cone beam X-ray CT apparatus 200 in FIG. 7 irradiates the subject 2 with X-rays and controls the imaging unit 10 that captures the projection data 111 of the subject 2 and each component of the imaging unit 10.
  • a display device 40 for displaying an image.
  • the imaging unit 10 includes a bed 17, an X-ray source 11 that irradiates the subject 2 lying on the bed 17 with X-rays, and an X-ray that is installed opposite to the X-ray source 11 and passes through the subject 2
  • a two-dimensional X-ray detector 12 that outputs projection data 111 by detecting the X-ray
  • a C-type arm 13 that mechanically connects the X-ray source 11 and the two-dimensional X-ray detector 12, and the C-type arm 13
  • a C-type arm holding body 14 to be held a ceiling support 15 for attaching the C-type arm holding body 14 to the ceiling, and a ceiling for supporting the ceiling support 15 so as to be movable in the two-dimensional direction of front, rear, left and right in the illustrated state.
  • Rail 16 and injector 18 for injecting contrast medium into subject 2 are provided.
  • the X-ray source 11 includes an X-ray tube 11t that generates X-rays, and a collimator 11c that controls the direction of X-ray irradiation from the X-ray tube 11t to be a cone, a quadrangular pyramid, or a multi-sided pyramid.
  • the two-dimensional X-ray detector 12 for example, a flat panel detector (hereinafter referred to as “FPD”) using a TFT element is used.
  • FPD flat panel detector
  • an X-ray image intensifier that converts an X-ray transmission image into a visible light image an optical lens that forms an image of the X-ray image intensifier, and You may use the two-dimensional X-ray detector comprised from the combination of the CCD television camera etc. which image
  • the imaging field of view of the two-dimensional X-ray detector 12 may be any shape such as a circle or a rectangle.
  • the C-arm 13 rotates around the rotation center axis 4 at every predetermined projection angle when the subject 2 is imaged.
  • the X-ray imaging is performed by rotating and moving on a circular orbit in substantially the same plane.
  • the control calculation unit 20 includes an imaging unit control unit 100 that controls the imaging unit 10, an image collection unit 110 that collects and stores the projection data 111 output from the imaging unit 10, and 3 based on the collected projection data 111.
  • An image reconstruction unit 21a that reconstructs the two-dimensional reconstruction image 211, and a region of interest setting unit that sets a three-dimensional region of interest centered on the pixel of interest on the three-dimensional reconstruction image 211 generated by the image reconstruction unit 21a 22, a structure detection unit 23 that detects the traveling direction of the structure included in the three-dimensional region of interest, and a filter that performs noise removal processing by setting a three-dimensional spatial filter based on the detected traveling direction of the structure
  • a processing unit 24 and a display control unit 25 that performs display control for displaying the filtered three-dimensional reconstructed image 241 on the multiplication device 40 are provided.
  • the information input device 30 has a structure detection threshold (the first threshold TH 1 and the second threshold TH 2 described above ) used when the structure detection unit 23 detects a structure in the three-dimensional region of interest. 301 and a first coefficient to be multiplied by the line structure contrast C 1 and a second coefficient to be multiplied by the surface structure contrast C 2 (hereinafter referred to as “contrast coefficient”) 302 in order to set the enhancement ratio between the line filter and the surface filter. Accept input.
  • the structure detection threshold 301 is delivered to the structure detection unit 23.
  • the contrast coefficient 302 is delivered to the filter processing unit 24.
  • the processing performed by the region-of-interest setting unit 22, the structure detection unit 23, the filter processing unit 24, and the display control unit 25 is the same as the processing described in the first embodiment and the second embodiment for the control calculation unit 20 illustrated in FIG. Therefore, a duplicate description is omitted here.
  • the imaging unit control unit 100 controls the position of the C-type arm 13 on the ceiling rail 16 and the imaging system rotation control unit 101 that controls the rotational movement of the C-arm 13 around the rotation center axis 4 to control the C-type arm 13.
  • An imaging system position control unit 102 that two-dimensionally controls the position of the mold arm 13 with respect to the subject 2
  • an X-ray irradiation control unit 103 that controls ON / OFF of a tube current flowing through the X-ray tube 11t
  • an injector 18 Injector control unit 104 that controls the injection amount and injection timing of contrast medium to be injected into subject 2
  • bed control unit 105 for adjusting the position of subject 2 by controlling the position of bed 17, and two-dimensional
  • a detection system control unit 107 that controls imaging of the projection data 111 by the X-ray detector 12.
  • a three-dimensional reconstructed image 211 is generated based on the projection data 111 obtained by the imaging unit 10, and a noise is reduced by applying a three-dimensional spatial filter to this. Can be processed. Therefore, a three-dimensional reconstructed image obtained by performing noise reduction processing without reducing linear and planar contrast is obtained.
  • a three-dimensional reconstructed image obtained by performing noise reduction processing without reducing linear and planar contrast is obtained.
  • a line imaging apparatus can be provided.
  • Medical image processing device 10 imaging unit, 20 control calculation unit, 30 information input device, 40 display device, 200 cone beam X-ray CT device (medical image imaging device)

Landscapes

  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Apparatus For Radiation Diagnosis (AREA)
  • Image Processing (AREA)

Abstract

The purpose of the present invention is to provide a technology which can perform noise elimination on a three-dimensional image by using an isotropic three-dimensional adaptive filter. With respect to a medical image including a three-dimensional reconstructed image (40) of a subject, the present invention sets three-dimensional regions of interest (41a-41d) centred on target pixels, and detects the travelling directions of structures (42a-42c) present in the three-dimensional regions of interest on the basis of the distribution of pixel values included in the three-dimensional regions of interest. The present invention sets three-dimensional spatial filters, the shapes of which are aligned in the travelling directions of the structures, or if there is no structure, sets a three-dimensional smoothing filter which does not take structural shapes into account, and performs noise elimination processing on the three-dimensional areas of interest by using the three-dimensional spatial filters.

Description

医用画像処理装置及び医用画像撮像装置Medical image processing apparatus and medical image imaging apparatus
 本発明は、医用画像処理装置及び医用画像撮像装置に係り、特に、3次元再構成像におけるノイズ成分の除去に関する。 The present invention relates to a medical image processing apparatus and a medical image imaging apparatus, and more particularly to removal of noise components in a three-dimensional reconstructed image.
 特許文献1には、2次元のX線撮影像に対し、線状陰影もしくはエッジ状陰影コントラストを落とさずに、ノイズ成分を選択的に低減する、2次元の適応型フィルタ処理を施す、X線診断装置が記載されている。 In Patent Document 1, two-dimensional X-ray images are subjected to two-dimensional adaptive filter processing that selectively reduces noise components without reducing linear shadows or edge-shaped shadow contrast. A diagnostic device is described.
特許4112762号公報Japanese Patent No. 4111762
 しかし、上記の適応型フィルタ処理は、2次元の適応型フィルタ処理であり、3次元像に施そうとする場合、引用文献1では3次元での構造体を考慮していないため、適正なノイズ除去処理がされた画像を取得することが出来なかった。 However, the adaptive filter process described above is a two-dimensional adaptive filter process, and when trying to apply to a three-dimensional image, Cited Document 1 does not consider a three-dimensional structure, so appropriate noise An image that has been subjected to the removal process could not be acquired.
 本発明は、上記問題に鑑みてなされたものであり、3次元画像に対しても適正なノイズ除去処理がされた画像を取得する技術を提供することを目的とする。 The present invention has been made in view of the above problems, and an object of the present invention is to provide a technique for acquiring an image obtained by performing appropriate noise removal processing on a three-dimensional image.
 本発明は、被検体の3次元再構成像を含む医用画像上において、着目画素を中心とする3次元の関心領域を設定し、3次元の関心領域に含まれる画素値の分布を基に、前記3次元の関心領域に含まれる構造体の走行方向を検出し、構造体の走行方向に沿った形状を有する3次元空間フィルタを設定し、その3次元空間フィルタを用いて3次元の関心領域に対しノイズ除去処理を行う、ことを特徴とする。 The present invention sets a three-dimensional region of interest centered on a pixel of interest on a medical image including a three-dimensional reconstructed image of a subject, and based on a distribution of pixel values included in the three-dimensional region of interest, Detecting the traveling direction of the structure included in the three-dimensional region of interest, setting a three-dimensional spatial filter having a shape along the traveling direction of the structure, and using the three-dimensional spatial filter, the three-dimensional region of interest A noise removal process is performed on the image.
 本発明によれば、3次元画像に対しても適正なノイズ除去処理がされた3次元画像を取得する技術を提供することができる。 According to the present invention, it is possible to provide a technique for acquiring a three-dimensional image in which appropriate noise removal processing has been performed on the three-dimensional image.
本発明が適用される医用画像処理装置の機能ブロック図Functional block diagram of a medical image processing apparatus to which the present invention is applied 第一実施形態に係る医用画像処理装置の処理の流れを示すフローチャートThe flowchart which shows the flow of a process of the medical image processing apparatus which concerns on 1st embodiment. 図2のステップS3の詳細を示すフローチャートFlowchart showing details of step S3 in FIG. 第一実施形態に係る医用画像処理装置の処理を示す説明図Explanatory drawing which shows the process of the medical image processing apparatus which concerns on 1st embodiment. 線構造コントラスト及び面構造コントラストを設定する操作画面の一例を示す説明図Explanatory drawing which shows an example of the operation screen which sets a line structure contrast and a surface structure contrast 線状構造体を検出するための閾値TH1及び面状構造体を検出するための閾値TH2を設定するための操作画面の一例を示す説明図Explanatory view showing an example of an operation screen for setting the threshold TH 2 for detecting the threshold value TH 1 and planar structure for detecting a linear structure 本発明が適用されるコーンビームX線CT装置(Cアーム方式)の機能ブロック図Functional block diagram of a cone beam X-ray CT apparatus (C-arm system) to which the present invention is applied
 本実施形態に係る医用画像処理装置は、被検体の3次元再構成像を含む医用画像上において、着目画素を中心とする3次元の関心領域を設定する関心領域設定部と、前記3次元の関心領域に含まれる画素値の分布を基に、前記3次元の関心領域に含まれる構造体の走行方向を検出する構造体検出部と、前記構造体の走行方向に沿った形状を有する3次元空間フィルタを設定し、その3次元空間フィルタを用いて、前記3次元再構成像に対しノイズ除去処理を行うフィルタ処理部と、を有することを特徴とする。 The medical image processing apparatus according to the present embodiment includes a region-of-interest setting unit that sets a three-dimensional region of interest around a pixel of interest on a medical image including a three-dimensional reconstructed image of a subject, and the three-dimensional Based on the distribution of pixel values included in the region of interest, a structure detection unit that detects the traveling direction of the structure included in the three-dimensional region of interest, and a three-dimensional shape having a shape along the traveling direction of the structure And a filter processing unit that sets a spatial filter and performs noise removal processing on the three-dimensional reconstructed image using the three-dimensional spatial filter.
 また、前記構造体検出部は、前記3次元の関心領域に含まれる線状構造体又は面状構造体の少なくとも一つの走行方向を検出することができる。なお、面状構造体は、面状に広がる構造であり、その走行方向とは面状に広がる方向を意味する。 Further, the structure detection unit can detect at least one traveling direction of the linear structure or the planar structure included in the three-dimensional region of interest. In addition, a planar structure is a structure spreading in a planar shape, and the traveling direction means a direction spreading in a planar shape.
 また、前記フィルタ処理部は、前記線状構造体の走行方向に平行な軸方向のコンボリューション演算を行う線フィルタと、前記面状構造体の走行方向に平行な面方向のコンボリューション演算を行う面フィルタと、を、所定の強度比率で合成することにより、前記3次元空間フィルタを生成することができる。 The filter processing unit performs a convolution operation in a plane direction parallel to the traveling direction of the planar structure and a line filter that performs an axial convolution operation parallel to the traveling direction of the linear structure. The three-dimensional spatial filter can be generated by combining the surface filter with a predetermined intensity ratio.
 また、前記構造体検出部は、前記3次元の関心領域に含まれる画素の座標値の偏差と、当該画素の画素値と、の積を用いた共分散を基に、前記線状構造体及び前記面状構造体の少なくとも一つの走行方向を検出してもよい。その際、前記構造体検出部は、前記共分散を用いた3次正方行列を生成し、当該3次正方行列の3つの固有値及び各固有値に対する固有ベクトルを算出し、これら3つの固有値及び固有ベクトルを用いて、前記線状構造体の走行方向及び前記面状構造体の走行方向の少なくとも一つを検出してもよい。 In addition, the structure detection unit, based on the covariance using the product of the deviation of the coordinate value of the pixel included in the three-dimensional region of interest and the pixel value of the pixel, At least one traveling direction of the planar structure may be detected. At that time, the structure detection unit generates a cubic square matrix using the covariance, calculates three eigenvalues of the cubic square matrix and eigenvectors for each eigenvalue, and uses these three eigenvalues and eigenvectors. Then, at least one of the traveling direction of the linear structure and the traveling direction of the planar structure may be detected.
 また、前記構造体検出部は、前記3つの固有値のうち、絶対値が最も大きい固有値に対応する固有ベクトルの方向を、前記3次元の関心領域に含まれる前記線状構造体の走行方向として算出してもよい。また、前記3つの固有値のうち、絶対値が最も小さい固有値に対応する固有ベクトルに直交する平面を、前記3次元の関心領域に含まれる前記面状構造体の走行方向として算出してもよい。 Further, the structure detection unit calculates the direction of the eigenvector corresponding to the eigenvalue having the largest absolute value among the three eigenvalues as the traveling direction of the linear structure included in the three-dimensional region of interest. May be. A plane orthogonal to the eigenvector corresponding to the eigenvalue having the smallest absolute value among the three eigenvalues may be calculated as the traveling direction of the planar structure included in the three-dimensional region of interest.
 また、前記構造体検出部は、前記共分散を用いた3次正方行列を生成し、当該3次正方行列の3つの固有値及び各固有値に対する固有ベクトルを算出し、これら3つの固有値及び固有ベクトルを用いて、前記3次元の関心領域内の背景領域に対する前記線状構造体のコントラスト、及び前記3次元の関心領域内の背景領域に対する前記面状構造体のコントラストの少なくとも一つを算出し、前記フィルタ処理部は、前記線状構造体のコントラスト及び前記面状構造体のコントラストに応じて、前記3次元空間フィルタ内の画素値に乗じる重み係数を増減してもよい。 Further, the structure detection unit generates a cubic square matrix using the covariance, calculates three eigenvalues of the cubic square matrix and eigenvectors for each eigenvalue, and uses these three eigenvalues and eigenvectors. Calculating at least one of a contrast of the linear structure with respect to a background region in the three-dimensional region of interest and a contrast of the planar structure with respect to a background region in the three-dimensional region of interest, and the filtering process The unit may increase or decrease a weighting factor by which the pixel value in the three-dimensional spatial filter is multiplied according to the contrast of the linear structure and the contrast of the planar structure.
 また、前記構造体検出部は、前記3つの固有値のうち、絶対値が最も大きい固有値と、絶対値が2番目に大きい固有値及び最も小さい固有値の積の平方根と、の比を、前記3次元の関心領域に含まれる線状構造体の前記コントラストとして算出してもよい。また、前記構造体検出部は、前記3つの固有値のうち、絶対値が最も大きい固有値及び2番目に大きい固有値の積の平方根と、絶対値が最も小さい固有値と、の比を、前記3次元の関心領域における面状構造体の前記コントラストとして算出してもよい。 Further, the structure detection unit calculates a ratio between the eigenvalue having the largest absolute value and the square root of the product of the second largest eigenvalue and the smallest eigenvalue among the three eigenvalues. The contrast of the linear structure included in the region of interest may be calculated. Further, the structure detection unit is configured to calculate a ratio of a square root of a product of the largest eigenvalue and the second largest eigenvalue among the three eigenvalues and a eigenvalue having the smallest absolute value to the three-dimensional eigenvalue. The contrast of the planar structure in the region of interest may be calculated.
 更に、前記フィルタ処理部は、前記3つの固有値のうち、絶対値が最も大きい固有値と、絶対値が2番目に大きい固有値と、に基づいた指向性を有する面フィルタを生成してもよい。 Furthermore, the filter processing unit may generate a surface filter having directivity based on the eigenvalue having the largest absolute value and the eigenvalue having the second largest absolute value among the three eigenvalues.
 また、前記構造体検出部は、前記3次元の関心領域の走行方向を、当該3次元の関心領域の周辺に位置する関心領域から算出される走行方向を用いて重み付け平均して求めてもよい。 Further, the structure detection unit may obtain the traveling direction of the three-dimensional region of interest by weighted averaging using the traveling direction calculated from the region of interest located around the three-dimensional region of interest. .
 また、前記構造体検出部は、前記3次元の関心領域から算出される線状構造体のコントラスト又は面状構造体のコントラストを、当該3次元の関心領域の周辺に位置する関心領域から算出される線状構造体のコントラスト又は面状構造体のコントラストを用いて重み付け平均して求めてもよい。 Further, the structure detection unit calculates a linear structure contrast or a planar structure contrast calculated from the three-dimensional region of interest from a region of interest located around the three-dimensional region of interest. It may be obtained by weighted averaging using the contrast of the linear structure or the contrast of the planar structure.
 また、本実施形態に係る医用画像撮像装置は、被検体を撮像して画像データを生成する撮像部と、前記画像データを基に再構成演算を行い、被検体の3次元再構成像を生成する画像再構成部と、前記3次元再構成像を含む医用画像上において、着目画素を中心とする3次元の関心領域を設定する関心領域設定部と、前記3次元の関心領域に含まれる画素値の分布を基に、前記3次元の関心領域に含まれる構造体の走行方向を検出する構造体検出部と、前記構造体の走行方向に沿った形状を有する3次元空間フィルタを設定し、その3次元空間フィルタを用いて、前記3次元再構成像に対しノイズ除去処理を行うフィルタ処理部と、前記ノイズ除去処理後の3次元再構成像を表示するための制御を行う表示制御部と、を有することを特徴とする。 In addition, the medical imaging apparatus according to the present embodiment generates a three-dimensional reconstructed image of a subject by performing a reconstruction operation based on the imaging unit that captures the subject and generates image data, and the image data. A region of interest setting unit for setting a three-dimensional region of interest around the pixel of interest on a medical image including the three-dimensionally reconstructed image, and a pixel included in the three-dimensional region of interest Based on the distribution of values, a structure detection unit that detects the traveling direction of the structure included in the three-dimensional region of interest, and a three-dimensional spatial filter having a shape along the traveling direction of the structure, Using the three-dimensional spatial filter, a filter processing unit that performs noise removal processing on the three-dimensional reconstruction image, and a display control unit that performs control for displaying the three-dimensional reconstruction image after the noise removal processing, It is characterized by having.
 前記撮像部は、X線を発生させるX線源と、前記X線を検出する検出素子を2次元アレイ状に並べて構成したX線平面検出器と、を備え、前記関心領域設定部は、前記検出素子の受像面を360°回転して形成される再構成視野の形状と、同一形状からなる前記3次元の関心領域を設定してもよい。以下、本発明の実施形態について図面を用いて詳述する。 The imaging unit includes an X-ray source that generates X-rays, and an X-ray flat panel detector configured by arranging detection elements that detect the X-rays in a two-dimensional array, and the region of interest setting unit includes the The three-dimensional region of interest having the same shape as the shape of the reconstruction field formed by rotating the image receiving surface of the detection element by 360 ° may be set. Hereinafter, embodiments of the present invention will be described in detail with reference to the drawings.
 <第一実施形態>
 第一実施形態は、3次元再構成像上において、着目画素を中心とする3次元関心領域内の構造体の走行方向を検出し、その走行方向に沿った3次元空間フィルタを設定し、ノイズ除去処理を行う実施形態である。以下、図1乃至図4に基づいて、第一実施形態について説明する。図1は、本実施形態に係る医用画像処理装置の機能ブロック図である。図2は、第一実施形態に係る医用画像処理装置の処理の流れを示すフローチャートである。図3は、図2のステップS3の処理の詳細を示すフローチャートである。図4は、第一実施形態に係る医用画像処理装置の処理の内容を示す説明図である。
<First embodiment>
The first embodiment detects a traveling direction of a structure within a three-dimensional region of interest centered on a pixel of interest on a three-dimensional reconstruction image, sets a three-dimensional spatial filter along the traveling direction, and generates noise. It is embodiment which performs a removal process. Hereinafter, the first embodiment will be described with reference to FIGS. FIG. 1 is a functional block diagram of the medical image processing apparatus according to the present embodiment. FIG. 2 is a flowchart showing a process flow of the medical image processing apparatus according to the first embodiment. FIG. 3 is a flowchart showing details of the process in step S3 of FIG. FIG. 4 is an explanatory diagram showing the contents of processing of the medical image processing apparatus according to the first embodiment.
 図1に示す医用画像処理装置1は、3次元再構成画像に対し、3次元空間フィルタを設定してノイズ除去処理を行う制御演算部20と、各種パラメータや指示を入力するためのマウス、キーボード、あるいはトラックボール等からなる情報入力装置30と、画像を表示する表示装置40と、を備える。制御演算部20は、ノイズ除去処理の対象となる3次元再構成画像を取得する再構成画像取得部21と、その3次元再構成像上の着目画素を中心とする3次元の関心領域(以下「3次元関心領域」という)を設定する関心領域設定部22と、3次元関心領域に含まれる構造体の走行方向を検出する構造体検出部23と、検出された構造体の走行方向に基づいて3次元空間フィルタを設定し、ノイズ除去処理を行うフィルタ処理部24と、フィルタ処理後の3次元再構成像の表示制御を行う表示制御部25と、を備える。上記再構成画像取得部21は、被検体を撮像して画像データを生成する撮像部10から画像データを取得し、これを基に再構成演算を行い、3次元再構成画像を生成する画像再構成部として構成してもよいし、予め生成された3次元再構成画像を記憶した画像記憶部(又は画像データベース)11から、ノイズ除去の対象となる3次元再構成画像を読み出したり、受信したりする画像読出部として構成してもよい。 A medical image processing apparatus 1 shown in FIG. 1 includes a control calculation unit 20 that performs noise removal processing on a three-dimensional reconstructed image by setting a three-dimensional spatial filter, and a mouse and a keyboard for inputting various parameters and instructions. Or an information input device 30 composed of a trackball or the like, and a display device 40 for displaying an image. The control calculation unit 20 includes a reconstructed image acquisition unit 21 that acquires a three-dimensional reconstructed image that is a target of noise removal processing, and a three-dimensional region of interest (hereinafter referred to as a target pixel on the three-dimensional reconstructed image). A region of interest setting unit 22 for setting a 3D region of interest), a structure detecting unit 23 for detecting a traveling direction of a structure included in the three-dimensional region of interest, and a detected traveling direction of the structure A filter processing unit 24 that sets a three-dimensional spatial filter and performs noise removal processing, and a display control unit 25 that performs display control of the three-dimensional reconstructed image after the filter processing. The reconstructed image acquisition unit 21 acquires image data from the imaging unit 10 that captures an image of a subject and generates image data, performs a reconstruction operation based on the acquired image data, and generates an image reconstruction image that generates a three-dimensional reconstructed image. It may be configured as a configuration unit, or read or receive a 3D reconstructed image to be subjected to noise removal from an image storage unit (or image database) 11 that stores a 3D reconstructed image generated in advance. Alternatively, the image reading unit may be configured.
 次に、図2及び図3のステップに沿って、医用画像処理装置1の処理内容について説明する。 Next, the processing contents of the medical image processing apparatus 1 will be described along the steps of FIG. 2 and FIG.
 (ステップS1)
 再構成画像取得部21が、ノイズ除去処理の対象となる3次元再構成画像を読み込む、又は生成する(S1)。
(Step S1)
The reconstructed image acquisition unit 21 reads or generates a three-dimensional reconstructed image that is a target of noise removal processing (S1).
 (ステップS2)
 関心領域設定部22が、ステップS1で読み込み、または生成した被検体の3次元再構成上の着目画素を中心とする3次元関心領域を設定する(S2)。以下の説明では、着目画素の座標を(I,j,k)、着目画素の画素値をv(I,j,k)と表す。そして、着目画素を中心とする5×5×5画素からなる立方体形状の3次元関心領域を設定するが、3次元関心領域に含まれる画素数や3次元関心領域の形状はこれらに限定されない。例えば、本実施形態に係る医用画像処理装置1をX線CT装置に搭載する場合、3次元関心領域の形状を、上記X線CT装置に搭載されるX線検出器の検出素子のX線受像面を360°回転して得られる再構成視野の形状と同一形状をなすように構成してもよい。例えば、X線受像面が矩形状の場合、3次元関心領域は、上記矩形状面を回転させて構成される円柱状に構成してもよい。また、X線検出器としてイメージインテンシファイアを用い、そのX線受像面が円形である場合には、3次元再構成領域が球形となるので、3次元関心領域の形状も球形状に構成してもよい。
(Step S2)
The region-of-interest setting unit 22 sets a three-dimensional region of interest centered on the pixel of interest on the three-dimensional reconstruction of the subject read or generated in step S1 (S2). In the following description, the coordinates of the pixel of interest are represented as (I, j, k), and the pixel value of the pixel of interest is represented as v (I, j, k). Then, a cubic three-dimensional region of interest consisting of 5 × 5 × 5 pixels centered on the pixel of interest is set, but the number of pixels included in the three-dimensional region of interest and the shape of the three-dimensional region of interest are not limited to these. For example, when the medical image processing apparatus 1 according to this embodiment is mounted on an X-ray CT apparatus, the shape of a three-dimensional region of interest is determined by an X-ray image of a detection element of an X-ray detector mounted on the X-ray CT apparatus. You may comprise so that the shape of the reconstruction visual field obtained by rotating a surface 360 degrees may be the same shape. For example, when the X-ray image receiving surface is rectangular, the three-dimensional region of interest may be configured in a cylindrical shape configured by rotating the rectangular surface. In addition, when an image intensifier is used as the X-ray detector and the X-ray receiving surface is circular, the 3D reconstruction area is spherical, so the 3D region of interest is also configured in a spherical shape. May be.
 (ステップS3)
 構造体検出部23は、3次元関心領域内の画素値の分布を基に、当該3次元関心領域における構造体の走行方向を検出する(S3)。本実施形態では、3次元関心領域内の画素値の分布を、共分散行列の固有値及び固有ベクトルを用いて検出する処理例を挙げて説明する。本ステップの処理を、図3のステップ順に沿って説明する。
(Step S3)
The structure detection unit 23 detects the traveling direction of the structure in the three-dimensional region of interest based on the distribution of pixel values in the three-dimensional region of interest (S3). In the present embodiment, an example of processing for detecting a distribution of pixel values in a three-dimensional region of interest using eigenvalues and eigenvectors of a covariance matrix will be described. The processing of this step will be described along the order of steps in FIG.
 (ステップS3-1)
 構造体検出部23は、3次元関心領域内に含まれる各画素の座標値の偏差(モーメント)と画素値との積を用い、座標値の偏差(モーメント)を画素値で重み付した3次元関心領域の共分散を求め、3×3の共分散行列を生成する(S3-1)。下式(1)に3×3の共分散行列を示す。
(Step S3-1)
The structure detection unit 23 uses a product of the deviation (moment) of the coordinate value of each pixel included in the three-dimensional region of interest and the pixel value, and weights the deviation (moment) of the coordinate value with the pixel value. The covariance of the region of interest is obtained, and a 3 × 3 covariance matrix is generated (S3-1). The following equation (1) shows a 3 × 3 covariance matrix.
Figure JPOXMLDOC01-appb-M000001
となる。
 そこで、以下(1)の共分散行列を式(1)’で表すこととする。
Figure JPOXMLDOC01-appb-M000001
It becomes.
Therefore, the following covariance matrix (1) is expressed by equation (1) ′.
Figure JPOXMLDOC01-appb-M000002
Figure JPOXMLDOC01-appb-M000002
 (ステップS3-2)
 構造体検出部23は、3×3の共分散行列の固有方程式を解き、固有値及び各固有値に応じた固有ベクトルを求める(S3-2)。そのため、まず上記の共分散行列(1)’から、下式(2)に示す固有方程式を得る。
(Step S3-2)
The structure detection unit 23 solves the eigen equation of the 3 × 3 covariance matrix and obtains the eigen value and the eigen vector corresponding to each eigen value (S3-2). Therefore, first, an eigen equation shown in the following equation (2) is obtained from the covariance matrix (1) ′.
Figure JPOXMLDOC01-appb-M000003
Figure JPOXMLDOC01-appb-M000003
 式(2)の固有方程式を展開し、3つの固有値λ=(α,β,γ)に対する3次方程式(下式(3)参照)を得る。 Develop the eigen equation of equation (2) to obtain a cubic equation (see equation (3) below) for the three eigenvalues λ = (α, β, γ).
Figure JPOXMLDOC01-appb-M000004
Figure JPOXMLDOC01-appb-M000004
 3次方程式a・λ3+b・λ2+c・λ+d=0の解λ=(α,β,γ)は、1の3乗根をωとすると、下式(4)で与えられる。 The solution λ = (α, β, γ) of the cubic equation a · λ 3 + b · λ 2 + c · λ + d = 0 is given by the following equation (4), where the cube root of 1 is ω.
Figure JPOXMLDOC01-appb-M000005
Figure JPOXMLDOC01-appb-M000005
 上記(4)のλの式において、内側の平方根(R2-Q3)を(-22・33)倍したものは、次式(5)で定義される3次方程式の判別式Dとなっている。 In the formula of λ in the above (4), the inner root (R 2 -Q 3) (-2 2 · 3 3) multiplied by ones, discriminant D of the cubic equation is defined by the following equation (5) It has become.
Figure JPOXMLDOC01-appb-M000006
Figure JPOXMLDOC01-appb-M000006
 3次方程式の判別式(式(5))において、a、b、c、dは上記式(4)の但し書きのように規定されており、正(特別な場合は0)となる。したがって、3次方程式の解(式(4)参照)の公式における2つの立方根の中は、共役複素数(特別な場合は等しい値R)となり、上式(2)の固有方程式を解いて得られる3つの固有値λ(=α,β,γ)は、共分散行列の値によらず、全て実数である。固有値3つの固有値λ(=α,β,γ)は、下式(6)のように書ける。 In the discriminant of the cubic equation (formula (5)), a, b, c, and d are defined as in the proviso of the above formula (4) and are positive (0 in special cases). Therefore, the inside of the two cube roots in the formula of the cubic equation solution (see equation (4)) is a conjugate complex number (equal value R in the special case), and is obtained by solving the eigen equation of equation (2) above The three eigenvalues λ (= α, β, γ) are all real numbers regardless of the value of the covariance matrix. Eigenvalues The three eigenvalues λ (= α, β, γ) can be written as in the following equation (6).
Figure JPOXMLDOC01-appb-M000007
Figure JPOXMLDOC01-appb-M000007
 次に、3つの固有値α,β,γに対する固有ベクトルを求める。
 固有ベクトルを
Next, eigenvectors for the three eigenvalues α, β, and γ are obtained.
The eigenvector
Figure JPOXMLDOC01-appb-M000008
とおく。式(7)の3つの固有ベクトルはそれぞれ
Figure JPOXMLDOC01-appb-M000008
far. The three eigenvectors of Equation (7) are
Figure JPOXMLDOC01-appb-M000009
をみたす。
 式(7)の3つの固有ベクトルを列にもつような、次式(9)の3×3の直交行列Uを構成すると、
Figure JPOXMLDOC01-appb-M000009
Meet.
When the 3 × 3 orthogonal matrix U of the following equation (9), which has three eigenvectors of the equation (7) in a column, is configured,
Figure JPOXMLDOC01-appb-M000010
Figure JPOXMLDOC01-appb-M000010
 共分散行列(1)は、直交行列Uと、Uを転置することにより得られる逆行列U-1を用いて、対角化される(式(10))。 The covariance matrix (1) is diagonalized using an orthogonal matrix U and an inverse matrix U −1 obtained by transposing U (formula (10)).
Figure JPOXMLDOC01-appb-M000011
Figure JPOXMLDOC01-appb-M000011
 式(9)の直交行列Uは、オイラー(Euler)パラメータと呼ばれる4つのパラメータe0,e1,e2,e3(但し、e0 2+e1 2+e2 2+e3 2=1)を用いて、式(11)で表される。 The orthogonal matrix U in Equation (9) has four parameters e 0 , e 1 , e 2 , e 3 (where e 0 2 + e 1 2 + e 2 2 + e 3 2 = 1) called Euler parameters. And is represented by Formula (11).
Figure JPOXMLDOC01-appb-M000012
Figure JPOXMLDOC01-appb-M000012
 直交行列Uは、3次元空間でベクトル(l,m,n)を回転軸とする角度φの回転行列となっている。式(11)のオイラーパラメータe0,e1,e2,e3と、回転軸(l,m,n)、回転角度φは、次式の関係式(12)で結びつけられる。 The orthogonal matrix U is a rotation matrix having an angle φ with a vector (l, m, n) as a rotation axis in a three-dimensional space. The Euler parameters e 0 , e 1 , e 2 , e 3 of the equation (11), the rotation axis (l, m, n), and the rotation angle φ are linked by the following equation (12).
Figure JPOXMLDOC01-appb-M000013
Figure JPOXMLDOC01-appb-M000013
 式(7)の3つの固有ベクトルは直交する。以下、3つの固有値λ=(α,β,γ)をその絶対値が|α|>|β|>|γ|となるように並べる場合を考える。式(10)により、固有値αに対する固有ベクトルはX軸方向の単位ベクトルを直交行列Uで回転したもの、固有値βに対する固有ベクトルはY軸方向の単位ベクトルを直交行列Uで回転したもの、固有値γに対する固有ベクトルはZ軸方向の単位ベクトルを直交行列Uで回転したもの、として求められる。 The three eigenvectors of Equation (7) are orthogonal. In the following, consider a case where three eigenvalues λ = (α, β, γ) are arranged so that their absolute values are | α |> | β |> | γ |. According to Equation (10), the eigenvector for the eigenvalue α is the unit vector in the X-axis direction rotated by the orthogonal matrix U, the eigenvector for the eigenvalue β is the unit vector in the Y-axis direction rotated by the orthogonal matrix U, and the eigenvector for the eigenvalue γ Is obtained by rotating the unit vector in the Z-axis direction by the orthogonal matrix U.
 以上の演算により、共分散行列(1)から、3つの固有値と固有ベクトルが求まり、ステップS2で設定した3次元関心領域での、構造体の走行方向を決定することができる。 Through the above calculation, three eigenvalues and eigenvectors are obtained from the covariance matrix (1), and the traveling direction of the structure in the three-dimensional region of interest set in step S2 can be determined.
 (ステップS3-3)
 構造体検出部23は、ステップS3-2で求めた3つの固有値及び固有ベクトルを用いて、構造体の走行方向を検出する。ここでいう構造体は、線状の構造体及び面状の構造体を含む。線状構造体には、例えば、血管やカテーテル、及びカテーテルをガイドするガイドワイヤーなどがあり、面状の構造体には、骨、臓器の表面や、軟部組織の境界面などがある。以下、面状の構造体とは、臓器や組織の境界面のことを意味しており、必ずしも平らな形状の構造体を指しているものではないことを付記しておく。本実施形態では、まず線状構造体の有無を判断し(S3-3)、続いて面状構造体の有無を判断するが、この順序を入れ替えて、面状構造体の有無を判断し、その後線状構造体の有無を判断してもよい。
(Step S3-3)
The structure detection unit 23 detects the traveling direction of the structure using the three eigenvalues and eigenvectors obtained in step S3-2. The structure here includes a linear structure and a planar structure. Examples of the linear structure include a blood vessel, a catheter, and a guide wire that guides the catheter. Examples of the planar structure include a bone, an organ surface, and a soft tissue boundary surface. Hereinafter, it should be noted that the planar structure means a boundary surface between organs and tissues and does not necessarily indicate a flat structure. In the present embodiment, first, the presence or absence of a linear structure is determined (S3-3), and then the presence or absence of a planar structure is determined, but this order is changed to determine the presence or absence of a planar structure, Thereafter, the presence or absence of the linear structure may be determined.
 線状構造体の有無を判断するために、構造体検出部23は、3つの固有値λ(=α,β,γ)の絶対値のうち、最も大きいものと2番目に大きいものと、を比較し、その比が第一閾値TH1以上か否かを判断する。第一閾値TH1が小さいほど、線状構造体をより多く認識しやすくなる一方、第一閾値TH1が大きいほど、より鮮明に撮像された線状構造体のみを認識しやすくなる。肯定であればステップS3-4へ、否定であればステップS3-5へ進む。 In order to determine the presence or absence of a linear structure, the structure detection unit 23 compares the largest value and the second largest value among the absolute values of the three eigenvalues λ (= α, β, γ). and, the ratio is determined whether the first threshold value TH 1 or more. As the first threshold TH 1 is smaller, it becomes easier to recognize more linear structures. On the other hand, as the first threshold TH 1 is larger, it becomes easier to recognize only the linear structures that are captured more clearly. If affirmative, the process proceeds to step S3-4, and if negative, the process proceeds to step S3-5.
 (ステップS3-4、S3-5)
 面状構造体の有無の判断を行う(S3-4,S3-5)。構造体検出部23は、3つの固有値λ(=α,β,γ)の絶対値のうち、最も大きいものと最も小さいものとの比、2番目に大きいものと最も小さいものとの比が、共に第二閾値TH2以上であるか否かを判断する。第二閾値TH2が小さいほど、面状構造体をより多く認識しやすくなる一方、第二閾値TH2が大きいほど、より鮮明に撮像された面状構造体を認識しやすくなる。肯定であればステップS3-6へ(S3-5の場合はS3-8へ)、否定であればステップS3-7へ(S3-5の場合はS3-9へ)進む。
(Steps S3-4, S3-5)
The presence / absence of the planar structure is determined (S3-4, S3-5). The structure detection unit 23 has the ratio of the largest and smallest of the absolute values of the three eigenvalues λ (= α, β, γ), and the ratio of the second largest to the smallest. determining whether both are the second threshold value TH 2 or more. As the second threshold TH 2 is smaller, more planar structures are more easily recognized. On the other hand, as the second threshold TH 2 is larger, it is easier to recognize planar structures that have been captured more clearly. If the determination is affirmative, the process proceeds to step S3-6 (in the case of S3-5, the process proceeds to S3-8). If the determination is negative, the process proceeds to step S3-7 (in the case of S3-5, the process proceeds to S3-9).
 (ステップS3-6)
 線状構造体及び面状構造体の双方が含まれていると判断する(S3-6)。構造体検出部23は、上記3つの固有値λ(=α,β,γ)のうち、絶対値が最も大きい固有値に対応する固有ベクトルの方向を、3次元関心領域内にある線状構造体の走行方向として算出する。更に、構造体検出部23は、上記3つの固有値λ(=α,β,γ)のうち、絶対値が最も小さい固有値に対応する固有ベクトルに直交する平面を、3次元関心領域における面状構造体の走行方向として三種tる検出する。
(Step S3-6)
It is determined that both the linear structure and the planar structure are included (S3-6). The structure detection unit 23 travels the direction of the eigenvector corresponding to the eigenvalue having the largest absolute value among the above three eigenvalues λ (= α, β, γ) of the linear structure in the three-dimensional region of interest. Calculate as direction. Furthermore, the structure detection unit 23 calculates a plane structure orthogonal to the eigenvector corresponding to the eigenvalue having the smallest absolute value among the three eigenvalues λ (= α, β, γ) in the planar structure in the three-dimensional region of interest. Detect three types of travel directions.
 (ステップS3-7)
 構造体検出部23は、線状構造体のみが含まれていると判断し、上記同様、線状構造体の走行方向を算出する(S3-7)。
(Step S3-7)
The structure detection unit 23 determines that only the linear structure is included, and similarly calculates the traveling direction of the linear structure (S3-7).
 (ステップS3-8)
 構造体検出部23は、面状構造体のみが含まれると判断し、上記同様、面状構造体の走行方向を算出する(S3-8)。
(Step S3-8)
The structure detection unit 23 determines that only the planar structure is included, and calculates the traveling direction of the planar structure as described above (S3-8).
 (ステップS3-9)
 構造体検出部23は、3次元関心領域内に、構造体が含まれないと判断する(S3-9)。
(Step S3-9)
The structure detection unit 23 determines that no structure is included in the three-dimensional region of interest (S3-9).
 (ステップS4)
 フィルタ処理部24は、構造体の走行方向に沿った形状の3次元空間フィルタを生成する(S4)。図4を基に、本ステップのフィルタ処理の内容を説明する。図4のフィルタ処理前の3次元再構成画像40に、3次元関心領域41a,41b,41c,41dが設定されているとする。3次元関心領域41a内には線状構造体42aのみが含まれ、3次元関心領域41b内には線状構造体及び面状構造体42bが含まれ、3次元関心領域41c内には面状構造体42cのみが含まれる。また、3次元関心領域41d内には構造体が含まれていない。3次元関心領域41a,41b,41c,41dに対してステップS3の処理を行うと、3次元関心領域41aに対してはステップS3-7、3次元関心領域41bに対してはステップS3-6、3次元関心領域41cに対してはステップS3-8、3次元関心領域41dに対してはステップS3-9との判断結果が得られる。
(Step S4)
The filter processing unit 24 generates a three-dimensional spatial filter having a shape along the traveling direction of the structure (S4). Based on FIG. 4, the contents of the filtering process in this step will be described. It is assumed that three-dimensional regions of interest 41a, 41b, 41c, and 41d are set in the three-dimensional reconstructed image 40 before the filter processing in FIG. Only the linear structure 42a is included in the 3D region of interest 41a, the linear structure and the planar structure 42b are included in the 3D region of interest 41b, and the planar shape is included in the 3D region of interest 41c. Only the structure 42c is included. In addition, no structure is included in the three-dimensional region of interest 41d. When the process of step S3 is performed on the three-dimensional region of interest 41a, 41b, 41c, 41d, step S3-7 is performed on the three-dimensional region of interest 41a, and step S3-6 is performed on the three-dimensional region of interest 41b. The determination result of step S3-8 is obtained for the three-dimensional region of interest 41c, and step S3-9 is obtained for the three-dimensional region of interest 41d.
 そこで、フィルタ処理部24は、3次元関心領域41aに対しては線フィルタを、3次元関心領域41bに対しては線フィルタ及び面フィルタを、3次元関心領域41cに対しては面フィルタを、3次元関心領域41dに対しては、構造の形状を考慮しない3次元平滑化フィルタを設定する。 Therefore, the filter processing unit 24 is a line filter for the three-dimensional region of interest 41a, a line filter and a surface filter for the three-dimensional region of interest 41b, a surface filter for the three-dimensional region of interest 41c, A three-dimensional smoothing filter that does not consider the shape of the structure is set for the three-dimensional region of interest 41d.
 3次元空間フィルタは、3次元関心領域に含まれる画素の画素値をコンボリューション演算して注目画素の画素値を決定する際に、各画素に設定する重み係数を決定する関数であり、線フィルタでは、線状構造体の走行方向に平行な軸方向のコンボリューション演算を行うように重み係数が決定され、面フィルタでは、面状構造体の走行方向に平行な面方向のコンボリューション演算を行うように重み係数が決定される。具体的には、線フィルタは、関心領域に含まれる画素のうち、注目画素から線状構造体の走行方向に平行な方向に並ぶ画素の画素値に対し、相対的に重みが重くなるような重み係数w1(w1>0)を設定し、それ以外の画素には相対的に重みが小さくなるような重み係数w2(w1>w2、またはw2=0)を設定した関数である。面フィルタは、関心領域に含まれる画素のうち、注目画素を中心として面状構造体の走行方向(面の広がり方向)に位置する画素の画素値に対し、相対的に重みが重くなるような重み係数を設定し、それ以外の画素には相対的に重みが小さくなるような重み係数を設定した関数である。 The 3D spatial filter is a function that determines the weighting coefficient to be set for each pixel when the pixel value of the pixel of interest is determined by convolution calculation of the pixel value of the pixel included in the 3D region of interest. Then, the weighting factor is determined so as to perform the convolution calculation in the axial direction parallel to the traveling direction of the linear structure, and the surface filter performs the convolution calculation in the surface direction parallel to the traveling direction of the planar structure. Thus, the weighting factor is determined. Specifically, the line filter is such that among the pixels included in the region of interest, the pixel value of the pixels arranged in the direction parallel to the traveling direction of the linear structure from the target pixel is relatively heavy. A function in which a weighting factor w 1 (w 1 > 0) is set and a weighting factor w 2 (w 1 > w 2 or w 2 = 0) is set so that the weight is relatively small for other pixels. It is. The surface filter is such that, among the pixels included in the region of interest, the weight is relatively heavy with respect to the pixel value of the pixel located in the traveling direction (surface spreading direction) of the planar structure around the pixel of interest. This is a function in which a weighting coefficient is set and a weighting coefficient is set so that the weight is relatively small for other pixels.
 次いで、フィルタ処理部24は、3次元関心領域41a,41b,41c,41dに含まれる各画素に対し、それぞれ設定した3次元空間フィルタの重み係数を乗算し、コンボリューション演算を行う。その結果得られる値を、着目画素Pのフィルタ後の画素値として算出する。図4に示す関心領域41bのように、領域内に線状構造体と面状構造体の両方が含まれると判断された場合には(多くの組織がこの場合に該当する)、線フィルタと面フィルタの両方を適用する。この場合、検査の対象に応じて、線フィルタと面フィルタの強度を異ならせていずれかを優先することも可能である。フィルタの強度の設定については別の実施形態で説明する。 Next, the filter processing unit 24 performs a convolution operation by multiplying each pixel included in the three-dimensional region of interest 41a, 41b, 41c, 41d by the weighting factor of the set three-dimensional spatial filter. The value obtained as a result is calculated as a filtered pixel value of the pixel of interest P. If it is determined that the region includes both a linear structure and a planar structure (like many tissues in this case), as shown in the region of interest 41b in FIG. 4, a line filter and Apply both area filters. In this case, it is possible to give priority to one of the line filters and the surface filters with different strengths depending on the inspection target. The setting of the filter strength will be described in another embodiment.
 (ステップS5)
 フィルタ処理部24は、3次元再構成像の全画素について、ステップS2からS4までの処理が終了したか否かを判断する。肯定であれば、ステップS6へ進み、否定であればステップS2へ戻り、以後の処理を繰り返す(S5)。すなわち、3次元関心領域を、3次元再構成画像の全領域に亘って走査しながら、ステップS2からステップS4の処理を実行する。
(Step S5)
The filter processing unit 24 determines whether or not the processing from steps S2 to S4 has been completed for all the pixels of the three-dimensional reconstructed image. If the determination is affirmative, the process proceeds to step S6. If the determination is negative, the process returns to step S2, and the subsequent processing is repeated (S5). That is, the processing from step S2 to step S4 is executed while scanning the three-dimensional region of interest over the entire region of the three-dimensional reconstructed image.
 (ステップS6)
 表示制御部25は、フィルタ処理後の3次元再構成像を表示装置80の画面に表示する(S6)。
(Step S6)
The display control unit 25 displays the filtered three-dimensional reconstructed image on the screen of the display device 80 (S6).
 本実施形態によれば、3次元関心領域内の構造体の走行方向に沿った形状の3次元方向の空間型フィルタを用いてノイズ除去処理を行うことにより、背景領域(3次元関心領域内における構造体を除く領域)のノイズのみを選択的に低減する。そのため、線状・面状構造のコントラストを落とさずに、3次元再構成像のノイズ成分を選択的に低減することができる。換言すれば、相対的に、線状構造体及び面状構造体の抽出能を高めることができる。 According to the present embodiment, by performing noise removal processing using a spatial filter in a three-dimensional direction having a shape along the traveling direction of the structure in the three-dimensional region of interest, the background region (in the three-dimensional region of interest Only the noise in the area (excluding the structure) is selectively reduced. Therefore, it is possible to selectively reduce the noise component of the three-dimensional reconstructed image without reducing the contrast of the linear / planar structure. In other words, the ability to extract linear structures and planar structures can be relatively enhanced.
 なお、図2のフローチャートでは、ステップS5で全画素の処理が終了すると、フィルタ処理後の再構成画像を表示するとしたが、全画素の処理が終了した後に、再度、ステップS1へ戻り、一つの3次元再構成像に対して複数回のフィルタ処理を施してもよい。この場合にも、本実施形態によれば、背景領域に対し、選択的にノイズを除去するので、複数回フィルタ処理を行っても、構造体の抽出能を高めることができる。 In the flowchart of FIG. 2, when all the pixels are processed in step S5, the reconstructed image after the filter processing is displayed. However, after all the pixels are processed, the process returns to step S1 again, Multiple filtering processes may be performed on the three-dimensional reconstructed image. Also in this case, according to the present embodiment, noise is selectively removed from the background region, so that the ability to extract a structure can be improved even if filter processing is performed a plurality of times.
 <第二実施形態>
 第二実施形態は、3次元空間フィルタ内で、上記重み係数w1を変化させる実施形態である。重み係数w1の変化のさせ方には、主に以下の3つの態様がある。
<Second embodiment>
The second embodiment is an embodiment in which the weight coefficient w 1 is changed in a three-dimensional spatial filter. The the way of change of the weighting coefficients w 1, there are mainly the following three aspects.
 (2-1)3次元関心領域内の構造体と、背景領域と、のコントラストを用いる態様
 構造体検出部23は、3次元関心領域の共分散行列の3つの固有値のうち、絶対値が最も大きい固有値と、絶対値が2番目に大きい固有値及び最も小さい固有値の積の平方根と、の比を、3次元関心領域における線状構造体のコントラストC1(以下「線構造コントラスト」という)として算出する(下式(13)参照)。
(2-1) A mode in which the contrast between the structure in the three-dimensional region of interest and the background region is used The structure detector 23 has the largest absolute value among the three eigenvalues of the covariance matrix of the three-dimensional region of interest. Calculate the ratio of the large eigenvalue to the square root of the product of the second largest eigenvalue and the smallest eigenvalue as the contrast C 1 of the linear structure in the 3D region of interest (hereinafter referred to as “line structure contrast”) (Refer to the following formula (13)).
Figure JPOXMLDOC01-appb-M000014
但し、|α|>|β|>|γ|
Figure JPOXMLDOC01-appb-M000014
However, | α |> | β |> | γ |
 また、構造体検出部23は、3次元関心領域の共分散行列の3つの固有値のうち、絶対値が最も大きい固有値及び2番目に大きい固有値の積の平方根と、絶対値が最も小さい固有値と、の比を、3次元関心領域における面状構造体のコントラストC2(以下「面構造コントラスト」という)として算出する(下式(14)参照)。 Further, the structure detection unit 23, among the three eigenvalues of the covariance matrix of the three-dimensional region of interest, the square root of the product of the eigenvalue having the largest absolute value and the second largest eigenvalue, the eigenvalue having the smallest absolute value, Is calculated as the contrast C 2 of the planar structure in the three-dimensional region of interest (hereinafter referred to as “plane structure contrast”) (see the following equation (14)).
Figure JPOXMLDOC01-appb-M000015
 但し、|α|>|β|>|γ|
Figure JPOXMLDOC01-appb-M000015
However, | α |> | β |> | γ |
 線構造コントラストC1及び面構造コントラストC2が大きいほど、3次元関心領域内の構造体の画素値と、背景領域画素の画素値と、の差が大きいことを意味する。そこで、フィルタ処理部24は、線構造コントラストC1が大きいほど重み係数w1と重み係数w2との差が大きくなるように、重み係数を設定する。例えば、重み係数w1を相対的に「1」に近い値とし、重み係数w2を相対的に「0」に近い値または0とする。面構造コントラストC2についても同様である。これにより、線構造コントラストC1及び面構造コントラストC2が大きい3次元関心領域内では、構造体を構成する画素の画素値の寄与率をより高くして、コンボリューション演算後の着目画素の画素値を算出することができ、構造体の鮮明度の低下を抑止しつつ、ノイズ除去が行える。 The larger the line structure contrast C 1 and the surface structure contrast C 2 , the larger the difference between the pixel value of the structure in the three-dimensional region of interest and the pixel value of the background region pixel. Therefore, the filter processing unit 24, so that the difference between the line structures contrast C 1 larger the weighting coefficients w 1 and the weight coefficient w 2 is large, it sets the weighting factor. For example, the weight coefficient w 1 is set to a value relatively close to “1”, and the weight coefficient w 2 is set to a value relatively close to “0” or 0. The same applies to the foliation contrast C 2. Thereby, in the three-dimensional region of interest where the line structure contrast C 1 and the surface structure contrast C 2 are large, the pixel value of the pixel of interest after the convolution calculation is increased by increasing the contribution ratio of the pixel values of the pixels constituting the structure. The value can be calculated, and noise can be removed while suppressing a decrease in the definition of the structure.
 (2-2)面状構造体内で重み係数w1を変化させる態様
 面フィルタは、絶対値が最も大きい固有値と、絶対値が2番目に大きい固有値と、に基づいた指向性を有してもよい。例えば、絶対値が最も大きい固有値に対する固有ベクトルに沿って位置する画素の画素値には重み係数w11(w11>0)を、絶対値が2番目に大きい固有値に対する固有ベクトルに沿って位置する画素の画素値には重み係数w12(0<w12<w11)を乗算してもよい。これにより、同一面においても、指向性がより強い画素値の寄与率を相対的に高くしつつ、コンボリューション演算後の着目画素の画素値を算出することができる。
(2-2) A mode in which the weighting coefficient w 1 is changed in the planar structure The surface filter has directivity based on the eigenvalue having the largest absolute value and the eigenvalue having the second largest absolute value. Good. For example, the weight value w 11 (w 11 > 0) is used for the pixel value of the pixel located along the eigenvector for the eigenvalue having the largest absolute value, and the pixel value of the pixel located along the eigenvector for the eigenvalue having the second largest absolute value is used. The pixel value may be multiplied by a weighting factor w 12 (0 <w 12 <w 11 ). Thereby, even in the same plane, the pixel value of the pixel of interest after the convolution calculation can be calculated while relatively increasing the contribution ratio of the pixel value having higher directivity.
 (2-3)着目画素からの距離に応じて重み係数w1を変化させる態様
 重み係数w1は、着目画素からの距離に応じて、変化させてもよい。例えば、着目画素に近い画素の画素値に対して乗算する重み係数w1を相対的に高くしてもよい。これにより、3次元空間フィルタの内、より着目画素に近い画素の寄与率を高くしてコンボリューション演算後の着目画素の画素値を算出することができ、ノイズ除去処理による画像のボケを抑制することができる。この(2-3)に示す態様は、単独でも、又は(2-1)及び(2-2)の何れかと組み合わせも適用することができる。
(2-3) aspect weighting coefficients w 1 to change the weighting coefficients w 1, depending on the distance from the target pixel, depending on the distance from the target pixel may be changed. For example, it may be relatively high weighting factor w 1 for multiplying the pixel value of the pixel near the target pixel. As a result, the pixel value of the target pixel after the convolution calculation can be calculated by increasing the contribution ratio of the pixel closer to the target pixel in the three-dimensional spatial filter, and the blur of the image due to the noise removal processing is suppressed. be able to. The mode shown in (2-3) can be applied alone or in combination with any of (2-1) and (2-2).
 本実施形態によれば、3次元空間フィルタ内の画素に対して乗算する重み係数を、背景領域との差異や、面フィルタの方向性、及び着目画素からの距離に応じて変化させることができ、より、ボケを生じさせることなく、ノイズ除去処理を施すことができる。 According to this embodiment, the weighting coefficient to be multiplied with respect to the pixels in the three-dimensional spatial filter can be changed according to the difference from the background area, the directionality of the surface filter, and the distance from the target pixel. Thus, noise removal processing can be performed without causing blur.
 <第三実施形態>
 第三実施形態は、操作用GUIを用いて、線フィルタ及び面フィルタの強度を設定する実施形態である。以下、図5に基づいて、本実施形態について説明する。図5は、線フィルタ及び面フィルタの強度を設定する操作画面の一例を説明図である。
<Third embodiment>
The third embodiment is an embodiment in which the strength of a line filter and a surface filter is set using an operation GUI. Hereinafter, the present embodiment will be described based on FIG. FIG. 5 is an explanatory diagram illustrating an example of an operation screen for setting the strengths of the line filter and the surface filter.
 図5の操作画面50は、撮影部位、撮影条件毎に、線フィルタと面フィルタとの強度を設定するための画面である。操作画面50では、撮像部位選択部として、「頭部」51、「胸部」52、「腹部」53、四肢「54」の4つのタグがあり、更に、撮像部位を追加するための「撮像部位 追加」ボタン55が備えられており、図5は、「腹部」53が選択されている。また、撮影条件別に、「標準再構成像」欄61と「DSA再構成像」欄62と、が備えられる。 The operation screen 50 in FIG. 5 is a screen for setting the strength of the line filter and the surface filter for each imaging region and imaging condition. In the operation screen 50, there are four tags, “head” 51, “chest” 52, “abdomen” 53, and limb “54” as an imaging part selection unit, and “imaging part for adding an imaging part” An “add” button 55 is provided, and “abdomen” 53 is selected in FIG. Further, a “standard reconstructed image” column 61 and a “DSA reconstructed image” column 62 are provided for each photographing condition.
 「標準再構成像」欄61及び「DSA再構成像」欄62のそれぞれには、線フィルタの強度を設定する設定バー62、65と、面フィルタの強度を設定する設定バー63、66と、がある。図5の例は、設定可能な線フィルタの強度の最小値が0、最大値が1.0、設定可能な面フィルタの強度の最小値が0、最大値が1.0の場合を示している。各設定バー上には、各バーの軸方向に沿って移動可能なスライダー62a、63a、65a、66aが備えられ、スライダーの位置に該当する設定値が、設定値表示欄62b、63b、64b、65b内に表示される。なお、線フィルタの強度と面フィルタの強度は、それぞれ独立して設定してもよいし、一方が強い場合には他方を弱くするというように相関を持たせて設定してもよい。 In each of the “standard reconstructed image” column 61 and the “DSA reconstructed image” column 62, setting bars 62 and 65 for setting the line filter strength, and setting bars 63 and 66 for setting the surface filter strength, There is. The example of FIG. 5 shows a case where the minimum value of the settable line filter strength is 0 and the maximum value is 1.0, and the minimum value of the settable surface filter strength is 0 and the maximum value is 1.0. On each setting bar, sliders 62a, 63a, 65a, 66a that can move along the axial direction of each bar are provided, and setting values corresponding to the positions of the sliders are displayed in setting value display fields 62b, 63b, 64b, Displayed in 65b. Note that the strength of the line filter and the strength of the surface filter may be set independently, or may be set so as to have a correlation such that when one is strong, the other is weakened.
 スライダー62aは、標準再構成画像に対して用いられる3次元空間フィルタの線構造コントラストC1に対して乗じる第一の係数を設定するものであり、スライダー63aは、標準再構成画像に対して用いられる3次元空間フィルタの面構造コントラストC2に対して乗じる第二の係数を設定するものである。第一の係数が相対的に大きいと、線構造コントラストC1も大きくなり、注目画素を中心に線状構造体の走行方向に沿って位置する画素に対して乗算される重み係数w1が大きくなる結果、線フィルタの強度を強くすることができる。線フィルタの強度と面フィルタの強度が相関する場合には、線フィルタに関する第一係数が大きいときには、第二の係数が相対的に小さくなり、面構造コントラストC2も小さくなり、面状構造体の走行方向に沿って設定された3次元空間フィルタ内の画素に対して乗算される重み係数w1が小さくなる結果、面フィルタの強度が弱くなる。DSA再構成像64の第一の係数及び第二の係数についても同様である。 Slider 62a is for setting the first coefficient multiplying against lineation contrast C 1 of three-dimensional spatial filter to be used for standard reconstructed image, slider 63a is used for standard reconstructed image The second coefficient to be multiplied to the surface structure contrast C 2 of the three-dimensional spatial filter to be obtained is set. When the first coefficient is relatively large, the line structure contrast C 1 also increases, and the weight coefficient w 1 multiplied by the pixel located along the traveling direction of the linear structure around the target pixel is large. As a result, the strength of the line filter can be increased. When the strength of the line filter and the strength of the surface filter are correlated, when the first coefficient related to the line filter is large, the second coefficient is relatively small, the surface structure contrast C 2 is also small, and the surface structure results weighting coefficients w 1 to be multiplied by the three-dimensional space pixels in the filter which travel is set along the direction of the smaller, the strength of the surface filter is weakened. The same applies to the first coefficient and the second coefficient of the DSA reconstructed image 64.
 一般に、DSA再構成像では、造影血管だけを強調表示したいという要求があるので、標準再構成像に比べて、構造体(造影血管)を非構造体(造影血管外領域)に対してより鮮明に表示することが望ましい。また、造影血管は線状構造体により近いので、線状構造体の重み係数を、面状構造体の重み係数よりも大きく設定することが望ましい。また、撮影部位によって、線状構造体、面状構造体のどちらをより強調させたいかが異なることがある。例えば一般に四肢では線状構造体をより強く、胸部や腹部では面状構造体をより強く強調させたいこともありうるが、本実施形態によれば、撮影部位や撮影手技に応じて、線フィルタと面フィルタの強度比率を設定でき、診断に必要な部位の形状に合わせてノイズ除去処理を施すことができる。 In general, in a DSA reconstructed image, there is a demand for highlighting only contrast-enhanced blood vessels, so the structure (contrast-enhanced blood vessels) is clearer than the non-structure (contrast-enhanced blood vessel region) compared to the standard reconstructed image. It is desirable to display on. Further, since the contrast blood vessel is closer to the linear structure, it is desirable to set the weight coefficient of the linear structure larger than the weight coefficient of the planar structure. In addition, depending on the imaging region, it may be different which of the linear structure or the planar structure is to be emphasized. For example, in general, it may be desired to emphasize the linear structure more strongly in the extremities and more strongly the planar structure in the chest or abdomen. And the intensity ratio of the surface filter can be set, and noise removal processing can be performed according to the shape of the part necessary for diagnosis.
 <第四実施形態>
 第四実施形態は、操作用GUIを用いて、構造体検出閾値(既述の第一閾値TH1及び第二閾値TH2)を設定する実施形態である。以下、図6に基づいて、本実施形態について説明する。図6は、構造体検出閾値を設定する操作画面の一例を説明図である。
<Fourth embodiment>
The fourth embodiment is an embodiment in which the structure detection threshold (the first threshold TH 1 and the second threshold TH 2 described above ) is set using the operation GUI. Hereinafter, the present embodiment will be described with reference to FIG. FIG. 6 is an explanatory diagram illustrating an example of an operation screen for setting a structure detection threshold.
 図6の操作画面70は、一方の軸に固有値の種別を、他方の軸に各固有値の絶対値を示したグラフ71と、そのグラフ上で第一閾値TH1を設定するスライダー72と、第二閾値TH2を設定するスライダー73と、を含む。スライダー72を上下すると第一閾値TH1の値を増減でき、スライダー73を上下すると第二閾値TH2の値を増減できる。操作者は、DSA画像のように、面状構造体に比べて線状構造体をより鮮明に描出したいときには、第一閾値TH1を相対的に高く設定する。また例えば、腹部や胸部の標準再構成画像のように、面状構造体をより鮮明に描出したいときには、第二閾値TH2を相対的に低く設定し、面形状をより細かく描出することで滑らかな面で描出された3次元再構成像を生成することができる。 Operation screen 70 in FIG. 6, the type of the eigenvalues one axis, the graph 71 shows the absolute value of each eigenvalue to the other shaft, and a slider 72 for setting the first threshold value TH 1 on the graph, the It includes a slider 73 for setting two threshold TH 2, the. The value of the first threshold TH 1 can be increased or decreased by moving the slider 72 up and down, and the value of the second threshold TH 2 can be increased or decreased by moving the slider 73 up and down. When the operator wants to draw a linear structure more clearly than a planar structure like a DSA image, the operator sets the first threshold value TH 1 relatively high. Also, for example, when you want to draw a planar structure more clearly, such as a standard reconstructed image of the abdomen or chest, the second threshold TH 2 is set relatively low, and the surface shape is drawn more finely. It is possible to generate a three-dimensional reconstruction image drawn on a simple surface.
 上記実施形態では、設定した3次元関心領域毎に走行方向及びコントラストを求めたが、周辺の3次元関心領域から算出される走行方向及びコントラストでそれぞれ重み付け平均をして、設定した3次元関心領域の走行方向及びコントラストを求めてもよい。例えば、線状構造体が含まれる3次元関心領域(例えば図4の3次元関心領域42a)の場合、線状構造体の走行方向に沿って隣接する周辺の3次元関心領域(例えば図4の3次元関心領域42aを中心とし、図4の紙面上下方向に隣接する3次元関心領域)の走行方向及びコントラストで重み付平均をした値を、当該3次元関心領域の走行方向及びコントラストとして求めてもよい。 In the above embodiment, the traveling direction and the contrast are obtained for each set three-dimensional region of interest. However, the weighted average is calculated with the traveling direction and the contrast calculated from the surrounding three-dimensional region of interest, and the set three-dimensional region of interest is set. The traveling direction and contrast may be obtained. For example, in the case of a three-dimensional region of interest including a linear structure (e.g., the three-dimensional region of interest 42a in FIG. 4), neighboring three-dimensional regions of interest (e.g., in FIG. 4) that are adjacent along the traveling direction of the linear structure. The value obtained by weighted average of the traveling direction and contrast of the three-dimensional region of interest 42a centered on the three-dimensional region of interest 42a in the vertical direction in FIG. 4) is obtained as the traveling direction and contrast of the three-dimensional region of interest. Also good.
 また、面状構造体が含まれる3次元関心領域(例えば図4の3次元関心領域42c)の場合、面状構造体の走行方向に沿って隣接する周辺の3次元関心領域(例えば図4の3次元関心領域42cを中心とし、図4の紙面左右方向及び奥行き方向に隣接する3次元関心領域)の走行方向及びコントラストで重み付平均をした値を、当該3次元関心領域の走行方向及びコントラストとして求めてもよい。これにより、隣接する関心領域間での走行方向及びコントラストの変化を滑らかにすることができる。 In addition, in the case of a three-dimensional region of interest including a planar structure (for example, the three-dimensional region of interest 42c in FIG. 4), neighboring peripheral three-dimensional regions of interest (for example, in FIG. 4) along the traveling direction of the planar structure. The three-dimensional region of interest 42c is the center of the three-dimensional region of interest adjacent to the left-right direction and the depth direction in FIG. You may ask as. Thereby, the change of the running direction and contrast between adjacent regions of interest can be smoothed.
 次に、上記実施形態に係る医用画像処理装置を医用画像撮像装置に搭載した実施形態について説明する。本実施形態では、医用画像撮像装置の例として、Cアーム型X線CT装置を用いて説明するが、ガントリタイプのX線CT装置や、MRI装置、また超音波診断装置など、被検体の3次元再構成像を生成する医用画像撮像装置であれば、その種類を問わず、本発明を適用することができる。以下、図7に基づいて、本実施形態に係るX線図7は、本発明が適用されたコーンビームX線CT装置(Cアーム方式)の機能ブロック図である。 Next, an embodiment in which the medical image processing apparatus according to the above embodiment is mounted on a medical image imaging apparatus will be described. In the present embodiment, a C-arm type X-ray CT apparatus will be described as an example of a medical imaging apparatus. However, a gantry type X-ray CT apparatus, an MRI apparatus, an ultrasonic diagnostic apparatus, or the like 3 The present invention can be applied to any medical imaging apparatus that generates a three-dimensional reconstructed image regardless of the type. Hereinafter, based on FIG. 7, the X-ray diagram 7 according to the present embodiment is a functional block diagram of a cone beam X-ray CT apparatus (C-arm system) to which the present invention is applied.
 図7のコーンビームX線CT装置200は、被検体2に対してX線を照射し、被検体2の投影データ111を撮影する撮影部10と、撮影部10の各構成要素を制御したり、投影データ111に基づいて被検体2の3次元CT像を再構成したりする制御演算部20と、各種パラメータや指示を入力するためのマウス、キーボード、あるいはトラックボール等からなる情報入力装置30と、画像を表示する表示装置40と、を備える。 The cone beam X-ray CT apparatus 200 in FIG. 7 irradiates the subject 2 with X-rays and controls the imaging unit 10 that captures the projection data 111 of the subject 2 and each component of the imaging unit 10. A control calculation unit 20 for reconstructing a three-dimensional CT image of the subject 2 based on the projection data 111, and an information input device 30 including a mouse, a keyboard, a trackball, etc. for inputting various parameters and instructions And a display device 40 for displaying an image.
 撮影部10は、寝台17と、該寝台17に横臥された被検体2にX線を照射するX線源11と、該X線源11に対向して設置され被検体2を透過したX線を検出することにより投影データ111を出力する2次元X線検出器12と、X線源11及び2次元X線検出器12を機械的に接続するC型アーム13と、該C型アーム13を保持するC型アーム保持体14と、該C型アーム保持体14を天井に取り付ける天井支持体15と、該天井支持体15を図示の状態で前後左右の2次元方向に移動可能に支持する天井レール16と、被検体2に造影剤を注入するインジェクタ18と、を備える。 The imaging unit 10 includes a bed 17, an X-ray source 11 that irradiates the subject 2 lying on the bed 17 with X-rays, and an X-ray that is installed opposite to the X-ray source 11 and passes through the subject 2 A two-dimensional X-ray detector 12 that outputs projection data 111 by detecting the X-ray, a C-type arm 13 that mechanically connects the X-ray source 11 and the two-dimensional X-ray detector 12, and the C-type arm 13 A C-type arm holding body 14 to be held, a ceiling support 15 for attaching the C-type arm holding body 14 to the ceiling, and a ceiling for supporting the ceiling support 15 so as to be movable in the two-dimensional direction of front, rear, left and right in the illustrated state. Rail 16 and injector 18 for injecting contrast medium into subject 2 are provided.
 X線源11は、X線を発生するX線管11tと、X線管11tからのX線照射の方向を円錐、四角錐状、あるいは多辺角錐状に制御するコリメータ11cと、を備える。 The X-ray source 11 includes an X-ray tube 11t that generates X-rays, and a collimator 11c that controls the direction of X-ray irradiation from the X-ray tube 11t to be a cone, a quadrangular pyramid, or a multi-sided pyramid.
 2次元X線検出器12には、たとえばTFT素子を用いるフラットパネルディテクター(以下「FPD」という)が用いられる。また、2次元X線検出器12の別の例として、X線透過像を可視光像に変換するX線イメージインテンシファイアと、X線イメージインテンシファイアの像を結像する光学レンズ、及び光学レンズにより結像されたX線イメージインテンシファイアの可視光像を撮影するCCDテレビカメラ等の組み合わせから構成される2次元X線検出器を用いてもよい。そして、2次元X線検出器12の撮影視野は円形、方形など、いかなる形状であってもよい。 For the two-dimensional X-ray detector 12, for example, a flat panel detector (hereinafter referred to as “FPD”) using a TFT element is used. Further, as another example of the two-dimensional X-ray detector 12, an X-ray image intensifier that converts an X-ray transmission image into a visible light image, an optical lens that forms an image of the X-ray image intensifier, and You may use the two-dimensional X-ray detector comprised from the combination of the CCD television camera etc. which image | photograph the visible light image of the X-ray image intensifier imaged by the optical lens. The imaging field of view of the two-dimensional X-ray detector 12 may be any shape such as a circle or a rectangle.
 上記C型アーム13は、被検体2の撮影に際して、所定の投影角度毎に回転中心軸4を中心として回転移動する。これにより、上記X線源11と2次元X線検出器12は対向配置したまま、ほぼ同一の平面状にある円軌道を回転移動し、X線撮影を行う。 The C-arm 13 rotates around the rotation center axis 4 at every predetermined projection angle when the subject 2 is imaged. As a result, while the X-ray source 11 and the two-dimensional X-ray detector 12 are arranged to face each other, the X-ray imaging is performed by rotating and moving on a circular orbit in substantially the same plane.
 制御演算部20は、撮影部10を制御する撮影部制御部100と、撮影部10が出力した投影データ111を収集して格納する画像収集部110と、収集された投影データ111に基づいて3次元再構成像211を再構成する画像再構成部21aと、画像再構成部21aが生成した3次元再構成像211上に、着目画素を中心とする3次元関心領域を設定する関心領域設定部22と、3次元関心領域に含まれる構造体の走行方向を検出する構造体検出部23と、検出された構造体の走行方向に基づいて3次元空間フィルタを設定し、ノイズ除去処理を行うフィルタ処理部24と、フィルタ処理後3次元再構成像241を乗じ装置40に表示するための表示制御を行う表示制御部25と、を備える。 The control calculation unit 20 includes an imaging unit control unit 100 that controls the imaging unit 10, an image collection unit 110 that collects and stores the projection data 111 output from the imaging unit 10, and 3 based on the collected projection data 111. An image reconstruction unit 21a that reconstructs the two-dimensional reconstruction image 211, and a region of interest setting unit that sets a three-dimensional region of interest centered on the pixel of interest on the three-dimensional reconstruction image 211 generated by the image reconstruction unit 21a 22, a structure detection unit 23 that detects the traveling direction of the structure included in the three-dimensional region of interest, and a filter that performs noise removal processing by setting a three-dimensional spatial filter based on the detected traveling direction of the structure A processing unit 24 and a display control unit 25 that performs display control for displaying the filtered three-dimensional reconstructed image 241 on the multiplication device 40 are provided.
 また、情報入力装置30は、構造体検出部23が、3次元関心領域内にある構造体を検出する際に用いる構造体検出閾値(既述の第一閾値TH1及び第二閾値TH2)301と、線フィルタと面フィルタの強調比率を設定するために、線構造コントラストC1に乗じる第一の係数及び面構造コントラストC2に乗じる第二の係数(以下「コントラスト係数」という)302の入力を受け付ける。構造体検出閾値301は、構造体検出部23に引き渡される。また、コントラスト係数302は、フィルタ処理部24に引き渡される。 Further, the information input device 30 has a structure detection threshold (the first threshold TH 1 and the second threshold TH 2 described above ) used when the structure detection unit 23 detects a structure in the three-dimensional region of interest. 301 and a first coefficient to be multiplied by the line structure contrast C 1 and a second coefficient to be multiplied by the surface structure contrast C 2 (hereinafter referred to as “contrast coefficient”) 302 in order to set the enhancement ratio between the line filter and the surface filter. Accept input. The structure detection threshold 301 is delivered to the structure detection unit 23. The contrast coefficient 302 is delivered to the filter processing unit 24.
 関心領域設定部22、構造体検出部23、フィルタ処理部24及び表示制御部25が行う処理は、図1に示す制御演算部20について第一実施形態及び第二実施形態で説明した処理と同様であり、ここでは重複する説明を省略する。 The processing performed by the region-of-interest setting unit 22, the structure detection unit 23, the filter processing unit 24, and the display control unit 25 is the same as the processing described in the first embodiment and the second embodiment for the control calculation unit 20 illustrated in FIG. Therefore, a duplicate description is omitted here.
 撮影部制御部100は、C型アーム13の、回転中心軸4の回りの回転移動を制御する撮影系回転制御部101と、天井支持体15の天井レール16上での位置を制御してC型アーム13の被検体2に対する位置を2次元的に制御する撮影系位置制御部102と、X線管11tに流す管電流のON、OFF等を制御するX線照射制御部103と、インジェクタ18が被検体2に注入する造影剤の注入量及び注入タイミングを制御するインジェクタ制御部104と、寝台17の位置を制御して被検体2の位置を調整するための寝台制御部105と、2次元X線検出器12による投影データ111の撮影を制御する検出系制御部107と、を備える。 The imaging unit control unit 100 controls the position of the C-type arm 13 on the ceiling rail 16 and the imaging system rotation control unit 101 that controls the rotational movement of the C-arm 13 around the rotation center axis 4 to control the C-type arm 13. An imaging system position control unit 102 that two-dimensionally controls the position of the mold arm 13 with respect to the subject 2, an X-ray irradiation control unit 103 that controls ON / OFF of a tube current flowing through the X-ray tube 11t, and an injector 18 Injector control unit 104 that controls the injection amount and injection timing of contrast medium to be injected into subject 2, bed control unit 105 for adjusting the position of subject 2 by controlling the position of bed 17, and two-dimensional And a detection system control unit 107 that controls imaging of the projection data 111 by the X-ray detector 12.
 本実施形態に係る医用画像撮像装置によれば、撮像部10で得られた投影データ111を基に3次元再構成像211を生成し、これに対し、3次元空間フィルタを適用してノイズ低減処理が行える。そのため、線状及び面状のコントラストを低減することなくノイズ低減処理を行った3次元再構成像が得られる。また、こうして得られる3次元再構成像を用いる事で、低線量のX線撮像装置でも、3次元画像処理、細い血管、カテーテルの抽出が可能となり、患者の被曝X量を低く抑えた、X線撮像装置の提供が、可能となる。 According to the medical imaging apparatus according to the present embodiment, a three-dimensional reconstructed image 211 is generated based on the projection data 111 obtained by the imaging unit 10, and a noise is reduced by applying a three-dimensional spatial filter to this. Can be processed. Therefore, a three-dimensional reconstructed image obtained by performing noise reduction processing without reducing linear and planar contrast is obtained. In addition, by using the three-dimensional reconstructed image obtained in this way, even with a low-dose X-ray imaging device, three-dimensional image processing, extraction of thin blood vessels and catheters is possible, and the amount of X exposure of the patient is kept low. A line imaging apparatus can be provided.
 1 医用画像処理装置、10 撮像部、20 制御演算部、30 情報入力装置、40 表示装置、200 コーンビームX線CT装置(医用画像撮像装置) 1 Medical image processing device, 10 imaging unit, 20 control calculation unit, 30 information input device, 40 display device, 200 cone beam X-ray CT device (medical image imaging device)

Claims (15)

  1.  被検体の3次元再構成像を含む医用画像上において、着目画素を中心とする3次元の関心領域を設定する関心領域設定部と、
     前記3次元の関心領域に含まれる画素値の分布を基に、前記3次元の関心領域に含まれる構造体の走行方向を検出する構造体検出部と、
     前記構造体の走行方向に沿った形状を有する3次元空間フィルタを設定し、その3次元空間フィルタを用いて、前記3次元再構成像に対しノイズ除去処理を行うフィルタ処理部と、
     を有することを特徴とする医用画像処理装置。
    On a medical image including a three-dimensional reconstructed image of a subject, a region of interest setting unit that sets a three-dimensional region of interest centered on a pixel of interest;
    Based on the distribution of pixel values included in the three-dimensional region of interest, a structure detection unit that detects the traveling direction of the structure included in the three-dimensional region of interest;
    Set a three-dimensional spatial filter having a shape along the traveling direction of the structure, and using the three-dimensional spatial filter, a filter processing unit that performs noise removal processing on the three-dimensional reconstructed image;
    A medical image processing apparatus comprising:
  2.  前記構造体検出部は、前記3次元の関心領域に含まれる線状構造体及び面状構造体の少なくとも一つの走行方向を検出する、
     ことを特徴とする請求項1に記載の医用画像処理装置。
    The structure detection unit detects at least one traveling direction of a linear structure and a planar structure included in the three-dimensional region of interest;
    2. The medical image processing apparatus according to claim 1, wherein:
  3.  前記フィルタ処理部は、前記線状構造体の走行方向に沿って並ぶ画素の画素値の重みを相対的に重くする線フィルタと、前記面状構造体の走行方向に沿って並ぶ画素の画素値の重みを相対的に重くする面フィルタとを設定し、前記線フィルタ及び面フィルタの少なくとも一方を用いて前記3次元の関心領域に含まれる画素値に対しコンボリューション演算を行うことを特徴とする請求項2に記載の医用画像処理装置。 The filter processing unit includes: a line filter that relatively increases a weight of a pixel value of pixels arranged along the running direction of the linear structure; and a pixel value of pixels arranged along the running direction of the planar structure. A surface filter that relatively increases the weight of the image, and performs a convolution operation on a pixel value included in the three-dimensional region of interest using at least one of the line filter and the surface filter. The medical image processing apparatus according to claim 2.
  4.  前記フィルタ処理部は、前記線フィルタ及び面フィルタを所定の強度比率で合成して、前記3次元空間フィルタを生成することを特徴とする請求項3に記載の医用画像処理装置。 4. The medical image processing apparatus according to claim 3, wherein the filter processing unit generates the three-dimensional spatial filter by combining the line filter and the surface filter at a predetermined intensity ratio.
  5.  前記線フィルタ及び面フィルタの強度を設定する入力手段をさらに備えたことを特徴とする請求項4に記載の医用画像処理装置。 5. The medical image processing apparatus according to claim 4, further comprising input means for setting strengths of the line filter and the surface filter.
  6.  前記構造体検出部は、前記3次元の関心領域に含まれる画素の座標値の偏差と、当該画素の画素値と、の積を用いた共分散を基に、前記線状構造体及び前記面状構造体の少なくとも一つの走行方向を検出する、
     ことを特徴とする請求項2に記載の医用画像処理装置。
    The structure detection unit, based on a covariance using a product of a deviation of a coordinate value of a pixel included in the three-dimensional region of interest and a pixel value of the pixel, the linear structure and the surface Detecting at least one traveling direction of the structure,
    3. The medical image processing apparatus according to claim 2, wherein
  7.  前記構造体検出部は、前記共分散を用いた3次正方行列を生成し、当該3次正方行列の3つの固有値及び各固有値に対する固有ベクトルを算出し、これら3つの固有値及び固有ベクトルを用いて、前記線状構造体の走行方向及び前記面状構造体の走行方向の少なくとも一つを検出する、
     ことを特徴とする請求項6に記載の医用画像処理装置。
    The structure detection unit generates a cubic square matrix using the covariance, calculates three eigenvalues of the cubic square matrix and eigenvectors for each eigenvalue, and uses these three eigenvalues and eigenvectors, Detecting at least one of a traveling direction of the linear structure and a traveling direction of the planar structure;
    7. The medical image processing apparatus according to claim 6, wherein
  8.  前記構造体検出部は、前記3つの固有値のうち、絶対値が最も大きい固有値に対応する固有ベクトルの方向を、前記3次元の関心領域に含まれる前記線状構造体の走行方向として算出する、
     ことを特徴とする請求項7に記載の医用画像処理装置。
    The structure detection unit calculates the direction of the eigenvector corresponding to the eigenvalue having the largest absolute value among the three eigenvalues as the traveling direction of the linear structure included in the three-dimensional region of interest.
    8. The medical image processing apparatus according to claim 7, wherein
  9.  前記構造体検出部は、前記3つの固有値のうち、絶対値が最も小さい固有値に対応する固有ベクトルに直交する平面を、前記3次元の関心領域に含まれる前記面状構造体の走行方向として算出する、
     ことを特徴とする請求項7に記載の医用画像処理装置。
    The structure detection unit calculates a plane orthogonal to the eigenvector corresponding to the eigenvalue having the smallest absolute value among the three eigenvalues as the traveling direction of the planar structure included in the three-dimensional region of interest. ,
    8. The medical image processing apparatus according to claim 7, wherein
  10.  前記3つの固有値を用いて前記線状構造体及び/又は前記面状構造体の走行方向を検出するための閾値を設定する入力手段をさらに備えたことを特徴とする請求項7に記載の医用画像処理装置。 8. The medical device according to claim 7, further comprising input means for setting a threshold value for detecting a traveling direction of the linear structure and / or the planar structure using the three eigenvalues. Image processing device.
  11.  前記構造体検出部は、前記共分散を用いた3次正方行列を生成し、当該3次正方行列の3つの固有値及び各固有値に対する固有ベクトルを算出し、これら3つの固有値及び固有ベクトルを用いて、前記3次元の関心領域内の背景領域に対する前記線状構造体のコントラスト、及び前記3次元の関心領域内の背景領域に対する前記面状構造体のコントラストの少なくとも一つを算出し、
     前記フィルタ処理部は、前記線状構造体のコントラスト及び前記面状構造体のコントラストに応じて、前記3次元空間フィルタ内の画素値に乗じる重み係数を増減する、
     ことを特徴とする請求項6に記載の医用画像処理装置。
    The structure detection unit generates a cubic square matrix using the covariance, calculates three eigenvalues of the cubic square matrix and eigenvectors for each eigenvalue, and uses these three eigenvalues and eigenvectors, Calculating at least one of a contrast of the linear structure with respect to a background region in a three-dimensional region of interest, and a contrast of the planar structure with respect to a background region in the three-dimensional region of interest;
    The filter processing unit increases or decreases a weighting factor by which a pixel value in the three-dimensional spatial filter is multiplied according to the contrast of the linear structure and the contrast of the planar structure.
    7. The medical image processing apparatus according to claim 6, wherein
  12.  前記構造体検出部は、前記3つの固有値のうち、絶対値が最も大きい固有値と、絶対値が2番目に大きい固有値及び最も小さい固有値の積の平方根と、の比を、前記3次元の関心領域に含まれる線状構造体の前記コントラストとして算出する、
     ことを特徴とする請求項11に記載の医用画像処理装置。
    The structure detection unit is configured to calculate a ratio of the eigenvalue having the largest absolute value among the three eigenvalues to the square root of the product of the second largest eigenvalue and the smallest eigenvalue, and the three-dimensional region of interest. Calculated as the contrast of the linear structure included in
    12. The medical image processing apparatus according to claim 11, wherein:
  13.  前記構造体検出部は、前記3つの固有値のうち、絶対値が最も大きい固有値及び2番目に大きい固有値の積の平方根と、絶対値が最も小さい固有値と、の比を、前記3次元の関心領域における面状構造体の前記コントラストとして算出する、
     ことを特徴とする請求項11に記載の医用画像処理装置。
    The structure detecting unit is configured to calculate a ratio of a square root of a product of the largest eigenvalue and the second largest eigenvalue among the three eigenvalues to a eigenvalue having the smallest absolute value, and the three-dimensional region of interest. Calculating as the contrast of the planar structure in
    12. The medical image processing apparatus according to claim 11, wherein:
  14.  前記フィルタ処理部は、前記3つの固有値のうち、絶対値が最も大きい固有値と、絶対値が2番目に大きい固有値と、に基づいた指向性を有する面フィルタを生成する、
     ことを特徴とする請求項13に記載の医用画像処理装置。
    The filter processing unit generates a surface filter having directivity based on the eigenvalue having the largest absolute value and the eigenvalue having the second largest absolute value among the three eigenvalues.
    14. The medical image processing apparatus according to claim 13, wherein
  15.  被検体を撮像して画像データを生成する撮像部と、
     前記画像データを基に再構成演算を行い、被検体の3次元再構成像を生成する画像再構成部と、
     前記3次元再構成像を含む医用画像上において、着目画素を中心とする3次元の関心領域を設定する関心領域設定部と、
     前記3次元の関心領域に含まれる画素値の分布を基に、前記3次元の関心領域に含まれる構造体の走行方向を検出する構造体検出部と、
     前記構造体の走行方向に沿った形状を有する3次元空間フィルタを設定し、その3次元空間フィルタを用いて、前記3次元再構成像に対しノイズ除去処理を行うフィルタ処理部と、
     前記ノイズ除去処理後の3次元再構成像を表示するための制御を行う表示制御部と、
     を有することを特徴とする医用画像撮像装置。
    An imaging unit that images a subject and generates image data;
    An image reconstruction unit that performs a reconstruction calculation based on the image data and generates a three-dimensional reconstruction image of the subject;
    On the medical image including the three-dimensional reconstructed image, a region-of-interest setting unit that sets a three-dimensional region of interest centered on the pixel of interest;
    Based on the distribution of pixel values included in the three-dimensional region of interest, a structure detection unit that detects the traveling direction of the structure included in the three-dimensional region of interest;
    Set a three-dimensional spatial filter having a shape along the traveling direction of the structure, and using the three-dimensional spatial filter, a filter processing unit that performs noise removal processing on the three-dimensional reconstructed image;
    A display control unit that performs control for displaying the three-dimensional reconstructed image after the noise removal processing;
    A medical image imaging apparatus comprising:
PCT/JP2014/050944 2013-01-30 2014-01-20 Medical image processing device, and medical image capture device WO2014119412A1 (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
JP2014559632A JPWO2014119412A1 (en) 2013-01-30 2014-01-20 Medical image processing apparatus and medical image imaging apparatus

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
JP2013015614 2013-01-30
JP2013-015614 2013-01-30

Publications (1)

Publication Number Publication Date
WO2014119412A1 true WO2014119412A1 (en) 2014-08-07

Family

ID=51262127

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/JP2014/050944 WO2014119412A1 (en) 2013-01-30 2014-01-20 Medical image processing device, and medical image capture device

Country Status (2)

Country Link
JP (1) JPWO2014119412A1 (en)
WO (1) WO2014119412A1 (en)

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN112545541A (en) * 2019-09-26 2021-03-26 株式会社日立制作所 Medical image processing apparatus and medical image processing method
JP2021118855A (en) * 2015-07-25 2021-08-12 ライトラボ・イメージング・インコーポレーテッド Intravascular data visualization methods

Citations (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPH11501140A (en) * 1995-12-21 1999-01-26 フィリップス エレクトロニクス エヌ ベー Direction noise reduction
JPH11328395A (en) * 1998-04-09 1999-11-30 General Electric Co <Ge> Reducing method for noise in image
JP2001111835A (en) * 1999-10-05 2001-04-20 Toshiba Corp Image processing device and x-ray diagnostic device
JP2003024311A (en) * 2001-07-17 2003-01-28 Hitachi Medical Corp Image processor
JP2006204912A (en) * 2005-01-24 2006-08-10 Medison Co Ltd Ultrasonic image processing method
JP2009226141A (en) * 2008-03-25 2009-10-08 Institute Of National Colleges Of Technology Japan Image processing apparatus and method
WO2009145076A1 (en) * 2008-05-28 2009-12-03 株式会社 日立メディコ Image processing device, image processing method, and image processing program
JP2010227554A (en) * 2009-03-04 2010-10-14 Toshiba Corp Ultrasonic diagnostic device, image processor, control method for ultrasonic diagnostic device, and image processing method

Patent Citations (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPH11501140A (en) * 1995-12-21 1999-01-26 フィリップス エレクトロニクス エヌ ベー Direction noise reduction
JPH11328395A (en) * 1998-04-09 1999-11-30 General Electric Co <Ge> Reducing method for noise in image
JP2001111835A (en) * 1999-10-05 2001-04-20 Toshiba Corp Image processing device and x-ray diagnostic device
JP2003024311A (en) * 2001-07-17 2003-01-28 Hitachi Medical Corp Image processor
JP2006204912A (en) * 2005-01-24 2006-08-10 Medison Co Ltd Ultrasonic image processing method
JP2009226141A (en) * 2008-03-25 2009-10-08 Institute Of National Colleges Of Technology Japan Image processing apparatus and method
WO2009145076A1 (en) * 2008-05-28 2009-12-03 株式会社 日立メディコ Image processing device, image processing method, and image processing program
JP2010227554A (en) * 2009-03-04 2010-10-14 Toshiba Corp Ultrasonic diagnostic device, image processor, control method for ultrasonic diagnostic device, and image processing method

Cited By (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2021118855A (en) * 2015-07-25 2021-08-12 ライトラボ・イメージング・インコーポレーテッド Intravascular data visualization methods
US11768593B2 (en) 2015-07-25 2023-09-26 Lightlab Imaging, Inc. Intravascular data visualization and interface systems and methods
CN112545541A (en) * 2019-09-26 2021-03-26 株式会社日立制作所 Medical image processing apparatus and medical image processing method
CN112545541B (en) * 2019-09-26 2024-04-23 富士胶片医疗健康株式会社 Medical image processing device and medical image processing method

Also Published As

Publication number Publication date
JPWO2014119412A1 (en) 2017-01-26

Similar Documents

Publication Publication Date Title
JP6534998B2 (en) Method and apparatus for displaying a medical image
JP6766045B2 (en) How to generate a synthetic mammogram from tomosynthesis data
JP6368779B2 (en) A method for generating edge-preserving synthetic mammograms from tomosynthesis data
JP4859446B2 (en) Angiographic X-ray diagnostic device for rotating angiography
CN103156629B (en) Image processing equipment and image processing method
JP5878119B2 (en) X-ray CT apparatus and control method thereof
JP2006043431A (en) Method of reducing helical windmill artifact with recovery noise for helical multi-slice ct
JP3987024B2 (en) Method and system for enhancing tomosynthesis images using lateral filtering
JP2005021345A (en) X-ray solid reconstruction processor, x-ray imaging apparatus, method for x-ray solid reconstruction processing, and x-ray solid imaging auxiliary tool
WO2006078085A1 (en) Method for reconstructing a local high resolution x-ray ct image and apparatus for reconstructing a local high resolution x-ray ct image
JP3897925B2 (en) Cone beam CT system
US10013778B2 (en) Tomography apparatus and method of reconstructing tomography image by using the tomography apparatus
US20180211420A1 (en) Tomographic device and tomographic image processing method according to same
JP4828920B2 (en) 3D image processing device
US20200240934A1 (en) Tomography apparatus and controlling method for the same
WO2014119412A1 (en) Medical image processing device, and medical image capture device
JP7267329B2 (en) Method and system for digital mammography imaging
Aliaksandrauna Adjusting videoendoscopic 3D reconstruction results using tomographic data
JP2021003240A (en) X-ray tomosynthesis device, image processing device and program
JP5514397B2 (en) Image display apparatus and X-ray tomography apparatus
JP2008154680A (en) X-ray ct apparatus
JP2013169359A (en) X-ray ct apparatus
KR20210147384A (en) Projection data correction method for truncation artifact reduction
JP2008119457A (en) X-ray diagnostic apparatus, image processing unit, and arithmetic program of filter coefficient used to reconstruct image
JP6373937B2 (en) X-ray CT apparatus and image display apparatus

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 14746640

Country of ref document: EP

Kind code of ref document: A1

ENP Entry into the national phase

Ref document number: 2014559632

Country of ref document: JP

Kind code of ref document: A

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 14746640

Country of ref document: EP

Kind code of ref document: A1