WO2014031531A1 - Système et procédé de procédures médicales guidées par des images - Google Patents
Système et procédé de procédures médicales guidées par des images Download PDFInfo
- Publication number
- WO2014031531A1 WO2014031531A1 PCT/US2013/055561 US2013055561W WO2014031531A1 WO 2014031531 A1 WO2014031531 A1 WO 2014031531A1 US 2013055561 W US2013055561 W US 2013055561W WO 2014031531 A1 WO2014031531 A1 WO 2014031531A1
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- imaging modality
- image
- imaging
- anatomical region
- features
- Prior art date
Links
Classifications
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T7/00—Image analysis
- G06T7/30—Determination of transform parameters for the alignment of images, i.e. image registration
- G06T7/33—Determination of transform parameters for the alignment of images, i.e. image registration using feature-based methods
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V10/00—Arrangements for image or video recognition or understanding
- G06V10/70—Arrangements for image or video recognition or understanding using pattern recognition or machine learning
- G06V10/74—Image or video pattern matching; Proximity measures in feature spaces
- G06V10/75—Organisation of the matching processes, e.g. simultaneous or sequential comparisons of image or video features; Coarse-fine approaches, e.g. multi-scale approaches; using context analysis; Selection of dictionaries
- G06V10/755—Deformable models or variational models, e.g. snakes or active contours
- G06V10/7553—Deformable models or variational models, e.g. snakes or active contours based on shape, e.g. active shape models [ASM]
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2207/00—Indexing scheme for image analysis or image enhancement
- G06T2207/10—Image acquisition modality
- G06T2207/10072—Tomographic images
- G06T2207/10081—Computed x-ray tomography [CT]
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2207/00—Indexing scheme for image analysis or image enhancement
- G06T2207/10—Image acquisition modality
- G06T2207/10072—Tomographic images
- G06T2207/10084—Hybrid tomography; Concurrent acquisition with multiple different tomographic modalities
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2207/00—Indexing scheme for image analysis or image enhancement
- G06T2207/10—Image acquisition modality
- G06T2207/10072—Tomographic images
- G06T2207/10088—Magnetic resonance imaging [MRI]
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2207/00—Indexing scheme for image analysis or image enhancement
- G06T2207/10—Image acquisition modality
- G06T2207/10132—Ultrasound image
- G06T2207/10136—3D ultrasound image
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2207/00—Indexing scheme for image analysis or image enhancement
- G06T2207/20—Special algorithmic details
- G06T2207/20212—Image combination
- G06T2207/20221—Image fusion; Image merging
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2207/00—Indexing scheme for image analysis or image enhancement
- G06T2207/30—Subject of image; Context of image processing
- G06T2207/30004—Biomedical image processing
- G06T2207/30096—Tumor; Lesion
Definitions
- the present disclosure relates to systems and methods for image guided medical and surgical procedures.
- U.S . Pat Pub. 2009/0054772 (EP20050781862), expressly incorporated herein by reference, entitled “Focused ultrasound therapy system”, provides a method for performing a High Intensity Focused Ultrasound (HIFU) procedure for specific clinical application.
- Basic image registration is performed for fusion from a diagnostic modality such as Computed Tomography (CT) or Magnetic Resonance Imaging (MRI) to ultrasound, only through body positioning, referred to as “immobilization”, resulting in only image registration via horizontal movement and zoom factor.
- CT Computed Tomography
- MRI Magnetic Resonance Imaging
- immobilization resulting in only image registration via horizontal movement and zoom factor.
- U.S. 8,224,420 expressly incorporated herein by reference, which provides a mechanical positioning means for moving said ultrasound energy applicator to position the applicator so that the energy application zone intersects said magnetic resonance volume within said region of the subject.
- U.S. Pub. Pat. 2007/0167762 expressly incorporated herein by reference, entitled “Ultrasound System for interventional treatment” provides an ultrasound system that can load a "wide-area" image signal such as CT or MRI that can be loaded and fused with the ultrasound image, using a manual definition of position of lesions and needle insertion position at the time of procedure.
- 2010/02906853 expressly incorporated herein by reference, entitled “Fusion of 3D volumes with CT reconstruction” discloses a method for registration of ultrasound device in three dimensions to a C-arm scan, the method including acquiring a baseline volume, acquiring images in which ultrasound device is disposed, locating the device within the images, registering the location of the device to the baseline volume, acquiring an ultrasound volume from the ultrasound device, registering the ultrasound volume to the baseline volume, and performing fusion imaging to display a view of the ultrasound device in the baseline volume.
- a mutual information based method is provided to register and display a 3D ultrasound image fused with a CT image.
- U.S. Pub. App. 2011/0178389 expressly incorporated herein by reference, entitled “Fused image modalities guidance” discloses a system and method for registration of medical images, which registers a previously obtained volume(s) onto an ultrasound volume during an ultrasound procedure, to produce a multimodal image, which may be used to guide a medical procedure.
- the multimodal image includes MRI information presented in the framework of a Trans Rectal Ultrasound (TRUS) image during a TRUS procedure.
- TRUS Trans Rectal Ultrasound
- Prostate cancer is one of the most common types of cancer affecting men. It is a slow growing cancer, which is easily treatable if identified at an early stage. A prostate cancer diagnosis often leads to surgery or radiation therapy. Such treatments are costly and can cause serious side effects, including incontinence and erectile dysfunction. Unlike many other types of cancer, prostate cancer is not always lethal and often is unlikely to spread or cause harm. Many patients who are diagnosed with prostate cancer receive radical treatment even though it would not prolong the patient's life, ease pain, or significantly increase the patient's health.
- Prostate cancer may be diagnosed by taking a biopsy of the prostate, which is conventionally conducted under the guidance of ultrasound imaging.
- Ultrasound imaging has high spatial resolution, and is relatively inexpensive and portable.
- ultrasound imaging has relatively low tissue discrimination ability. Accordingly, ultrasound imaging provides adequate imaging of the prostate organ, but it does not provide adequate imaging of tumors within the organ due to the similarity of cancer tissue and benign tissues, as well as the lack of tissue uniformity. Due to the inability to visualize the cancerous portions within the organ with ultrasound, the entire prostate must be considered during the biopsy. Thus, in the conventional prostate biopsy procedure, a urologist relies on the guidance of two-dimensional ultrasound to systematically remove tissue samples from various areas throughout the entire prostate, including areas that are free from cancer.
- Magnetic Resonance Imaging has long been used to evaluate the prostate and surrounding structures. MRI is in some ways superior to ultrasound imaging because it has very good soft tissue contrast. There are several types of MRI techniques, including T2 weighted imaging, diffusion weighted imaging, and dynamic contrast imaging. Standard T2-weighted imaging does not discriminate cancer from other processes with acceptable accuracy. Diffusion- weighted imaging and dynamic contrast imaging may be integrated with traditional T2-weighted imaging to produce multi-parametric MRI. The use of multi-parametric MRI has been shown to improve sensitivity over any single parameter and may enhance overall accuracy in cancer diagnosis.
- MRI As with ultrasound imaging, MRI also has limitations. For instance, it has a relatively long imaging time, requires specialized and costly facilities, and is not well-suited for performance by a urologist at a urology center. Furthermore, performing direct prostate biopsy within MRI machines is not practical for a urologist at a urology center.
- MRI and ultrasound imaging modalities To overcome these shortcomings and maximize the usefulness of the MRI and ultrasound imaging modalities, methods and devices have been developed for digitizing medical images generated by multiple imaging modalities (e.g., ultrasound and MRI) and fusing or integrating multiple images to form a single composite image.
- This composite image includes information from each of the original images that were fused together.
- a fusion or integration of Magnetic Resonance (MR) images with ultrasound-generated images has been useful in the analysis of prostate cancer within a patient.
- Image-guided biopsy systems such as the Artemis produced by Eigen (See, e.g., U.S. Pub. App. Nos. 2012/0087557, 2011/0184684,
- Pat. No. 8,369,592 expressly incorporated herein by reference
- These systems are three-dimensional (3D) image-guided prostate biopsy systems that provide tracking of biopsy sites within the prostate.
- 3D three-dimensional
- Such MRI data is not readily available to urologists and it would be commercially impractical for such MRI data to be generated at a urology center. This is due to many reasons, including urologists' lack of training or expertise, as well as the lack of time, to do so. Also, it is uncertain whether a urologist can profitably implement an image-guided biopsy system in his or her practice while contemporaneously attempting to learn to perform MRI scans.
- MRI is generally considered to offer the best soft tissue contrast of all imaging modalities.
- anatomical e.g., Ti, T2
- functional MRI e.g. dynamic contrast-enhanced (DCE), magnetic resonance spectroscopic imaging (MRSI) and diffusion-weighted imaging (DWI)
- DCE dynamic contrast-enhanced
- MRSI magnetic resonance spectroscopic imaging
- DWI diffusion-weighted imaging
- DCE improves specificity over T2 imaging in detecting cancer. It measures the vascularity of tissue based on the flow of blood and permeability of vessels. Tumors can be detected based on their early enhancement and early washout of the contrast agent. DWI measures the water diffusion in tissues. Increased cellular density in tumors reduces the signal intensity on apparent diffusion maps.
- TRUS trans-rectal ultrasound
- CT imaging is likewise expensive and has limited access, and poses a radiation risk for operators and patient.
- a pre-acquired image e.g., an MRI or CT image
- Regions of interest identifiable in the pre-acquired image volume may be tied to corresponding locations within the TRUS image such that they may be visualized during prior to biopsy target planning or therapeutic application. This solution allows a radiologist to acquire, analyze and annotate MRI/CT scan at the image acquisition facility while a urologist can still perform the procedure using live ultrasound in his/her clinic.
- the present technology provides a method for combining information from plurality of medical imaging modalities, such as positron Emission Tomography (PET), Computed Tomography (CT), Magnetic Resonance Imaging (MRI), Magnetic Resonance Spectroscopic Imaging (MRSI), Ultrasound, Echo Cardiograms and Elastography, supplemented by information obtained in advance by at least one other modality, which is properly registered to the real time image despite soft tissue movement, deformation, or change in size.
- medical imaging modalities such as positron Emission Tomography (PET), Computed Tomography (CT), Magnetic Resonance Imaging (MRI), Magnetic Resonance Spectroscopic Imaging (MRSI), Ultrasound, Echo Cardiograms and Elastography
- the real time image is of a soft tissue organ or gland such as prostate, skin, heart, lung, kidney, liver, bladder, ovaries, and thyroid, and the supplemented real time image is used for a medical image guided procedure.
- the real time image may also be used for orthopedic or musculoskeletal procedures, or exercise physiology.
- a further real-time imaging type is endoscopy, or more generally, videography, which is in growing use, especially for minimally invasive procedures.
- the medical procedure may be a needle based procedure, such as but not limited to biopsy, laser ablation, brachytherapy, stem cell injection for ischemia of the heart, cryotherapy, and/or direct injection of a photothermal or photodynamic agent.
- the medical professional seeks to treat a highly localized portion of an organ, while either avoiding a toxic or damaging therapy to adjacent structures, or to avoid waste of a valuable agent.
- the available real-time medical imaging modalities for guiding the localized treatment visualize the organ, but do not clearly delineate the portion of the organ to be treated.
- non-real time imaging modalities are available for defining locations sought to be treated with the localized treatment.
- the organ in the case of soft tissues, in the time between the non-real time imaging and the real time procedure, the organ can shift, deform (especially as a result of the procedure itself), or change in size, thus substantially distorting the relationship between the real time image used to guide the procedure and the non-real time diagnostic or tissue localization image.
- a further complication is that the non-real time image may have a different intrinsic coordinate system from the real time imaging, leading to artifacts.
- Typical medical procedures comprise image-guided non-needle based procedures such as but not limited to HIFU, EvlRT, and robotic surgery.
- the pre-operative imaging modality may thus be used to a identify target object or gland, and suspicious lesions of the object or gland, for targeted biopsy, targeted therapy, targeted dose delivery or a combination of the above.
- the pre-operative imaging modality may be used to identify and annotate surrounding structures that need to be spared in order to minimize the impact of procedure on quality of life.
- such structures may be nerve bundles, the urethra, rectum and bladder identified in a magnetic resonance (MR) image.
- MR magnetic resonance
- the pre-operative imaging modality may be used to identify and uniquely label anatomical landmarks for manual, semi-automated or automated registration.
- anatomical landmarks may be urethra at prostate base, urethra at apex, verumontanum, calcifications and cysts.
- the pre-operative imaging modality may be used to identify and uniquely label anatomical structures for manual, semi-automated or automated registration.
- such structures may be urethra, seminal vesicles, ejaculatory ducts, bladder and rectum.
- a targeted biopsy may be performed for a malignancy to determine the extent of malignancy and best treatment option.
- Needle guidance procedures may be provided where the pre-operative imaging modality is analyzed to plan the complete procedure or part of the procedure, such that anatomical locations of targets for needle placement is planned in advance, and the anatomical locations are guided by the real time imaging modality.
- the needle locations and trajectories may be identified in advance based on the non- real time, pre-operative imaging modality, such that the target region is adequately sampled for biopsy to maximize the accuracy while minimizing number of samples for each target region.
- the needle locations and trajectories may be identified in advance, such that a target region is effectively treated in a therapeutic procedure within the target area, while minimizing the damage to the surrounding tissue and structures.
- the trajectory may be optimized in a prostate procedure such that the needle insertion minimizes damage to important structures such as rectum and nerve bundles.
- the duration of needle placement at each location in a therapeutic procedure may be optimized using a pre-operative imaging modality, to effectively design a treatment for the target region locally while sparing the surrounding tissues and structures.
- Anatomical landmarks and/or structures identified in pre-operative image may also identified in the intra-operative (live) image and labeled consistently.
- the pre-operative image may also identify surfaces and boundaries, which can be defied or modeled as, for example, triangulated meshes.
- the surfaces may represent the entire anatomical structure/object or a part thereof.
- a boundary may have no real anatomical correlate, and be defined virtually; however, an advantage arises if the virtual boundary can be consistently and accurately identified in both the pre-operative imaging and the real-time inter-operative imaging, since the facilitates registration and alignment of the images.
- the virtual features of the images may be based on generic anatomy, e.g., of humans or animals, or patient-specific.
- Labeled surfaces and landmarks in pre-operative and intra-operative images may be used for rigid registration.
- the bladder is labeled as "1" in pre-operative image, it is registered with object labeled "1" in intra-operative image. More generally, regions on an image are classified or segmented, and that classification or segment definition from the pre-operative imaging is applied to the inter-operative real time imaging.
- the landmark and object registration may be performed rigidly using a simultaneous landmark and surface registration algorithm.
- a rigid registration may be optionally followed by an affine registration.
- An elastic registration method may follow the rigid or affine registration.
- An elastic registration method may be at least one of intensity based, binary mask based, surfaces- and landmarks-based method or a combination of these methods.
- a deformation model may be computed from a number of training datasets is used for image registration.
- the deformation model models the deformation of the object of interest, for example, a prostate goes through deformation upon application of an external tool such as ultrasound transducer or endorectal coil.
- the training datasets may include sets of corresponding planning images and live modality images for same patient.
- preoperative imaging is obtained under conditions that model a soft tissue deformation that might occur during the real-time imaging.
- the correspondence may be further refined by identifying and defining mismatching corresponding features between the pre-procedure and intra-procedure images.
- a calcification may be seen in both MRI (pre- procedure) and ultrasound (inter-procedure) images, and if these anatomical landmarks mismatch slightly, a user may identify these landmarks visually and select them by click of a mouse; alternately, an automated indication of mismatch may be generated.
- An algorithm can then refine the correspondence such that the boundaries of the object of interest do not move while the deformation inside the object gets updated. The deformation inside the object of interest may thus follow an elastic deformation model based on the new landmarks constrained by object boundaries.
- An image registration method may therefore be provided that maps a region of interest from a pre-procedural (planning) image to an intra-procedural (live) image, along with a complete plan such that the plan and the region of interest are mapped and deformed to conform to the shape of the object during the procedure.
- the technology provides a method of image, fusion where the mapped plan may be displayed as one or more overlays on a live imaging modality display during an image guided procedure.
- the fusion need not be an overlay, and may be supplemental information through a different modality, such as voice or sonic feedback, force-feedback or proprioceptive feedback, a distinct display (without overlay), or the like.
- voice or sonic feedback e.g., voice synthesis feedback
- force-feedback or proprioceptive feedback e.g., force-feedback or proprioceptive feedback
- a distinct display without overlay
- different types of information may be distinguished by color, intensity, depth (on a stereoscopic display), icons, or other known means.
- the plan may be indicated by static images or graphics, animated graphics, and/or acoustic information (e.g., voice synthesis feedback).
- a planning image can also be overlaid on the live imaging modality during an image guided procedure such that the images can be toggled back and forth, or displayed together in a real-time "fused" display.
- the mapped plan may be further adjusted to account for a new shape of object revealed during real-time imaging. This may be done using an automated method, semi-automatic method, or manual method or a combination thereof.
- a pre-procedure planning image may be used to plan the procedure such that the plan is embedded in an electronic, printed, or interactive web-based report.
- the present technology identifies imaging modalities clearly including landmarks, objects and intensity information, to perform registration using a combination of rigid, affine and non-rigid elastic registration.
- the modeling of the objects within an image may thus comprise a segmentation of anatomical features.
- the method may further comprise transforming coordinate systems of various imaging modalities.
- the system may further comprise at least one modeling processor configured to perform real-time model updates of a patient soft-tissue to ensure that a pre-operative image remains accurately registered with an intra-operative image.
- the annotated regions of the medical imaging scan or the plan may be generated by a computer-aided diagnosis or therapeutic planning system. At least apportion of the pre-operative imaging may be conducted at a remote location from the therapeutic or diagnostic procedure, and the information conveyed between the two through the Internet, preferably over a virtual private network. A true private network may also be used, or simply encrypted files communicate over otherwise public channels. The physical separation of the imaging modalities facilitates professional specialization, since experts at different aspects of the process need not be collocated. [0041] The present technology permits porting information from a planning image frame of reference to a live imaging modality for guiding a medical procedure. The plan may be defined as a region of interest and needle placement or a method to plan a treatment or biopsy, for example.
- the present technology may employ not only object boundaries, but also surrounding or internal information for registration, and thus is may be employed in applications where there is significant internal deformation that cannot be modeled using boundaries alone.
- image fusion is sometimes used to define the process of registering two images that are acquired via different imaging modalities or at different time instances.
- the registration/fusion of images obtained from different modalities creates a number of complications.
- the shape of soft tissues in two images may change between acquisitions of each image.
- a diagnostic or therapeutic procedure can alter the shape of the object that was previously imaged.
- the frame of reference (FOR) of the acquired images is typically different. That is, multiple MRI volumes are obtained in high resolution transverse, coronal or sagittal planes respectively, with lower resolution representing the slice distance. These planes are usually in rough alignment with the patient's head-toe, anterior-posterior or left-right orientations.
- TRUS images are often acquired while a patient lies on his side in a fetal position by reconstructing multiple rotated samples 2D frames to a 3D volume.
- the 2D image frames are obtained at various instances of rotation of the TRUS probe after insertion in to the rectal canal.
- the probe is inserted at an angle (approximately 30-45 degrees) to the patient's head-toe orientation.
- the gland in MRI and TRUS will need to be rigidly aligned because their relative orientations are unknown at scan time.
- well-defined and invariant anatomical landmarks may be used to register the images, though since the margins of landmarks themselves vary with imaging modality, the registration may be imperfect or require discretion in interpretation.
- a further difficulty with these different modalities is that the intensity of objects in the images do not necessarily correspond. For instance, structures that appear bright in one modality (e.g., MRI) may appear dark in another modality (e.g., ultrasound). Thus, the logistical process of overlaying or merging the images requires perceptual optimization. In addition, structures identified in one image (soft tissue in MRI) may be entirely absent in another image. TRUS imaging causes further deformation of gland due to pressure exerted by the TRUS transducer on prostate. As a result, rigid registration is not sufficient to account for difference between MRI and TRUS images. Finally, the resolution of the images may also impact registration quality.
- the boundary/surface model of the prostate is an elastic object that has a gland boundary or surface model that defines the volume of the prostate.
- the boundary can then be used as a reference for aligning both images.
- each point of the volume defined within the gland boundary of the prostate in one image should correspond to a point within a volume defined by a gland boundary of the prostate in the other image, and vice versa.
- the data in each data set may be transformed, assuming elastic deformation of the prostate gland.
- the characteristics of soft tissue under shear and strain, compression, heating and/or inflammation, bleeding, coagulation, biopsy sampling, tissue resection, etc., as well as normal physiological changes for healthy and pathological tissue, over time are modeled, and therefore these various effects accounted for during the pre-operative imaging and real-time intraprocedural imaging.
- a system and method for use in medical imaging of a prostate of a patient.
- the utility includes obtaining a first 3D image volume from an MRI imaging device.
- this first 3D image volume is acquired from data storage. That is, the first 3D image volume is acquired at a time prior to a current procedure.
- a first shape or surface model may be obtained from the MRI image (e.g., a triangulated mesh describing the gland).
- the surface model can be manually or automatically extracted from all co-registered MRI image modalities. That is, multiple MRI images may themselves be registered with each other as a first step.
- the 3D image processing may be automated, so that a technician need not be solely occupied by the image processing, which may take seconds or minutes.
- the MRI images may be Ti, T 2 , DCE (dynamic contrast-enhanced), DWI (diffusion weighted imaging), ADC (apparent diffusion coefficient) or other.
- CAT computer aided tomography
- the surface of the prostate may not represent a high contrast feature, and therefore other aspects of the image may be used; typically, the CAT scan is used to identify radiodense features, such as calcifications, or brachytherapy seeds, and therefore the goal of the image registration process would be to ensure that these features are accurately located in the fused image model.
- a co-registered CT image with PET scan can also provide diagnostic information that can be mapped to TRUS frame of reference for image guidance.
- the pre-operative imaging comprises use of the same imaging modality as used intra-operatively, generally along with an additional imaging technology.
- an ultrasound volume of the patient's prostate may be obtained, for example, through rotation of a TRUS probe, and the gland boundary is segmented in an ultrasound image.
- the ultrasound images acquired at various angular positions of the TRUS probe during rotation can be reconstructed to a rectangular grid uniformly through intensity interpolation to generate a 3D TRUS volume.
- other ultrasound methods may be employed without departing from the scope of the technology.
- the MRI or CAT scan volume is registered to the 3D TRUS volume (or vice versa), and a registered image of the 3D TRUS volume is generated in the same frame of reference (FOR) as the MRI or CAT scan image.
- this registration occurs prior to a diagnostic or therapeutic intervention.
- the advantage here is that both data sets may be fully processed, with the registration of the 3D TRUS volume information completed.
- a fully fused volume model is available. In general, the deviation of a prior 3D TRUS scan from a subsequent one will be small, so features from the real-time scan can be aligned with those of the prior imaging procedure.
- the fused image from the MRI (or CAT) scan provides better localization of the suspect pathological tissue, and therefore guidance of the diagnostic biopsy or therapeutic intervention. Therefore, the suspect voxels from the MRI are highlighted in the TRUS image, which during a procedure would be presented in 2D on a display screen to guide the urologist.
- the process therefore seeks to register 3 sets of data; the MRI (or other scan) information, the pre-operative 3D TRUS information, and the real time TRUS used during the procedure.
- the preoperative 3D TRUS and the inter-operative TRUS are identical apparatus, and therefore would provide maximum similarity and either minimization of artifacts or present the same artifacts.
- the 3D TRUS preoperative scan can be obtained using the same TRUS scanner and immediately pre-operative, though it is preferred that the registration of the images proceed under the expertise of a radiologist or medical scanning technician, who may not be immediately available during that period.
- a plan may be defined manually, semi-automatically, or in certain cases, automatically.
- the plan for example in a prostate biopsy procedure, defines both the location of the samples to be acquired, as well as the path to be taken by the biopsy instrument in order to avoid undue damage to tissues.
- the plan may be updated in real-time. For example, if the goal of the plan is to sample a volume of tissue on 1.5 mm spatial distances, but the accuracy of sampling is ⁇ 0.5 mm, then subsequent sampling targets may be defined adaptively based on the actual sampling location during the procedure.
- the course of treatment including both the location of the laser and its excitation parameters, may be determined based on both the actual location of a fiber optic tip, as well as a measured temperature, and perhaps an inter-operatively determined physiological response to the therapy, such as changes in circulation pattern, swelling, and the like.
- the registered image and the geometric transformation that relates, for example, a MRI scan volume with an ultrasound volume can be used to guide a medical procedure such as, for example, biopsy or brachytherapy.
- regions of interest identified on the MRI scan are usually defined by a radiologist based on information available in MRI prior to biopsy, and may be a few points, point clouds representing regions, or triangulated meshes.
- the 3D TRUS may also reveal features of interest for biopsy, which may also be marled as regions of interest. Because of the importance of registration of the regions of interest in the MRI scan with the TRUS used intra-operatively, in a manual or semiautomated pre-operative image processing method, the radiologist can override or control the image fusion process according to his or her discretion.
- a segmented MRI and 3D TRUS is obtained from a patient for the prostate grand.
- the MRI and TRUS data is registered and transformations applied to form a fused image in which voxels of the MRI and TRUS images physically correspond to one another. Regions of interest are then identified either from the source images or from the fused image. The regions of interest are then communicated to the real-time ultrasound system, which tracks the earlier TRUS image. Because the ultrasound image is used for real time guidance, typically the transformation/alignment takes place on the MRI data, which can then be superposed or integrated with the ultrasound data.
- the real-time TRUS display is supplemented with the MRI (or CAT or other scan) data, and an integrated display presented to the operating urologist.
- haptic feedback may be provided so that the urologist can "feel" features when using a tracker.
- the MRI or CAT scan data may be used to provide a coordinate frame of reference for the procedure, and the TRUS image modified in real-time to reflect an inverse of the ultrasound distortion. That is, the MRI or CAT data typically has a precise and undistorted geometry.
- the ultrasound image may be geometrically distorted by phase velocity variations in the propagation of the ultrasound waves through the tissues, and to a lesser extent, by reflections and resonances. Since the biopsy instrument itself is rigid, it will correspond more closely to the MRI or CAT model than the TRUS model, and therefore a urologist seeking to acquire a biopsy sample may have to make corrections in course if guided by the TRUS image.
- TRUS image is normalized to the MRI coordinate system, then such corrections may be minimized. This requires that the TRUS data be modified according to the fused image volume model in real time.
- graphics processors GPU or APU, multicore CPU, FPGA
- other computing technologies make this feasible.
- the urologist is presented with a 3D display of the patient's anatomy, supplemented by and registered to the real-time TRUS data.
- Such 3D displays may be effectively used with haptic feedback.
- the first is a frame of reference transformation, due to the fact that the MRI image is created as a set of slices in parallel planes (rectangular coordinate system) which will generally differ from the image plane of the TRUS, defined by the probe angle (cylindrical coordinate system, with none of the cylindrical axes aligned with a coordinate axis of the MRI).
- the second transformation represents the elastic deformation of the objects within the image to properly aligned surfaces, landmarks, etc.
- the segmentation and/or digitizing may be carried out semi-automatically (manual control over automated image processing tasks) or automatically using computer software.
- computer software which may be suitable includes 3D Slicer (www.slicer.org), an open source software package capable of automatic image segmentation, manual editing of images, fusion and co-registering of data using rigid and non-rigid algorithms, and tracking of devices for image-guided procedures.
- Magnetic resonance imaging/ultrasound fusion guided prostate biopsy improves cancer detection following transrectal ultrasound biopsy and correlates with multiparametric magnetic resonance imaging.
- the Journal of urology 186.4 (2011): 1281-1285; Reynier, Christophe, et al. MRI/TRUS data fusion for prostate brachytherapy. Preliminary results.”
- Image fusion of MR images and real-time ultrasonography evaluation of fusion accuracy combining two commercial instruments, a neuronavigation system and a ultrasound system.” Acta neurochirurgica 146.3 (2004): 271-277; Wein, Wolfgang, Barbara Roper, and Nassir Navab.
- a further object provides a system for combining information from plurality of medical imaging modalities, comprising: an input port configured to receive at least two first volumetric images using a first volumetric imaging modality of an anatomical region representing respectively different states of elastic deformation, and at least two second volumetric images using a second volumetric imaging modality, of the anatomical region representing respectively different states of elastic deformation; at least one processor configured to define an elastic soft tissue model for at least a portion of the anatomical region encompassed by the first volumetric image, and to label features of the anatomical region based on at least the first volumetric image and the soft tissue model; and a memory configured to store the defined elastic soft tissue model
- the first imaging modality may comprise at least one of positron emission tomography, computed tomography, magnetic resonance imaging, magnetic resonance spectrography imaging, and elastography.
- the anatomical region may comprise an organ selected from the group consisting of prostate, heart, lung, kidney, liver, bladder, ovaries, and thyroid.
- the therapeutic intervention may be selected from one or more selected from the group consisting of laser ablation, brachytherapy, stem cell injection for ischemia of the heart, cryotherapy, direct injection of a photothermal or photodynamic agent, and radiotherapy.
- the differentially visualized anatomical region may be at least one anatomical structure to be spared in an invasive procedure, selected from the group consisting of a nerve bundle, a urethra, a rectum and a bladder.
- the registered features may comprise anatomical landmarks selected from the group consisting of a urethra, a urethra at a prostate base, a urethra at an apex, a verumontanum, a calcification and a cyst, a seminal vesicle, an ejaculatory duct, a bladder and a rectum.
- the method may further comprise acquiring a tissue sample from a location determined based on at least the first imaging modality and the second imaging modality.
- the method may further comprise delivering a therapeutic intervention at a location determined based on at least the first imaging modality and the second imaging modality.
- the method may further comprise performing an image-guided at least partially automated procedure selected from the group consisting of laser ablation, high intensity focused ultrasound, cryotherapy, radio frequency, brachytherapy, IMRT, and robotic surgery.
- the differentially visualized anatomical region may comprise at least one of a suspicious lesion for targeted biopsy, a suspicious lesion for targeted therapy, and a lesion for targeted dose delivery.
- the method may further comprise automatically defining a plan comprising an target and an invasive path to reach the target.
- the plan may be defined based on the first imaging modality, and is adapted in real time based on at least the second imaging modality.
- the plan may comprise a plurality of targets.
- a plurality of anatomical features may be consistently labeled in the first volumetric image and the second volumetric image.
- the soft tissue model may comprise an elastic triangular mesh approximating a surface of an organ.
- the anatomical landmark registration may be performed rigidly using a simultaneous landmark and surface registration algorithm.
- An affine registration may be performed.
- the registering may comprise an elastic registration based on at least one of an intensity, a binary mask, and surfaces and landmarks.
- the model may be derived from a plurality of training datasets representing different states of deformation of an organ of a respective human using the first imaging modality and the second imaging modality.
- the method may further comprise identifying a mismatch of corresponding anatomical features of the first volumetric image and the second volumetric image, and updating the registration to converge the corresponding anatomical features to reduce the mismatch based on corrections of an elastic deformation model constrained by object boundaries.
- Figure 1 shows a typical workflow for a surgeon in using a fusion platform for mapping plan from a pre-procedural planning image to the intra-procedural live image
- Figure 4 shows an object process diagram for non-rigid elastic image registration, using rigid and/or affine registration as an initialization, wherein multiple labeled objects are used to compute the correspondence while satisfying the regularization constraints;
- Figure 5 shows an object process diagram for planning a laser ablation of the prostate gland, in which a radiologist/radiation oncologist analyzes multiparametric MRI (mpMRI) images of a prostate and plans the location of needle tip, trajectory and duration of needle application; and
- mpMRI multiparametric MRI
- Figures 6A and 6B show a conceptual diagram for planning a laser ablation of the prostate gland, in which Figure 6A shows the target lesion identified by an expert in sagittal and transverse images, and Figure 6B shows the plan for laser ablation in the two orthogonal directions.
- the present invention will be described with respect to a process, which may be carried out through interaction with a user or automatically.
- imaging systems including but not limited to MRI, ultrasound, PET, CT, SPECT, X-ray, and the like may be used for either pre-operative or intra-operative imaging, but that a preferred scheme employs a fusion of MRI and/or CT and/or PET and ultrasound imaging for the pre-operative imaging, and trans-urethral ultrasound for intra-operative real time imaging in a prostate diagnosis or therapeutic procedure.
- one or more pre-procedure "planning” images are used to plan a procedure and one or more intra-procedure "live” images used to guide the procedure.
- prostate biopsy and ablation is typically done under ultrasound guidance. While speed of imaging and cost make ultrasound an ideal imaging modality for guiding biopsy, ultrasound images are insufficient and ineffective at identifying prostate cancer.
- Multi-parametric MRI mpMRI
- mpMRI consists of various protocols for MR imaging including T2-weighted imaging, Diffusion-weighted imaging (DWI), Dynamic contrast-enhanced (DCE) and MR Spectroscopic imaging (MRSI).
- Radiologists are best placed at analyzing the MRI images for detection and grading the prostate cancer. However, it remains challenging to take the information from radiologists and present to urologists or surgeons who use ultrasound as imaging method for performing a biopsy. Likewise, MRI is ideal for identifying the sensitive surrounding structures that must be spared in order to preserve quality of life after the procedure.
- 2010/02906853 uses a prostate surface-based non-rigid image registration method.
- the method uses only triangulated prostate boundaries as input to registration and performs a point-wise image registration only at the surface boundaries and then interpolates the information from boundaries to inside the object.
- Significant drawbacks include lack of information from surrounding structures, requiring significant skills, knowledge and effort to provide a good manual image registration between MRI and ultrasound, which is very challenging, especially when surgeons are not very skilled at reading and interpreting MR images.
- the results can be variable since there can be significant difference in orientation and shape of gland between MRI and transrectal ultrasound.
- the prostate internal structures and details are also completely ignored.
- any internal twisting, rotation or non-rigid distortion is not accounted for, which may lead to poor results especially when an endo-rectal coil is used in MRI.
- the plan is mapped as a region of interest, leaving it up to the surgeon to interpret how to properly sample a certain region. Also, in case of misregistration, there is no way disclosed to edit or refine the registration.
- the method plans location, trajectory and depth of needle insertion optimized such that there is maximum likelihood of sampling the malignancy while minimizing number of biopsy cores.
- Figure 2 shows a method according to the present technology for rigid registration.
- Ii(x) and 3 ⁇ 4( ⁇ ) represent the planning and live images, respectively with x being the coordinate system.
- wj's are relative weights for different costs
- Sim(A,B) represents the similarity cost between two objects A and B.
- the cost could be sum of the squared intensity differences or a mutual information based metric, in case of binary objects, the cost may be relative overlap. In case of surfaces, the cost could be a symmetric distance between corresponding points.
- R represents the rigid transformation matrix that includes rotation and translation in 3D frame of reference.
- Figure 3 shows a method for affine registration.
- Ij(x) and I 2 (x) represent the planning and live images, respectively with x being the coordinate system.
- Wj's are relative weights for different costs and Sim(A,B) represents the similarity cost between two objects A and B.
- the cost could be sum of squared intensity differences or a mutual information based metric, in case of binary objects, the cost may be relative overlap. In case of surfaces, the cost could be symmetric distance between corresponding points.
- A represents the affine transformation matrix that registers image Ii to frame of reference of image I 2 .
- the procedure is preferably performed under intra-procedural image guidance, with the information from pre-procedure mapped to an intra-procedure image using a combination of rigid, affine and elastic registration, as shown in Figure 4, which shows an object process diagram for non-rigid elastic image registration using rigid and/or affine registration as an initialization.
- the method uses multiple labeled objects to compute the correspondence while satisfying the regularization constraints.
- the surgeon identifies the same landmarks, features and structures as the pre-procedure image and labels them consistently. This may be done automatically or manually after acquiring an initial intra-procedural scan.
- the registration method uses the labels in pre-procedure and intra-procedure images to identify the structural correspondence and registers the images using a combination of rigid, affine and elastic registration.
- Figures 6A and 6B show a conceptual diagram for planning a laser ablation of the prostate gland.
- Figure 6A shows the target lesion identified by an expert in sagittal and transverse images.
- Figure 6B shows the plan for laser ablation in the two orthogonal directions.
- a and B represent the planned needles, which ablate the area shown in hatched lines. The ablated area covers the planned target.
- the registration provides a surgeon with image fusion such that the information from pre-procedure or planning images is mapped to the frame of reference of the intra-procedure or live images.
- the mapped information contains at least one structural image, target area to be treated and a plan for the procedure.
- the plan may be in the form of needle location and trajectory along with the duration of needle application, if needed.
- Figure 1 shows the overall workflow of a surgeon, where the images planned by an expert (radiologist/ radiation oncologist) are fused with a live imaging modality such as ultrasound for real-time guidance while taking advantage of diagnostic capabilities of the pre- procedural planning image.
- the pre-procedure image is registered with the live image using a combination of rigid, affine and non-rigid elastic registration.
- the registration provides a correspondence or a deformation map, which is used to map planning information from the frame of reference of the planning image to the live image.
- the method permits a radiologist, radiation oncologist or an oncological image specialist to analyze pre-operative images, identify and label various structures including the objects of interest, say the prostate from the above detailed examples.
- the structures identified and labeled by the imaging specialist could include external and internal structures and landmarks such as bladder, urethra, rectum, seminal vesicles, nerve bundles, fibromuscular stroma and prostate zones. These structures are identified and stored as either points, binary masks or surface meshes. Each such structure is labeled uniquely.
- the method includes an automatically (or semi-automatically) generated plan for the entire procedure.
- Figure 2, 3 and 4 represent the rigid, affine and non-rigid elastic registration methods.
- An expert or a computer algorithm identifies and labels various anatomical structures and landmarks in the planning image.
- image I ⁇ x represent the structural planning image.
- the structural image could be a T2-weighted transversally acquired MRI image.
- the subscript 1 corresponds to the planning or pre-procedural image.
- objects may also be represented by surfaces, in which case, the objects will consist of the vertices and triangles joining the vertices.
- the expert provides the plan for a procedure on the structural image.
- a surgeon loads the planning image along with the object labels or surface meshes, landmarks and the plan.
- the planning image Ii is projected to the intra- procedure image I 2 acquired during the procedure.
- the labels and landmarks may be defined in the image h either manually or automatically.
- the labels in the target image are automatically computed by letting the planning image Ii deform to the shape of the target image 3 ⁇ 4.
- the object maps defined in planning image also participate in the registration such that segmentation (object labeling) and registration (computation of correspondence) happens at the same time in the target image.
- Figure 4 shows one way of performing the registration between the pre-procedure planning image and the intra-operative image.
- the method uses the object maps along with the intensity information and the landmark correspondences to compute the correspondence between the images.
- the resulting deformation map is used to map the plan from frame of reference of the planning image to the intra-procedural image.
- Figures 5, 6A and 6B represent an embodiment where the plan is a needle-based laser ablation plan.
- the radiologist or radiation oncologist analyses the MRI image and automatically or manually computes a target region along with labeling the surrounding sensitive tissue, i.e., the safety zone.
- the automated method embedded in the current method computes the trajectory, location, energy settings and the duration of application of laser such that the target region is completely ablated while the safety zone is spared.
- MRI data which may include post-segmented MR image data, pre-segmented interpreted MRI data, the original MRI scans, suspicion index data, and/or instructions or a plan, may be communicated to a urologist,
- the MRI data may be stored in a DICOM format, in another industry-standard format, or in a proprietary format unique to the imaging modality or processing platform generating the medical images.
- the urology center where the MRI data is received may contain an image-guided biopsy or therapy system such as the Artemis, UroStation (Koelis, La Tranche, France), or BiopSee (MedCom GmbH, Darmstadt, Germany).
- the image-guided biopsy system may comprise hardware and/or software configured to work in conjunction with a urology center's preexisting hardware and/or software.
- a mechanical tracking arm may be connected to a preexisting ultrasound machine, and a computer programmed with suitable software may be connected to the ultrasound machine or the arm.
- a tracking arm on the system may be attached to an ultrasound probe and an ultra sound scan is performed.
- a two-dimensional (2-D) or 3D model of the prostate may he generated using the ultrasonic images produced by the scan, and segmentation of the model may be performed. Pre-processed ultrasound image data and post-processed ultrasound image data may be transmitted to the urology center. Volumetry may also be performed, including geometric or planimetric volumetry. Segmentation and/or volumetry may he performed manually or automatically by the image guided biopsy system. Preselected biopsy sites (e.g., selected by the radiologist during the analysis) may be incorporated into and displayed on the model. All of this ultrasound data generated from these processes may be electronically stored on the urology center's server via a communications link.
- processing of the MRI data or ultrasound data may be carried out manually, automatically, or semi- automatically.
- This may be accomplished through the use of segmentation software, such as Segasist Prostate Auto-Contouring, which may be included in the image-guided biopsy system.
- segmentation software such as Segasist Prostate Auto-Contouring
- Such software may also be used to perform various types of contour modification, including manual delineation, smoothing, rotation, translation, and edge snapping.
- the software is capable of being trained or calibrated, in which it observes, captures, and saves the user's contouring and editing preferences over time and applies this knowledge to contour new images.
- This software need not be hosted locally, but rather, may be hosted on a remote server or in a cloud computing environment.
- MRI data may be integrated with the image-guided biopsy system.
- the fusion process may be aided by the use of the instructions included with the MRI data.
- the fusion process may include registration of the MR and ultrasonic images, which may include manual or automatic selection of fixed anatomical landmarks in each image modality. Such landmarks may include the base and apex of the prostatic urethra.
- the two images may be substantially aligned and then one image superimposed onto the other.
- Registration may also be performed with models of the regions of interest. These models of the regions of interest, or target areas, may also be superimposed on the digital prostate model.
- the fusion process thus seeks to anatomically align the 3D models obtained by the radiological imaging, e.g., MRI, with the 3D models obtained by the ultrasound imaging, using anatomical landmarks as anchors and performing a warping of at least one of the models to confirm with the other.
- the radiological analysis is preserved, such that information from the analysis relevant to suspicious regions or areas of interest are conveyed to the urologist.
- the fused models are then provided for use with the real-time ultrasound system, to guide the urologist in obtaining biopsy samples or performing a therapeutic procedure.
- the 3D MR image is integrated or fused with real-time ultrasonic images, based on a 3D ultrasound model obtained prior to the procedure (perhaps immediately prior). This allows the regions of interest to be viewed under real-time ultrasonic imaging so that they can be targeted during biopsy or therapy.
- biopsy tracking and targeting using image fusion may be performed by the urologist for diagnosis and management of prostate cancer.
- Targeted biopsies may be more effective and efficient for revealing cancer than non-targeted, systematic biopsies.
- Such methods are particularly useful in diagnosing the ventral prostate gland, where malignancy may not always be detected with biopsy.
- Targeted biopsy addresses this problem by providing a more accurate diagnosis method. This may be particularly true when the procedure involves the use of multimodal MRI. Additionally, targeting of the suspicious areas may reduce the need for taking multiple biopsy samples or performing saturation biopsy.
- the described methods and systems may also be used to perform saturation biopsy.
- Saturation biopsy is a multicore biopsy procedure in which a greater number of samples are obtained from throughout the prostate than with a standard biopsy. Twenty or more samples may be obtained during saturation biopsy, and sometimes more than one hundred. This procedure may increase tumor detection in high-risk cases.
- the benefits of such a procedure are often outweighed by its drawbacks, such as the Inherent trauma to the prostate, the higher incidence of side effects, the additional use of analgesia or anesthesia, and the high cost of processing the large amount of samples.
- focused saturation biopsy may be performed to exploit the benefits of a saturation biopsy while minimizing the drawbacks.
- a physician may sample four or more cores, all from the suspected area. This procedure avoids the need for high-concentration sampling in healthy areas of the prostate. Further, this procedure will not only improve detection, but will enable one to determine the extent of the disease.
- non-targeted systematic biopsy may be performed under the guidance of 3D ultrasonic imaging. This may allow for more even distribution of biopsy sites and wider sampling over conventional techniques.
- the image data may be used as a map to assist the image-guided biopsy system in navigation of the biopsy needle, as well as tracking and recording the navigation.
- the process described above may further include making treatment decisions and carrying out the treatment of prostate cancer using the image-guided biopsy system.
- the current invention provides physicians with information that can help them and patients make decisions about the course of care, whether it be watchful waiting, hormone therapy, targeted thermal ablation, nerve sparing robotic surgery, or radiation therapy.
- CT computed tomography
- CT scans may be fused with MRI data to provide more accurate prediction of the correct staging, more precise target volume identification, and improved target delineation.
- MRI in combination with biopsy, will enhance patient selection for focal ablation by helping to localize clinically significant tumor foci.
- HIFU for treatment of prostate cancer in conjunction with the methods and apparatus previously described.
- An example of a commercially available HIFU system is the Sonablate 500 by Focus Surgery, Inc. (Indianapolis, IN), which is a HIFU therapy device that operates under the guidance of 3D ultrasound imaging.
- Such treatment systems can be improved by being configured to operate under the guidance of a fused MRI-ultrasound image.
- temperatures in the tissue being ablated may be closely monitored and the subsequent zone of necrosis (thermal lesion) visualized, and used to update a real-time tissue model.
- Temperature monitoring for the visualization of a treated region may reduce recurrence rates of local tumor after therapy.
- Techniques for the foregoing may include microwave radiometry, ultrasound, impedance tomography, MRI, monitoring shifts in diagnostic pulse-echo ultrasound, and the real-time and in vivo monitoring of the spatial distribution of heating and temperature elevation, by measuring the local propagation velocity of sound through an elemental volume of such tissue structure, or through analysis of changes in backscattered energy.
- Other traditional methods of monitoring tissue temperature include thermometry, such as ultrasound thermometry and the use of a thermocouple.
- MRI may also be used to monitor treatment, ensure tissue destruction, and avoid overheating surrounding structures. Further, because ultrasonic imaging is not always adequate for accurately defining areas that have been treated, MRI may be used to evaluate the success of the procedure. For instance, MRI may be used for assessment of extent of necrosis shortly after therapy and for long-term surveillance for residual or recurrent tumor that may then undergo targeted biopsy.
- post-operative image fusion that is, performing an imaging procedure after completion of an interventional procedure, and fusing or integrating pre-operative and/or intra-operative imaging data to help understand the post-operative anatomy. For example, after aggressive therapy, a standard anatomical model of soft tissue may no longer be accurate, but by integrating the therapeutic intervention data, a more accurate understanding, imaging, and image analysis may be provided.
- a diagnostic and treatment image generation system includes at least one database containing image data from two different modalities, such as MRI and ultrasound data, and an image-guided biopsy and/or therapy system.
- the diagnostic and treatment image generation system may also include a computer programmed to aid in the transmission of the image data and/or the fusion of the data using the image-guided biopsy system.
- a computer readable storage medium has a non-transitory computer program stored thereon, to control an automated system to carry out various methods disclosed herein.
Landscapes
- Engineering & Computer Science (AREA)
- Computer Vision & Pattern Recognition (AREA)
- Theoretical Computer Science (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- Software Systems (AREA)
- Computing Systems (AREA)
- Artificial Intelligence (AREA)
- Health & Medical Sciences (AREA)
- Databases & Information Systems (AREA)
- Evolutionary Computation (AREA)
- General Health & Medical Sciences (AREA)
- Medical Informatics (AREA)
- Multimedia (AREA)
- Ultra Sonic Daignosis Equipment (AREA)
- Magnetic Resonance Imaging Apparatus (AREA)
Abstract
Un système et un procédé combinent des informations d'une pluralité de modes d'imagerie médicale, telles que TEP, TDM, IRM, ISRM, ultrasons, échocardiogrammes, imagerie et élastographie photo-acoustique pour une procédure médicale guidée par des images, de sorte qu'une image de pré-procédure utilisant un de ces modes d'imagerie soit fusionnée avec un mode d'imagerie d'intra-procédure, utilisé pour un guidage en temps réel assisté par imagerie, pour une procédure médicale pour un organe ou un gland de tissu mou, tel que la prostate, la peau, le cœur, le poumon, le rein, le foie, la vessie, les ovaires et la thyroïde ; la déformation et les changements du tissu mou entre les deux instances d'imagerie étant modélisées et représentées automatiquement.
Applications Claiming Priority (4)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
US201261691758P | 2012-08-21 | 2012-08-21 | |
US61/691,758 | 2012-08-21 | ||
US13/835,479 | 2013-03-15 | ||
US13/835,479 US20140073907A1 (en) | 2012-09-12 | 2013-03-15 | System and method for image guided medical procedures |
Publications (1)
Publication Number | Publication Date |
---|---|
WO2014031531A1 true WO2014031531A1 (fr) | 2014-02-27 |
Family
ID=50150325
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
PCT/US2013/055561 WO2014031531A1 (fr) | 2012-08-21 | 2013-08-19 | Système et procédé de procédures médicales guidées par des images |
Country Status (1)
Country | Link |
---|---|
WO (1) | WO2014031531A1 (fr) |
Cited By (14)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
WO2015008279A1 (fr) * | 2013-07-15 | 2015-01-22 | Tel Hashomer Medical Research Infrastructure And Services Ltd. | Procédés de fusion d'images d'irm et utilisations de ces images |
CN105310776A (zh) * | 2014-12-02 | 2016-02-10 | 复旦大学 | 一种基于子块的软组织表面变形追踪方法 |
WO2016033065A1 (fr) * | 2014-08-26 | 2016-03-03 | Rational Surgical Solutions, Llc | Alignement d'image pour ct ou imagerie mr et imagerie ultrasonore faisant appel à un dispositif mobile |
WO2016039763A1 (fr) * | 2014-09-12 | 2016-03-17 | Analogic Corporation | Repères d'alignement d'image |
WO2017017498A1 (fr) * | 2015-07-29 | 2017-02-02 | Synaptive Medical (Barbados) Inc. | Procédé, système et appareil pour régler des données d'image afin de compenser une distorsion induite par modalité |
WO2017192670A1 (fr) * | 2016-05-03 | 2017-11-09 | SonaCare Medical, LLC | Système et procédé de prétraitement d'un volume de tissu appelé à être traité |
EP3547252A4 (fr) * | 2016-12-28 | 2019-12-04 | Shanghai United Imaging Healthcare Co., Ltd. | Système et procédé de traitement d'images multi-modales |
US10535149B2 (en) | 2014-06-25 | 2020-01-14 | Koninklijke Philips N.V. | Imaging device for registration of different imaging modalities |
CN111801705A (zh) * | 2018-03-02 | 2020-10-20 | 通用电气公司 | 用于加速临床工作流程的系统和方法 |
EP3600111A4 (fr) * | 2017-03-20 | 2020-12-30 | Exact Imaging Inc. | Procédé et système d'assistance visuelle à un opérateur d'un système à ultrasons |
CN112494028A (zh) * | 2019-09-13 | 2021-03-16 | 通用电气公司 | 使用多模态成像的活检工作流程 |
US11257219B2 (en) | 2017-12-08 | 2022-02-22 | Koninklijke Philips N.V. | Registration of static pre-procedural planning data to dynamic intra-procedural segmentation data |
EP4156100A1 (fr) * | 2021-09-23 | 2023-03-29 | Siemens Healthcare GmbH | Fourniture de données d'image de résultat |
WO2024215884A1 (fr) * | 2023-04-14 | 2024-10-17 | Medtronic Navigation, Inc. | Système et procédé d'imagerie et d'enregistrement pour navigation |
Citations (5)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20040234113A1 (en) * | 2003-02-24 | 2004-11-25 | Vanderbilt University | Elastography imaging modalities for characterizing properties of tissue |
US20080161687A1 (en) * | 2006-12-29 | 2008-07-03 | Suri Jasjit S | Repeat biopsy system |
US20090281422A1 (en) * | 2008-05-06 | 2009-11-12 | Salama Khaled N | Multi-modality system for imaging in dense compressive media and method of use thereof |
US20090316854A1 (en) * | 2008-06-23 | 2009-12-24 | Ismail Aly M | Multi-modality system for screening, imaging and diagnosis in dense compressive media and method of use thereof |
US20090326363A1 (en) * | 2008-05-02 | 2009-12-31 | Eigen, Llc | Fused image modalities guidance |
-
2013
- 2013-08-19 WO PCT/US2013/055561 patent/WO2014031531A1/fr active Application Filing
Patent Citations (5)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20040234113A1 (en) * | 2003-02-24 | 2004-11-25 | Vanderbilt University | Elastography imaging modalities for characterizing properties of tissue |
US20080161687A1 (en) * | 2006-12-29 | 2008-07-03 | Suri Jasjit S | Repeat biopsy system |
US20090326363A1 (en) * | 2008-05-02 | 2009-12-31 | Eigen, Llc | Fused image modalities guidance |
US20090281422A1 (en) * | 2008-05-06 | 2009-11-12 | Salama Khaled N | Multi-modality system for imaging in dense compressive media and method of use thereof |
US20090316854A1 (en) * | 2008-06-23 | 2009-12-24 | Ismail Aly M | Multi-modality system for screening, imaging and diagnosis in dense compressive media and method of use thereof |
Cited By (22)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
WO2015008279A1 (fr) * | 2013-07-15 | 2015-01-22 | Tel Hashomer Medical Research Infrastructure And Services Ltd. | Procédés de fusion d'images d'irm et utilisations de ces images |
US10535149B2 (en) | 2014-06-25 | 2020-01-14 | Koninklijke Philips N.V. | Imaging device for registration of different imaging modalities |
WO2016033065A1 (fr) * | 2014-08-26 | 2016-03-03 | Rational Surgical Solutions, Llc | Alignement d'image pour ct ou imagerie mr et imagerie ultrasonore faisant appel à un dispositif mobile |
US10966688B2 (en) | 2014-08-26 | 2021-04-06 | Rational Surgical Solutions, Llc | Image registration for CT or MR imagery and ultrasound imagery using mobile device |
WO2016039763A1 (fr) * | 2014-09-12 | 2016-03-17 | Analogic Corporation | Repères d'alignement d'image |
CN105310776B (zh) * | 2014-12-02 | 2018-07-24 | 复旦大学 | 一种基于子块的软组织表面变形追踪方法 |
CN105310776A (zh) * | 2014-12-02 | 2016-02-10 | 复旦大学 | 一种基于子块的软组织表面变形追踪方法 |
US10102681B2 (en) | 2015-07-29 | 2018-10-16 | Synaptive Medical (Barbados) Inc. | Method, system and apparatus for adjusting image data to compensate for modality-induced distortion |
GB2556787A (en) * | 2015-07-29 | 2018-06-06 | Synaptive Medical Barbados Inc | Method, system and apparatus for adjusting image data to compensate for modality-induced distortion |
GB2556787B (en) * | 2015-07-29 | 2020-12-02 | Synaptive Medical Barbados Inc | Method, system and apparatus for adjusting image data to compensate for modality-induced distortion |
WO2017017498A1 (fr) * | 2015-07-29 | 2017-02-02 | Synaptive Medical (Barbados) Inc. | Procédé, système et appareil pour régler des données d'image afin de compenser une distorsion induite par modalité |
US20170319875A1 (en) * | 2016-05-03 | 2017-11-09 | SonaCare Medical, LLC | System and method for pretreatment of a volume of tissue slated for treatment |
WO2017192670A1 (fr) * | 2016-05-03 | 2017-11-09 | SonaCare Medical, LLC | Système et procédé de prétraitement d'un volume de tissu appelé à être traité |
EP3547252A4 (fr) * | 2016-12-28 | 2019-12-04 | Shanghai United Imaging Healthcare Co., Ltd. | Système et procédé de traitement d'images multi-modales |
US11037309B2 (en) | 2016-12-28 | 2021-06-15 | Shanghai United Imaging Healthcare Co., Ltd. | Method and system for processing multi-modality image |
US11869202B2 (en) | 2016-12-28 | 2024-01-09 | Shanghai United Imaging Healthcare Co., Ltd. | Method and system for processing multi-modality image |
EP3600111A4 (fr) * | 2017-03-20 | 2020-12-30 | Exact Imaging Inc. | Procédé et système d'assistance visuelle à un opérateur d'un système à ultrasons |
US11257219B2 (en) | 2017-12-08 | 2022-02-22 | Koninklijke Philips N.V. | Registration of static pre-procedural planning data to dynamic intra-procedural segmentation data |
CN111801705A (zh) * | 2018-03-02 | 2020-10-20 | 通用电气公司 | 用于加速临床工作流程的系统和方法 |
CN112494028A (zh) * | 2019-09-13 | 2021-03-16 | 通用电气公司 | 使用多模态成像的活检工作流程 |
EP4156100A1 (fr) * | 2021-09-23 | 2023-03-29 | Siemens Healthcare GmbH | Fourniture de données d'image de résultat |
WO2024215884A1 (fr) * | 2023-04-14 | 2024-10-17 | Medtronic Navigation, Inc. | Système et procédé d'imagerie et d'enregistrement pour navigation |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
US20200085412A1 (en) | System and method for using medical image fusion | |
US20140073907A1 (en) | System and method for image guided medical procedures | |
US20210161507A1 (en) | System and method for integrated biopsy and therapy | |
WO2014031531A1 (fr) | Système et procédé de procédures médicales guidées par des images | |
Wein et al. | Automatic CT-ultrasound registration for diagnostic imaging and image-guided intervention | |
JP5627677B2 (ja) | 画像ガイド下前立腺癌針生検のためのシステムおよび方法 | |
Xu et al. | Real-time MRI-TRUS fusion for guidance of targeted prostate biopsies | |
Baumann et al. | Prostate biopsy tracking with deformation estimation | |
Hu et al. | MR to ultrasound registration for image-guided prostate interventions | |
Jolesz | Intraoperative imaging and image-guided therapy | |
Najmaei et al. | Image‐guided techniques in renal and hepatic interventions | |
Li et al. | Augmenting intraoperative ultrasound with preoperative magnetic resonance planning models for percutaneous renal access | |
Takamoto et al. | Feasibility of intraoperative navigation for liver resection using real-time virtual sonography with novel automatic registration system | |
Ma et al. | Surgical navigation system for laparoscopic lateral pelvic lymph node dissection in rectal cancer surgery using laparoscopic-vision-tracked ultrasonic imaging | |
Schumann | State of the art of ultrasound-based registration in computer assisted orthopedic interventions | |
Spinczyk | Towards the clinical integration of an image-guided navigation system for percutaneous liver tumor ablation using freehand 2D ultrasound images | |
Pohlman et al. | Two‐dimensional ultrasound‐computed tomography image registration for monitoring percutaneous hepatic intervention | |
Rapetti et al. | Virtual reality navigation system for prostate biopsy | |
US20130085383A1 (en) | Systems, methods and computer readable storage media storing instructions for image-guided therapies | |
Ipsen et al. | A visual probe positioning tool for 4D ultrasound-guided radiotherapy | |
Kadoury et al. | Realtime TRUS/MRI fusion targeted-biopsy for prostate cancer: a clinical demonstration of increased positive biopsy rates | |
Chen et al. | Towards transcervical ultrasound image guidance for transoral robotic surgery | |
Alameddine et al. | Image Fusion Principles: Theory | |
De Silva et al. | Evaluating the utility of intraprocedural 3D TRUS image information in guiding registration for displacement compensation during prostate biopsy | |
Lu et al. | Multimodality image-guided lung intervention systems |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
121 | Ep: the epo has been informed by wipo that ep was designated in this application |
Ref document number: 13831493 Country of ref document: EP Kind code of ref document: A1 |
|
NENP | Non-entry into the national phase |
Ref country code: DE |
|
122 | Ep: pct application non-entry in european phase |
Ref document number: 13831493 Country of ref document: EP Kind code of ref document: A1 |