[go: up one dir, main page]
More Web Proxy on the site http://driver.im/

WO2000014668A1 - Procede et systeme de detection perfectionnee du cancer de la prostate - Google Patents

Procede et systeme de detection perfectionnee du cancer de la prostate Download PDF

Info

Publication number
WO2000014668A1
WO2000014668A1 PCT/US1999/020390 US9920390W WO0014668A1 WO 2000014668 A1 WO2000014668 A1 WO 2000014668A1 US 9920390 W US9920390 W US 9920390W WO 0014668 A1 WO0014668 A1 WO 0014668A1
Authority
WO
WIPO (PCT)
Prior art keywords
biopsy
tissue
dimensional
graphic
tumorous
Prior art date
Application number
PCT/US1999/020390
Other languages
English (en)
Inventor
T. Joseph Wang
Original Assignee
Catholic University Of America
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Catholic University Of America filed Critical Catholic University Of America
Priority to AU61369/99A priority Critical patent/AU6136999A/en
Publication of WO2000014668A1 publication Critical patent/WO2000014668A1/fr

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/0002Inspection of images, e.g. flaw detection
    • G06T7/0012Biomedical image inspection
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F18/00Pattern recognition
    • G06F18/20Analysing
    • G06F18/23Clustering techniques
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F18/00Pattern recognition
    • G06F18/40Software arrangements specially adapted for pattern recognition, e.g. user interfaces or toolboxes therefor

Definitions

  • Prostate cancer is the most prevalent male malignancy and the second leading cause of death by cancer in American men [1] .
  • PSA prostate-specific antigen
  • DRE digital rectal examination
  • TRUS transrectal ultrasound
  • prostate cancer diagnosis and staging is that detailed quantitative analysis of the extent and grade of cancer in systematic needle biopsy specimens provides useful prognostic information, especially when combined with standard clinical tests, such as DRE, PSA, and PSA density [3] .
  • the challenge remains whether it is possible to improve prostate biopsy strategy to yield more representative samples of the cancer which accurately reflect biological potential prior to treatment.
  • the inventor has previously developed core technologies to reconstruct a 3-D graphic model of the prostate from excised prostates of previously imaged cancers [19,31], and to perform virtual simulation of various biopsy protocols [24,30].
  • This interactive environment has made it possible to study tumor patterns in locations that have previously been difficult to evaluate in true 3-D.
  • the preliminary results have shown promising clinical potential in that the data from such studies provide major contributions to the understanding of the early natural history of prostate cancer including its pattern of growth and progression [16] .
  • the data can also lead to biopsy strategies and recommendations regarding the clinical management of patients based on biopsy findings [7] .
  • the inventors also have correlated the findings in the simulated biopsies with the grade and volume of the cancer in the operative specimen of the entire prostate, and thereby have made it possible to study the intraprostatic location, multicentricity, and possible extraprostatic extension of a tumor, and subsequently determined the accuracy and pitfalls of currently used diagnosis and staging systems. It was found that 51% of the cases of prostate cancer were multicentric, ranging from 2 to 5 tumors, and the present procedure leads to underestimation of both size and grade of prostate cancer, due to possible limitations of conventional protocols and misinterpretation of these lesions [19,24,27] .
  • the present invention includes a method for conducting a tissue biopsy, which includes creating a plurality of three-dimensional graphic, electronic models of tumorous and non-tumorous individual patient tissue specimens from corresponding digitized cross-sectional sequences, where each of the sequences represents an actual patient tissue specimen.
  • the digitized cross- sectional sequences consist of two-dimensional cross sectional slides. The slides represent slices of the tissue specimen at spaced intervals.
  • a three-dimensional graphic, electronic master model of a tissue specimen is then formed by mapping all of the graphic models of tumorous and non-tumorous individual patient tissue specimens.
  • a three-dimensional statistical probability distribution is then incorporated into the master model, that designates positions of potential tumors to be found during the tissue biopsy, based on locations of actual tumors in the tumorous individual patient tissue specimens.
  • the master model with the statistical probability distribution is then superimposed on a graphic, electronic patient's biopsy display during the tissue biopsy.
  • the graphic, electronic patient' s biopsy display can then be stored in a computer memory.
  • tissue biopsy produces trans-rectal ultrasound images as said graphic, electronic patient's biopsy display. Accordingly, the tissue biopsy includes the step of directing a trans-rectal ultrasound probe to the positions of potential tumors designated by the three- dimensional statistical probability distribution and the master model.
  • the method of the present invention also includes predicting a volume of a detected tumor using Bayesian theory principles, the three-dimensional statistical probability distribution, and neural network based algorithms.
  • the present invention also includes a method for determining ideal tissue biopsy procedures, which includes creating a plurality of three-dimensional graphic, electronic models of tumorous and non-tumorous individual patient tissue specimens from corresponding digitized cross-sectional sequences, where each of the sequences represents an actual patient tissue specimen.
  • a three-dimensional graphic, electronic model of a patient tissue specimen is then selected from a computer memory.
  • a probe is then connected to a computer that includes the computer memory and from which the three- dimensional graphic, electronic model of a patient tissue specimen is displayed.
  • An interactive simulation of a tissue biopsy is then performed on the three-dimensional graphic, electronic model of the patient tissue specimen, using the probe.
  • a biopsy protocol is thus determined, including optimal probe shapes and pathways, based on the interactive simulation of a tissue biopsy.
  • the method can further include the steps of repeating the step of performing an interactive simulation of a tissue biopsy step on a plurality of three-dimensional graphic, electronic models of patient tissue specimens. Then, a statistical analysis can be conducted to evaluate effectiveness of the biopsy protocol .
  • the method can further include the steps of forming a three-dimensional graphic, electronic master model of a tissue specimen by mapping all of the graphic models of tumorous and non-tumorous individual patient tissue specimens. Then, a three-dimensional statistical probability distribution can be incorporated into the master model. The probability distribution designates positions of potential tumors to be found during the tissue biopsy, based on locations of actual tumors in the tumorous individual patient tissue specimens. The master model, including the statistical probability distribution is then superimposed on the three-dimensional graphic, electronic model of a patient tissue specimen selected from the computer memory.
  • the interactive simulation of a tissue biopsy on the three-dimensional graphic, electronic model of a patient tissue specimen can be stored in a computer memory.
  • the interactive simulation of a tissue biopsy can include the step of directing the probe to the positions of potential tumors as designated by the three- dimensional statistical probability distribution, and by the master model.
  • FIG. la shows digitally-imaged slides of a surgical prostate specimen, outlined slice by slice according to the present invention.
  • FIG. lb shows the result of contour interpolation using non-linear formable modeling according to the present invention.
  • FIGs. 2 to 4 show an example of 3-D reconstruction of a computer model of a prostate using the surface-spine deformable model according to the present invention, with FIG. 2 showing an initial model, FIG. 3 showing the reconstruction after only 5 iterations, and FIG. 4 showing the final model after 20 iterations.
  • FIGs. 5a to 5d show how using a material editor, the surface of a prostate capsule can be adjusted into transparent format so the internal structures and any tumors can be viewed clearly using a multimodal view according to the present invention.
  • FIG. 6 shows a simulated TRUS guided needle biopsy based on the reconstructed prostate model of the present invention.
  • FIG. 7a shows the clinical setting of a TRUS guided prostate biopsy.
  • FIG. 7b shows how a real ultrasound image of the prostate gland is given in a clinical setting based on the coordination of the biopsy needle.
  • FIGs. 8a to 8d show several views of a virtual environment for simulation of a TRUS guided prostate needle biopsy, where multimodality visualization enables accurate needle positioning according to the present invention .
  • FIGs. 9a and 9b show a table providing numerical results of a computerized needle biopsy performed according to the present invention.
  • FIGs. 10a and 10b show results from 3-D nonlinear matching using a surface-spine deformable model.
  • FIG. 11 shows a flow diagram outlining the data preparation procedure of the present invention.
  • FIG. 12 shows a flow diagram outlining 3-D object reconstruction of the process of the present invention.
  • FIG. 13 shows a flow diagram outlining the virtual environment formation process of the present invention.
  • FIG. 14 shows a flow diagram outlining the process of construction and quantification of 3-D probablity maps of the location of different cancer grades according to the present invention.
  • FIG. 15 shows a flow diagram outlining the process of superimposing and visualization of the master model with TRUS imaging features for on-line biopsy guidance according to the present invention.
  • FIG. 16 shows a graphical representation of statistical data mapping and clustering according to the present invention.
  • the statistical modeling and multimodality visualization of prostate cancer requires the acquisition of a clinically proven prostate cancer database (digitally imaged whole mount prostatectomy specimens) , 3-D graphical reconstruction of the object of interest (prostate structure and the tumors with different grades), a virtual environment for interactive simulation of TRUS guided needle biopsy, a graphics based cross object matching, and 3-D data mapping and statistical modeling.
  • the modeling and visualization process of the present invention is organized into the following parts: 1) data preparation; 2) 3-D object reconstruction; 3) development of virtual environment; 4) interactive simulation of TRUS guided needle biopsy; 5) 3-D nonlinear graphical matching, and 6) 3-D data mapping and statistical modeling.
  • the data preparation procedure of the present invention is outlined in FIG. 11.
  • a statistically significant database 21 is used to provide ground "truth" of the disease when present.
  • the database 21 includes digitized cross- sectional sequences 20 of hundreds of whole mount prostatectomy specimens, removed due to prostate cancer, and provided by the AFIP [26] . All necessary clinical information of these surgical specimen sequences 20 is complete including diagnostic, medical images [19].
  • Each of these sequences 20 consists of ten to fourteen slices that are 4 ⁇ m sections at 2.5 mm intervals.
  • the corresponding digital images 25 of these slices are acquired at a resolution of 1500 dots per inch (dpi) , and are shown in FIG. la.
  • the contours of the regions of interest including the prostate capsule, urethra, seminal vesicles, ejaculatory ducts, surgical margin, any localized tumor, prostate carcinoma with high grade, and areas of prostatic intraepitrelial neoplasia, are delineated using computer-aided methods, followed by a semi-automatic contour refining algorithm using a snake model [69] .
  • a PC3DTM software program 22 is used to preview the possible outcomes in 3-D so that the data can be re-arranged 23 to avoid any misinterpretation about the shape and spatial distribution of the cancer when transferred from 2-D to 3-D [19] .
  • the parameter setting for both focus and resolution of the digitizer is optimized 24 to assure high image quality.
  • the 3-D object reconstruction of the process of the present invention is outlined in FIG. 12. Based on the original contours of the prostate and tumors (any kind or high grade), a 3-D surface of the object can be accurately and reliably reconstructed utilizing the elastic property of soft tissue deformation for mathematical implementation.
  • the prostate specimens with localized tumors are used as the first target, which produces a total of eighty computerized prostate models after 3-D object reconstruction.
  • mathematical interpolation is required to fill the gaps between a start and a goal contour [7] , and repeated for each of the contours.
  • a 3-D elastic contour model computes a 3-D force field between adjacent slices thus enabling a "pulling and pushing" metaphor to move the starting contour gradually to the final contour [31,55], as shown in FIG. lb.
  • the non- linearity characteristics of the elastic contour model permits a meaningful interpolation result yielding a high quality representation of the realistic nature (soft tissue modeling) of the object surface.
  • Reconstruction of an object is to form 3-D surfaces based on the contours of successive 2-D slices.
  • One conventional way of doing this is to directly connect the contours by planar triangle elements where the reconstructed surfaces are usually coarse and static [7] .
  • a physical-based deformable surface model is preferably used to perform 3-D object reconstruction. Two major operations are involved: (1) triangulated patches are tiled between adjacent contours with a criterion of minimizing the surface area, and (2) tiled triangulated patches are refined by using a deformable surface-spine model 26. The surface formation is governed by a second- order partial differential equation and is accomplished when the energy of the deformable surface model reaches its minimum [24,27,31].
  • FIGs. 2 to 4 show an example of 3-D reconstruction of a computer model of a prostate using the surface-spine deformable model according to the present invention, with FIG. 2 showing an initial model, FIG. 3 showing the reconstruction after only 5 iterations, and FIG. 4 showing the final model after 20 iterations.
  • FIGs. 5a to 5d show how using a material editor, the surface of a prostate capsule can be adjusted into transparent format so the internal structures and any tumors can be viewed clearly using a multimodal view according to the present invention.
  • 3-D graphical model images 27 of the prostates that are reconstructed according to the above method have been shown to realistically represent actual shapes and distributions of prostate specimens and cancers, and have superior properties compared to known methods [7,9,16,19].
  • the computer algorithms are automatic in which several key parameters can be easily controlled by the user through a human-computer interface .
  • the shape information from high resolution medical images [28,43] 27 obtained as described above can be combined with other shape information so obtained to provide expanded views of prostate specimens, as well as other tissues. Since a realistic 3-D model can be reconstructed for any object and/or organ, other programs can be easily developed using the principles of the present invention to analyze many important cancer characteristics.
  • FIG. 13 The virtual environment process of the present invention is outlined in FIG. 13.
  • the use of the reconstructed 3-D computer models, in visualization and simulation of clinical procedures, can provide an offline capability with which a large number of computerized "needle biopsies" can be taken from the models to address questions of sampling that simply are not amenable to study in the clinical setting.
  • An interactive virtual environment is required to enable a reproducible computerized "needle biopsy” experiment. The results from the simulation provide reliable information that reflects the clinical reality.
  • An interactive environment for visualizing the 3-D prostate models is created and displayed 30 according to the present invention, based on a state-of-the-art computer graphics toolkit 28 such as object-oriented OpenlnventorTM, commercially available from Silicon Graphics, Inc. [24,30].
  • a state-of-the-art computer graphics toolkit 28 such as object-oriented OpenlnventorTM, commercially available from Silicon Graphics, Inc. [24,30].
  • OpenlnventorTM commercially available from Silicon Graphics, Inc. [24,30].
  • the system of the present invention allows the user of the system to examine the prostate model in 3-D with any viewpoint, and dynamically walk through its internal structures to understand the spatial relationships among anatomical structures and the tumors present.
  • a force feedback system 31 can be further incorporated into the system to provide a tactile sensation to the user, using the PHANToM SystemTM for example. Equipped with hardware human-machine interface, the system developed as described above enables a full view of the 3-D surgical prostate model right in front of the user, e.g., a surgeon or a pathologist, for examination of the cancer pattern or performing surgical procedures.
  • a typical system is shown in FIG. 6.
  • Interactive Simulation of TRUS Guided Needle Biopsy TRUS guided needle biopsy is considered a gold standard clinical procedure with its dual purposes of diagnosing and staging the prostate cancer.
  • FIG. 7a shows a diagram of a transrectal biopsy procedure that is simulated by the system of the present invention. Specifically, FIG. 7a shows the clinical setting of a TRUS guided prostate biopsy.
  • FIG. 7b shows how a real ultrasound image of the prostate gland is given in a clinical setting based on the coordination of the biopsy needle.
  • the needle Under TRUS guidance, the needle is placed through the guide into the targeted lesion or location.
  • a two-step TRUS guided needle biopsy simulation is performed.
  • First, various simulated TRUS probes are used to drive axially and/or longitudinally oriented sectional images, for an efficient planning of needle pathways.
  • Second, needles with or without triggers are constructed and simulated to perform an actual biopsy on the reconstructed 3-D prostate models according to the planned needle pathways.
  • This virtual system and process allows a surgeon to sit in front of the computer and simulate needle biopsies, plan optimal needle pathways when overlaid with TRUS imaging features, and further practice a designed biopsy procedure prior to actual clinical application to a patient.
  • a statistical analysis can be conducted to evaluate the effectiveness of selected biopsy protocols based on sufficient large number of "virtual" biopsies, and, if necessary, recommend new biopsy techniques to improve prostate diagnostic accuracy.
  • the system of the present invention implements both sextant random core biopsy [12] and systematic 5-region biopsy techniques [4]. Selected biopsy techniques have been performed based on dozens of reconstructed computer models of prostate specimens [24,30]. The simulation results are shown in FIGs. 8a to 8d, and the detection probability of each needle can be calculated to indicate its clinical importance. The analysis of estimated positive biopsy distribution (histogram) suggested that a spatial pattern of prostate cancer distribution exists. More results ' are shown in FIGs. 9a and 9b, where the clinical stage with positive biopsies in these dozens of patients were used to distinguish clinically important and unimportant tumors. When the simulation is also recorded electronically, the results can be further analyzed to study various causes of hit or miss in each individual case.
  • the grade of the tumor can also be incorporated into the system so that a spectrum of different cancer grade distribution can be investigated.
  • the physician that is practicing using the simulated prostate can use one or more biopsy needles to practice planting a small piece of a radioisotope into a preselected location of a tumor during a virtual branchy therapy treatment, or, to practice directing a beam of radiant energy into a preselected location of a tumor during a virtual radiological onthology therapy treatment .
  • 3-D object matching which normally involves translation (i.e., positioning the origin), rotation (i.e., aligning the orientation), and scaling (i.e., adjusting the scale) . Since most available image registration methods are only valid for rigid objects, the challenge becomes how to incorporate soft tissue modeling of the prostate gland into the required 3-D object matching. To meet this challenge, the present invention incorporates a 3-D elastic matching method based on object reconstruction from 2-D contours [31] .
  • a 3-D nonlinear registration algorithm matches two surfaces by using a deformable surface-spine model.
  • the advantage of the deformable surface-spine model lies in its ability to respond dynamically to applied external forces according to physical principles formalized in continuum mechanics as partial differential equations.
  • the dynamic capability of this matching method is very effective to recover the non-rigid deformation between two surfaces, which is the case in the actual experimental setting [29].
  • FIGs. 10a and 10b show the results of the 3-D nonlinear matching method incorporated in the present invention using a surface-spine deformable model.
  • the 3-D matching model of the present invention can be described as the following coupled dynamic system.
  • the initial spine is the axis of the surface determined from its contours.
  • the present invention includes a 3-D principal axes algorithm to initially align two prostate glands [72] .
  • two sets of complex- structured tumor distributions are matched, with the tumor of one prostate correspondingly transformed to a new location, and with the tumor shape being modified according to the recovered nonlinear deformation, consistent with the deformed prostate capsule after registration.
  • FIG. 14 The system and process for construction and quantification of 3-D probablity maps of the location of different cancer grades is outlined in FIG. 14.
  • hundreds of additional clinically proven surgical specimen sequences 32 of actual prostates are scanned into an electronic database, to create a 3-D probability map of clinically significant and representative high grade cancer.
  • Contour extraction 34 of key structures in these specimens is then conducted. Because this requires a large data storage capacity, high density CDROM and an on-line StorageTekTM robotic unit are examples of storage media that may be used with the present invention.
  • the above-described 3-D object reconstruction method is used to generate the computer graphical models for these specimens .
  • a 3-D is calculated 36.
  • the voxels of localized prostate cancer are labeled "1”
  • the voxels of other internal structures are labeled "0”
  • a 3-D binary map of the prostate capsule and cancer which is simply a mutually exclusive random sampling of the underlying spatial probability distribution of cancer occurrence. All these binary maps are summed (geometrically normalized) together to obtain a 3-D histogram of the cancer distribution in which a mathematical normalization in the random space is required.
  • this 3-D histogram is mathematically modeled by a standard finite generalized normal mixture (FGNM) distribution [45].
  • FGNM finite generalized normal mixture
  • the optimal location of the biopsy site is determined quantitatively based on the master model.
  • a 3-D learning vector quantization method is applied to identify the best biopsy sites based on estimated probability maps [39] .
  • Such a method provides an optimal solution in the sense of minimum mean squared error [47]. Given this information, specific locations can be recommended for analysis, thereby developing more selective biopsy strategies.
  • Fig. 15 is a flow diagram that shows the process of superimposing and visualization of the master model with TRUS imaging features for on-line biopsy guidance according to the present invention.
  • the master model along with the spatial probability distribution pattern 41 is superimposed 39 on the TRUS image 40. Since the TRUS can only provide a gray scale image 40 in which key anatomical structures can not be directly identified, image segmentation is a key part of the present invention.
  • a statistical model-based method 38 is employed for quantifying different tissue types and then segmenting major internal structures from TRUS images.
  • the model-based method 38 [35,36,37,38,41,43,44,48] has been applied to enable image segmentation from magnetic resonance (MR) brain images, computed tomography (CT) liver images, and computed radiography (CR) breast images .
  • MR magnetic resonance
  • CT computed tomography
  • CR computed radiography
  • the master model is superimposed over the on- line TRUS imaging. Since only 2-D images from TRUS may be available in the current clinical setting, a 3-D to 2- D registration is computed by the computer database/memory 21. The whole prostate gland is pre- scanned with two views to generate a sequence of image slices. Then, a matched filter is used to identify the most appropriate correspondence between the master model and a particular TRUS slice. After that, a multiple feature-based method is used to register the master model with the TRUS image. This method has shown a very robust performance when applied to PET and MRI brain image fusion [72] . Once again, soft tissue deformation needs to be considered, since the patient's body motion may be involved during the needle biopsy procedure. Furthermore, the method can also incorporate advances from site model based registration and mutual information maximization [71] .
  • the above process of superimposing and visualization of the master model with TRUS imaging features is useful to direct the one or more biopsy needles 44a.
  • the one or more biopsy needles 44a may be displayed 39 and viewed on a monitor, along with any medical equipment that may be necessary for a desired procedure.
  • Such equipment may include a tool 44c for planting a small piece of a radioisotope into a preselected location of a tumor during a branchy therapy treatment, or, an emitter 44b for directing a beam of radiant energy into a preselected location of a tumor during a radiological onthology therapy treatment.
  • true tumor parameters will first be calculated.
  • the key quantities are the tumor location, distribution, and representative volume. Assuming the reconstructed object-surface is accurate, true tumor volume is simply the interior volume confined by the tumor surface.
  • a simulation of various biopsy protocols is conducted using two modes: purely computerized simulation and human controlled virtual biopsy.
  • Various computer programs can be easily generated to perform a computerized biopsy in which clinical factors will be incorporated such as variability of needle angle and mispositioning .
  • force feedback is integrated into the routine practice and the error caused by human factor is also addressed.
  • these trials can determine the likelihood of adequate tumor sampling using current standard transrectal sextant biopsy techniques [12] ; spatial distribution, enumeration, symmetry, total tumor volume, and tumor volume as a fraction of prostrate volume; volume and distribution of extraprostatic tumors; spatial distribution of tumor foci; and distribution correlation of prostatic intraepithelial neoplasia and invasive tumors . Derivation of an Algorithm to Estimate Tumor Volume and Other Staging Parameters
  • the situation for estimating tumor volume differs in that the individual tumor volume must be estimated from limited samples in comparison to the large number of pixels in medical images, but the same methodology pertains: the tumor volume can be "predicted” based on Bayesian theory and the underlying probability maps.
  • Currently used protocols generally underestimate the tumor volume. This implies that traditional formulation may be augmented by newly developed machine learning approaches [3] .
  • Neural networks can effectively learn the knowledge from large samples regarding the relationships among data, and have successfully developed various neural network based algorithms and fuzzy logic for robust clinical decision making, such as prostatron treatment planning for BPH and breast cancer diagnosis [42] . Therefore, a probabilistic modular neural network is applied according to the present invention to estimate total tumor volume directly from the outcomes of needle biopsy cores.
  • the likelihood of tumor volume given the outcomes of needle biopsies can be calculated, using a probability concentric onion like system together with Bayesian rule.
  • the likelihood of tumor volume will decrease as the cancer-proven biopsy core moves layer by layer from the centrum toward the periphery of the cancer foci. These layers are defined by the factorized standard deviations. This aspect of the invention further improves the ability to estimate the location of the detected cancer.

Landscapes

  • Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Data Mining & Analysis (AREA)
  • General Physics & Mathematics (AREA)
  • Physics & Mathematics (AREA)
  • Radiology & Medical Imaging (AREA)
  • Artificial Intelligence (AREA)
  • Health & Medical Sciences (AREA)
  • Nuclear Medicine, Radiotherapy & Molecular Imaging (AREA)
  • Medical Informatics (AREA)
  • General Engineering & Computer Science (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • Quality & Reliability (AREA)
  • Bioinformatics & Cheminformatics (AREA)
  • Bioinformatics & Computational Biology (AREA)
  • General Health & Medical Sciences (AREA)
  • Evolutionary Biology (AREA)
  • Evolutionary Computation (AREA)
  • Human Computer Interaction (AREA)
  • Measuring And Recording Apparatus For Diagnosis (AREA)

Abstract

La présente invention concerne un procédé informatisé d'analyse de données issues d'une biopsie, consistant à créer plusieurs tracés graphiques tridimensionnels, et des modèles électroniques d'échantillons tissulaires tumoraux et non tumoraux de patients provenant de séquences en coupes transversales numérisées correspondantes, dans lesquels chacune de ces séquences représente l'échantillon tissulaire d'un patient. Les séquences en coupes transversales numérisées sont constituées de vues de coupes transversales bidimensionnelles. Ces vues représentent des coupes de l'échantillon tissulaire à intervalles espacés. Par ailleurs, on forme un modèle électronique de référence d'échantillon tissulaire en représentant graphiquement tous le modèles graphiques des échantillons tissulaires tumoraux et non tumoraux des patients. On introduit ensuite dans le modèle principal une distribution de probabilité statistique tridimensionnelle, qui désigne les positions des tumeurs potentielles devant être localisées lors de la biopsie, sur la base de l'emplacement des tumeurs actuelles dans les échantillons tissulaires des patients. Le modèle de référence, avec la loi de probabilité, est ensuite superposé sur un affichage graphique électronique de la biopsie du patient.
PCT/US1999/020390 1998-09-08 1999-09-08 Procede et systeme de detection perfectionnee du cancer de la prostate WO2000014668A1 (fr)

Priority Applications (1)

Application Number Priority Date Filing Date Title
AU61369/99A AU6136999A (en) 1998-09-08 1999-09-08 Method and system for improved detection of prostate cancer

Applications Claiming Priority (4)

Application Number Priority Date Filing Date Title
US9932998P 1998-09-08 1998-09-08
US60/099,329 1998-09-08
US10062298P 1998-09-17 1998-09-17
US60/100,622 1998-09-17

Publications (1)

Publication Number Publication Date
WO2000014668A1 true WO2000014668A1 (fr) 2000-03-16

Family

ID=26795981

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/US1999/020390 WO2000014668A1 (fr) 1998-09-08 1999-09-08 Procede et systeme de detection perfectionnee du cancer de la prostate

Country Status (2)

Country Link
AU (1) AU6136999A (fr)
WO (1) WO2000014668A1 (fr)

Cited By (19)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
DE10328765A1 (de) * 2003-06-25 2005-02-03 aviCOM Gesellschaft für angewandte visuelle Systeme mbH Vorrichtung und Verfahen zur Verbindung der Darstellung des elektrischen Herzfeldes mit der Darstellung des zugehörigen Herzens
US7804989B2 (en) 2006-10-30 2010-09-28 Eigen, Inc. Object recognition system for medical imaging
WO2010136584A1 (fr) * 2009-05-29 2010-12-02 Institut Telecom-Telecom Paris Tech Procede de quantification de l'evolution de pathologies impliquant des changements de volumes de corps, notamment de tumeurs
US7856130B2 (en) 2007-03-28 2010-12-21 Eigen, Inc. Object recognition system for medical imaging
WO2011015822A1 (fr) 2009-08-07 2011-02-10 Ucl Business Plc Appareil et procédé d'alignement de deux images médicales
US7894645B2 (en) 2000-08-10 2011-02-22 Ohio State University High-resolution digital image processing in the analysis of pathological materials
US7942829B2 (en) 2007-11-06 2011-05-17 Eigen, Inc. Biopsy planning and display apparatus
US8064664B2 (en) 2006-10-18 2011-11-22 Eigen, Inc. Alignment method for registering medical images
US8175350B2 (en) 2007-01-15 2012-05-08 Eigen, Inc. Method for tissue culture extraction
US8425418B2 (en) 2006-05-18 2013-04-23 Eigen, Llc Method of ultrasonic imaging and biopsy of the prostate
US8447384B2 (en) 2008-06-20 2013-05-21 Koninklijke Philips Electronics N.V. Method and system for performing biopsies
US8571277B2 (en) 2007-10-18 2013-10-29 Eigen, Llc Image interpolation for medical imaging
EP2673738A4 (fr) * 2011-02-11 2017-08-23 E-4 Endeavors, Inc. Système et procédé de modélisation de spécimen de biopsie
WO2018002265A1 (fr) * 2016-06-30 2018-01-04 Koninklijke Philips N.V. Génération et personnalisation d'un modèle statistique du sein
CN107743409A (zh) * 2015-06-12 2018-02-27 皇家飞利浦有限公司 剂量规划系统
US10426424B2 (en) 2017-11-21 2019-10-01 General Electric Company System and method for generating and performing imaging protocol simulations
US10716544B2 (en) 2015-10-08 2020-07-21 Zmk Medical Technologies Inc. System for 3D multi-parametric ultrasound imaging
US10963757B2 (en) 2018-12-14 2021-03-30 Industrial Technology Research Institute Neural network model fusion method and electronic device using the same
DE102022131177A1 (de) 2022-11-24 2024-05-29 B. Braun New Ventures GmbH Chirurgisches Navigationssystem und Navigationsverfahren

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
YUE WANG ET AL: "Statistical modeling and visualization of localized prostate cancer", MEDICAL IMAGING 1997: IMAGE DISPLAY, NEWPORT BEACH, CA, USA, 23-25 FEB. 1997, vol. 3031, Proceedings of the SPIE - The International Society for Optical Engineering, 1997, SPIE-Int. Soc. Opt. Eng, USA, pages 73 - 84, XP000874460, ISSN: 0277-786X *

Cited By (27)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US7894645B2 (en) 2000-08-10 2011-02-22 Ohio State University High-resolution digital image processing in the analysis of pathological materials
DE10328765A1 (de) * 2003-06-25 2005-02-03 aviCOM Gesellschaft für angewandte visuelle Systeme mbH Vorrichtung und Verfahen zur Verbindung der Darstellung des elektrischen Herzfeldes mit der Darstellung des zugehörigen Herzens
DE10328765B4 (de) * 2003-06-25 2005-11-24 aviCOM Gesellschaft für angewandte visuelle Systeme mbH Vorrichtung und Verfahren zur Verbindung der Darstellung des elektrischen Herzfeldes mit der Darstellung des zugehörigen Herzens
US8425418B2 (en) 2006-05-18 2013-04-23 Eigen, Llc Method of ultrasonic imaging and biopsy of the prostate
US8064664B2 (en) 2006-10-18 2011-11-22 Eigen, Inc. Alignment method for registering medical images
US7804989B2 (en) 2006-10-30 2010-09-28 Eigen, Inc. Object recognition system for medical imaging
US8175350B2 (en) 2007-01-15 2012-05-08 Eigen, Inc. Method for tissue culture extraction
US7856130B2 (en) 2007-03-28 2010-12-21 Eigen, Inc. Object recognition system for medical imaging
US8571277B2 (en) 2007-10-18 2013-10-29 Eigen, Llc Image interpolation for medical imaging
US7942829B2 (en) 2007-11-06 2011-05-17 Eigen, Inc. Biopsy planning and display apparatus
US8447384B2 (en) 2008-06-20 2013-05-21 Koninklijke Philips Electronics N.V. Method and system for performing biopsies
FR2946171A1 (fr) * 2009-05-29 2010-12-03 Groupe Ecoles Telecomm Procede de quantification de l'evolution de pathologies impliquant des changements de volumes de corps, notamment de tumeurs
WO2010136584A1 (fr) * 2009-05-29 2010-12-02 Institut Telecom-Telecom Paris Tech Procede de quantification de l'evolution de pathologies impliquant des changements de volumes de corps, notamment de tumeurs
US9026195B2 (en) 2009-05-29 2015-05-05 Institute Telecom-Telecom Paris Tech Method for characterizing the development of pathologies involving changes in volumes of bodies, notably tumors
WO2011015822A1 (fr) 2009-08-07 2011-02-10 Ucl Business Plc Appareil et procédé d'alignement de deux images médicales
US8620055B2 (en) 2009-08-07 2013-12-31 Ucl Business Plc Apparatus and method for registering two medical images
EP2673738A4 (fr) * 2011-02-11 2017-08-23 E-4 Endeavors, Inc. Système et procédé de modélisation de spécimen de biopsie
US10223825B2 (en) 2011-02-11 2019-03-05 E4 Endeavors, Inc. System and method for modeling a biopsy specimen
CN107743409A (zh) * 2015-06-12 2018-02-27 皇家飞利浦有限公司 剂量规划系统
US10716544B2 (en) 2015-10-08 2020-07-21 Zmk Medical Technologies Inc. System for 3D multi-parametric ultrasound imaging
WO2018002265A1 (fr) * 2016-06-30 2018-01-04 Koninklijke Philips N.V. Génération et personnalisation d'un modèle statistique du sein
CN109414233A (zh) * 2016-06-30 2019-03-01 皇家飞利浦有限公司 统计乳房模型的生成和个性化
US11213247B2 (en) 2016-06-30 2022-01-04 Koninklijke Philips N.V. Generation and personalization of a statistical breast model
CN109414233B (zh) * 2016-06-30 2023-09-12 皇家飞利浦有限公司 统计乳房模型的生成和个性化
US10426424B2 (en) 2017-11-21 2019-10-01 General Electric Company System and method for generating and performing imaging protocol simulations
US10963757B2 (en) 2018-12-14 2021-03-30 Industrial Technology Research Institute Neural network model fusion method and electronic device using the same
DE102022131177A1 (de) 2022-11-24 2024-05-29 B. Braun New Ventures GmbH Chirurgisches Navigationssystem und Navigationsverfahren

Also Published As

Publication number Publication date
AU6136999A (en) 2000-03-27

Similar Documents

Publication Publication Date Title
WO2000014668A1 (fr) Procede et systeme de detection perfectionnee du cancer de la prostate
Zhu et al. Computer technology in detection and staging of prostate carcinoma: A review
JP5520378B2 (ja) 2つの医用画像を位置合わせするための装置および方法
Gong et al. Parametric shape modeling using deformable superellipses for prostate segmentation
US8425418B2 (en) Method of ultrasonic imaging and biopsy of the prostate
CN103402453B (zh) 用于导航系统的自动初始化和配准的系统和方法
JP4657561B2 (ja) 医学的ドキュメント化のための生体乳房生検位置の視覚化強化
US20110178389A1 (en) Fused image moldalities guidance
US20070167784A1 (en) Real-time Elastic Registration to Determine Temporal Evolution of Internal Tissues for Image-Guided Interventions
US20100286517A1 (en) System and Method For Image Guided Prostate Cancer Needle Biopsy
Shao et al. Prostate boundary detection from ultrasonographic images
Astley et al. Automation in mammography: Computer vision and human perception
Mastmeyer et al. Accurate model-based segmentation of gynecologic brachytherapy catheter collections in MRI-images
Fei et al. A molecular image-directed, 3D ultrasound-guided biopsy system for the prostate
Cool et al. Design and evaluation of a 3D transrectal ultrasound prostate biopsy system
Schalk et al. 3D surface-based registration of ultrasound and histology in prostate cancer imaging
Cool et al. 3D prostate model formation from non-parallel 2D ultrasound biopsy images
WO2007137179A2 (fr) systÈme amÉliorÉ et procÉdÉ pour biopsie 3D
Xuan et al. 3-D model supported prostate biopsy simulation and evaluation
Lu et al. Statistical volumetric model for characterization and visualization of prostate cancer
Imran et al. Image Registration of In Vivo Micro-Ultrasound and Ex Vivo Pseudo-Whole Mount Histopathology Images of the Prostate: A Proof-of-Concept Study
Wang et al. Statistical modeling and visualization of localized prostate cancer
Yu et al. Model-supported virtual environment for prostate cancer pattern analysis
Wang et al. Statistical Modeling and Visualization of Localized
Gong Prostate ultrasound image segmentation and registration

Legal Events

Date Code Title Description
AK Designated states

Kind code of ref document: A1

Designated state(s): AL AM AT AU AZ BA BB BG BR BY CA CH CN CU CZ DE DK EE ES FI GB GD GE GH GM HR HU ID IL IN IS JP KE KG KP KR KZ LC LK LR LS LT LU LV MD MG MK MN MW MX NO NZ PL PT RO RU SD SE SG SI SK SL TJ TM TR TT UA UG US UZ VN YU ZW

AL Designated countries for regional patents

Kind code of ref document: A1

Designated state(s): GH GM KE LS MW SD SL SZ UG ZW AM AZ BY KG KZ MD RU TJ TM AT BE CH CY DE DK ES FI FR GB GR IE IT LU MC NL PT SE BF BJ CF CG CI CM GA GN GW ML MR NE SN TD TG

121 Ep: the epo has been informed by wipo that ep was designated in this application
REG Reference to national code

Ref country code: DE

Ref legal event code: 8642

122 Ep: pct application non-entry in european phase