[go: up one dir, main page]
More Web Proxy on the site http://driver.im/

US20140129200A1 - Preoperative surgical simulation - Google Patents

Preoperative surgical simulation Download PDF

Info

Publication number
US20140129200A1
US20140129200A1 US13/958,954 US201313958954A US2014129200A1 US 20140129200 A1 US20140129200 A1 US 20140129200A1 US 201313958954 A US201313958954 A US 201313958954A US 2014129200 A1 US2014129200 A1 US 2014129200A1
Authority
US
United States
Prior art keywords
image
medical image
procedure
medical
anatomical
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Abandoned
Application number
US13/958,954
Inventor
Ran Bronstein
Niv Fisher
Ofek Shilon
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Simbionix Ltd
Original Assignee
Simbionix Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Simbionix Ltd filed Critical Simbionix Ltd
Priority to US13/958,954 priority Critical patent/US20140129200A1/en
Assigned to SIMBIONIX LTD. reassignment SIMBIONIX LTD. ASSIGNMENT OF ASSIGNORS INTEREST (SEE DOCUMENT FOR DETAILS). Assignors: BRONSTEIN, RAN, FISHER, NIV, SHILON, OFEK
Publication of US20140129200A1 publication Critical patent/US20140129200A1/en
Assigned to 3D SYSTEMS, INC. reassignment 3D SYSTEMS, INC. ASSIGNMENT OF ASSIGNORS INTEREST (SEE DOCUMENT FOR DETAILS). Assignors: SIMBIONIX LTD.
Assigned to SIMBIONIX LTD. reassignment SIMBIONIX LTD. ASSIGNMENT OF ASSIGNORS INTEREST (SEE DOCUMENT FOR DETAILS). Assignors: 3D SYSTEMS, INC.
Abandoned legal-status Critical Current

Links

Images

Classifications

    • A61B19/50
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T19/00Manipulating 3D models or images for computer graphics
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B34/00Computer-aided surgery; Manipulators or robots specially adapted for use in surgery
    • A61B34/10Computer-aided planning, simulation or modelling of surgical operations
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T17/00Three dimensional [3D] modelling, e.g. data description of 3D objects
    • GPHYSICS
    • G16INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR SPECIFIC APPLICATION FIELDS
    • G16HHEALTHCARE INFORMATICS, i.e. INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR THE HANDLING OR PROCESSING OF MEDICAL OR HEALTHCARE DATA
    • G16H30/00ICT specially adapted for the handling or processing of medical images
    • G16H30/20ICT specially adapted for the handling or processing of medical images for handling medical images, e.g. DICOM, HL7 or PACS
    • GPHYSICS
    • G16INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR SPECIFIC APPLICATION FIELDS
    • G16HHEALTHCARE INFORMATICS, i.e. INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR THE HANDLING OR PROCESSING OF MEDICAL OR HEALTHCARE DATA
    • G16H30/00ICT specially adapted for the handling or processing of medical images
    • G16H30/40ICT specially adapted for the handling or processing of medical images for processing medical images, e.g. editing
    • GPHYSICS
    • G16INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR SPECIFIC APPLICATION FIELDS
    • G16HHEALTHCARE INFORMATICS, i.e. INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR THE HANDLING OR PROCESSING OF MEDICAL OR HEALTHCARE DATA
    • G16H50/00ICT specially adapted for medical diagnosis, medical simulation or medical data mining; ICT specially adapted for detecting, monitoring or modelling epidemics or pandemics
    • G16H50/50ICT specially adapted for medical diagnosis, medical simulation or medical data mining; ICT specially adapted for detecting, monitoring or modelling epidemics or pandemics for simulation or modelling of medical disorders
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2210/00Indexing scheme for image generation or computer graphics
    • G06T2210/41Medical

Definitions

  • the present invention relates to an apparatus and a method for performing a simulated image-guided medical procedure and, more particularly, but not exclusively to performing a simulated image-guided procedure according to a three-dimensional (3D) model of an organ that is based on a 3D medical image.
  • 3D three-dimensional
  • Medical imaging is generally recognized as important for diagnosis and patient care with the goal of improving treatment outcomes.
  • medical imaging has experienced an explosive growth due to advances in imaging modalities such as x-rays, computed tomography (CT), magnetic resonance imaging (MRI) and ultrasound.
  • CT computed tomography
  • MRI magnetic resonance imaging
  • ultrasound ultrasound
  • These modalities provide noninvasive methods for studying internal organs in vivo, but the amount of data is relatively large and when presented as two dimensional (2D) images, it generally requires an anatomist/radiology specialist for interpretation. Unfortunately, the cost incurred in manual interpretation of this data is prohibitive for routine data analysis.
  • the 2D slices can be combined to generate a 3-D volumetric model.
  • Such medical imaging systems allow the performance of minimally invasive therapeutic procedures. These procedures are typically carried out in a CathLab, where a physician wishes to assess the functions of internal organ such as the heart and coronary artery or to perform procedures such as coronary angioplasty.
  • Radiology yields recorded images such as 2D X-ray films or 3D medical images such as CT and MRI scans.
  • Mild dosage interactively controlled X-Ray also known as fluoroscopy, allows a physician to monitor actively an operation at progress.
  • Interventional radiology is the specialty in which the radiologist and cardiologists utilizes real time radiological images to perform therapeutic and diagnostic procedures. Interventional radiologists currently rely on the real-time fluoroscopic 2D images, available as analog video or digital information viewed on video monitors.
  • Medical simulators that can be used to train such medical specialists have significant potential in reducing healthcare costs through improved training, better pre-treatment planning, and more economic and rapid development of new medical devices. Hands-on experience becomes possible in training, before direct patient involvement that will carry a significant risk.
  • Image-guided procedures such as vascular catheterization, angioplasty, and stent placement, are specially suited for simulation because they typically place the physician at-a-distance from the operative site manipulating surgical instruments and viewing the procedures on video monitors.
  • the interface device includes a catheter unit assembly for receiving a catheter needle assembly, and a skin traction mechanism to simulate placing skin in traction or manipulating other anatomical sites for performing a medical procedure.
  • the catheter needle assembly and skin traction mechanism are manipulated by a user during a medical procedure.
  • the catheter unit assembly includes a base, a housing, a bearing assembly and a shaft that receives the catheter needle assembly.
  • the bearing assembly enables translation of the catheter needle assembly, and includes bearings that enable the shaft to translate in accordance with manipulation of the catheter needle assembly.
  • the shaft typically includes an encoder to measure translational motion of a needle of the catheter needle assembly, while the interface device further includes encoders to measure manipulation of the catheter needle assembly in various degrees of freedom and the skin traction mechanism.
  • the simulation system receives measurements from the interface device encoders and updates the simulation and display, while providing control signals to the force feedback device to enable application of force feedback to the catheter needle assembly.
  • simulation systems and other known simulation systems are based on predefined models, which are acquired and enhanced before the systems become operational or during a maintenance thereof, such as updating the system.
  • a library that comprises virtual models which are stored in a related database is connected to the simulation system.
  • the system simulates an image-guided procedure according to one of the virtual models that has been selected by the system user.
  • the simulated image-guided procedures are modeled according to predefined or randomly changed models of an organ, a human body system, or a section thereof.
  • the physician or the trainee is trained using a model of a virtual organ that is not identical to the organ that he or she is about to perform an operative image-guided procedure on.
  • the simulation system cannot be used for accurately simulating an operation that has been performed on a real patient. Therefore, the currently used simulation systems cannot be used for going back over an operation that went wrong or for didactic purposes.
  • an apparatus for simulating an image-guided procedure comprises an input for receiving a three-dimensional (3D) medical image depicting an organ of a patient, a model generation unit configured for generating a 3D anatomical model of the organ according to the 3D medical image, and a simulating unit configured for simulating an image-guided procedure planned for the patient according to the 3D anatomical model.
  • 3D three-dimensional
  • the apparatus further comprises a segmentation unit operatively connected to the model generation unit, the segmentation unit being configured for segmenting the organ in the 3D medical image to a plurality of areas, the segmented organ image being used for generating the 3D anatomical model.
  • the 3D anatomical model is a model of a tract.
  • the tract is a member of the following group: a vascular tract, a urinary tract, a gastrointestinal tract, and a fistula tract.
  • the 3D medical image is a member of the following group: computerized tomography (CT) scan images, magnetic resonance imager (MRI) scan images, ultrasound scan images, and positron emission tomography (PET)-CT scan images.
  • CT computerized tomography
  • MRI magnetic resonance imager
  • PET positron emission tomography
  • the planned image-guided procedure is an angioplasty procedure.
  • the apparatus further comprises a user interface operatively connected to the model generation unit, the user interface allows a user to instruct the model generation unit during the generation of the 3D anatomical model.
  • the simulated planned image-guided procedure is used as a study case during a learning process.
  • the simulated planned image-guided procedure is used to demonstrate a respective image-guided procedure to the patient.
  • the simulated planned image-guided procedure is used to document preparation to an operation.
  • the input is configured for receiving a four dimensional (4D) medical image depicting the organ during a certain period
  • the model generation unit configured for generating a 4D organ model of the organ according to the 4D medical image
  • the simulating unit configured for simulating an image-guided procedure planned for the patient according to the 4D organ model.
  • the organ is a member of a group comprising: an anatomical region, a human body system, an area of an organ, a number of areas of an organ, a section of an organ, and a section of a human body system.
  • a method for performing a simulated image-guided procedure comprises the following steps: a) obtaining a three-dimensional (3D) medical image depicting an organ of a patient, b) producing a 3D anatomical model of the organ according to the 3D medical image, and c) simulating an image-guided procedure planned for the patient according to the 3D model.
  • the method further comprises a step al) between step a) and b) of segmenting the organ in the 3D medical image to a plurality of areas, the producing of step b) is performed according to the segmented 3D medical image.
  • the planned image-guided procedure is an angioplasty procedure.
  • the producing comprises a step of receiving generation instructions from a system user, the generation instructions being used for defining the 3D model.
  • the simulating comprises displaying the organ.
  • the method further comprises a step of allowing a system user to mark labels for the planned image-guided procedure according to the display.
  • the planned image-guided procedure is an angioplasty procedure.
  • the simulation is a pre-operative surgical simulation.
  • the 3D anatomical model is a model of a tract.
  • the 3D anatomical model is a tract model.
  • the tract model define a member of the following group: a vascular tract, a urinary tract, a gastrointestinal tract, and a fistula tract.
  • the obtaining comprises a step of obtaining a four dimensional (4D) medical image depicting the organ during a certain period
  • the producing comprises a step of producing a 4D model of the organ according to the 4D medical image
  • the simulating is performed according to the 4D model.
  • the organ is a member of a group comprising: an anatomical region, a human body system, an area of an organ, a number of areas of an organ, a section of an organ, and a section of a human body system.
  • Implementation of the method and system of the present invention involves performing or completing certain selected tasks or steps manually, automatically, or a combination thereof.
  • several selected steps could be implemented by hardware or by software on any operating system of any firmware or a combination thereof.
  • selected steps of the invention could be implemented as a chip or a circuit.
  • selected steps of the invention could be implemented as a plurality of software instructions being executed by a computer using any suitable operating system.
  • selected steps of the method and system of the invention could be described as being performed by a data processor, such as a computing platform for executing a plurality of instructions.
  • FIG. 1 is a schematic representation of a pre-operative simulator for simulating an image-guided procedure, according to one preferred embodiment of the present invention
  • FIG. 2A is a graphical representation of the Hounsfield scale, which measures attenuation of X-Ray radiation by a medium. Hounsfield values of different human tissues are marked;
  • FIGS. 2B and 2C respectively illustrate schematically two triangular surface models of a femur bone, one directly generated from scan data, and a coarsened variant of the segment in FIG. 2B which is generated according to one preferred embodiment of the present invention
  • FIG. 3 is a schematic representation of the pre-operative simulator of FIG. 1 with a detailed description of the simulating unit, according to one embodiment of the present invention
  • FIG. 4 is an exemplary illustration of the pre-operative simulator of FIG. 3 , according to an embodiment of the present invention.
  • FIG. 5 is an exemplary illustration of a screen display taken during the simulation of an image-guide procedure, according to an embodiment of the present invention.
  • FIG. 6 is a flowchart of a method for performing a pre-operative simulation of an image-guided procedure, according to a preferred embodiment of present invention.
  • the present embodiments comprise a apparatus and a method for simulating an image-guided procedure.
  • the apparatus and the method allow a physician to set a pre-operative simulation of an image-guided procedure.
  • the pre-operative simulation simulates an image-guided procedure that is about to be performed on a certain patient.
  • a 3D medical image that depicts an anatomical region of a certain patient who is about to be operated on is acquired and 3D anatomical models are generated based thereupon.
  • the 3D anatomical model defines the boundaries of a certain anatomy or an organ such as a vascular tract.
  • the 3D anatomical models are used for simulating an image-guided procedure on that region.
  • a 3D medical image may be understood as a sequence of CT scan images, a sequence of MRI scan images, a sequence of PET-CT scan images, a spatial image, etc.
  • a medical imaging system may be understood as an MRI imaging system, a CT imaging system, a PET-CT imaging system, etc.
  • An organ or an anatomical region may be understood as human body organ, a human body system, an area of an organ, a number of areas of an organ, a section of an organ, a section of a human body system, etc.
  • FIG. 1 is a schematic representation of a pre-operative simulator 1 for simulating an image-guided procedure, according to one preferred embodiment of the present invention.
  • the pre-operative simulator 1 comprises an input unit 2 for obtaining a 3D medical image that depicts an anatomy region of a patient and an anatomy-model generation unit 3 that is designed for generating a 3D anatomical model of an organ according to the received 3D medical image.
  • the pre-operative simulator 1 further comprises a simulating unit 4 for simulating an image-guided procedure according to the three-dimensional model, as described below.
  • the input unit 2 preferably allows the system for simulating image-guided procedure 1 to fetch the 3D medical image from a medical images server such as a picture archiving communication system (PACS) before being accessed by the physicians.
  • the PACS server comprises a number of computers, which are dedicated for storing, retrieving, distributing and presenting the stored 3D medical images.
  • the 3D medical images are stored in a number of formats. The most common format for image storage is digital imaging and communications in medicine (DICOM).
  • DICOM digital imaging and communications in medicine
  • the fetched 3D medical image is represented in a 3D array, preferably of 512 ⁇ 512 ⁇ 150 voxels.
  • the input unit 2 receives as input a raw 3D data array, composed as a pre-fetched and pre-parsed DICOM image.
  • the segmentation is not limited to a specific modality.
  • the 3D medical image is a CT scan.
  • each voxel is represented by a single measured value, physically corresponding to the degree of X-ray attenuation of a respective location in the depicted organ.
  • the data acquisition modality is CT-angiography (CTA).
  • the input unit 2 may be adjusted to receive the 3D medical image from a PACS workstation, a computer network, or a portable memory device such as a DVD, a CD, a memory card, etc.
  • the received 3D medical image is forwarded to the anatomy-model generation unit 3 that is designed for generating the 3D anatomical model, as described above.
  • the anatomy-model generation unit 3 comprises a 3D image segmentation unit that is used for segmenting the received 3D medical image into anatomical structures. The segmentation is performed either automatically or semi-automatically. In one embodiment, a standard automatic segmentation procedure is used for segmenting the image.
  • the segmentation is based on a procedure in which relevant voxels of the received raw 3D data array are isolated.
  • the physical attenuation is scaled in HUs, where the value ⁇ 1000 HU is associated with air and the value 0 HU is associated with water, as shown at FIG. 2A .
  • different tissue types have different typical HU ranges.
  • the typical attenuation of a specific tissue is used to isolate it in a 3D array of CT data.
  • the value of voxels that depict lungs is usually between ⁇ 550 HU and ⁇ 450 HU and the value of voxels that depict bones is approximately between 450 HU and 1000 HU.
  • the HU values of voxels of the 3D medical image are used for isolating the voxels in the tissue of interest.
  • intravenous contrast enhancement (ICE) components such as Barium, Iodine or any other radiopharmaceutical component, are applied when the 3D medical image is taken.
  • ICE components increase the HU value of blood vessels to the HU value of bones and sometimes beyond. Such an increment results in a higher contrast between the vessel voxels and the surrounding that can improve the segmentation procedure.
  • the segmentation procedure is adapted to segment a subset of scanned voxels from the 3D medical image, wherein the stored values of the voxels in the subset is in a predefined range.
  • all the voxels with stored values in the range of blood vessels is segmented and tagged.
  • a triangle mesh is computed from the raw 3D data array of the HU values.
  • a variant of the marching cubes algorithm is used for initial generating the triangle mesh, see Marching Cubes: A High Resolution 3D Surface Construction Algorithm”, William E. Lorensen and Harvey E. Cline, Computer Graphics (Proceedings of SIGGRAPH '87), Vol. 21, No. 4, pp. 163-169.
  • the triangle mesh is used for surface construction of segments in the 3D medical image.
  • the mesh obtained by the variant of the marching cube algorithm bounds the desired volume of the segment. As the segment is obtained in the resolution of the 3D medical image, it may be extremely fine. Therefore, preferably, an additional decimation processing stage is carried out, in which the mesh is coarsened and the level of surface approximation of the segments is reduced.
  • an Edge-Collapse operation is used for the coarsening, see Hoppe, H. Progressive meshes. In Proc. SIGGRAPH '96, pages 99-108, August 1996 and Hussain, M., Okada, Y. and Niijima, K. Fast, simple, feature-preserving and memory efficient simplification of triangle meshes. International Journal of Image and Graphics, 3(4):1-18, 2003.
  • An example for such decimation is depicted in FIGS. 2B and 2C that respectively depict a schematic illustration a segmented femur bone and a coarsened variant of the segmented femur bone that has been generated by applying the aforementioned decimation processing.
  • the 3D medical image is represented in a 3D array of 512 ⁇ 512 ⁇ 150, wherein each voxel is preferably represented by a value in one of the following formats: 8-bit (1 byte storage), 12-bit (2 byte storage), 16 bit (2 byte storage), and a single-precision floating point (4 byte storage).
  • the segmentation procedure is adapted to segment the anatomy that is depicted in the received 3D medical image. Different anatomic parts have different characteristics that affect segmentation.
  • the segmentation procedure's object is to identify such a tract and to segment it or to segment all the areas that delimit that tract.
  • the carotid artery is the tract through which the catheter or alike is conveyed.
  • the carotid artery should be segmented.
  • the artery net possesses a-priori known traits that can be exploited to enhance and verify the fidelity of the segmentation stage.
  • the area is a cervical portion and the procedure is carotid stenting
  • the following anatomical structures are exploited: the thoracic aorta, the brachiocephalic trunk, the Subclavian arteries, the carotid arteries, and the vertebral arteries.
  • blood vessels in the image of the organ are identified and segmented during the segmentation procedure.
  • the centerline the radius and the inter-connectivity of each one of the main blood vessels in the image are identified and registered.
  • the anatomy-model generation unit 3 is connected to a user interface (not shown).
  • a simulator user may be asked, for example, to mark one or more points on a depicted tract. For example, if the received 3D medical image depicts a cervical portion of the human spine and the image-guided procedure is an angioplasty procedure, the simulator user may be required to mark the left carotid artery as a starting point for the automatic segmentation.
  • segmented version of the 3D image or an array that represents the segmented areas and the tracts is generated.
  • the segmented areas can be represented in several formats and sets of data.
  • the segmented 3D image is represented by using one or more of the following sets of data:
  • the segmented 3D medical image or an array representing segments in the 3D medical image is forwarded to the simulating unit 4 .
  • the pre-operative simulator 1 may also be used to simulate an image-guided procedure according to a four dimensional (4D) image, which is a set of 3D medical image that depicts a certain organ during a certain period.
  • a 4D image is received by the input 2 .
  • the received 4D medical image is forwarded to the anatomy-model generation unit 3 that is designed for generating the 4D model.
  • the anatomy-model generation unit 3 comprises a 4D image segmentation unit that is used for segmenting the received 4D medical image into anatomical structures. The segmentation is performed either automatically or semi-automatically.
  • each one of the 3D medical image that comprise the received 4D medical image is separately segmented, as described below.
  • FIG. 3 is a block diagram representing the pre-operative simulator 1 , which is depicted in FIG. 1 , the components of the simulating unit 4 , and a planning module 51 , according to one embodiment of the present invention.
  • the simulating unit 4 preferably comprises two subsystems.
  • the first subsystem is an intervention simulator device 50 constituted by a dummy interventional instrument 52 , motion detectors 53 , a movement calculation unit 57 , an image display device 58 , and a force feedback mechanism 54 .
  • the second subsystem is a simulation module 55 that has the functions of receiving inputs from the motion detectors 53 , analyzing the inputs using the movement calculation unit 57 , translating the outcome to visual and tactile outputs and transferring them to the display device 58 and to the force feedback mechanism 54 .
  • the simulation module 55 has also the functions of receiving the segmented 3D medical image from the anatomy-model generation unit 3 , wherein the received segmented 3D medical image is already translated to a 3D model that simulates the organ that is depicted in the segmented 3D medical image. As described above the segmented 3D medical image is based on a 3D medical image that is received from the actual patient who is about to be operated on.
  • FIG. 4 is an exemplary illustration of the aforementioned pre-operative simulator 1 for simulation of an image-guided procedure according to an embodiment of the present invention.
  • the dummy intervention instrument 52 and the image display device are as in FIG. 3 , however FIG. 4 further depicts an enclosure 62 , a computer processor 64 , and a user input interface 65 .
  • a physician prepares himself for the operative image-guided procedure by manipulating the dummy interventional instrument 52 that is preferably a dummy catheter.
  • the dummy interventional instrument 52 is inserted into a cavity 66 within an enclosure 62 that comprises the motion detectors and force feedback components (not shown), such as resisting force generators, of the force feedback mechanism (not shown).
  • tactile and visual feedbacks are determined according to the position of dummy interventional instrument 52 within the enclosure 62 in respect to the aforementioned 3D model of the simulated organ.
  • Visual feedback is provided in the form of a display on the image display device 58 and tactile feedback is provided from the force feedback components within the enclosure 62 .
  • the visual and tactile feedbacks, which are respectively displayed on the image display device 58 and imparted on the dummy interventional instrument 52 are designed to improve technical and operational skills of the physician.
  • the visual feedback is given by a display device 58 that displays a sequence of consecutive images, which are based on a 3D model that is based on the received 3D medical image.
  • the tactile feedback is given by imparting different pressures on the dummy interventional instrument respective to the movement signals as received from the imaging simulation module, in respect to the 3D model that is based on the received 3D medical image.
  • the different pressures simulate the actual tactile feeling the physician experiences during a real image-guided procedure and reflects the actual reaction of the patient tissues to the dummy interventional instrument 52 manipulation.
  • the image display device 58 displays a real time feedback image as transferred from the simulation module (not shown).
  • the real time feedback image represents a visual image as seen if an interventional instrument was inserted into the organ of the patient which is about to be operated on.
  • the visual image is an accurate and realistic simulation of the visual data that would be received from the related organ.
  • the simulation module and the anatomy-model generation unit 3 are supported by a processor such as an Intel Pentium Core-Duo, with an nVidia GeForce-6+ (6600 onwards) GPU.
  • a processor such as an Intel Pentium Core-Duo, with an nVidia GeForce-6+ (6600 onwards) GPU.
  • the simulation module 55 through the processor, is utilized to prepare simulated organ visual images as displayed on the screen during the operative image-guided procedure.
  • the visual feedback is rendered for simulating a visual display of the organ during the simulated image-guided procedure, as shown in FIG. 5 that is a simulated fluoroscopic image of Carotid stenting.
  • the simulation module 55 simulates a number of vascular tracts, according to the received 3D medical image.
  • the simulation module 55 receives navigation signals from the motion detectors 53 , which are located along the enclosure cavity.
  • the simulation module 55 uses the processor to calculate the position of the dummy interventional instrument 52 within the enclosure cavity according to the navigation signals and updates the visual image of the organ, as described above, with the instantaneous respective position of the dummy interventional instrument 52 . Moreover, the simulation module 55 simulates realistic interaction between the simulated instrument, such as a catheter, and the simulated anatomy, including—but not limited to—catheter twist and bend, vessel flexing and optionally vessel rupture.
  • the simulation module 55 also instructs the components of the force feedback 54 to impart pressure on the dummy interventional instrument 52 in a manner that simulates the instantaneous tactile feedback of the procedure.
  • Such visual images and tactile feedback simulate the actual feedback as received during an actual medical procedure as performed on an actual subject and therefore reflect to the physician the current location and bending of the interventional instrument along the simulated organ.
  • the pre-operative simulator 1 is not bound to the simulation of a particular organ, such as a vascular tract, but can reflect a visual display of various elements and organs relative to the instantaneous position of the interventional instrument. Simulators of image-guided procedures are not described here in greater detail as they are generally well known and already comprehensibly described in the incorporated patents and in publications known to the skilled in the art.
  • the pre-operative simulator 1 is designed to allow a physician to conduct a pre-operative surgical simulation of the image-guided procedure he or she is about to perform on a certain patient.
  • the physician refers the certain patient to a medical imaging system for acquiring a 3D medical image of an organ that is about to be operated on.
  • the acquired 3D medical image is then forwarded to the PACS server. Later on, the acquired 3D medical image is obtained by the pre-operative simulator 1 from the PACS server.
  • the 3D medical image is used as the basis for a 3D anatomical model of the organ.
  • the 3D anatomical model is generated by a segmentation unit that is designed for segmenting the organ into a number of areas, as described in greater detail above.
  • Such a pre-operative simulator 1 can also be used for explaining and demonstrating to the patient the details of his pathology and the operation he is about to undergo.
  • the pre-operative simulator 1 can also be used as a learning tool.
  • Known simulators are designed to simulate an image-guided procedure on a predefined model of a virtual organ. As the simulated organ is a virtual organ, the trainer cannot be experienced in diagnosing a real patient in a manner that allows him to receive a more comprehensive overview of the related case.
  • the pre-operative simulator 1 allows the performance of patient-specific simulations of real anatomy, as described above. As such, the pre-operative simulator 1 can be used for teaching a very real case, with real anatomy, lesions, problems, conflicts and resolutions. Physicians can experience a more realistic image-guided procedure, and decisions may be taken during the simulated image-guided procedure based on the overall medical history and the medical condition of the patient himself.
  • the pre-operative simulator 1 can also be used as a planning tool.
  • the planning module 51 which is depicted in FIG. 3 , is preferably connected to the image display device 58 or to any other display device and to a user interface.
  • the planning module 51 supports tools for allowing physicians to plan an operative image-guided procedure according to the aforementioned case-specific simulation.
  • the module preferably allows the physician to sketch and to take notes during the image-guided procedure simulation.
  • the image display device 58 is a touch screen that allows the physician to sketch a track that depicts the maneuvers that he intends to take during the operative image-guided medical procedure.
  • the physician can mark problematic areas of the depicted organ.
  • the image-guided procedure simulation is an angioplasty procedure simulation. The physician can use the touch screen to sketch the boundaries of the tract through which he intends to perform the procedure or a portion thereof.
  • the pre-operative simulator 1 can also be used as an analyzer tool for going back over performed operations.
  • the model of the operated organ is generated according to a medical image of an organ which is about to be operated.
  • the pre-operative simulator 1 is used for performing a reenactment of the image-guided procedure that has been performed on the patient. Such a reenactment is performed as an image-guided procedure simulation, as described above.
  • the model that is used by the pre-operative simulator 1 simulates the operated on organ, the reenactment is realistic and allows the physicians to be prepared better to the operation.
  • FIG. 6 is a flowchart of a method for performing a simulated image-guided procedure, according to one embodiment of present invention.
  • the method depicted in FIG. 6 allows a physician to conduct a clinical pre-operative simulation of the image guided procedure he or she is about to perform. Such a simulation allows the physician to take safe and unrushed clinical decisions based on a 3D medical image of the patient that is about to be operated on.
  • a 3D medical image depicting an organ of a patient is obtained.
  • the 3D medical image has been taken using a medical imaging system and obtained, for example via a PACS server or a portable memory device, as described above.
  • the 3D medical image depicts an organ of a patient that is about to be operated on.
  • a 3D model of the anatomy is produced according to the received 3D medical image.
  • the 3D model defines the boundaries of areas in the anatomy such as a certain tract.
  • a simulation of an image-guided procedure on the patient is held according to the 3D model that has been constructed in the previous step.
  • the simulation of the image-guided procedure allows a physician to prepare himself to the operative image-guided procedure. Based on the simulation, the physician can choose the fittest angles and the tools. Furthermore, the user can mark pitfalls, such as hard-to navigate zones or misleading view angles in advance.
  • the physician can choose, in advance, the size and the type of the catheter, the balloon, and the stent he is going to use during the operation. Moreover, gaining acquaintance with the specific anatomy of the patient in advance may result in reducing contrast injection and X-ray exposure.
  • the duration of the X-ray exposure periods depends on the time it takes the physician to maneuver the catheter in the relevant anatomy region. If the physician already simulated the angioplasty procedure using the aforementioned system, he is already familiar with the specific region and therefore can easily maneuver the catheter during the actual angioplasty procedure.

Landscapes

  • Engineering & Computer Science (AREA)
  • Health & Medical Sciences (AREA)
  • Medical Informatics (AREA)
  • Public Health (AREA)
  • General Health & Medical Sciences (AREA)
  • Physics & Mathematics (AREA)
  • Nuclear Medicine, Radiotherapy & Molecular Imaging (AREA)
  • Epidemiology (AREA)
  • Primary Health Care (AREA)
  • General Physics & Mathematics (AREA)
  • Computer Graphics (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • Biomedical Technology (AREA)
  • Theoretical Computer Science (AREA)
  • Radiology & Medical Imaging (AREA)
  • Surgery (AREA)
  • Software Systems (AREA)
  • Animal Behavior & Ethology (AREA)
  • Robotics (AREA)
  • Geometry (AREA)
  • Veterinary Medicine (AREA)
  • Molecular Biology (AREA)
  • Heart & Thoracic Surgery (AREA)
  • Computer Hardware Design (AREA)
  • General Engineering & Computer Science (AREA)
  • Data Mining & Analysis (AREA)
  • Databases & Information Systems (AREA)
  • Pathology (AREA)
  • Apparatus For Radiation Diagnosis (AREA)
  • Magnetic Resonance Imaging Apparatus (AREA)

Abstract

An apparatus for simulating an image-guided procedure. The system comprises an input for receiving a three-dimensional (3D) medical image depicting an organ of a patient, a model generation unit for generating a 3D anatomical model of the organ according to the 3D medical image, and a simulating unit for simulating a planned image-guided procedure on the patient, according to the 3D anatomical model.

Description

    FIELD AND BACKGROUND OF THE INVENTION
  • The present invention relates to an apparatus and a method for performing a simulated image-guided medical procedure and, more particularly, but not exclusively to performing a simulated image-guided procedure according to a three-dimensional (3D) model of an organ that is based on a 3D medical image.
  • Medical imaging is generally recognized as important for diagnosis and patient care with the goal of improving treatment outcomes. In recent years, medical imaging has experienced an explosive growth due to advances in imaging modalities such as x-rays, computed tomography (CT), magnetic resonance imaging (MRI) and ultrasound. These modalities provide noninvasive methods for studying internal organs in vivo, but the amount of data is relatively large and when presented as two dimensional (2D) images, it generally requires an anatomist/radiology specialist for interpretation. Unfortunately, the cost incurred in manual interpretation of this data is prohibitive for routine data analysis. The 2D slices can be combined to generate a 3-D volumetric model.
  • Such medical imaging systems allow the performance of minimally invasive therapeutic procedures. These procedures are typically carried out in a CathLab, where a physician wishes to assess the functions of internal organ such as the heart and coronary artery or to perform procedures such as coronary angioplasty.
  • Most radiology yields recorded images such as 2D X-ray films or 3D medical images such as CT and MRI scans. Mild dosage interactively controlled X-Ray, also known as fluoroscopy, allows a physician to monitor actively an operation at progress. Interventional radiology is the specialty in which the radiologist and cardiologists utilizes real time radiological images to perform therapeutic and diagnostic procedures. Interventional radiologists currently rely on the real-time fluoroscopic 2D images, available as analog video or digital information viewed on video monitors.
  • However, these procedures involve delicate and coordinated hand movements, spatially unrelated to the view on a video monitor of the remotely controlled surgical instruments. Depth perception is lacking on the flat video display and therefore it is not an easy task to learn to control the tools through the spatially arbitrary linkage. A mistake in this difficult environment can be dangerous. Therefore, a high level of skill is required, and a realistic training of these specialists is a complex task. In addition, usually there is no direct engagement of the depth perception of the radiologist, who must make assumptions about the patient's anatomy to deliver therapy and assess the results.
  • Medical simulators that can be used to train such medical specialists have significant potential in reducing healthcare costs through improved training, better pre-treatment planning, and more economic and rapid development of new medical devices. Hands-on experience becomes possible in training, before direct patient involvement that will carry a significant risk.
  • Image-guided procedures, such as vascular catheterization, angioplasty, and stent placement, are specially suited for simulation because they typically place the physician at-a-distance from the operative site manipulating surgical instruments and viewing the procedures on video monitors.
  • For example, U.S. Pat. No. 6,062,866 published on May 16, 2000 describes a medical model for teaching and demonstrating invasive medical procedures such as angioplasty. The model is a plastic, transparent three-dimensional, anatomically correct representation of at least a portion of the vascular system and in a preferred embodiment would include the aorta, coronary artery, subclavian arteries, pulmonary artery and renal arteries each defining a passageway or lumen. An access port is provided so that actual medical devices, such as a guide and catheter may be inserted to the location-simulated blockage. Fluid may also be introduced to simulate realistically in vivo conditions. Simulated heart chambers of similar construction may also be attached to the aortic valve to enhance further the representation of invasive procedures.
  • More complex simulation systems that provide more accurate, linked visual and tactile feedback during the training is disclosed in U.S. Patent Application No. 2003/0069719 published Apr. 10, 2003 that describes an interface device and method for interfacing instruments to a vascular access simulation system serve to interface peripherals in the form of mock or actual medical instruments to the simulation system to enable simulation of medical procedures. The interface device includes a catheter unit assembly for receiving a catheter needle assembly, and a skin traction mechanism to simulate placing skin in traction or manipulating other anatomical sites for performing a medical procedure. The catheter needle assembly and skin traction mechanism are manipulated by a user during a medical procedure. The catheter unit assembly includes a base, a housing, a bearing assembly and a shaft that receives the catheter needle assembly. The bearing assembly enables translation of the catheter needle assembly, and includes bearings that enable the shaft to translate in accordance with manipulation of the catheter needle assembly. The shaft typically includes an encoder to measure translational motion of a needle of the catheter needle assembly, while the interface device further includes encoders to measure manipulation of the catheter needle assembly in various degrees of freedom and the skin traction mechanism. The simulation system receives measurements from the interface device encoders and updates the simulation and display, while providing control signals to the force feedback device to enable application of force feedback to the catheter needle assembly.
  • Another example for a simulating system that is designed to simulate an image guiding procedure according to a predefined and fixed module is disclosed in U.S. Pat. No. 6,538,634 published on Mar. 25, 2003.
  • These simulation systems and other known simulation systems are based on predefined models, which are acquired and enhanced before the systems become operational or during a maintenance thereof, such as updating the system. Usually, a library that comprises virtual models which are stored in a related database is connected to the simulation system. During the operational mode, the system simulates an image-guided procedure according to one of the virtual models that has been selected by the system user.
  • Though such systems allow physicians and trainees to simulate image-guided procedures, the simulated image-guided procedures are modeled according to predefined or randomly changed models of an organ, a human body system, or a section thereof. As such, the physician or the trainee is trained using a model of a virtual organ that is not identical to the organ that he or she is about to perform an operative image-guided procedure on.
  • Moreover, when a virtual model is used, the simulation system cannot be used for accurately simulating an operation that has been performed on a real patient. Therefore, the currently used simulation systems cannot be used for going back over an operation that went wrong or for didactic purposes.
  • There is thus a widely recognized need for, and it would be highly advantageous to have, a system for simulating image-guided procedures, devoid of the above limitations, that can simulate in a more realistic manner the image-guided procedure that the physician is about to perform.
  • SUMMARY OF THE INVENTION
  • According to one aspect of the present invention there is provided an apparatus for simulating an image-guided procedure. The apparatus comprises an input for receiving a three-dimensional (3D) medical image depicting an organ of a patient, a model generation unit configured for generating a 3D anatomical model of the organ according to the 3D medical image, and a simulating unit configured for simulating an image-guided procedure planned for the patient according to the 3D anatomical model.
  • Preferably, the apparatus further comprises a segmentation unit operatively connected to the model generation unit, the segmentation unit being configured for segmenting the organ in the 3D medical image to a plurality of areas, the segmented organ image being used for generating the 3D anatomical model.
  • Preferably, the 3D anatomical model is a model of a tract.
  • More preferably, the tract is a member of the following group: a vascular tract, a urinary tract, a gastrointestinal tract, and a fistula tract.
  • Preferably, the 3D medical image is a member of the following group: computerized tomography (CT) scan images, magnetic resonance imager (MRI) scan images, ultrasound scan images, and positron emission tomography (PET)-CT scan images.
  • Preferably, the planned image-guided procedure is an angioplasty procedure.
  • Preferably, the apparatus further comprises a user interface operatively connected to the model generation unit, the user interface allows a user to instruct the model generation unit during the generation of the 3D anatomical model.
  • Preferably, the simulated planned image-guided procedure is used as a study case during a learning process.
  • Preferably, the simulated planned image-guided procedure is used to demonstrate a respective image-guided procedure to the patient.
  • Preferably, the simulated planned image-guided procedure is used to document preparation to an operation.
  • Preferably, the input is configured for receiving a four dimensional (4D) medical image depicting the organ during a certain period, the model generation unit configured for generating a 4D organ model of the organ according to the 4D medical image, the simulating unit configured for simulating an image-guided procedure planned for the patient according to the 4D organ model.
  • Preferably, the organ is a member of a group comprising: an anatomical region, a human body system, an area of an organ, a number of areas of an organ, a section of an organ, and a section of a human body system.
  • According to one aspect of the present invention there is provided a method for performing a simulated image-guided procedure. The method comprises the following steps: a) obtaining a three-dimensional (3D) medical image depicting an organ of a patient, b) producing a 3D anatomical model of the organ according to the 3D medical image, and c) simulating an image-guided procedure planned for the patient according to the 3D model.
  • Preferably, the method further comprises a step al) between step a) and b) of segmenting the organ in the 3D medical image to a plurality of areas, the producing of step b) is performed according to the segmented 3D medical image.
  • Preferably, the planned image-guided procedure is an angioplasty procedure.
  • Preferably, the producing comprises a step of receiving generation instructions from a system user, the generation instructions being used for defining the 3D model.
  • Preferably, the simulating comprises displaying the organ.
  • More preferably, the method further comprises a step of allowing a system user to mark labels for the planned image-guided procedure according to the display.
  • Preferably, the planned image-guided procedure is an angioplasty procedure.
  • Preferably, the simulation is a pre-operative surgical simulation.
  • Preferably, the 3D anatomical model is a model of a tract.
  • Preferably, the 3D anatomical model is a tract model.
  • More preferably, the tract model define a member of the following group: a vascular tract, a urinary tract, a gastrointestinal tract, and a fistula tract.
  • Preferably, the obtaining comprises a step of obtaining a four dimensional (4D) medical image depicting the organ during a certain period, the producing comprises a step of producing a 4D model of the organ according to the 4D medical image, the simulating is performed according to the 4D model.
  • Preferably, the organ is a member of a group comprising: an anatomical region, a human body system, an area of an organ, a number of areas of an organ, a section of an organ, and a section of a human body system.
  • Unless otherwise defined, all technical and scientific terms used herein have the same meaning as commonly understood by one of ordinary skill in the art to which this invention belongs. The materials, methods, and examples provided herein are illustrative only and not intended to be limiting.
  • Implementation of the method and system of the present invention involves performing or completing certain selected tasks or steps manually, automatically, or a combination thereof. Moreover, according to actual instrumentation and equipment of preferred embodiments of the method and system of the present invention, several selected steps could be implemented by hardware or by software on any operating system of any firmware or a combination thereof. For example, as hardware, selected steps of the invention could be implemented as a chip or a circuit. As software, selected steps of the invention could be implemented as a plurality of software instructions being executed by a computer using any suitable operating system. In any case, selected steps of the method and system of the invention could be described as being performed by a data processor, such as a computing platform for executing a plurality of instructions.
  • BRIEF DESCRIPTION OF THE DRAWINGS
  • The invention is herein described, by way of example only, with reference to the accompanying drawings. With specific reference now to the drawings in detail, it is stressed that the particulars shown are by way of example and for purposes of illustrative discussion of the preferred embodiments of the present invention only, and are presented in order to provide what is believed to be the most useful and readily understood description of the principles and conceptual aspects of the invention. In this regard, no attempt is made to show structural details of the invention in more detail than is necessary for a fundamental understanding of the invention, the description taken with the drawings making apparent to those skilled in the art how the several forms of the invention may be embodied in practice.
  • In the drawings:
  • FIG. 1 is a schematic representation of a pre-operative simulator for simulating an image-guided procedure, according to one preferred embodiment of the present invention;
  • FIG. 2A is a graphical representation of the Hounsfield scale, which measures attenuation of X-Ray radiation by a medium. Hounsfield values of different human tissues are marked;
  • FIGS. 2B and 2C respectively illustrate schematically two triangular surface models of a femur bone, one directly generated from scan data, and a coarsened variant of the segment in FIG. 2B which is generated according to one preferred embodiment of the present invention;
  • FIG. 3 is a schematic representation of the pre-operative simulator of FIG. 1 with a detailed description of the simulating unit, according to one embodiment of the present invention;
  • FIG. 4 is an exemplary illustration of the pre-operative simulator of FIG. 3, according to an embodiment of the present invention;
  • FIG. 5 is an exemplary illustration of a screen display taken during the simulation of an image-guide procedure, according to an embodiment of the present invention; and
  • FIG. 6 is a flowchart of a method for performing a pre-operative simulation of an image-guided procedure, according to a preferred embodiment of present invention.
  • DESCRIPTION OF THE PREFERRED EMBODIMENTS
  • The present embodiments comprise a apparatus and a method for simulating an image-guided procedure. According to one embodiment of the present invention, the apparatus and the method allow a physician to set a pre-operative simulation of an image-guided procedure. The pre-operative simulation simulates an image-guided procedure that is about to be performed on a certain patient. In order to allow such a case-specific simulation, a 3D medical image that depicts an anatomical region of a certain patient who is about to be operated on is acquired and 3D anatomical models are generated based thereupon. Preferably, the 3D anatomical model defines the boundaries of a certain anatomy or an organ such as a vascular tract. During the pre-operative simulation, the 3D anatomical models are used for simulating an image-guided procedure on that region.
  • The principles and operation of an apparatus and method according to the present invention may be better understood with reference to the drawings and accompanying description.
  • Before explaining at least one embodiment of the invention in detail, it is to be understood that the invention is not limited in its application to the details of construction and the arrangement of the components set forth in the following description or illustrated in the drawings. The invention is capable of other embodiments or of being practiced or carried out in various ways. Also, it is to be understood that the phraseology and terminology employed herein is for the purpose of description and should not be regarded as limiting.
  • A 3D medical image may be understood as a sequence of CT scan images, a sequence of MRI scan images, a sequence of PET-CT scan images, a spatial image, etc.
  • A medical imaging system may be understood as an MRI imaging system, a CT imaging system, a PET-CT imaging system, etc.
  • An organ or an anatomical region may be understood as human body organ, a human body system, an area of an organ, a number of areas of an organ, a section of an organ, a section of a human body system, etc.
  • Reference is now made to FIG. 1, which is a schematic representation of a pre-operative simulator 1 for simulating an image-guided procedure, according to one preferred embodiment of the present invention. The pre-operative simulator 1 comprises an input unit 2 for obtaining a 3D medical image that depicts an anatomy region of a patient and an anatomy-model generation unit 3 that is designed for generating a 3D anatomical model of an organ according to the received 3D medical image. The pre-operative simulator 1 further comprises a simulating unit 4 for simulating an image-guided procedure according to the three-dimensional model, as described below.
  • The input unit 2 preferably allows the system for simulating image-guided procedure 1 to fetch the 3D medical image from a medical images server such as a picture archiving communication system (PACS) before being accessed by the physicians. The PACS server comprises a number of computers, which are dedicated for storing, retrieving, distributing and presenting the stored 3D medical images. The 3D medical images are stored in a number of formats. The most common format for image storage is digital imaging and communications in medicine (DICOM). Preferably, the fetched 3D medical image is represented in a 3D array, preferably of 512·512·150 voxels.
  • In one embodiment, the input unit 2 receives as input a raw 3D data array, composed as a pre-fetched and pre-parsed DICOM image. The segmentation is not limited to a specific modality. Preferably, the 3D medical image is a CT scan. In such an embodiment, each voxel is represented by a single measured value, physically corresponding to the degree of X-ray attenuation of a respective location in the depicted organ. Preferably, the data acquisition modality is CT-angiography (CTA).
  • The input unit 2 may be adjusted to receive the 3D medical image from a PACS workstation, a computer network, or a portable memory device such as a DVD, a CD, a memory card, etc.
  • The received 3D medical image is forwarded to the anatomy-model generation unit 3 that is designed for generating the 3D anatomical model, as described above. Preferably, the anatomy-model generation unit 3 comprises a 3D image segmentation unit that is used for segmenting the received 3D medical image into anatomical structures. The segmentation is performed either automatically or semi-automatically. In one embodiment, a standard automatic segmentation procedure is used for segmenting the image.
  • Preferably, the segmentation is based on a procedure in which relevant voxels of the received raw 3D data array are isolated. For example, if the raw 3D data array is based on a CT scan, the physical attenuation is scaled in HUs, where the value −1000 HU is associated with air and the value 0 HU is associated with water, as shown at FIG. 2A. On such a scale, different tissue types have different typical HU ranges. The typical attenuation of a specific tissue is used to isolate it in a 3D array of CT data. For example, the value of voxels that depict lungs is usually between −550 HU and −450 HU and the value of voxels that depict bones is approximately between 450 HU and 1000 HU.
  • In such an embodiment, the HU values of voxels of the 3D medical image are used for isolating the voxels in the tissue of interest. Preferably, in order to improve precision of the segmentation procedure, intravenous contrast enhancement (ICE) components, such as Barium, Iodine or any other radiopharmaceutical component, are applied when the 3D medical image is taken. The ICE components increase the HU value of blood vessels to the HU value of bones and sometimes beyond. Such an increment results in a higher contrast between the vessel voxels and the surrounding that can improve the segmentation procedure. Preferably, the segmentation procedure is adapted to segment a subset of scanned voxels from the 3D medical image, wherein the stored values of the voxels in the subset is in a predefined range. In one embodiment, all the voxels with stored values in the range of blood vessels is segmented and tagged.
  • In one embodiment of the present invention, a triangle mesh is computed from the raw 3D data array of the HU values. Preferably, a variant of the marching cubes algorithm is used for initial generating the triangle mesh, see Marching Cubes: A High Resolution 3D Surface Construction Algorithm”, William E. Lorensen and Harvey E. Cline, Computer Graphics (Proceedings of SIGGRAPH '87), Vol. 21, No. 4, pp. 163-169. The triangle mesh is used for surface construction of segments in the 3D medical image. The mesh obtained by the variant of the marching cube algorithm bounds the desired volume of the segment. As the segment is obtained in the resolution of the 3D medical image, it may be extremely fine. Therefore, preferably, an additional decimation processing stage is carried out, in which the mesh is coarsened and the level of surface approximation of the segments is reduced.
  • Preferably, an Edge-Collapse operation is used for the coarsening, see Hoppe, H. Progressive meshes. In Proc. SIGGRAPH '96, pages 99-108, August 1996 and Hussain, M., Okada, Y. and Niijima, K. Fast, simple, feature-preserving and memory efficient simplification of triangle meshes. International Journal of Image and Graphics, 3(4):1-18, 2003. An example for such decimation is depicted in FIGS. 2B and 2C that respectively depict a schematic illustration a segmented femur bone and a coarsened variant of the segmented femur bone that has been generated by applying the aforementioned decimation processing. Preferably, the 3D medical image is represented in a 3D array of 512×512×150, wherein each voxel is preferably represented by a value in one of the following formats: 8-bit (1 byte storage), 12-bit (2 byte storage), 16 bit (2 byte storage), and a single-precision floating point (4 byte storage).
  • Preferably, the segmentation procedure is adapted to segment the anatomy that is depicted in the received 3D medical image. Different anatomic parts have different characteristics that affect segmentation.
  • During the image-guided procedures, a catheter or the like is conveyed by a physician via a certain tract. Therefore, the segmentation procedure's object is to identify such a tract and to segment it or to segment all the areas that delimit that tract.
  • For example, if the received 3D medical image depicts a cervical portion of the human spine and the image-guided procedure is an angioplasty procedure, such as carotid stenting, the carotid artery is the tract through which the catheter or alike is conveyed. In such a case, the carotid artery should be segmented. The artery net possesses a-priori known traits that can be exploited to enhance and verify the fidelity of the segmentation stage. For example, if the area is a cervical portion and the procedure is carotid stenting, the following anatomical structures are exploited: the thoracic aorta, the brachiocephalic trunk, the Subclavian arteries, the carotid arteries, and the vertebral arteries.
  • Preferably, blood vessels in the image of the organ are identified and segmented during the segmentation procedure. Preferably, during the segmentation the centerline, the radius and the inter-connectivity of each one of the main blood vessels in the image are identified and registered.
  • Preferably, the anatomy-model generation unit 3 is connected to a user interface (not shown). In such an embodiment, a simulator user may be asked, for example, to mark one or more points on a depicted tract. For example, if the received 3D medical image depicts a cervical portion of the human spine and the image-guided procedure is an angioplasty procedure, the simulator user may be required to mark the left carotid artery as a starting point for the automatic segmentation.
  • When the segmentation process is completed, a segmented version of the 3D image or an array that represents the segmented areas and the tracts is generated. The segmented areas can be represented in several formats and sets of data. Preferably, the segmented 3D image is represented by using one or more of the following sets of data:
    • a. A cubic Catmull-Rom 3D spline description of a central curve of each artery or any other tract portion;
    • b. A tree description, graph description or any other description that describes the connectivity between arteries or any other tract portions. For example, such a description describes in which point an artery X emanates an artery Y;
    • c. A cubic Catmull-Rom 2D spline description of the radius of each artery at each point on its central curve;
    • d. A triangular surface mesh that describes the surface of the vasculature anatomy;
    • e. Polygonal meshes describing other organs captured in the scan—Lungs, heart, kidneys, etc; and
    • f. Classification of each raw data voxel to its designated part of anatomy (a vessel voxel, a kidney voxel, etc.).
  • The segmented 3D medical image or an array representing segments in the 3D medical image is forwarded to the simulating unit 4.
  • It should be noted that the pre-operative simulator 1 may also be used to simulate an image-guided procedure according to a four dimensional (4D) image, which is a set of 3D medical image that depicts a certain organ during a certain period. In such an embodiment, a 4D image is received by the input 2. The received 4D medical image is forwarded to the anatomy-model generation unit 3 that is designed for generating the 4D model. Preferably, the anatomy-model generation unit 3 comprises a 4D image segmentation unit that is used for segmenting the received 4D medical image into anatomical structures. The segmentation is performed either automatically or semi-automatically. In one embodiment, each one of the 3D medical image that comprise the received 4D medical image is separately segmented, as described below.
  • Reference is now made to FIG. 3, which is a block diagram representing the pre-operative simulator 1, which is depicted in FIG. 1, the components of the simulating unit 4, and a planning module 51, according to one embodiment of the present invention.
  • The simulating unit 4 preferably comprises two subsystems. The first subsystem is an intervention simulator device 50 constituted by a dummy interventional instrument 52, motion detectors 53, a movement calculation unit 57, an image display device 58, and a force feedback mechanism 54. The second subsystem is a simulation module 55 that has the functions of receiving inputs from the motion detectors 53, analyzing the inputs using the movement calculation unit 57, translating the outcome to visual and tactile outputs and transferring them to the display device 58 and to the force feedback mechanism 54. The simulation module 55 has also the functions of receiving the segmented 3D medical image from the anatomy-model generation unit 3, wherein the received segmented 3D medical image is already translated to a 3D model that simulates the organ that is depicted in the segmented 3D medical image. As described above the segmented 3D medical image is based on a 3D medical image that is received from the actual patient who is about to be operated on.
  • Reference in now made to FIG. 4, which is an exemplary illustration of the aforementioned pre-operative simulator 1 for simulation of an image-guided procedure according to an embodiment of the present invention. The dummy intervention instrument 52 and the image display device are as in FIG. 3, however FIG. 4 further depicts an enclosure 62, a computer processor 64, and a user input interface 65. In use, a physician prepares himself for the operative image-guided procedure by manipulating the dummy interventional instrument 52 that is preferably a dummy catheter. The dummy interventional instrument 52 is inserted into a cavity 66 within an enclosure 62 that comprises the motion detectors and force feedback components (not shown), such as resisting force generators, of the force feedback mechanism (not shown). As the physician manipulates the dummy interventional instrument 52, tactile and visual feedbacks are determined according to the position of dummy interventional instrument 52 within the enclosure 62 in respect to the aforementioned 3D model of the simulated organ. Visual feedback is provided in the form of a display on the image display device 58 and tactile feedback is provided from the force feedback components within the enclosure 62. The visual and tactile feedbacks, which are respectively displayed on the image display device 58 and imparted on the dummy interventional instrument 52 are designed to improve technical and operational skills of the physician. The visual feedback is given by a display device 58 that displays a sequence of consecutive images, which are based on a 3D model that is based on the received 3D medical image. The tactile feedback is given by imparting different pressures on the dummy interventional instrument respective to the movement signals as received from the imaging simulation module, in respect to the 3D model that is based on the received 3D medical image. The different pressures simulate the actual tactile feeling the physician experiences during a real image-guided procedure and reflects the actual reaction of the patient tissues to the dummy interventional instrument 52 manipulation.
  • The image display device 58 displays a real time feedback image as transferred from the simulation module (not shown). The real time feedback image represents a visual image as seen if an interventional instrument was inserted into the organ of the patient which is about to be operated on. The visual image is an accurate and realistic simulation of the visual data that would be received from the related organ.
  • Preferably, the simulation module and the anatomy-model generation unit 3 are supported by a processor such as an Intel Pentium Core-Duo, with an nVidia GeForce-6+ (6600 onwards) GPU.
  • Reference is now made, once again, to FIG. 3. The simulation module 55, through the processor, is utilized to prepare simulated organ visual images as displayed on the screen during the operative image-guided procedure. The visual feedback is rendered for simulating a visual display of the organ during the simulated image-guided procedure, as shown in FIG. 5 that is a simulated fluoroscopic image of Carotid stenting. Preferably, the simulation module 55 simulates a number of vascular tracts, according to the received 3D medical image. At the same time, the simulation module 55 receives navigation signals from the motion detectors 53, which are located along the enclosure cavity. The simulation module 55 uses the processor to calculate the position of the dummy interventional instrument 52 within the enclosure cavity according to the navigation signals and updates the visual image of the organ, as described above, with the instantaneous respective position of the dummy interventional instrument 52. Moreover, the simulation module 55 simulates realistic interaction between the simulated instrument, such as a catheter, and the simulated anatomy, including—but not limited to—catheter twist and bend, vessel flexing and optionally vessel rupture.
  • In addition, and in correspondence with the visual information, the simulation module 55 also instructs the components of the force feedback 54 to impart pressure on the dummy interventional instrument 52 in a manner that simulates the instantaneous tactile feedback of the procedure. Such visual images and tactile feedback simulate the actual feedback as received during an actual medical procedure as performed on an actual subject and therefore reflect to the physician the current location and bending of the interventional instrument along the simulated organ. Clearly, the pre-operative simulator 1 is not bound to the simulation of a particular organ, such as a vascular tract, but can reflect a visual display of various elements and organs relative to the instantaneous position of the interventional instrument. Simulators of image-guided procedures are not described here in greater detail as they are generally well known and already comprehensibly described in the incorporated patents and in publications known to the skilled in the art.
  • The pre-operative simulator 1 is designed to allow a physician to conduct a pre-operative surgical simulation of the image-guided procedure he or she is about to perform on a certain patient. In such an embodiment, the physician refers the certain patient to a medical imaging system for acquiring a 3D medical image of an organ that is about to be operated on. The acquired 3D medical image is then forwarded to the PACS server. Later on, the acquired 3D medical image is obtained by the pre-operative simulator 1 from the PACS server. The 3D medical image is used as the basis for a 3D anatomical model of the organ. The 3D anatomical model is generated by a segmentation unit that is designed for segmenting the organ into a number of areas, as described in greater detail above.
  • It should be noted that such a pre-operative simulator 1 can also be used for explaining and demonstrating to the patient the details of his pathology and the operation he is about to undergo.
  • In one embodiment of the present invention, the pre-operative simulator 1 can also be used as a learning tool. Known simulators are designed to simulate an image-guided procedure on a predefined model of a virtual organ. As the simulated organ is a virtual organ, the trainer cannot be experienced in diagnosing a real patient in a manner that allows him to receive a more comprehensive overview of the related case. As opposed to that, the pre-operative simulator 1 allows the performance of patient-specific simulations of real anatomy, as described above. As such, the pre-operative simulator 1 can be used for teaching a very real case, with real anatomy, lesions, problems, conflicts and resolutions. Physicians can experience a more realistic image-guided procedure, and decisions may be taken during the simulated image-guided procedure based on the overall medical history and the medical condition of the patient himself.
  • In one embodiment of the present invention, the pre-operative simulator 1 can also be used as a planning tool. The planning module 51, which is depicted in FIG. 3, is preferably connected to the image display device 58 or to any other display device and to a user interface. The planning module 51 supports tools for allowing physicians to plan an operative image-guided procedure according to the aforementioned case-specific simulation. The module preferably allows the physician to sketch and to take notes during the image-guided procedure simulation. Preferably, the image display device 58 is a touch screen that allows the physician to sketch a track that depicts the maneuvers that he intends to take during the operative image-guided medical procedure. Moreover, in such an embodiment, the physician can mark problematic areas of the depicted organ. In one preferred embodiment, the image-guided procedure simulation is an angioplasty procedure simulation. The physician can use the touch screen to sketch the boundaries of the tract through which he intends to perform the procedure or a portion thereof.
  • In one embodiment of the present invention, the pre-operative simulator 1 can also be used as an analyzer tool for going back over performed operations. As described above, the model of the operated organ is generated according to a medical image of an organ which is about to be operated. In one embodiment of the present invention the pre-operative simulator 1 is used for performing a reenactment of the image-guided procedure that has been performed on the patient. Such a reenactment is performed as an image-guided procedure simulation, as described above. As the model that is used by the pre-operative simulator 1 simulates the operated on organ, the reenactment is realistic and allows the physicians to be prepared better to the operation.
  • Reference is now made to FIG. 6, which is a flowchart of a method for performing a simulated image-guided procedure, according to one embodiment of present invention.
  • The method depicted in FIG. 6 allows a physician to conduct a clinical pre-operative simulation of the image guided procedure he or she is about to perform. Such a simulation allows the physician to take safe and unrushed clinical decisions based on a 3D medical image of the patient that is about to be operated on.
  • During the first step, as shown at 201, a 3D medical image depicting an organ of a patient is obtained. The 3D medical image has been taken using a medical imaging system and obtained, for example via a PACS server or a portable memory device, as described above. The 3D medical image depicts an organ of a patient that is about to be operated on. During the following step, as shown at 202, a 3D model of the anatomy is produced according to the received 3D medical image. The 3D model defines the boundaries of areas in the anatomy such as a certain tract. In the following step, as shown at 203, a simulation of an image-guided procedure on the patient is held according to the 3D model that has been constructed in the previous step. The simulation of the image-guided procedure allows a physician to prepare himself to the operative image-guided procedure. Based on the simulation, the physician can choose the fittest angles and the tools. Furthermore, the user can mark pitfalls, such as hard-to navigate zones or misleading view angles in advance.
  • For example, if the simulated image-guided procedure is an angioplasty procedure, the physician can choose, in advance, the size and the type of the catheter, the balloon, and the stent he is going to use during the operation. Moreover, gaining acquaintance with the specific anatomy of the patient in advance may result in reducing contrast injection and X-ray exposure. In angioplasty procedure, for example, the duration of the X-ray exposure periods depends on the time it takes the physician to maneuver the catheter in the relevant anatomy region. If the physician already simulated the angioplasty procedure using the aforementioned system, he is already familiar with the specific region and therefore can easily maneuver the catheter during the actual angioplasty procedure.
  • It is expected that during the life of this patent many relevant devices and systems will be developed and the scope of the terms herein, particularly of the terms a 3D model, an imaging device, a simulating unit, motion detectors, a 3D medical image, and an image-guided procedure are intended to include all such new technologies a priori.
  • It is appreciated that certain features of the invention, which are, for clarity, described in the context of separate embodiments, may also be provided in combination in a single embodiment. Conversely, various features of the invention, which are, for brevity, described in the context of a single embodiment, may also be provided separately or in any suitable subcombination.
  • Although the invention has been described in conjunction with specific embodiments thereof, it is evident that many alternatives, modifications and variations will be apparent to those skilled in the art. Accordingly, it is intended to embrace all such alternatives, modifications and variations that fall within the spirit and broad scope of the appended claims. All publications, patents, and patent applications mentioned in this specification are herein incorporated in their entirety by reference into the specification, to the same extent as if each individual publication, patent or patent application was specifically and individually indicated to be incorporated herein by reference. In addition, citation or identification of any reference in this application shall not be construed as an admission that such reference is available as prior art to the present invention.

Claims (21)

1. An apparatus for simulating an image-guided procedure, comprising:
an input unit to receive a three-dimensional (3D) medical image specific to an actual patient undergoing a specific medical procedure obtained by a medical imaging system depicting an anatomical region of the patient undergoing the specific medical procedure wherein the medical image is obtained after administering an intravenous contrast enhancement (ICE) component to the patient in order to improve precision of an automatic 3D segmentation process related to a soft tissue;
a 3D segmentation unit to perform the automatic segmentation process on the 3D medical image specific to the patient and for producing a segmented 3D medical image, wherein the automatic segmentation process comprises classification of data voxels according to respective anatomical parts of said anatomical region and registration of said anatomical region;
a model generation unit to generate a 3D anatomical model of said anatomical region, according to said segmented 3D medical image; and
a simulating unit to simulate an image-guided procedure planned for said patient according to said 3D anatomical model.
2. The apparatus of claim 1, wherein the 3D medical image is represented in a 3D data array and the 3D segmentation unit receives as input the 3D data array.
3. The apparatus of claim 1, where said 3D medical image is represented in digital imaging and communication in medicine (DICOM) format and said 3D anatomical model is presented by sets of data comprising a 3D spline description and polygonal meshes representation.
4. The apparatus of claim 1, wherein said 3D anatomical model is a model of a tract and said tract is a member of the following group: a vascular tract, a urinary tract, a gastrointestinal tract, and a fistula tract.
5. The apparatus of claim 1, wherein said 3D medical image is a member of the following group: computerized tomography (CT) scan images, magnetic resonance imager (MRI) scan images, ultrasound scan images, and positron emission tomography (PET)-CT scan images.
6. The apparatus of claim 1, wherein said planned image-guided procedure is an angioplasty procedure.
7. The apparatus of claim 1, further comprising a user interface operatively connected to said model generation unit, said user interface is to accept input data that identifies a location in the 3D medical image;
8. The apparatus of claim 1, wherein said simulated planned image-guided procedure is used as a study case during a learning process.
9. The apparatus of claim 1, wherein said simulated planned image-guided procedure is used to demonstrate a respective image-guided procedure to said patient.
10. The apparatus of claim 1, wherein said simulated planned image-guided procedure is used to document preparation to an operation.
11. The apparatus of claim 1, wherein said input unit is configured for receiving a four dimensional (4D) medical image, which is a set of consecutive 3D medical images that depicts said anatomical region during a time period, said model generation unit is configured for generating a 4D anatomical model according to said 4D medical image, said simulating unit is configured for simulating an image-guided procedure planned for said patient according to said 4D anatomical model.
12. The apparatus of claim 1, wherein said anatomical region is a member of a group comprising: an organ, a human body system, an area of an organ, a number of areas of an organ, a section of an organ, and a section of a human body system.
13. A method for performing a simulated image-guided procedure, said method comprising:
obtaining, by a medical imaging system, a three-dimensional (3D) medical image specific to an actual patient undergoing a specific medical procedure, depicting an anatomical region of the patient undergoing the specific medical procedure, wherein the medical image is obtained after administering an intravenous contrast enhancement (ICE) component to the patient in order to improve precision of an automatic 3D segmentation process related to a soft tissue;
performing, by a computer processor, the automatic 3D segmentation process on the 3D medical image specific to the patient to produce a segmented 3D medical image, wherein the automatic segmentation process comprises classifying data voxels according to respective anatomical parts of said anatomical region and registering said anatomical region;
producing, by the computer processor, a 3D anatomical model of said anatomical region according to said segmented 3D medical image; and
simulating an image-guided procedure planned for said patient according to said 3D anatomical model.
14. The method of claim 13, wherein the 3D medical image is represented in a 3D data array and the 3D segmentation unit receives as input the 3D data array.
15. The method of claim 13, wherein said planned image-guided procedure is an angioplasty procedure.
16. The method of claim 13 comprising:
receiving input data that identifies a location in the 3D medical image in relation to the automatic segmentation process.
17. The method of claim 13, wherein said simulating comprises displaying said 3D anatomical model as a display on an image display device coupled to the computer processor.
18. The method of claim 17, further comprising a step of allowing a system user to mark labels for said planned image-guided procedure according to said display.
19. The method of claim 13, wherein said planned image-guided procedure is an angioplasty procedure.
20. The method of claim 13, wherein said step of simulating is performed as a pre-operative surgical simulation.
21-25. (canceled)
US13/958,954 2007-01-16 2013-08-05 Preoperative surgical simulation Abandoned US20140129200A1 (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
US13/958,954 US20140129200A1 (en) 2007-01-16 2013-08-05 Preoperative surgical simulation

Applications Claiming Priority (4)

Application Number Priority Date Filing Date Title
US88041507P 2007-01-16 2007-01-16
US12/224,314 US8500451B2 (en) 2007-01-16 2008-01-13 Preoperative surgical simulation
PCT/IL2008/000056 WO2008087629A2 (en) 2007-01-16 2008-01-13 Preoperative surgical simulation
US13/958,954 US20140129200A1 (en) 2007-01-16 2013-08-05 Preoperative surgical simulation

Related Parent Applications (2)

Application Number Title Priority Date Filing Date
PCT/IL2008/000056 Continuation WO2008087629A2 (en) 2007-01-16 2008-01-13 Preoperative surgical simulation
US12/224,314 Continuation US8500451B2 (en) 2007-01-16 2008-01-13 Preoperative surgical simulation

Publications (1)

Publication Number Publication Date
US20140129200A1 true US20140129200A1 (en) 2014-05-08

Family

ID=39636461

Family Applications (2)

Application Number Title Priority Date Filing Date
US12/224,314 Active 2029-05-25 US8500451B2 (en) 2007-01-16 2008-01-13 Preoperative surgical simulation
US13/958,954 Abandoned US20140129200A1 (en) 2007-01-16 2013-08-05 Preoperative surgical simulation

Family Applications Before (1)

Application Number Title Priority Date Filing Date
US12/224,314 Active 2029-05-25 US8500451B2 (en) 2007-01-16 2008-01-13 Preoperative surgical simulation

Country Status (4)

Country Link
US (2) US8500451B2 (en)
CN (1) CN101627411B (en)
GB (1) GB2459225B (en)
WO (1) WO2008087629A2 (en)

Cited By (14)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN104463965A (en) * 2014-12-17 2015-03-25 中国科学院自动化研究所 Training scene simulation system and method for minimally invasive cardiovascular interventional operation
EP3279865B1 (en) 2016-08-01 2018-11-21 3mensio Medical Imaging B.V. Method, device and system for simulating shadow images
US10657671B2 (en) 2016-12-02 2020-05-19 Avent, Inc. System and method for navigation to a target anatomical object in medical imaging-based procedures
EP3247300B1 (en) * 2015-01-09 2020-07-15 Azevedo Da Silva, Sara Isabel Orthopedic surgery planning system
US20200360089A1 (en) * 2017-12-28 2020-11-19 Hutom Co., Ltd. Method for generating surgical simulation information and program
US10849688B2 (en) 2016-03-02 2020-12-01 Truinject Corp. Sensory enhanced environments for injection aid and social training
US10896627B2 (en) 2014-01-17 2021-01-19 Truinjet Corp. Injection site training system
US10902746B2 (en) 2012-10-30 2021-01-26 Truinject Corp. System for cosmetic and therapeutic training
US11042778B2 (en) * 2018-11-29 2021-06-22 International Business Machines Corporation Generating realistic organ x-ray angiography (XA) images for deep learning consumption
US11151721B2 (en) 2016-07-08 2021-10-19 Avent, Inc. System and method for automatic detection, localization, and semantic segmentation of anatomical objects
US11291423B2 (en) 2017-07-14 2022-04-05 Materialise N.V. System and method of radiograph correction and visualization
EP4163872A1 (en) * 2021-10-11 2023-04-12 Koninklijke Philips N.V. Enhanced segmentation
US11710424B2 (en) 2017-01-23 2023-07-25 Truinject Corp. Syringe dose and position measuring apparatus
US12070581B2 (en) 2015-10-20 2024-08-27 Truinject Corp. Injection system

Families Citing this family (79)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US6850788B2 (en) 2002-03-25 2005-02-01 Masimo Corporation Physiological measurement communications adapter
US9161696B2 (en) * 2006-09-22 2015-10-20 Masimo Corporation Modular patient monitor
US8840549B2 (en) 2006-09-22 2014-09-23 Masimo Corporation Modular patient monitor
US8543338B2 (en) * 2007-01-16 2013-09-24 Simbionix Ltd. System and method for performing computerized simulations for image-guided procedures using a patient specific model
IL184151A0 (en) 2007-06-21 2007-10-31 Diagnostica Imaging Software Ltd X-ray measurement method
JP5215828B2 (en) * 2008-12-02 2013-06-19 三菱プレシジョン株式会社 Model generation method for preoperative simulation
US8311791B1 (en) 2009-10-19 2012-11-13 Surgical Theater LLC Method and system for simulating surgical procedures
US9153112B1 (en) 2009-12-21 2015-10-06 Masimo Corporation Modular patient monitor
US10580325B2 (en) * 2010-03-24 2020-03-03 Simbionix Ltd. System and method for performing a computerized simulation of a medical procedure
US20120100517A1 (en) * 2010-09-30 2012-04-26 Andrew Bowditch Real-time, interactive, three-dimensional virtual surgery system and method thereof
US9224240B2 (en) * 2010-11-23 2015-12-29 Siemens Medical Solutions Usa, Inc. Depth-based information layering in medical diagnostic ultrasound
US9847044B1 (en) 2011-01-03 2017-12-19 Smith & Nephew Orthopaedics Ag Surgical implement training process
US20120208160A1 (en) * 2011-02-16 2012-08-16 RadOnc eLearning Center, Inc. Method and system for teaching and testing radiation oncology skills
JP6457262B2 (en) 2011-03-30 2019-01-23 アヴィザル,モルデチャイ Method and system for simulating surgery
US10354555B2 (en) * 2011-05-02 2019-07-16 Simbionix Ltd. System and method for performing a hybrid simulation of a medical procedure
US8532807B2 (en) * 2011-06-06 2013-09-10 Biomet Manufacturing, Llc Pre-operative planning and manufacturing method for orthopedic procedure
JP5984235B2 (en) * 2011-07-19 2016-09-06 東芝メディカルシステムズ株式会社 Image processing system, apparatus, method, and medical image diagnostic apparatus
JP5165782B2 (en) * 2011-08-11 2013-03-21 株式会社コナミデジタルエンタテインメント Image file processing apparatus and program
US10734116B2 (en) 2011-10-04 2020-08-04 Quantant Technology, Inc. Remote cloud based medical image sharing and rendering semi-automated or fully automated network and/or web-based, 3D and/or 4D imaging of anatomy for training, rehearsing and/or conducting medical procedures, using multiple standard X-ray and/or other imaging projections, without a need for special hardware and/or systems and/or pre-processing/analysis of a captured image data
US9943269B2 (en) 2011-10-13 2018-04-17 Masimo Corporation System for displaying medical monitoring data
EP3584799B1 (en) 2011-10-13 2022-11-09 Masimo Corporation Medical monitoring hub
US8908918B2 (en) * 2012-11-08 2014-12-09 Navigate Surgical Technologies, Inc. System and method for determining the three-dimensional location and orientation of identification markers
US9489472B2 (en) 2011-12-16 2016-11-08 Trimble Navigation Limited Method and apparatus for detecting interference in design environment
US20140195963A1 (en) * 2011-12-16 2014-07-10 Gehry Technologies Method and apparatus for representing 3d thumbnails
US20130211244A1 (en) * 2012-01-25 2013-08-15 Surgix Ltd. Methods, Devices, Systems, Circuits and Associated Computer Executable Code for Detecting and Predicting the Position, Orientation and Trajectory of Surgical Tools
US9152743B2 (en) 2012-02-02 2015-10-06 Gehry Technologies, Inc. Computer process for determining best-fitting materials for constructing architectural surfaces
US10307111B2 (en) 2012-02-09 2019-06-04 Masimo Corporation Patient position detection system
US10149616B2 (en) 2012-02-09 2018-12-11 Masimo Corporation Wireless patient monitoring device
US9622820B2 (en) 2012-05-03 2017-04-18 Siemens Product Lifecycle Management Software Inc. Feature-driven rule-based framework for orthopedic surgical planning
US10056012B2 (en) 2012-05-25 2018-08-21 Surgical Theatre LLC Hybrid image/scene renderer with hands free control
US20140071125A1 (en) * 2012-09-11 2014-03-13 The Johns Hopkins University Patient-Specific Segmentation, Analysis, and Modeling from 3-Dimensional Ultrasound Image Data
US20140081659A1 (en) 2012-09-17 2014-03-20 Depuy Orthopaedics, Inc. Systems and methods for surgical and interventional planning, support, post-operative follow-up, and functional recovery tracking
US9749232B2 (en) 2012-09-20 2017-08-29 Masimo Corporation Intelligent medical network edge router
US9601030B2 (en) * 2013-03-15 2017-03-21 Mark B. Ratcliffe System and method for performing virtual surgery
US9968408B1 (en) * 2013-03-15 2018-05-15 Nuvasive, Inc. Spinal balance assessment
US9280819B2 (en) * 2013-08-26 2016-03-08 International Business Machines Corporation Image segmentation techniques
US10832818B2 (en) 2013-10-11 2020-11-10 Masimo Corporation Alarm notification system
CN103700139A (en) * 2013-12-02 2014-04-02 北京像素软件科技股份有限公司 Method and device for creating river model in 3D (three-dimensional) online game
US10459594B2 (en) * 2013-12-31 2019-10-29 Vmware, Inc. Management of a pre-configured hyper-converged computing device
US9524582B2 (en) * 2014-01-28 2016-12-20 Siemens Healthcare Gmbh Method and system for constructing personalized avatars using a parameterized deformable mesh
WO2015154069A1 (en) * 2014-04-04 2015-10-08 Surgical Theater LLC Dynamic and interactive navigation in a surgical environment
CN104978872A (en) * 2014-04-04 2015-10-14 上海橘井泉网络科技有限公司 Surgery demonstration method, surgery demonstration device and surgery demonstration system
US9652590B2 (en) 2014-06-26 2017-05-16 General Electric Company System and method to simulate maintenance of a device
US11227509B2 (en) 2014-12-29 2022-01-18 Help Me See Inc. Surgical simulator systems and methods
US10695099B2 (en) 2015-02-13 2020-06-30 Nuvasive, Inc. Systems and methods for planning, performing, and assessing spinal correction during surgery
US10601950B2 (en) 2015-03-01 2020-03-24 ARIS MD, Inc. Reality-augmented morphological procedure
FR3037785B1 (en) * 2015-06-26 2017-08-18 Therenva METHOD AND SYSTEM FOR GUIDING A ENDOVASCULAR TOOL IN VASCULAR STRUCTURES
WO2017011337A1 (en) * 2015-07-10 2017-01-19 Quantant Technology Inc. Remote cloud based medical image sharing and rendering
JP6631072B2 (en) * 2015-07-31 2020-01-15 富士通株式会社 Biological simulation system and biological simulation method
US10448844B2 (en) 2015-08-31 2019-10-22 Masimo Corporation Systems and methods for patient fall detection
CN105096716B (en) * 2015-09-01 2019-01-25 深圳先进技术研究院 Intravascular intervention surgery simulation system
JP2018534011A (en) 2015-10-14 2018-11-22 サージカル シアター エルエルシー Augmented reality surgical navigation
US20190206134A1 (en) 2016-03-01 2019-07-04 ARIS MD, Inc. Systems and methods for rendering immersive environments
JP2019514450A (en) 2016-03-02 2019-06-06 ニューヴェイジヴ,インコーポレイテッド System and method for spinal orthopedic surgery planning
US10617302B2 (en) 2016-07-07 2020-04-14 Masimo Corporation Wearable pulse oximeter and respiration monitor
CN106295189B (en) * 2016-08-12 2019-06-28 上海鸿巍企业管理咨询有限公司 A kind of visualization operation whole process integral system
WO2018071715A1 (en) 2016-10-13 2018-04-19 Masimo Corporation Systems and methods for patient fall detection
US10810907B2 (en) 2016-12-19 2020-10-20 National Board Of Medical Examiners Medical training and performance assessment instruments, methods, and systems
US10636323B2 (en) * 2017-01-24 2020-04-28 Tienovix, Llc System and method for three-dimensional augmented reality guidance for use of medical equipment
RU2685961C2 (en) * 2017-07-17 2019-04-23 Общество с ограниченной ответственностью "ЭНСИМ" Surgical procedure preoperative modeling method and system
RU178470U1 (en) * 2017-08-24 2018-04-04 Общество с ограниченной ответственностью "ЭНСИМ" DEVICE FOR PREOPERATIVE SIMULATION OF SURGICAL PROCEDURE
US10861236B2 (en) 2017-09-08 2020-12-08 Surgical Theater, Inc. Dual mode augmented reality surgical system and method
US11272985B2 (en) 2017-11-14 2022-03-15 Stryker Corporation Patient-specific preoperative planning simulation techniques
EP3782165A1 (en) 2018-04-19 2021-02-24 Masimo Corporation Mobile patient alarm display
KR101940706B1 (en) * 2018-05-23 2019-04-10 (주)휴톰 Program and method for generating surgical simulation information
US11589776B2 (en) * 2018-11-06 2023-02-28 The Regents Of The University Of Colorado Non-contact breathing activity monitoring and analyzing through thermal and CO2 imaging
WO2020227661A1 (en) * 2019-05-09 2020-11-12 Materialise N.V. Surgery planning system with automated defect quantification
US11272988B2 (en) 2019-05-10 2022-03-15 Fvrvs Limited Virtual reality surgical training systems
CN110811831A (en) * 2019-05-27 2020-02-21 苏州六莲科技有限公司 Accurate automatic evaluation method and device for kidney surgery
US10698493B1 (en) * 2019-06-26 2020-06-30 Fvrvs Limited Virtual reality surgical training systems with advanced haptic feedback
EP3771449A1 (en) 2019-07-31 2021-02-03 Siemens Healthcare GmbH Method for deformation simulation and device
US20210296008A1 (en) 2020-03-20 2021-09-23 Masimo Corporation Health monitoring system for limiting the spread of an infection in an organization
USD980091S1 (en) 2020-07-27 2023-03-07 Masimo Corporation Wearable temperature measurement device
USD974193S1 (en) 2020-07-27 2023-01-03 Masimo Corporation Wearable temperature measurement device
US11532244B2 (en) 2020-09-17 2022-12-20 Simbionix Ltd. System and method for ultrasound simulation
CN113192177A (en) * 2021-04-20 2021-07-30 江苏瑞影医疗科技有限公司 Urinary system three-dimensional digital simulation model construction system and method
USD1000975S1 (en) 2021-09-22 2023-10-10 Masimo Corporation Wearable temperature measurement device
USD1048908S1 (en) 2022-10-04 2024-10-29 Masimo Corporation Wearable sensor
CN117444990B (en) * 2023-12-25 2024-02-27 深圳市普朗医疗科技发展有限公司 Mechanical arm injection control method and system based on 3D modeling

Citations (10)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US5782762A (en) * 1994-10-27 1998-07-21 Wake Forest University Method and system for producing interactive, three-dimensional renderings of selected body organs having hollow lumens to enable simulated movement through the lumen
US5825909A (en) * 1996-02-29 1998-10-20 Eastman Kodak Company Automated method and system for image segmentation in digital radiographic images
US20010002310A1 (en) * 1997-06-20 2001-05-31 Align Technology, Inc. Clinician review of an orthodontic treatment plan and appliance
US20010055016A1 (en) * 1998-11-25 2001-12-27 Arun Krishnan System and method for volume rendering-based segmentation
US20030097068A1 (en) * 1998-06-02 2003-05-22 Acuson Corporation Medical diagnostic ultrasound system and method for versatile processing
US20040023493A1 (en) * 2002-05-20 2004-02-05 Katsuhiro Tomoda Isolating method and transferring method for semiconductor devices
US20070052724A1 (en) * 2005-09-02 2007-03-08 Alan Graham Method for navigating a virtual camera along a biological object with a lumen
US20070060799A1 (en) * 2005-09-13 2007-03-15 Lyon Torsten M Apparatus and method for automatic image guided accuracy verification
US20070116357A1 (en) * 2005-11-23 2007-05-24 Agfa-Gevaert Method for point-of-interest attraction in digital images
US20070116338A1 (en) * 2005-11-23 2007-05-24 General Electric Company Methods and systems for automatic segmentation of biological structure

Family Cites Families (372)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US1959490A (en) 1931-08-31 1934-05-22 Mistelski Theodor Wire pushing and pulling tool
US3024539A (en) * 1961-04-19 1962-03-13 Link Division Of General Prec Trainer control column apparatus
US3263824A (en) 1963-12-20 1966-08-02 Northrop Corp Servo controlled manipulator device
US3490059A (en) * 1966-06-06 1970-01-13 Martin Marietta Corp Three axis mounting and torque sensing apparatus
US3517446A (en) 1967-04-19 1970-06-30 Singer General Precision Vehicle trainer controls and control loading
US3406601A (en) 1967-09-19 1968-10-22 Clifford Francis Patrick Automatic measuring apparatus
US3520071A (en) 1968-01-29 1970-07-14 Aerojet General Co Anesthesiological training simulator
US3579842A (en) 1968-09-27 1971-05-25 Weber & Scher Mfg Co Inc Cable-measuring machine
US3573444A (en) 1969-06-04 1971-04-06 Contour Saws Gaging camber of lengthwise moving strip material
US3704529A (en) 1970-07-13 1972-12-05 Forrest J Cioppa Training and instruction device for performing cricothyroidotomy
US3814145A (en) 1970-07-31 1974-06-04 Evg Entwicklung Verwert Ges Wire mesh welding machine
US3919691A (en) 1971-05-26 1975-11-11 Bell Telephone Labor Inc Tactile man-machine communication system
US3739276A (en) 1971-06-29 1973-06-12 Western Electric Co Method of and system for measuring the speed and length of a moving elongated article
US3789518A (en) * 1972-04-12 1974-02-05 Weatherby Nasco Inc Simulated human limb
US3722108A (en) * 1972-04-12 1973-03-27 Weatherby Nasco Inc Injection training aid
US3775865A (en) 1972-07-24 1973-12-04 R Rowan Simulator for teaching suturing techniques
US3945593A (en) * 1972-10-10 1976-03-23 Bodenseewerk Geratetechnik Gmbh Flight control apparatus
US3795150A (en) * 1972-12-13 1974-03-05 Us Air Force System for rapidly positioning gimbaled objects
US3861065A (en) * 1973-01-08 1975-01-21 Lucas Aerospace Ltd Apparatus for simulating the effects of aerodynamic forces on aircraft control
US3875488A (en) 1973-02-15 1975-04-01 Raytheon Co Inertially stabilized gimbal platform
US3795061A (en) * 1973-03-21 1974-03-05 Survival Technology Training injector
US3991490A (en) 1973-04-09 1976-11-16 Markman H David Teaching aid for sigmoidoscope and the like
GB1516503A (en) 1974-07-01 1978-07-05 Strathearn Audio Ltd Arrangement for indicating speed of rotation of a body
US4183249A (en) * 1975-03-07 1980-01-15 Varian Associates, Inc. Lens system for acoustical imaging
US4033331A (en) 1975-07-17 1977-07-05 Guss Stephen B Cardiac catheter and method of using same
US4024873A (en) 1976-05-24 1977-05-24 Becton, Dickinson And Company Balloon catheter assembly
US4115755A (en) 1976-06-11 1978-09-19 United Technologies Corporation Aerodynamic surface load sensing
US4078317A (en) * 1976-07-30 1978-03-14 Wheatley Ronald B Flight simulator system
US4276702A (en) 1976-12-06 1981-07-07 Pacer Systems, Inc. Aircraft flight simulating trainer
US4089494A (en) 1976-12-16 1978-05-16 Mcdonnell Douglas Corporation Reduced servo noise control for a hydraulic actuator
US4148014A (en) 1977-04-06 1979-04-03 Texas Instruments Incorporated System with joystick to control velocity vector of a display cursor
US4136554A (en) * 1977-10-31 1979-01-30 Wells Electronics, Inc. Tester for inflated items
US4162582A (en) 1978-01-09 1979-07-31 Killeen George F Flight trainer and entertainment device for simulating aerial acrobatics
US4182054A (en) * 1978-02-16 1980-01-08 Medical Plastics Laboratory, Inc. Artificial arm
US4177984A (en) 1978-02-17 1979-12-11 Mattel, Inc. Captive flying toy airplane having simulated motor sounds
DE2807902C2 (en) 1978-02-24 1980-04-30 Messerschmitt-Boelkow-Blohm Gmbh, 8000 Muenchen Control device with active force feedback
FR2419548A1 (en) 1978-03-09 1979-10-05 Materiel Telephonique ELECTRO-HYDRAULIC FLIGHT CONTROL SIMULATOR
US4262549A (en) 1978-05-10 1981-04-21 Schwellenbach Donald D Variable mechanical vibrator
FR2442484A1 (en) 1978-11-22 1980-06-20 Materiel Telephonique ELECTRONIC DEVICE FOR MONITORING THE OPERATION OF A HYDRAULIC SERVOVERIN
US4307539A (en) 1979-03-10 1981-12-29 Klein Claus Dieter Toy simulating a physician's instrument
US4250887A (en) * 1979-04-18 1981-02-17 Dardik Surgical Associates, P.A. Remote manual injecting apparatus
US4250636A (en) * 1979-06-01 1981-02-17 Aviation Simulation Technology Yoke assembly for flight simulator
US4264312A (en) 1980-04-28 1981-04-28 The Kendall Company Body channel simulation device
US4360345A (en) 1980-07-14 1982-11-23 American Heart Association, Inc. Health education system
US4464117A (en) 1980-08-27 1984-08-07 Dr. Ing. Reiner Foerst Gmbh Driving simulator apparatus
NL8006091A (en) 1980-11-07 1982-06-01 Fokker Bv FLIGHTMATTER.
US4333070A (en) 1981-02-06 1982-06-01 Barnes Robert W Motor vehicle fuel-waste indicator
US4599070A (en) 1981-07-29 1986-07-08 Control Interface Company Limited Aircraft simulator and simulated control system therefor
ES260340Y (en) 1981-08-31 1982-10-16 LEARNING DEVICE FOR ENDOSCOPES
US4436188A (en) * 1981-11-18 1984-03-13 Jones Cecil R Controlled motion apparatus
JPS5877785U (en) 1981-11-24 1983-05-26 株式会社シグマ monitor game machine
DE3382431D1 (en) 1982-01-22 1991-11-14 British Aerospace CONTROL UNIT.
US4427388A (en) * 1982-08-11 1984-01-24 The United States Of America As Represented By The Secretary Of The Air Force Yoke mover
US4545390A (en) 1982-09-22 1985-10-08 C. R. Bard, Inc. Steerable guide wire for balloon dilatation procedure
US4504233A (en) * 1982-12-20 1985-03-12 The Singer Company High performance control loading system for manually-operable controls in a vehicle simulator
DE3366764D1 (en) 1983-01-28 1986-11-13 Ibm A stylus or pen for interactive use with a graphics input tablet
FR2545606B1 (en) 1983-05-06 1985-09-13 Hispano Suiza Sa FORCE TENSIONER SENSOR
US4655673A (en) 1983-05-10 1987-04-07 Graham S. Hawkes Apparatus providing tactile feedback to operators of remotely controlled manipulators
US4733214A (en) * 1983-05-23 1988-03-22 Andresen Herman J Multi-directional controller having resiliently biased cam and cam follower for tactile feedback
US4481001A (en) 1983-05-26 1984-11-06 Collagen Corporation Human skin model for intradermal injection demonstration or training
DE3327342A1 (en) 1983-07-29 1985-02-07 Peter 7800 Freiburg Pedersen DEVICE FOR DETECTING AND EVALUATING THE PRESSURE IN THE BALLOON CUFF OF A CLOSED TRACHEAL TUBE
US4604016A (en) 1983-08-03 1986-08-05 Joyce Stephen A Multi-dimensional force-torque hand controller having force feedback
EP0137870B1 (en) 1983-09-20 1987-01-14 ATELIERS DE CONSTRUCTIONS ELECTRIQUES DE CHARLEROI (ACEC) Société Anonyme Flight training simulator for pilots
US4688983A (en) 1984-05-21 1987-08-25 Unimation Inc. Low cost robot
US4573452A (en) 1984-07-12 1986-03-04 Greenberg I Melvin Surgical holder for a laparoscope or the like
US4794384A (en) 1984-09-27 1988-12-27 Xerox Corporation Optical translator device
US4712101A (en) 1984-12-04 1987-12-08 Cheetah Control, Inc. Control mechanism for electronic apparatus
US4654648A (en) * 1984-12-17 1987-03-31 Herrington Richard A Wireless cursor control system
US4782327A (en) 1985-01-02 1988-11-01 Victor B. Kley Computer control
US4605373A (en) 1985-01-10 1986-08-12 Rosen Bernard A Training device for setting broken limbs
US4632341A (en) 1985-02-06 1986-12-30 The United States Of America As Represented By The Secretary Of The Air Force Stabilizing force feedback in bio-actuated control systems
US5078152A (en) * 1985-06-23 1992-01-07 Loredan Biomedical, Inc. Method for diagnosis and/or training of proprioceptor feedback capabilities in a muscle and joint system of a human patient
US4642055A (en) * 1985-06-24 1987-02-10 Saliterman Steven S Hemodynamic monitoring trainer
DE3523188A1 (en) 1985-06-28 1987-01-08 Zeiss Carl Fa CONTROL FOR COORDINATE MEASURING DEVICES
US4713007A (en) 1985-10-11 1987-12-15 Alban Eugene P Aircraft controls simulator
JPH0778718B2 (en) 1985-10-16 1995-08-23 株式会社日立製作所 Image display device
US5275174B1 (en) * 1985-10-30 1998-08-04 Jonathan A Cook Repetitive strain injury assessment
US4659313A (en) 1985-11-01 1987-04-21 New Flite Inc. Control yoke apparatus for computerized aircraft simulation
NL8503096A (en) 1985-11-11 1987-06-01 Fokker Bv SIMULATOR OF MECHANICAL PROPERTIES OF OPERATING SYSTEM.
US4891764A (en) * 1985-12-06 1990-01-02 Tensor Development Inc. Program controlled force measurement and control system
US5103404A (en) 1985-12-06 1992-04-07 Tensor Development, Inc. Feedback for a manipulator
US4934694A (en) 1985-12-06 1990-06-19 Mcintosh James L Computer controlled exercise system
US5591924A (en) * 1985-12-18 1997-01-07 Spacetec Imc Corporation Force and torque converter
FR2592514B1 (en) 1985-12-27 1988-04-08 Beer Gabel Marc SIMULATION APPARATUS FOR STUDYING ENDOSCOPY
US4742815A (en) 1986-01-02 1988-05-10 Ninan Champil A Computer monitoring of endoscope
US4646742A (en) * 1986-01-27 1987-03-03 Angiomedics Incorporated Angioplasty catheter assembly
US4708650A (en) 1986-02-10 1987-11-24 Johnson & Johnson Dental Products Company Direct delivery system for dental materials
US4786892A (en) 1986-02-22 1988-11-22 Alps Electric Co., Ltd. X-Y direction input device having changeable orientation of input axes and switch activation
US4751662A (en) 1986-07-14 1988-06-14 United States Of America As Represented By The Secretary Of The Navy Dynamic flight simulator control system
US4803413A (en) * 1986-07-15 1989-02-07 Honeywell Inc. Magnetic isolating and pointing gimbal apparatus
US4909232A (en) * 1986-07-30 1990-03-20 Carella Richard F Shooting and training device for archery
EP0513474A1 (en) 1986-09-11 1992-11-19 Hughes Aircraft Company Digital visual and sensor simulation system for generating realistic scenes
GB8622257D0 (en) 1986-09-16 1986-10-22 Daykin A P Teaching aids
NL8602624A (en) 1986-10-20 1988-05-16 Oce Nederland Bv INPUT DEVICE WITH TAKTILE FEEDBACK.
NL8602697A (en) 1986-10-27 1988-05-16 Huka Bv Developments JOYSTICK.
US4706006A (en) 1986-10-31 1987-11-10 Altman Stage Lighting Co., Inc. Dual-axis tactile feedback light control device
DE3638192A1 (en) * 1986-11-08 1988-05-19 Laerdal Asmund S As SYSTEM AND METHOD FOR TESTING A PERSON IN CARDIOPULMONARY RESURRECTION (CPR) AND EVALUATING CPR EXERCISES
US4795296A (en) * 1986-11-17 1989-01-03 California Institute Of Technology Hand-held robot end effector controller having movement and force control
US4726772A (en) * 1986-12-01 1988-02-23 Kurt Amplatz Medical simulator
US6885361B1 (en) 1987-03-24 2005-04-26 Sun Microsystems, Inc. Tactile feedback mechanism for a data processing system
US5986643A (en) 1987-03-24 1999-11-16 Sun Microsystems, Inc. Tactile feedback mechanism for a data processing system
US5018922A (en) 1987-03-26 1991-05-28 Kabushiki Kaisha Komatsu Seisakusho Master/slave type manipulator
US4839838A (en) 1987-03-30 1989-06-13 Labiche Mitchell Spatial input apparatus
US4860215A (en) 1987-04-06 1989-08-22 California Institute Of Technology Method and apparatus for adaptive force and position control of manipulators
US4961138A (en) 1987-05-01 1990-10-02 General Datacomm, Inc. System and apparatus for providing three dimensions of input into a host processor
US4912638A (en) * 1987-05-04 1990-03-27 Pratt Jr G Andrew Biofeedback lifting monitor
US4868549A (en) 1987-05-18 1989-09-19 International Business Machines Corporation Feedback mouse
DE3717459A1 (en) 1987-05-23 1988-12-01 Zeiss Carl Fa HAND-HELD COORDINATE MEASURING DEVICE
US4748984A (en) 1987-05-29 1988-06-07 Patel Piyush V Catheter assembly and method of performing coronary angiography and angioplasty
US4874998A (en) 1987-06-11 1989-10-17 International Business Machines Corporation Magnetically levitated fine motion robot wrist with programmable compliance
US4907796A (en) * 1987-06-22 1990-03-13 Roel Rodriguez Santiago Ski simulator
JPH0769972B2 (en) 1987-07-16 1995-07-31 インタ−ナショナル・ビジネス・マシ−ンズ・コ−ポレ−ション Image generation method
USH703H (en) 1987-07-30 1989-11-07 The United States Of America As Represented By The Secretary Of The Air Force Manual control apparatus with electable mechanical impedance
US4789340A (en) 1987-08-18 1988-12-06 Zikria Bashir A Surgical student teaching aid
US4867685A (en) 1987-09-24 1989-09-19 The Trustees Of The College Of Aeronautics Audio visual instructional system
US4775289A (en) 1987-09-25 1988-10-04 Regents Of The University Of Minnesota Statically-balanced direct-drive robot arm
US4825875A (en) 1987-10-23 1989-05-02 Ninan Champil A Computer localization in pressure profile
US4896554A (en) * 1987-11-03 1990-01-30 Culver Craig F Multifunction tactile manipulatable control
US4982618A (en) * 1987-11-03 1991-01-08 Culver Craig F Multifunction tactile manipulatable control
US4823634A (en) 1987-11-03 1989-04-25 Culver Craig F Multifunction tactile manipulatable control
US4815313A (en) 1987-11-16 1989-03-28 Abbott Laboratories Syringe pressure calibration reference
US4820162A (en) 1987-11-23 1989-04-11 Robert Ross Joystick control accessory for computerized aircraft flight simulation program
GB2212888A (en) * 1987-12-02 1989-08-02 Philips Electronic Associated X-y signal generating device
JPH01164583A (en) 1987-12-21 1989-06-28 Hitachi Ltd End effector
GB8801951D0 (en) * 1988-01-29 1988-02-24 British Aerospace Control apparatus
EP0326768A3 (en) 1988-02-01 1991-01-23 Faro Medical Technologies Inc. Computer-aided surgery apparatus
US4907970A (en) * 1988-03-30 1990-03-13 Grumman Aerospace Corporation Sidestick-type thrust control simulator
GB2220252A (en) 1988-05-27 1990-01-04 Creative Devices Res Ltd Control device for data processing apparatus
US4885565A (en) 1988-06-01 1989-12-05 General Motors Corporation Touchscreen CRT with tactile feedback
JPH0719512Y2 (en) 1988-06-15 1995-05-10 株式会社セガ・エンタープライゼス Simulated pilot game device
US4870964A (en) 1988-06-16 1989-10-03 Paul F. Bailey, Jr. Opthalmic surgical device and method with image data reflected off of the eye
US4887966A (en) 1988-06-30 1989-12-19 Gellerman Floyd R Flight simulation control apparatus
US4857881A (en) 1988-07-08 1989-08-15 Hayes Technology Joystick with spring disconnect
US5116180A (en) 1988-07-18 1992-05-26 Spar Aerospace Limited Human-in-the-loop machine control loop
GB8821675D0 (en) 1988-09-02 1988-10-19 Craig T R Rotation & displacement sensing apparatus
US5047952A (en) * 1988-10-14 1991-09-10 The Board Of Trustee Of The Leland Stanford Junior University Communication system for deaf, deaf-blind, or non-vocal individuals using instrumented glove
US4907973A (en) * 1988-11-14 1990-03-13 Hon David C Expert system simulator for modeling realistic internal environments and performance
US5009598A (en) 1988-11-23 1991-04-23 Bennington Thomas E Flight simulator apparatus using an inoperative aircraft
US4930770A (en) 1988-12-01 1990-06-05 Baker Norman A Eccentrically loaded computerized positive/negative exercise machine
US5153716A (en) 1988-12-14 1992-10-06 Horizonscan Inc. Panoramic interactive system
US5021982A (en) 1988-12-28 1991-06-04 Veda Incorporated Motion base control process and pilot perceptual simulator
US5353242A (en) 1988-12-28 1994-10-04 Veda Incorporated Motion base control process and operator perceptual simulator
US4881324A (en) 1988-12-30 1989-11-21 Gte Valenite Corporation Apparatus for linear measurements
US4998916A (en) * 1989-01-09 1991-03-12 Hammerslag Julius G Steerable medical device
US4949119A (en) 1989-01-12 1990-08-14 Atari Games Corporation Gearshift for a vehicle simulator using computer controlled realistic real world forces
US5044956A (en) 1989-01-12 1991-09-03 Atari Games Corporation Control device such as a steering wheel for video vehicle simulator with realistic feedback forces
US5116051A (en) 1989-01-12 1992-05-26 Atari Games Corporation Strain gauge pressure-sensitive video game control
US5033352A (en) 1989-01-19 1991-07-23 Yamaha Corporation Electronic musical instrument with frequency modulation
US5019761A (en) 1989-02-21 1991-05-28 Kraft Brett W Force feedback control for backhoe
GB8904955D0 (en) 1989-03-03 1989-04-12 Atomic Energy Authority Uk Multi-axis hand controller
US5135488A (en) 1989-03-17 1992-08-04 Merit Medical Systems, Inc. System and method for monitoring, displaying and recording balloon catheter inflation data
US5057078A (en) 1989-03-17 1991-10-15 Merit Medical Systems, Inc. Locking syringe
US5201753A (en) * 1989-03-17 1993-04-13 Merit Medical Systems, Inc. Totally self-contained, digitally controlled, disposable syringe inflation system, and method for monitoring, displaying and recording balloon catheter inflation data
US5062306A (en) 1989-04-20 1991-11-05 Kabushiki Kaisha Toyoda Jidoshokki Seisakusho Apparatus for detecting torque of rotating shaft
US5184306A (en) * 1989-06-09 1993-02-02 Regents Of The University Of Minnesota Automated high-precision fabrication of objects of complex and unique geometry
US5004391A (en) 1989-08-21 1991-04-02 Rutgers University Portable dextrous force feedback master for robot telemanipulation
US5139261A (en) 1989-09-15 1992-08-18 Openiano Renato M Foot-actuated computer game controller serving as a joystick
US5196017A (en) * 1989-10-11 1993-03-23 Silva Fidel H Method and apparatus for patient protection against vessel rupture from balloon-tipped catheters
US5661253A (en) 1989-11-01 1997-08-26 Yamaha Corporation Control apparatus and electronic musical instrument using the same
US5209131A (en) 1989-11-03 1993-05-11 Rank Taylor Hobson Metrology
US5126948A (en) 1989-11-08 1992-06-30 Ltv Aerospace And Defense Company Digital position encoder and data optimizer
US5112228A (en) 1989-11-13 1992-05-12 Advanced Cardiovascular Systems, Inc. Vascular model
US5048508A (en) 1989-12-23 1991-09-17 Karl Storz Endoscope having sealed shaft
US4964097A (en) 1990-01-02 1990-10-16 Conoco Inc. Three dimensional image construction using a grid of two dimensional depth sections
US5259894A (en) 1990-01-26 1993-11-09 Sampson Richard K Method for solvent bonding non-porous materials to automatically create variable bond characteristics
US5072361A (en) 1990-02-01 1991-12-10 Sarcos Group Force-reflective teleoperation control system
US5631861A (en) 1990-02-02 1997-05-20 Virtual Technologies, Inc. Force feedback and texture simulating interface device
US5184319A (en) * 1990-02-02 1993-02-02 Kramer James F Force feedback and textures simulating interface device
US5104328A (en) 1990-04-18 1992-04-14 Lounsbury Katherine L Anatomical model
US5086401A (en) 1990-05-11 1992-02-04 International Business Machines Corporation Image-directed robotic system for precise robotic surgery including redundant consistency checking
US5022384A (en) 1990-05-14 1991-06-11 Capitol Systems Vibrating/massage chair
KR950013780B1 (en) 1990-05-22 1995-11-16 후루까와 뗀끼 고요교오 가부시끼가이샤 Apparatus and method for measuring length of moving elongated object
EP0459761A3 (en) 1990-05-31 1993-07-14 Hewlett-Packard Company Three dimensional computer graphics employing ray tracking to compute form factors in radiosity
US5311422A (en) 1990-06-28 1994-05-10 The United States Of America As Represented By The Administrator Of The National Aeronautics And Space Administration General purpose architecture for intelligent computer-aided training
US5547382A (en) 1990-06-28 1996-08-20 Honda Giken Kogyo Kabushiki Kaisha Riding simulation system for motorcycles
US5077769A (en) 1990-06-29 1991-12-31 Siemens Gammasonics, Inc. Device for aiding a radiologist during percutaneous transluminal coronary angioplasty
US5158459A (en) 1990-07-05 1992-10-27 Ralph Edelberg Freestanding integrated control stick, rudder pedals, and throttle for computerized aircraft flight simulation program
GB9015177D0 (en) 1990-07-10 1990-08-29 Secr Defence A helmet loader for flight simulation
US5197003A (en) * 1990-08-01 1993-03-23 Atari Games Corporation Gearshift for a vehicle simulator having a solenoid for imposing a resistance force
US5269519A (en) 1990-08-15 1993-12-14 David Malone Game simulation interface apparatus and method
US5423754A (en) 1990-09-20 1995-06-13 Scimed Life Systems, Inc. Intravascular catheter
US5680590A (en) 1990-09-21 1997-10-21 Parti; Michael Simulation system and method of using same
US5181181A (en) * 1990-09-27 1993-01-19 Triton Technologies, Inc. Computer apparatus input device for three-dimensional information
WO1992007350A1 (en) * 1990-10-15 1992-04-30 National Biomedical Research Foundation Three-dimensional cursor control device
JP3104249B2 (en) 1990-10-17 2000-10-30 オムロン株式会社 Feedback control device
US5149270A (en) 1990-10-29 1992-09-22 Mckeown M J Apparatus for practicing surgical procedures
US5209661A (en) 1990-10-29 1993-05-11 Systems Control Technology, Inc. Motor control desired dynamic load of a simulating system and method
EP0507935A4 (en) * 1990-10-29 1993-06-23 Angeion Corporation Digital display system for balloon catheter
US5193963A (en) * 1990-10-31 1993-03-16 The United States Of America As Represented By The Administrator Of The National Aeronautics And Space Administration Force reflecting hand controller
US5062594A (en) 1990-11-29 1991-11-05 The United States Of America As Represented By The Secretary Of The Air Force Flight control system with tactile feedback
NL194053C (en) 1990-12-05 2001-05-03 Koninkl Philips Electronics Nv Device with a rotationally symmetrical body.
US5191320A (en) * 1990-12-15 1993-03-02 Sony Corporation Of America Variable scale input device
US5167159A (en) 1990-12-18 1992-12-01 Lucking William M Tension transducer
US5223776A (en) 1990-12-31 1993-06-29 Honeywell Inc. Six-degree virtual pivot controller
US5204600A (en) 1991-02-06 1993-04-20 Hewlett-Packard Company Mechanical detent simulating system
GB2252656B (en) 1991-02-11 1994-12-14 Keymed Improvements in endoscopy training apparatus
US5142931A (en) 1991-02-14 1992-09-01 Honeywell Inc. 3 degree of freedom hand controller
US5212473A (en) 1991-02-21 1993-05-18 Typeright Keyboard Corp. Membrane keyboard and method of using same
US5334027A (en) 1991-02-25 1994-08-02 Terry Wherlock Big game fish training and exercise device and method
US5354162A (en) 1991-02-26 1994-10-11 Rutgers University Actuator system for providing force feedback to portable master support
US5143505A (en) 1991-02-26 1992-09-01 Rutgers University Actuator system for providing force feedback to a dextrous master glove
DE9102864U1 (en) 1991-03-09 1991-05-29 Hacoba Textilmaschinen Gmbh & Co Kg, 5600 Wuppertal Device for measuring the length of thread-like textile material
US5240417A (en) 1991-03-14 1993-08-31 Atari Games Corporation System and method for bicycle riding simulation
US5203563A (en) 1991-03-21 1993-04-20 Atari Games Corporation Shaker control device
US5177473A (en) * 1991-03-28 1993-01-05 Drysdale Frank R Foot operated electrical control with potentiometers
GB9108497D0 (en) 1991-04-20 1991-06-05 Ind Limited W Human/computer interface
US5265034A (en) 1991-05-13 1993-11-23 The United States Of America As Represented By The Administrator Of The National Aeronautics And Space Administration Feedback controlled optics with wavefront compensation
JPH06508222A (en) * 1991-05-23 1994-09-14 アタリ ゲームズ コーポレーション modular display simulator
US5146566A (en) 1991-05-29 1992-09-08 Ibm Corporation Input/output system for computer user interface using magnetic levitation
US5215523A (en) 1991-05-30 1993-06-01 Eli Williams Balloon catheter inflation syringe with remote display
US5279309A (en) 1991-06-13 1994-01-18 International Business Machines Corporation Signaling device and method for monitoring positions in a surgical operation
US5417210A (en) 1992-05-27 1995-05-23 International Business Machines Corporation System and method for augmentation of endoscopic surgery
US5222893A (en) 1991-07-05 1993-06-29 Hardesty David L Instrument panel cross check training device
JP2514490B2 (en) 1991-07-05 1996-07-10 株式会社ダイヘン Teaching control method by interlocking manual operation of industrial robot
US5185561A (en) * 1991-07-23 1993-02-09 Digital Equipment Corporation Torque motor as a tactile feedback device in a computer system
US5171299A (en) 1991-08-02 1992-12-15 Baxter International Inc. Balloon catheter inflation pressure and diameter display apparatus and method
US5186629A (en) * 1991-08-22 1993-02-16 International Business Machines Corporation Virtual graphics display capable of presenting icons and windows to the blind computer user and method
US5403191A (en) 1991-10-21 1995-04-04 Tuason; Leo B. Laparoscopic surgery simulator and method of use
US5180007A (en) * 1991-10-21 1993-01-19 Halliburton Company Low pressure responsive downhold tool with hydraulic lockout
US5180351A (en) * 1991-10-21 1993-01-19 Alpine Life Sports Simulated stair climbing exercise apparatus having variable sensory feedback
US5220260A (en) 1991-10-24 1993-06-15 Lex Computer And Management Corporation Actuator having electronically controllable tactile responsiveness
US5889670A (en) 1991-10-24 1999-03-30 Immersion Corporation Method and apparatus for tactilely responsive user interface
US5228356A (en) 1991-11-25 1993-07-20 Chuang Keh Shih K Variable effort joystick
US5335557A (en) 1991-11-26 1994-08-09 Taizo Yasutake Touch sensitive input control device
US5309140A (en) 1991-11-26 1994-05-03 The United States Of America As Represented By The Secretary Of The Navy Feedback system for remotely operated vehicles
US5371778A (en) 1991-11-29 1994-12-06 Picker International, Inc. Concurrent display and adjustment of 3D projection, coronal slice, sagittal slice, and transverse slice images
US5252068A (en) 1991-12-31 1993-10-12 Flight Dynamics, Incorporated Weight-shift flight control transducer and computer controlled flight simulator, hang gliders and ultralight aircraft utilizing the same
US5275169A (en) 1992-01-15 1994-01-04 Innovation Associates Apparatus and method for determining physiologic characteristics of body lumens
DE69312053T2 (en) 1992-01-21 1997-10-30 Stanford Res Inst Int TELEOPERATEURSYSTEM AND METHOD WITH TELE PRESENCE
GB9201214D0 (en) 1992-01-21 1992-03-11 Mcmahon Michael J Surgical retractors
US5631973A (en) 1994-05-05 1997-05-20 Sri International Method for telemanipulation with telepresence
US5318533A (en) 1992-02-21 1994-06-07 Scimed Life Systems, Inc. Balloon catheter inflation device including apparatus for monitoring and wireless transmission of inflation data, and system
CA2062147C (en) * 1992-03-02 1995-07-25 Kenji Hara Multi-axial joy stick device
US5246007A (en) 1992-03-13 1993-09-21 Cardiometrics, Inc. Vascular catheter for measuring flow characteristics and method
US5428355A (en) 1992-03-23 1995-06-27 Hewlett-Packard Corporation Position encoder system
US5999185A (en) 1992-03-30 1999-12-07 Kabushiki Kaisha Toshiba Virtual reality control using image, model and control data to manipulate interactions
JP3199130B2 (en) 1992-03-31 2001-08-13 パイオニア株式会社 3D coordinate input device
US5189355A (en) * 1992-04-10 1993-02-23 Ampex Corporation Interactive rotary controller system with tactile feedback
US5324260A (en) 1992-04-27 1994-06-28 Minnesota Mining And Manufacturing Company Retrograde coronary sinus catheter
JP2677315B2 (en) * 1992-04-27 1997-11-17 株式会社トミー Driving toys
US5584701A (en) 1992-05-13 1996-12-17 University Of Florida Research Foundation, Incorporated Self regulating lung for simulated medical procedures
US5366376A (en) 1992-05-22 1994-11-22 Atari Games Corporation Driver training system and method with performance data feedback
US5368484A (en) 1992-05-22 1994-11-29 Atari Games Corp. Vehicle simulator with realistic operating feedback
US5515078A (en) 1992-06-12 1996-05-07 The Computer Museum, Inc. Virtual-reality positional input and display system
US5327790A (en) 1992-06-19 1994-07-12 Massachusetts Institute Of Technology Reaction sensing torque actuator
AU670311B2 (en) 1992-07-06 1996-07-11 Immersion Corporation Determination of kinematically constrained multi-articulated structures
US5245320A (en) * 1992-07-09 1993-09-14 Thrustmaster, Inc. Multiport game card with configurable address
US5313230A (en) 1992-07-24 1994-05-17 Apple Computer, Inc. Three degree of freedom graphic object controller
US5296871A (en) 1992-07-27 1994-03-22 Paley W Bradford Three-dimensional mouse with tactile feedback
US5368487A (en) 1992-07-31 1994-11-29 Medina; Marelyn Laparoscopic training device and method of use
US5259626A (en) 1992-08-07 1993-11-09 Std Electronic International Ltd. Programmable video game controller
US5657429A (en) 1992-08-10 1997-08-12 Computer Motion, Inc. Automated endoscope system optimal positioning
US5428748A (en) 1992-09-24 1995-06-27 National Semiconductor Corporation Method and apparatus for automatically configuring a computer peripheral
US5283970A (en) * 1992-09-25 1994-02-08 Strombecker Corporation Toy guns
US5368565A (en) 1992-09-28 1994-11-29 Medex, Inc. Balloon catheter pressure monitor for local and remote display
US5264768A (en) 1992-10-06 1993-11-23 Honeywell, Inc. Active hand controller feedback loop
US5286203A (en) * 1992-10-07 1994-02-15 Aai Microflite Simulation International Simulating horizontal stabilizer trimming in an aircraft
US5666473A (en) 1992-10-08 1997-09-09 Science & Technology Corporation & Unm Tactile computer aided sculpting device
US5295694A (en) * 1992-10-27 1994-03-22 Levin John M Laparoscopic surgery simulating game
US5397323A (en) * 1992-10-30 1995-03-14 International Business Machines Corporation Remote center-of-motion robot for surgery
US5370535A (en) 1992-11-16 1994-12-06 Cae-Link Corporation Apparatus and method for primary control loading for vehicle simulation
US5389865A (en) 1992-12-02 1995-02-14 Cybernet Systems Corporation Method and system for providing a tactile virtual reality and manipulator defining an interface device therefor
US5629594A (en) 1992-12-02 1997-05-13 Cybernet Systems Corporation Force feedback system
US5769640A (en) 1992-12-02 1998-06-23 Cybernet Systems Corporation Method and system for simulating medical procedures including virtual reality and control method and system for use therein
US5412189A (en) 1992-12-21 1995-05-02 International Business Machines Corporation Touch screen apparatus with tactile information
US5451924A (en) 1993-01-14 1995-09-19 Massachusetts Institute Of Technology Apparatus for providing sensory substitution of force feedback
US5355148A (en) 1993-01-14 1994-10-11 Ast Research, Inc. Fingerpoint mouse
US5542676A (en) 1993-02-11 1996-08-06 Soundadvice For Sports, Inc. Biosensor feedback device for sporting implements
US5412880A (en) 1993-02-23 1995-05-09 Faro Technologies Inc. Method of constructing a 3-dimensional map of a measurable quantity using three dimensional coordinate measuring apparatus
US5314339A (en) 1993-03-29 1994-05-24 Marivel Aponte Educational medical mannequin
DE69426664T2 (en) * 1993-04-09 2001-08-23 Sega Enterprises Kk MULTIPLE CONNECTOR FOR GAME APPARATUS
US5541831A (en) 1993-04-16 1996-07-30 Oliver Manufacturing Co., Inc. Computer controlled separator device
US5344354A (en) 1993-04-30 1994-09-06 Larry Wiley Flight-simulating airplane toy
JP3686686B2 (en) 1993-05-11 2005-08-24 松下電器産業株式会社 Haptic device, data input device, and data input device device
US5425644A (en) 1993-05-13 1995-06-20 Gerhard Szinicz Surgical training apparatus and method
US5429140A (en) 1993-06-04 1995-07-04 Greenleaf Medical Systems, Inc. Integrated virtual reality rehabilitation system
US5739811A (en) 1993-07-16 1998-04-14 Immersion Human Interface Corporation Method and apparatus for controlling human-computer interface systems providing force feedback
US5731804A (en) * 1995-01-18 1998-03-24 Immersion Human Interface Corp. Method and apparatus for providing high bandwidth, low noise mechanical I/O for computer systems
US5724264A (en) * 1993-07-16 1998-03-03 Immersion Human Interface Corp. Method and apparatus for tracking the position and orientation of a stylus and for digitizing a 3-D object
WO1995002801A1 (en) 1993-07-16 1995-01-26 Immersion Human Interface Three-dimensional mechanical mouse
US5805140A (en) 1993-07-16 1998-09-08 Immersion Corporation High bandwidth force feedback interface using voice coils and flexures
US5467441A (en) 1993-07-21 1995-11-14 Xerox Corporation Method for operating on objects in a first image using an object-based model data structure to produce a second contextual image having added, replaced or deleted objects
US5425709A (en) 1993-07-22 1995-06-20 C. R. Bard, Inc. Sheath for a balloon catheter
CA2103626A1 (en) * 1993-08-09 1995-02-10 Septimiu Edmund Salcudean Motion scaling tele-operating system with force feedback suitable for microsurgery
US5776126A (en) 1993-09-23 1998-07-07 Wilk; Peter J. Laparoscopic surgical apparatus and associated method
DE4332580A1 (en) * 1993-09-24 1995-03-30 Deutsche Aerospace Apparatus for reconstructing or simulating the sense of touch in a surgical instrument
US5470232A (en) 1993-09-29 1995-11-28 The United States Of America As Represented By The Secretary Of The Navy Reconfigurable aircraft stick control and method for connecting and removing stick control from aircraft simulator
US5625576A (en) 1993-10-01 1997-04-29 Massachusetts Institute Of Technology Force reflecting haptic interface
US5397308A (en) * 1993-10-22 1995-03-14 Scimed Life Systems, Inc. Balloon inflation measurement apparatus
US5436640A (en) 1993-10-29 1995-07-25 Thrustmaster, Inc. Video game and simulator joystick controller with geared potentiometer actuation
US5384460A (en) * 1993-11-03 1995-01-24 Silitek Corporation Encoder with a light emitting editing wheel
US5599301A (en) * 1993-11-22 1997-02-04 Advanced Cardiovascular Systems, Inc. Motor control system for an automatic catheter inflation system
AU7601094A (en) 1993-12-15 1995-07-03 Computer Motion, Inc. Automated endoscope system for optimal positioning
US5473235A (en) 1993-12-21 1995-12-05 Honeywell Inc. Moment cell counterbalance for active hand controller
US5461711A (en) 1993-12-22 1995-10-24 Interval Research Corporation Method and system for spatial accessing of time-based information
JPH07200002A (en) 1993-12-28 1995-08-04 Canon Inc Feedback controller
US5577981A (en) 1994-01-19 1996-11-26 Jarvik; Robert Virtual reality exercise machine and computer controlled video system
US6120465A (en) 1994-01-24 2000-09-19 Radionics Software Applications, Inc. Virtual probe for a stereotactic digitizer for use in surgery
WO1995020787A1 (en) * 1994-01-27 1995-08-03 Exos, Inc. Multimode feedback display technology
US5492530A (en) * 1994-02-07 1996-02-20 Cathco, Inc. Method for accessing the coronary arteries from the radial or brachial artery in the arm
US5580249A (en) 1994-02-14 1996-12-03 Sarcos Group Apparatus for simulating mobility of a human
US5482051A (en) * 1994-03-10 1996-01-09 The University Of Akron Electromyographic virtual reality system
US5661667A (en) 1994-03-14 1997-08-26 Virtek Vision Corp. 3D imaging using a laser projector
GB9407936D0 (en) 1994-04-21 1994-06-15 Univ Bristol Training device
US6004134A (en) 1994-05-19 1999-12-21 Exos, Inc. Interactive simulation including force feedback
US5616030A (en) 1994-06-01 1997-04-01 Watson; Bruce L. Flight simulator employing an actual aircraft
US6160489A (en) 1994-06-23 2000-12-12 Motorola, Inc. Wireless communication device adapted to generate a plurality of distinctive tactile alert patterns
JP2843964B2 (en) * 1994-06-27 1999-01-06 東レエンジニアリング株式会社 Turret type winder
US5524637A (en) 1994-06-29 1996-06-11 Erickson; Jon W. Interactive system for measuring physiological exertion
US5821920A (en) 1994-07-14 1998-10-13 Immersion Human Interface Corporation Control input device for interfacing an elongated flexible object with a computer system
US5623582A (en) 1994-07-14 1997-04-22 Immersion Human Interface Corporation Computer interface or control input device for laparoscopic surgical instrument and other elongated mechanical objects
US5575761A (en) 1994-07-27 1996-11-19 Hajianpour; Mohammed-Ali Massage device applying variable-frequency vibration in a variable pulse sequence
US5600348A (en) * 1994-08-19 1997-02-04 Ftg Data Systems Adjustable tip light pen
US5684722A (en) 1994-09-21 1997-11-04 Thorner; Craig Apparatus and method for generating a control signal for a tactile sensation generator
US5565840A (en) 1994-09-21 1996-10-15 Thorner; Craig Tactile sensation generator
US5669818A (en) 1995-03-23 1997-09-23 Thorner; Craig Seat-based tactile sensation generator
US5609485A (en) * 1994-10-03 1997-03-11 Medsim, Ltd. Medical reproduction system
US5765561A (en) * 1994-10-07 1998-06-16 Medical Media Systems Video-based surgical targeting system
US5766016A (en) 1994-11-14 1998-06-16 Georgia Tech Research Corporation Surgical simulator and method for simulating surgical procedure
US5771181A (en) 1994-12-14 1998-06-23 Moore; Robert S. Generation for virtual reality simulator systems
US5548694A (en) 1995-01-31 1996-08-20 Mitsubishi Electric Information Technology Center America, Inc. Collision avoidance system for voxel-based object representation
US5930741A (en) 1995-02-28 1999-07-27 Virtual Technologies, Inc. Accurate, rapid, reliable position sensing using multiple sensing technologies
US5592401A (en) * 1995-02-28 1997-01-07 Virtual Technologies, Inc. Accurate, rapid, reliable position sensing using multiple sensing technologies
US5542672A (en) 1995-03-17 1996-08-06 Meredith; Chris Fishing rod and reel electronic game controller
US5749853A (en) 1995-03-17 1998-05-12 Advanced Cardiovascular Systems, Inc. Inflation control system with elapsed time measurement
US5882206A (en) 1995-03-29 1999-03-16 Gillio; Robert G. Virtual surgery system
US5720619A (en) * 1995-04-24 1998-02-24 Fisslinger; Johannes Interactive computer assisted multi-media biofeedback system
US5736978A (en) 1995-05-26 1998-04-07 The United States Of America As Represented By The Secretary Of The Air Force Tactile graphics display
US5691898A (en) 1995-09-27 1997-11-25 Immersion Human Interface Corp. Safe and low cost computer peripherals with force feedback for consumer applications
US5651775A (en) 1995-07-12 1997-07-29 Walker; Richard Bradley Medication delivery and monitoring system and methods
US5776050A (en) 1995-07-24 1998-07-07 Medical Media Systems Anatomical visualization system
US5810007A (en) 1995-07-26 1998-09-22 Associates Of The Joint Center For Radiation Therapy, Inc. Ultrasound localization and image fusion for the treatment of prostate cancer
WO1997020305A1 (en) 1995-11-30 1997-06-05 Virtual Technologies, Inc. Tactile feedback man-machine interface device
US6219032B1 (en) 1995-12-01 2001-04-17 Immersion Corporation Method for providing force feedback to a user of an interface device based on interactions of a controlled cursor with graphical elements in a graphical user interface
US5956484A (en) 1995-12-13 1999-09-21 Immersion Corporation Method and apparatus for providing force feedback over a computer network
US5806521A (en) 1996-03-26 1998-09-15 Sandia Corporation Composite ultrasound imaging apparatus and method
US6111577A (en) 1996-04-04 2000-08-29 Massachusetts Institute Of Technology Method and apparatus for determining forces to be applied to a user through a haptic interface
US5807377A (en) 1996-05-20 1998-09-15 Intuitive Surgical, Inc. Force-reflecting surgical instrument and positioning mechanism for performing minimally invasive surgery with enhanced dexterity and sensitivity
US5797900A (en) 1996-05-20 1998-08-25 Intuitive Surgical, Inc. Wrist mechanism for surgical instrument for performing minimally invasive surgery with enhanced dexterity and sensitivity
US6047080A (en) * 1996-06-19 2000-04-04 Arch Development Corporation Method and apparatus for three-dimensional reconstruction of coronary vessels from angiographic images
US5800179A (en) 1996-07-23 1998-09-01 Medical Simulation Corporation System for training persons to perform minimally invasive surgical procedures
US6084587A (en) 1996-08-02 2000-07-04 Sensable Technologies, Inc. Method and apparatus for generating and interfacing with a haptic virtual reality environment
US6024576A (en) * 1996-09-06 2000-02-15 Immersion Corporation Hemispherical, high bandwidth mechanical interface for computer systems
US5694013A (en) 1996-09-06 1997-12-02 Ford Global Technologies, Inc. Force feedback haptic interface for a three-dimensional CAD surface
US6104379A (en) 1996-12-11 2000-08-15 Virtual Technologies, Inc. Forearm-supported exoskeleton hand-tracking device
US6038488A (en) * 1997-02-27 2000-03-14 Bertec Corporation Catheter simulation device
WO1998047426A1 (en) 1997-04-21 1998-10-29 Virtual Technologies, Inc. Goniometer-based body-tracking device and method
US6110130A (en) 1997-04-21 2000-08-29 Virtual Technologies, Inc. Exoskeleton device for directly measuring fingertip position and inferring finger joint angle
US6042555A (en) 1997-05-12 2000-03-28 Virtual Technologies, Inc. Force-feedback interface device for the hand
WO1999039315A2 (en) 1998-01-28 1999-08-05 Ht Medical Systems, Inc. Interface device and method for interfacing instruments to vascular access simulation systems
US6062866A (en) 1998-03-27 2000-05-16 Prom; James M. Medical angioplasty model
US6538634B1 (en) * 1998-12-18 2003-03-25 Kent Ridge Digital Labs Apparatus for the simulation of image-guided surgery
EP1253854A4 (en) * 1999-03-07 2010-01-06 Discure Ltd Method and apparatus for computerized surgery
US6674894B1 (en) * 1999-04-20 2004-01-06 University Of Utah Research Foundation Method and apparatus for enhancing an image using data optimization and segmentation
US7538764B2 (en) * 2001-01-05 2009-05-26 Interuniversitair Micro-Elektronica Centrum (Imec) System and method to obtain surface structures of multi-dimensional objects, and to represent those surface structures for animation, transmission and display
SE518252C2 (en) * 2001-01-24 2002-09-17 Goeteborg University Surgical Method of simulation of a surgical step, method of simulation of surgical operation and system of simulation of a surgical step
US7630750B2 (en) * 2001-02-05 2009-12-08 The Research Foundation For The State University Of New York Computer aided treatment planning
WO2002070980A1 (en) * 2001-03-06 2002-09-12 The Johns Hopkins University School Of Medicine Simulation system for image-guided medical procedures
US20040234933A1 (en) * 2001-09-07 2004-11-25 Dawson Steven L. Medical procedure training system
SG165160A1 (en) * 2002-05-06 2010-10-28 Univ Johns Hopkins Simulation system for medical procedures
CN1331100C (en) * 2003-12-22 2007-08-08 李浩宇 Establishing method of 3D interacting model of human skeleton unknown body and its use
US20050196740A1 (en) * 2004-03-08 2005-09-08 Olympus Corporation Simulator system and training method for endoscopic manipulation using simulator
US20060211940A1 (en) * 2004-10-01 2006-09-21 Marco Antonelli Blood vessel structure segmentation system and method
US7563228B2 (en) * 2005-01-24 2009-07-21 Siemens Medical Solutions Usa, Inc. Stereoscopic three or four dimensional ultrasound imaging
US8388348B2 (en) 2005-04-19 2013-03-05 Regents Of The University Of Minnesota Disease treatment simulation
US8165908B2 (en) * 2005-07-29 2012-04-24 Siemens Aktiengesellschaft Tool tip with additional information and task-sensitive direct access help for a user
US7681579B2 (en) * 2005-08-02 2010-03-23 Biosense Webster, Inc. Guided procedures for treating atrial fibrillation
US7877128B2 (en) * 2005-08-02 2011-01-25 Biosense Webster, Inc. Simulation of invasive procedures
US20070049817A1 (en) * 2005-08-30 2007-03-01 Assaf Preiss Segmentation and registration of multimodal images using physiological data
US20070231779A1 (en) * 2006-02-15 2007-10-04 University Of Central Florida Research Foundation, Inc. Systems and Methods for Simulation of Organ Dynamics
US9345548B2 (en) 2006-02-27 2016-05-24 Biomet Manufacturing, Llc Patient-specific pre-operative planning
JP5641736B2 (en) 2008-03-25 2014-12-17 株式会社東芝 Medical image processing apparatus and X-ray diagnostic apparatus

Patent Citations (10)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US5782762A (en) * 1994-10-27 1998-07-21 Wake Forest University Method and system for producing interactive, three-dimensional renderings of selected body organs having hollow lumens to enable simulated movement through the lumen
US5825909A (en) * 1996-02-29 1998-10-20 Eastman Kodak Company Automated method and system for image segmentation in digital radiographic images
US20010002310A1 (en) * 1997-06-20 2001-05-31 Align Technology, Inc. Clinician review of an orthodontic treatment plan and appliance
US20030097068A1 (en) * 1998-06-02 2003-05-22 Acuson Corporation Medical diagnostic ultrasound system and method for versatile processing
US20010055016A1 (en) * 1998-11-25 2001-12-27 Arun Krishnan System and method for volume rendering-based segmentation
US20040023493A1 (en) * 2002-05-20 2004-02-05 Katsuhiro Tomoda Isolating method and transferring method for semiconductor devices
US20070052724A1 (en) * 2005-09-02 2007-03-08 Alan Graham Method for navigating a virtual camera along a biological object with a lumen
US20070060799A1 (en) * 2005-09-13 2007-03-15 Lyon Torsten M Apparatus and method for automatic image guided accuracy verification
US20070116357A1 (en) * 2005-11-23 2007-05-24 Agfa-Gevaert Method for point-of-interest attraction in digital images
US20070116338A1 (en) * 2005-11-23 2007-05-24 General Electric Company Methods and systems for automatic segmentation of biological structure

Cited By (19)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US11854426B2 (en) 2012-10-30 2023-12-26 Truinject Corp. System for cosmetic and therapeutic training
US11403964B2 (en) 2012-10-30 2022-08-02 Truinject Corp. System for cosmetic and therapeutic training
US10902746B2 (en) 2012-10-30 2021-01-26 Truinject Corp. System for cosmetic and therapeutic training
US10896627B2 (en) 2014-01-17 2021-01-19 Truinjet Corp. Injection site training system
CN104463965A (en) * 2014-12-17 2015-03-25 中国科学院自动化研究所 Training scene simulation system and method for minimally invasive cardiovascular interventional operation
EP3247300B1 (en) * 2015-01-09 2020-07-15 Azevedo Da Silva, Sara Isabel Orthopedic surgery planning system
US12070581B2 (en) 2015-10-20 2024-08-27 Truinject Corp. Injection system
US10849688B2 (en) 2016-03-02 2020-12-01 Truinject Corp. Sensory enhanced environments for injection aid and social training
US11730543B2 (en) 2016-03-02 2023-08-22 Truinject Corp. Sensory enhanced environments for injection aid and social training
US11151721B2 (en) 2016-07-08 2021-10-19 Avent, Inc. System and method for automatic detection, localization, and semantic segmentation of anatomical objects
EP3279865B1 (en) 2016-08-01 2018-11-21 3mensio Medical Imaging B.V. Method, device and system for simulating shadow images
US10657671B2 (en) 2016-12-02 2020-05-19 Avent, Inc. System and method for navigation to a target anatomical object in medical imaging-based procedures
US11710424B2 (en) 2017-01-23 2023-07-25 Truinject Corp. Syringe dose and position measuring apparatus
US11291423B2 (en) 2017-07-14 2022-04-05 Materialise N.V. System and method of radiograph correction and visualization
US20200360089A1 (en) * 2017-12-28 2020-11-19 Hutom Co., Ltd. Method for generating surgical simulation information and program
US11660142B2 (en) * 2017-12-28 2023-05-30 Hutom Co., Ltd. Method for generating surgical simulation information and program
US11042778B2 (en) * 2018-11-29 2021-06-22 International Business Machines Corporation Generating realistic organ x-ray angiography (XA) images for deep learning consumption
WO2023061843A1 (en) * 2021-10-11 2023-04-20 Koninklijke Philips N.V. Enhanced segmentation
EP4163872A1 (en) * 2021-10-11 2023-04-12 Koninklijke Philips N.V. Enhanced segmentation

Also Published As

Publication number Publication date
GB2459225A (en) 2009-10-21
WO2008087629A3 (en) 2009-03-26
CN101627411A (en) 2010-01-13
GB2459225B (en) 2011-07-20
CN101627411B (en) 2014-03-19
WO2008087629A2 (en) 2008-07-24
US8500451B2 (en) 2013-08-06
GB0914278D0 (en) 2009-09-30
US20090018808A1 (en) 2009-01-15

Similar Documents

Publication Publication Date Title
US8500451B2 (en) Preoperative surgical simulation
Vidal et al. Simulation of ultrasound guided needle puncture using patient specific data with 3D textures and volume haptics
US20040009459A1 (en) Simulation system for medical procedures
US20020168618A1 (en) Simulation system for image-guided medical procedures
Villard et al. Interventional radiology virtual simulator for liver biopsy
Shahidi et al. Clinical applications of three-dimensional rendering of medical data sets
Alderliesten et al. Simulation of minimally invasive vascular interventions for training purposes
Delingette et al. Computational models for image-guided robot-assisted and simulated medical interventions
Dev Imaging and visualization in medical education
Meglan Making surgical simulation real
Robb et al. Patient-specific anatomic models from three dimensional medical image data for clinical applications in surgery and endoscopy
Tahmasebi et al. A framework for the design of a novel haptic-based medical training simulator
Nakao et al. Haptic reproduction and interactive visualization of a beating heart for cardiovascular surgery simulation
Faso Haptic and virtual reality surgical simulator for training in percutaneous renal access
Villard et al. Virtual reality simulation of liver biopsy with a respiratory component
Satava et al. Medical applications of virtual environments
Vidal et al. Principles and Applications of Medical Virtual Environments.
Nicolau et al. A low cost simulator to practice ultrasound image interpretation and probe manipulation: Design and first evaluation
Nowinski Virtual reality in brain intervention: Models and applications
Soomro et al. Image-based modeling and precision medicine
Blezek et al. Virtual reality simulation of regional anesthesia for training of residents
Alghamdi Simulation System for Radiology Education Integration of Physical and Virtual Realities: Overview and Software Considerations.
RU2802129C1 (en) Method of virtual simulation of retrograde intrarenal surgery for treatment of urolithiasis, used in teaching endourological manipulation skills and in planning surgery using a flexible ureteroscope
Nowinski Virtual reality in brain intervention
Chung et al. Real color volume model of cadaver for learning cardiac computed tomographs and echocardiographs

Legal Events

Date Code Title Description
AS Assignment

Owner name: SIMBIONIX LTD., ISRAEL

Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNORS:BRONSTEIN, RAN;FISHER, NIV;SHILON, OFEK;REEL/FRAME:032209/0726

Effective date: 20080818

AS Assignment

Owner name: 3D SYSTEMS, INC., SOUTH CAROLINA

Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNOR:SIMBIONIX LTD.;REEL/FRAME:034261/0647

Effective date: 20141119

AS Assignment

Owner name: SIMBIONIX LTD., ISRAEL

Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNOR:3D SYSTEMS, INC.;REEL/FRAME:034653/0355

Effective date: 20141215

STPP Information on status: patent application and granting procedure in general

Free format text: NON FINAL ACTION MAILED

STCB Information on status: application discontinuation

Free format text: ABANDONED -- FAILURE TO RESPOND TO AN OFFICE ACTION