EP3928293A1 - Procédés et systèmes de traitement d'images - Google Patents
Procédés et systèmes de traitement d'imagesInfo
- Publication number
- EP3928293A1 EP3928293A1 EP20704054.4A EP20704054A EP3928293A1 EP 3928293 A1 EP3928293 A1 EP 3928293A1 EP 20704054 A EP20704054 A EP 20704054A EP 3928293 A1 EP3928293 A1 EP 3928293A1
- Authority
- EP
- European Patent Office
- Prior art keywords
- image
- dimensional digital
- digital image
- observation
- intensity values
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Pending
Links
- 238000003672 processing method Methods 0.000 title claims abstract description 16
- 238000000034 method Methods 0.000 claims description 33
- 239000003550 marker Substances 0.000 claims description 22
- 239000000463 material Substances 0.000 claims description 20
- 238000002059 diagnostic imaging Methods 0.000 claims description 12
- 238000003384 imaging method Methods 0.000 claims description 12
- 238000004364 calculation method Methods 0.000 claims description 10
- 238000009434 installation Methods 0.000 claims description 9
- 230000008569 process Effects 0.000 claims description 7
- 238000003325 tomography Methods 0.000 claims description 7
- 238000002432 robotic surgery Methods 0.000 claims description 2
- 239000007943 implant Substances 0.000 description 23
- 210000000988 bone and bone Anatomy 0.000 description 15
- 238000012545 processing Methods 0.000 description 9
- 238000001356 surgical procedure Methods 0.000 description 8
- 210000003484 anatomy Anatomy 0.000 description 5
- 230000037182 bone density Effects 0.000 description 5
- 238000010586 diagram Methods 0.000 description 5
- 238000002513 implantation Methods 0.000 description 5
- 238000005553 drilling Methods 0.000 description 4
- 239000012636 effector Substances 0.000 description 4
- 238000005259 measurement Methods 0.000 description 4
- 125000006850 spacer group Chemical group 0.000 description 4
- 230000000399 orthopedic effect Effects 0.000 description 3
- 230000008901 benefit Effects 0.000 description 2
- 238000013479 data entry Methods 0.000 description 2
- 230000006870 function Effects 0.000 description 2
- 238000003780 insertion Methods 0.000 description 2
- 230000037431 insertion Effects 0.000 description 2
- 208000029725 Metabolic bone disease Diseases 0.000 description 1
- 241001465754 Metazoa Species 0.000 description 1
- 238000010521 absorption reaction Methods 0.000 description 1
- 208000037873 arthrodesis Diseases 0.000 description 1
- 238000004422 calculation algorithm Methods 0.000 description 1
- 230000008859 change Effects 0.000 description 1
- 238000004891 communication Methods 0.000 description 1
- 238000002591 computed tomography Methods 0.000 description 1
- 238000012790 confirmation Methods 0.000 description 1
- 238000010276 construction Methods 0.000 description 1
- 238000006073 displacement reaction Methods 0.000 description 1
- 239000003814 drug Substances 0.000 description 1
- 230000003100 immobilizing effect Effects 0.000 description 1
- 230000003993 interaction Effects 0.000 description 1
- 238000002955 isolation Methods 0.000 description 1
- 238000009659 non-destructive testing Methods 0.000 description 1
- 230000007170 pathology Effects 0.000 description 1
- 238000003909 pattern recognition Methods 0.000 description 1
- 210000004197 pelvis Anatomy 0.000 description 1
- 230000004044 response Effects 0.000 description 1
- 206010039722 scoliosis Diseases 0.000 description 1
- 210000000278 spinal cord Anatomy 0.000 description 1
- 230000000007 visual effect Effects 0.000 description 1
Classifications
-
- G—PHYSICS
- G16—INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR SPECIFIC APPLICATION FIELDS
- G16H—HEALTHCARE INFORMATICS, i.e. INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR THE HANDLING OR PROCESSING OF MEDICAL OR HEALTHCARE DATA
- G16H20/00—ICT specially adapted for therapies or health-improving plans, e.g. for handling prescriptions, for steering therapy or for monitoring patient compliance
- G16H20/40—ICT specially adapted for therapies or health-improving plans, e.g. for handling prescriptions, for steering therapy or for monitoring patient compliance relating to mechanical, radiation or invasive therapies, e.g. surgery, laser therapy, dialysis or acupuncture
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T15/00—3D [Three Dimensional] image rendering
- G06T15/08—Volume rendering
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T7/00—Image analysis
- G06T7/60—Analysis of geometric attributes
- G06T7/68—Analysis of geometric attributes of symmetry
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T7/00—Image analysis
- G06T7/70—Determining position or orientation of objects or cameras
- G06T7/73—Determining position or orientation of objects or cameras using feature-based methods
-
- G—PHYSICS
- G16—INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR SPECIFIC APPLICATION FIELDS
- G16H—HEALTHCARE INFORMATICS, i.e. INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR THE HANDLING OR PROCESSING OF MEDICAL OR HEALTHCARE DATA
- G16H30/00—ICT specially adapted for the handling or processing of medical images
- G16H30/20—ICT specially adapted for the handling or processing of medical images for handling medical images, e.g. DICOM, HL7 or PACS
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2200/00—Indexing scheme for image data processing or generation, in general
- G06T2200/24—Indexing scheme for image data processing or generation, in general involving graphical user interfaces [GUIs]
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2207/00—Indexing scheme for image analysis or image enhancement
- G06T2207/10—Image acquisition modality
- G06T2207/10072—Tomographic images
- G06T2207/10081—Computed x-ray tomography [CT]
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2207/00—Indexing scheme for image analysis or image enhancement
- G06T2207/10—Image acquisition modality
- G06T2207/10116—X-ray image
- G06T2207/10121—Fluoroscopy
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2207/00—Indexing scheme for image analysis or image enhancement
- G06T2207/30—Subject of image; Context of image processing
- G06T2207/30204—Marker
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2210/00—Indexing scheme for image generation or computer graphics
- G06T2210/41—Medical
Definitions
- TITLE Image processing methods and systems
- the present invention relates to methods and systems for processing images, particularly for planning a surgical operation.
- CT-Scan computer-assisted tomography
- Such methods can be used during surgical operations, for example to prepare and facilitate the placement of a surgical implant by a surgeon or by a surgical robot.
- these methods can be used during an operation for the surgical treatment of the spine of a patient, during which one or more spinal implants are placed, for example to perform arthrodesis of a segment of several vertebrae.
- Such spinal implants generally include pedicle screws, that is, screws placed in the pedicles of the patient's vertebrae.
- pedicle screws that is, screws placed in the pedicles of the patient's vertebrae.
- the surgical procedures necessary for the installation of these spinal implants, and more particularly for the installation of the pedicle screws, are difficult to carry out because of the small dimensions of the bone structures where the implants must be anchored, and because of the risks of damage to critical anatomical structures located nearby, such as the spinal cord.
- the pixel values of the resulting image are representative of the material density of the target object that has been imaged.
- the resulting image constructed from the acquired images makes it possible to immediately visualize the bone density of said structure, and in particular to visualize the contrast between areas of high bone density and areas of low bone density within the bone structure itself.
- the bone density information allows an operator to more easily find the optimal cutting plane of each vertebra. Once this cutting plane has been identified, the operator can easily define a sighting mark indicating the direction of insertion of a pedicle screw.
- the invention allows him in particular to find more easily and quickly where to place the target mark, for example when areas of high bone density must be favored.
- such a process can incorporate one or more of the following characteristics, taken in isolation or in any technically acceptable combination:
- the three-dimensional digital image is an X-ray image produced by a computer assisted tomography process, the voxel intensity values of the three-dimensional digital image being associated with the material density values of the target object.
- the method further comprises steps consisting of:
- the method further comprises calculating at least one target position of a surgical robot, or even a target trajectory of a surgical robot, from the acquired position of said virtual landmark.
- the calculation of at least one target position or of a target trajectory includes the calculation of the coordinates of the virtual coordinate system in a geometric reference system linked to a surgical robot from the coordinates of said virtual coordinate system in a geometric frame specific to the digital image.
- the method further comprises steps consisting of:
- the first virtual frame of reference acquires the coordinates of an axis of symmetry defined on a portion of the two-dimensional digital image by the operator by means of the interface man-machine;
- a calibration marker is placed in the field of view of the imaging device alongside the target object, at least a portion of the marker being made of a material with a predefined material density, so that a part of the generated three-dimensional digital fluoroscopic image includes the image of the calibration marker;
- the method further comprising a calibration step in which are automatically associated with the intensity values of the pixels of the two-dimensional digital image, density values determined automatically from the intensity values of a subset of pixels of this same image associated with the portion of the marker produced in the material having the predefined material density.
- a medical imaging system in particular for a robotic surgery installation, is configured to implement steps consisting of:
- Figure 1 schematically shows a human vertebra in an axial sectional plane
- FIG 2 schematically shows a computer system according to one embodiment of the invention comprising an image processing system and a surgical robot;
- Figure 3 shows schematically a sighting mark positioned in a portion of human spine as well as images of said spine portion in anatomical section planes on which the sighting mark is displayed;
- FIG 4 is a flow diagram of an image processing method according to embodiments of the invention.
- Figure 5 shows schematically the construction of a resulting image from images acquired by tomography during the method of Figure 4;
- Figure 6 illustrates an example of an image of a portion of human spine according to a front view reconstructed by means of the method of Figure 4 as well as images of said portion of spine in section planes anatomical on which the sight mark is displayed;
- Figure 7 shows schematically a spacer forming part of the system of Figure 2;
- FIG. 8 schematically shows a registration target
- FIG. 9 is a flow diagram of a method of operating a surgical robot according to embodiments for delivering a surgical implant. The description which follows is given by way of example with reference to an operation for the surgical treatment of the spine of a patient during which one or more spinal implants are placed.
- the invention is not limited to this example and other applications are possible, in particular orthopedic applications, such as surgery of the pelvis or, more generally, the placement of any surgical implant that must be at least partially anchored in a structure. bone of a human or animal patient, or the cutting or drilling of such a bone structure.
- orthopedic applications such as surgery of the pelvis or, more generally, the placement of any surgical implant that must be at least partially anchored in a structure. bone of a human or animal patient, or the cutting or drilling of such a bone structure.
- the description below can therefore be generalized and transposed to these other applications.
- Figure 1 is shown a bone structure 2 in which is placed a surgical implant 4 in an implantation direction X4.
- bone structure 2 is a human vertebra, here shown in an axial sectional plane.
- Implant 4 here includes a pedicle screw inserted into vertebra 2 and aligned along the direction of implantation X4.
- the vertebra 2 comprises a body 6 crossed by a channel 8, two pedicles 10, two transverse processes 12 and a spinous process 14.
- the X4 direction of implantation extends along one of the pedicles 10.
- Reference X4 defines a corresponding implantation direction for another pedicle screw 4 (not shown in Figure 1) and which extends along the other pedicle 10, generally symmetrically to the X4 direction.
- a notable difficulty that arises during a surgical operation to place implants 4 consists in determining the directions of implantation X4 and X4 '.
- the pedicle screws 4 should not be placed too close to the canal 8 nor too close to the outer edge of the body 6 so as not to damage the vertebra 2. They should not be inserted too deeply so as not to protrude from the anterior body, nor too short so as not to risk being accidentally expelled.
- One of the aspects of the method described below makes it possible to facilitate this determination before placing the implants.
- FIG. 2 is shown a robotic surgical installation 20 comprising a robotic surgical system 22 for operating on a patient 24.
- the surgical installation 20 is for example placed in an operating theater.
- the surgical robotic system 22 includes a robot arm carrying one or more effector tools, for example a bone piercing tool or a drilling tool. screwing. This system is simply called “surgical robot 22” in what follows.
- the robot arm is attached to a surgical robot support table 22.
- the support table is arranged near an operating table for receiving patient 24.
- the surgical robot 22 comprises an electronic control circuit configured to automatically move the effector tool (s) using actuators as a function of a setpoint position or a setpoint trajectory.
- Facility 20 includes a medical imaging system configured to acquire a three-dimensional digital fluoroscopic image of a target object, such as an anatomical region of patient 24.
- the medical imaging system includes a medical imaging device 26, an image processing unit 28, and a human-machine interface 30.
- apparatus 26 is a computer assisted x-ray tomography machine.
- the image processing unit 28 is configured to drive the apparatus 26 and to generate the three-dimensional digital fluoroscopic image from radiological measurements taken by the apparatus 26.
- the processing unit 28 comprises an electronic circuit or a computer programmed to automatically execute an image processing algorithm, for example by means of a microprocessor and of a software code recorded in a recording medium of computer readable data.
- the man-machine interface 30 allows an operator to control and / or supervise the operation of the imaging system.
- the interface 30 comprises a display screen and data entry means such as a keyboard and / or or a touch screen and / or a pointing device such as a mouse or a stylus or any other means. equivalent.
- the installation 20 comprises an operating planning system comprising a man-machine interface 31, a planning unit 32 and a trajectory computer 34, this planning system here bearing the reference 36.
- the man-machine interface 31 allows an operator to interact with the processing unit 32 and the computer 34, or even to control and / or supervise the operation of the surgical robot 22.
- the man-machine interface 31 comprises a display screen and data entry means such as a keyboard and / or or a touch screen and / or a pointing device such as a mouse or a stylus. or any equivalent means.
- the planning unit 32 is programmed to acquire position coordinates of one or more virtual landmarks defined by an operator by means of the man-machine interface 31 and, if necessary, to convert the coordinates of a geometric reference frame to another, for example from an image frame of reference to a robot 22 frame of reference.
- the trajectory computer 34 is programmed to automatically calculate the coordinates of one or more target positions, for example to form a target trajectory, in particular as a function of the virtual benchmarks determined by the planning unit 32.
- the trajectory computer 34 supplies positioning instructions to the robot 22 in order to correctly place the effector tool (s) in order to carry out all or part of the steps for placing the implants 4.
- the planning unit 32 and the trajectory computer 34 comprise an electronic circuit or a computer comprising a microprocessor and software code stored in a computer readable data recording medium.
- Figure 3 is shown a three-dimensional image 40 of a target object, such as an anatomical structure of the patient 24, preferably a bone structure, such as a portion of the patient's spine 24.
- a target object such as an anatomical structure of the patient 24, preferably a bone structure, such as a portion of the patient's spine 24.
- the three-dimensional image 40 is automatically reconstructed from raw data, in particular from a raw image generated by the imaging device 26, such as a digital image conforming to the DICOM standard ("Digital imaging and communications in medicine ”).
- the reconstruction is for example carried out by a computer comprising a graphics processing unit or by one of the units 28 or 32.
- the three-dimensional image 40 comprises a plurality of voxels distributed in a three-dimensional volume and which are each associated with a value representative of information on the local density of matter of the target object resulting from radiological measurements carried out by the device of imaging 26. These values are, for example, expressed on the Hounsfield scale.
- High density regions of the target object are more opaque to x-rays than low density regions. According to one possible convention, high density regions are assigned a higher intensity value than low density regions.
- the intensity values can be normalized based on a predefined pixel value scale, such as an encoding scale of type RGB ("Red-Green-Blue").
- a predefined pixel value scale such as an encoding scale of type RGB ("Red-Green-Blue").
- the normalized intensity is an integer between 0 and 255.
- the three-dimensional image 40 is for example reconstructed from a plurality of two-dimensional images corresponding to section planes of the apparatus 26.
- the distances between the voxels and between the section planes are known and can be stored in memory. .
- the imaging unit 28 calculates and displays, on the interface 30, two-dimensional images 42 showing different anatomical section planes of the target object, such as a section. sagittal 42a, a frontal section 42b and an axial section 42c.
- a virtual landmark 44 is illustrated on image 40 and may be superimposed on image 40 and on images 42a, 42b, 42c.
- the virtual coordinate system 44 comprises, for example, a set of coordinates stored in memory and expressed in the geometric frame of reference specific to image 40.
- An operator can change the orientation of the image 40 displayed on the interface 30, for example by rotating or tilting it, using the interface 31.
- the operator can also modify the position of the virtual marker 44, as illustrated by the arrows 46.
- the images 42a, 42b and 42c are then recalculated so that the marker 44 remains visible in each of the anatomical planes corresponding to the images 42a, 42b and 42c. This allows the operator to have a confirmation of the position of marker 44.
- FIG 4 is shown an image processing method implemented automatically by the planning system 36.
- a raw image of the target object is acquired by means of the medical imaging system.
- the raw image is generated by the processing unit 28, from a set of radiological measurements taken by the imaging device 26 on the target object.
- the digital image 40 is automatically reconstructed from the raw image acquired.
- the raw image is transferred from the imaging system to the planning system 36 through interfaces 30 and 31.
- an observation point is defined relative to the digital image 40, for example by choosing a particular orientation of the image 40 by means of the man-machine interface 31.
- the coordinates of the observation point thus defined are for example stored in memory and expressed in the geometric frame of reference specific to image 40.
- a plurality of viewing directions also called virtual rays, are defined in the three-dimensional image 40 as passing through the three-dimensional image 40 and emanating from the defined observation point.
- diagram (a) represents an illustrative example in which an observation point 50 is defined from which two virtual rays 52 and 54 depart which go towards the three-dimensional image 40 and pass successively through a plurality of voxels of the three-dimensional image 40.
- Virtual rays 52 and 54 are lines which diverge from observation point 50. They do not necessarily pass through the same voxels as they propagate in frame 40.
- Step S104 can be implemented in a manner analogous to computer graphics methods known as ray tracing, except that the projection step used in ray tracing methods is not here not used.
- the number of rays 52, 54 and the number of pixels may be different from that illustrated in this example.
- a resulting value is calculated from the respective intensity values of the voxels of the digital image crossed by said ray.
- diagram (b) represents the set 66 of the intensity values of the voxels encountered by the ray 52 during its path from the observation point 50.
- the resulting value 68 is calculated at from the set 66 of intensity values.
- diagram (c) represents the set 70 of the intensity values of the voxels encountered by the ray 52 during its path from the observation point 50.
- the resulting value 72 is calculated from the set 70 intensity values.
- the resulting value is calculated, for each direction of observation, as being equal to the product of the inverse of the intensity values of the voxels crossed.
- the resultant is calculated using the following calculation formula:
- the index "i” identifies the voxels crossed by the ray
- "ISO” denotes the normalized intensity value associated with the i-th voxel
- "Max” denotes the maximum length of the ray, for example imposed by the dimensions of the digital image 40.
- a two-dimensional digital image is calculated from the calculated result values.
- the resulting image can then be automatically displayed on the screen of interface 31.
- the resulting image is a two-dimensional view of the three-dimensional image as seen from the chosen observation point.
- the pixel intensity values of the resulting image correspond to the resulting values calculated during the various iterations of step S106.
- the intensity values are preferably normalized to allow the resulting image to be displayed in grayscale on a screen.
- the regions of low resultant are visually represented on the image with a darker tint than the regions corresponding to a high resultant.
- Figure 6 a resulting image 80 constructed from image 40 showing a portion of a patient's spine 24.
- the images 42a, 42b and 42c are also displayed alongside the resulting image 80 and are recalculated according to the orientation given to the image 40.
- the method thus provides, through a guided process of human-machine interaction, a visual aid to a surgeon or an operator to define more easily the target position of a surgical implant by means of virtual sighting marks.
- the preferred sectional plane for easily affixing the sighting marks corresponds to an antero-posterior view of vertebra 2.
- the pedicles 10 are then aligned perpendicular to the section plane and are easily identifiable on the resulting image due to their greater density and the fact that their cross section, which is then aligned in the image plane, presents a easily identifiable specific shape, for example an oval shape, as evidenced by zone 82 in figure 6.
- the resulting values are automatically calibrated against a scale of density values so as to associate a density value with each resulting value.
- the density can be quantified and not just visually shown in image 80.
- This registration is for example carried out with the help of a marker present in the field of vision of the device 26 during the radiological measurements used to construct the image 40, as will be understood from the description given below in reference to figure 8.
- the marker is placed alongside the target object and at least a portion of the marker is made of a material with a predefined material density, so that part of the three-dimensional digital fluoroscopic image generated includes the image of the calibration marker.
- the calibration are automatically associated with the intensity values of the pixels of image 80, density values determined automatically from the intensity values of a subset of pixels of this same image associated with the portion of the image. marker made in the material having the predefined material density.
- the viewing angle of the resulting image can be changed and a new resulting image is then automatically calculated based on the new orientation selected.
- a new position of the observation point is acquired, for example by means of the interface 31 in response to a selection by the operator.
- Steps S104, S106, S108 are then repeated with the new position of the observation point, to define new directions of observation from which new result values are calculated to construct a new resultant image, which differs from the previous resultant image only in the position from which is seen the target object.
- the resulting image 80 may be displayed in a specific area of the screen alternating with a two-dimensional image 42 showing the same region.
- An operator can alternate between viewing the resulting image and the two-dimensional image 42, for example if he wishes to confirm an anatomical interpretation of the image.
- FIG. 9 is shown a method for automatically planning a surgical operation, in particular an operation for placing a surgical implant, implemented by means of the installation 20.
- a three-dimensional digital fluoroscopic image of a target object is acquired by means of the medical imaging system and then a resulting image 80 is automatically constructed and then displayed from the three-dimensional image 40 by means of a image processing method in accordance with one of the embodiments described above.
- the operator defines the location of the virtual coordinate system using the input means of interface 31. For example, the operator places or draws a line segment defining a direction and positions of the virtual coordinate system. Alternatively, the operator can only point to a particular point, such as the center of the displayed cross section of pedicle 10.
- the virtual landmark can be displayed on frame 80 and / or on frame 40 and / or frames. 42. Several virtual landmarks can thus be defined on the same image.
- a step S122 is acquired, for example by the planning unit 32, the position of at least one virtual landmark 44 defined on the image 80 by an operator by means of a man-machine interface.
- a step S124 after the acquisition of a position of a virtual frame of reference, called the first virtual frame of reference, of the coordinates of an axis of symmetry defined on a portion of the image 80 by the operator by means of the interface 31 are acquired.
- the axis of symmetry is drawn on the image 80 by the operator by means of the interface 31. Then, the position of a second virtual coordinate system is automatically calculated by symmetry of the first virtual coordinate system with respect to the defined axis of symmetry.
- the direction X4 can thus be determined automatically if the operator considers that the vertebra 2 is sufficiently symmetrical.
- One or more other virtual landmarks may be defined analogously in the rest of the image once a virtual landmark has been defined, for example between several successive vertebrae of a portion of a spine.
- At least one target position, or even a target trajectory of the surgical robot 22, is automatically calculated by the unit 34 from the acquired position of the previously acquired virtual landmark. This calculation can take into account robot control laws 22 or a pre-established surgical program.
- this calculation comprises the calculation by unit 34 of the coordinates of the virtual coordinate system in a geometric frame of reference linked to the surgical robot 22 from the coordinates of said virtual frame of reference in a geometric frame of reference specific to the digital image.
- the frame of reference of the robot 22 is mechanically linked without a degree of freedom to the geometric frame of reference of the digital image 40, for example immobilizing the patient 24 with the support table of the robot 22, which makes it possible to establish a correspondence between a geometric frame of reference for the surgical robot and a geometric frame of reference for the patient.
- This immobilization is achieved here by means of spacers connected to the robot support table 22, as explained below.
- the density values can be used when calculating the trajectory or programming parameters of the robot 22.
- a bone drilling tool will have to apply a torque of higher piercing in areas of bone where higher bone density has been measured.
- the position and / or trajectory coordinates can then be transmitted to the robot 22 to position a tool so as to perform a surgical operation, in particular for placing a surgical implant, or at least to assist a surgeon to perform this surgical operation.
- FIG. 7 is shown an example of a surgical instrument 90 making it possible to immobilize the patient 24 with the support table of the robot 22 and comprising a retractor for pushing back the walls of an incision 92 made in the body 94 of the patient 24 comprising spreader arms 96 mounted on a frame 98.
- Each spreader arm 96 has a spreader tool 100 mounted at one end of a bar 102 fixed to the frame 100 by means of a fixing device 104 adjustable by means of an adjustment knob 106.
- the frame 98 has a fastening system by which it can be fixed integrally without a degree of freedom to the robot 22, preferably to the robot support table 22.
- the frame 98 is here formed by assembling a plurality of bars, here of tubular shape, these bars comprising in particular a main bar 108 fixedly fixed without any degree of freedom to the robot support table 22, side bars 1 10 and a front bar 1 12 on which the spacer arms 96 are mounted.
- the bars are fixed together at their respective ends by fixing devices 1 14 similar to devices 104.
- the frame 98 is arranged so as to overhang the body 94 of the patient and here has an essentially rectangular shape.
- the frame 98 and the spacer arms 96 are made of a radiolucent material, so as not to be visible on image 40.
- the retractor 96 can be configured to immobilize the spine of the patient 24 made accessible by the incision 92, which makes it possible to link the patient even better to the reference frame of the robot 22 and to avoid any movement liable to induce a spatial shift. between the image and the actual position of the patient.
- a calibration marker 1 16 made of a radiopaque material, that is to say a material opaque to X-rays, can be used in the installation 20.
- the marker 116 can be attached to the instrument 90, for example held integral with the frame 98, although this is not necessary.
- marker 1 16 can be attached to the end of the robot arm.
- At least part of the marker 116 has a regular geometric shape, so as to be easily identifiable on images 40 and 80.
- the marker 1 16 comprises a body 1 18, for example of cylindrical shape, and one or more parts 120, 122, 124 in the form of a disc or of a sphere, preferably having different diameters. For example, these diameters are greater than the dimensions of the body 1 18.
- a spherical shape has the advantage of having the same appearance regardless of the viewing angle.
- At least a portion of the marker 1 16, preferably those having a recognizable shape, in particular spherical, is made of a material with a predefined material density.
- the density scale calibration is carried out by identifying this portion of the marker on the image 40 or 80, by automatic pattern recognition or by manual pointing of the shape on the image by the operator through interface 30.
- the medical imaging system comprising the apparatus 26 and the unit 28 can be used independently of the surgical robot 22 and the planning system 36.
- the image processing method described above can therefore be used independently of the imaging methods. surgical operation planning described above.
- this image processing method can be used for non-destructive testing of mechanical parts using industrial imaging techniques.
- the instrument 90 and the image processing method can be used independently of each other.
- the instrument 90 may include a displacement sensor such as an inertial unit, bearing the reference 1 15 in FIG. 7, to measure movements of the patient 24 during the operation and to correct the calculated positions or trajectories accordingly.
- a displacement sensor such as an inertial unit, bearing the reference 1 15 in FIG. 7, to measure movements of the patient 24 during the operation and to correct the calculated positions or trajectories accordingly.
- sensor 115 is connected to unit 32 by a data link.
- Unit 32 is programmed to record patient movements measured by sensor 115 and to automatically correct positions or trajectories of a robot arm based on the measured movements.
Landscapes
- Engineering & Computer Science (AREA)
- Physics & Mathematics (AREA)
- Health & Medical Sciences (AREA)
- Theoretical Computer Science (AREA)
- General Physics & Mathematics (AREA)
- Epidemiology (AREA)
- Medical Informatics (AREA)
- Primary Health Care (AREA)
- Public Health (AREA)
- General Health & Medical Sciences (AREA)
- Nuclear Medicine, Radiotherapy & Molecular Imaging (AREA)
- Computer Vision & Pattern Recognition (AREA)
- Computer Graphics (AREA)
- Surgery (AREA)
- Urology & Nephrology (AREA)
- Radiology & Medical Imaging (AREA)
- Geometry (AREA)
- Apparatus For Radiation Diagnosis (AREA)
Abstract
Description
Claims
Applications Claiming Priority (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
FR1901615A FR3092748A1 (fr) | 2019-02-18 | 2019-02-18 | Procédés et systèmes de traitement d’images |
PCT/EP2020/054055 WO2020169515A1 (fr) | 2019-02-18 | 2020-02-17 | Procédés et systèmes de traitement d'images |
Publications (1)
Publication Number | Publication Date |
---|---|
EP3928293A1 true EP3928293A1 (fr) | 2021-12-29 |
Family
ID=67514741
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
EP20704054.4A Pending EP3928293A1 (fr) | 2019-02-18 | 2020-02-17 | Procédés et systèmes de traitement d'images |
Country Status (4)
Country | Link |
---|---|
US (1) | US20220130509A1 (fr) |
EP (1) | EP3928293A1 (fr) |
FR (1) | FR3092748A1 (fr) |
WO (1) | WO2020169515A1 (fr) |
Families Citing this family (2)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN117316393B (zh) * | 2023-11-30 | 2024-02-20 | 北京维卓致远医疗科技发展有限责任公司 | 一种精度调整的方法、装置、设备、介质和程序产品 |
CN118196356B (zh) * | 2024-05-16 | 2024-08-02 | 山东捷瑞数字科技股份有限公司 | 一种基于点云对于不规则物体图像的重构解析方法及系统 |
Family Cites Families (13)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US6470207B1 (en) * | 1999-03-23 | 2002-10-22 | Surgical Navigation Technologies, Inc. | Navigational guidance via computer-assisted fluoroscopic imaging |
ES2231185T3 (es) * | 1999-04-22 | 2005-05-16 | Medtronic Surgical Navigation Technologies | Aparatos y metodos para cirugia guiada por imagenes. |
US7194120B2 (en) * | 2003-05-29 | 2007-03-20 | Board Of Regents, The University Of Texas System | Methods and systems for image-guided placement of implants |
US20080119724A1 (en) * | 2006-11-17 | 2008-05-22 | General Electric Company | Systems and methods for intraoperative implant placement analysis |
US8641621B2 (en) * | 2009-02-17 | 2014-02-04 | Inneroptic Technology, Inc. | Systems, methods, apparatuses, and computer-readable media for image management in image-guided medical procedures |
US9510771B1 (en) * | 2011-10-28 | 2016-12-06 | Nuvasive, Inc. | Systems and methods for performing spine surgery |
US8891847B2 (en) * | 2012-01-23 | 2014-11-18 | Medtronic Navigation, Inc. | Automatic implant detection from image artifacts |
US10467752B2 (en) * | 2013-06-11 | 2019-11-05 | Atsushi Tanji | Bone cutting support system, information processing apparatus, image processing method, and image processing program |
JP6437286B2 (ja) * | 2014-11-26 | 2018-12-12 | 株式会社東芝 | 画像処理装置、画像処理プログラム、画像処理方法及び治療システム |
US10595941B2 (en) * | 2015-10-30 | 2020-03-24 | Orthosensor Inc. | Spine measurement system and method therefor |
US10191615B2 (en) * | 2016-04-28 | 2019-01-29 | Medtronic Navigation, Inc. | Method and apparatus for image-based navigation |
US10561466B2 (en) * | 2017-08-10 | 2020-02-18 | Sectra Ab | Automated planning systems for pedicle screw placement and related methods |
US10892058B2 (en) * | 2017-09-29 | 2021-01-12 | K2M, Inc. | Systems and methods for simulating spine and skeletal system pathologies |
-
2019
- 2019-02-18 FR FR1901615A patent/FR3092748A1/fr active Pending
-
2020
- 2020-02-17 WO PCT/EP2020/054055 patent/WO2020169515A1/fr unknown
- 2020-02-17 US US17/430,917 patent/US20220130509A1/en active Pending
- 2020-02-17 EP EP20704054.4A patent/EP3928293A1/fr active Pending
Also Published As
Publication number | Publication date |
---|---|
WO2020169515A1 (fr) | 2020-08-27 |
FR3092748A1 (fr) | 2020-08-21 |
US20220130509A1 (en) | 2022-04-28 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
Suenaga et al. | Vision-based markerless registration using stereo vision and an augmented reality surgical navigation system: a pilot study | |
EP3145420B1 (fr) | Procédé de suivi preopératoire | |
KR101156306B1 (ko) | 기구 추적 방법 및 장치 | |
EP2186479A1 (fr) | Mesure de grandeurs geometriques intrinseques a un systeme anatomique. | |
US8731643B2 (en) | Imaging system and methods for medical needle procedures | |
JP2008093438A (ja) | 内視鏡を較正するためのシステム及び方法 | |
EP3928293A1 (fr) | Procédés et systèmes de traitement d'images | |
JP6767997B2 (ja) | 歯科用画像生成システムの画像データの画像向上のための方法 | |
CN116847799A (zh) | 用于导航脊柱手术的增强现实脊柱棒规划和弯曲的计算机实现方法 | |
Moratin et al. | Head motion during cone-beam computed tomography: Analysis of frequency and influence on image quality | |
JP2021508549A (ja) | X線撮像システムを較正するための方法及びシステム | |
CN118436438A (zh) | 用于将切口标记投射到患者上的方法和系统 | |
US20230196641A1 (en) | Method and Device for Enhancing the Display of Features of interest in a 3D Image of an Anatomical Region of a Patient | |
FR3092746A1 (fr) | Instrument chirurgical pour une installation de chirurgie robotique | |
CA3212799A1 (fr) | Robot equipe d'une sonde echographique pour le guidage temps-reel d'interventions percutanees | |
JP7172086B2 (ja) | 手術シミュレーション装置及び手術シミュレーションプログラム | |
Naik et al. | Feature-based registration framework for pedicle screw trajectory registration between multimodal images | |
US20240197411A1 (en) | System and method for lidar-based anatomical mapping | |
EP3931799B1 (fr) | Suivi de dispositif d'intervention | |
CN110313991B (zh) | 静态虚拟相机定位 | |
US20070248206A1 (en) | Ct scanner with untracked markers | |
CN113164066B (zh) | 用于将切口标记投射到患者上的方法和系统 | |
EP3547255B1 (fr) | Localisation d'une ouverture d'une cavité corporelle | |
US20240225583A9 (en) | Providing a 3d image dataset | |
CN117677358A (zh) | 用于手术期间现场x射线荧光透视和c形臂计算机断层扫描成像的立体投影和交叉参考的增强现实系统和方法 |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
STAA | Information on the status of an ep patent application or granted ep patent |
Free format text: STATUS: UNKNOWN |
|
STAA | Information on the status of an ep patent application or granted ep patent |
Free format text: STATUS: THE INTERNATIONAL PUBLICATION HAS BEEN MADE |
|
STAA | Information on the status of an ep patent application or granted ep patent |
Free format text: STATUS: THE INTERNATIONAL PUBLICATION HAS BEEN MADE |
|
PUAI | Public reference made under article 153(3) epc to a published international application that has entered the european phase |
Free format text: ORIGINAL CODE: 0009012 |
|
STAA | Information on the status of an ep patent application or granted ep patent |
Free format text: STATUS: REQUEST FOR EXAMINATION WAS MADE |
|
17P | Request for examination filed |
Effective date: 20210813 |
|
AK | Designated contracting states |
Kind code of ref document: A1 Designated state(s): AL AT BE BG CH CY CZ DE DK EE ES FI FR GB GR HR HU IE IS IT LI LT LU LV MC MK MT NL NO PL PT RO RS SE SI SK SM TR |
|
DAV | Request for validation of the european patent (deleted) | ||
DAX | Request for extension of the european patent (deleted) | ||
RAP1 | Party data changed (applicant data changed or rights of an application transferred) |
Owner name: S.M.A.I.O |
|
STAA | Information on the status of an ep patent application or granted ep patent |
Free format text: STATUS: EXAMINATION IS IN PROGRESS |
|
17Q | First examination report despatched |
Effective date: 20240807 |