CN111870211A - Three-dimensional endoscope with instrument pose navigation function and navigation method thereof - Google Patents
Three-dimensional endoscope with instrument pose navigation function and navigation method thereof Download PDFInfo
- Publication number
- CN111870211A CN111870211A CN202010739975.1A CN202010739975A CN111870211A CN 111870211 A CN111870211 A CN 111870211A CN 202010739975 A CN202010739975 A CN 202010739975A CN 111870211 A CN111870211 A CN 111870211A
- Authority
- CN
- China
- Prior art keywords
- curved surface
- camera
- endoscope
- dimensional
- navigation
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Pending
Links
- 238000000034 method Methods 0.000 title claims abstract description 26
- 230000005484 gravity Effects 0.000 claims abstract description 19
- 238000009877 rendering Methods 0.000 claims abstract description 8
- 239000013307 optical fiber Substances 0.000 claims description 27
- 239000011159 matrix material Substances 0.000 claims description 8
- 239000000523 sample Substances 0.000 claims description 8
- 230000008878 coupling Effects 0.000 claims description 6
- 238000010168 coupling process Methods 0.000 claims description 6
- 238000005859 coupling reaction Methods 0.000 claims description 6
- CPBQJMYROZQQJC-UHFFFAOYSA-N helium neon Chemical compound [He].[Ne] CPBQJMYROZQQJC-UHFFFAOYSA-N 0.000 claims description 6
- 230000003287 optical effect Effects 0.000 claims description 4
- 230000000007 visual effect Effects 0.000 claims description 4
- 238000006243 chemical reaction Methods 0.000 claims description 3
- 238000006073 displacement reaction Methods 0.000 claims description 3
- 238000000605 extraction Methods 0.000 claims description 3
- 238000005457 optimization Methods 0.000 claims description 3
- 239000007787 solid Substances 0.000 claims description 3
- 238000005259 measurement Methods 0.000 description 4
- 239000000835 fiber Substances 0.000 description 2
- 230000006870 function Effects 0.000 description 2
- 230000008447 perception Effects 0.000 description 2
- 208000002847 Surgical Wound Diseases 0.000 description 1
- 210000000988 bone and bone Anatomy 0.000 description 1
- 239000002775 capsule Substances 0.000 description 1
- 238000010276 construction Methods 0.000 description 1
- 238000010586 diagram Methods 0.000 description 1
- 230000006872 improvement Effects 0.000 description 1
- 230000035945 sensitivity Effects 0.000 description 1
- 210000001519 tissue Anatomy 0.000 description 1
- 230000009466 transformation Effects 0.000 description 1
- 210000001835 viscera Anatomy 0.000 description 1
Images
Classifications
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B1/00—Instruments for performing medical examinations of the interior of cavities or tubes of the body by visual or photographical inspection, e.g. endoscopes; Illuminating arrangements therefor
- A61B1/04—Instruments for performing medical examinations of the interior of cavities or tubes of the body by visual or photographical inspection, e.g. endoscopes; Illuminating arrangements therefor combined with photographic or television appliances
- A61B1/045—Control thereof
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B1/00—Instruments for performing medical examinations of the interior of cavities or tubes of the body by visual or photographical inspection, e.g. endoscopes; Illuminating arrangements therefor
- A61B1/00147—Holding or positioning arrangements
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B1/00—Instruments for performing medical examinations of the interior of cavities or tubes of the body by visual or photographical inspection, e.g. endoscopes; Illuminating arrangements therefor
- A61B1/06—Instruments for performing medical examinations of the interior of cavities or tubes of the body by visual or photographical inspection, e.g. endoscopes; Illuminating arrangements therefor with illuminating arrangements
- A61B1/07—Instruments for performing medical examinations of the interior of cavities or tubes of the body by visual or photographical inspection, e.g. endoscopes; Illuminating arrangements therefor with illuminating arrangements using light-conductive means, e.g. optical fibres
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T7/00—Image analysis
- G06T7/20—Analysis of motion
- G06T7/246—Analysis of motion using feature-based methods, e.g. the tracking of corners or segments
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T7/00—Image analysis
- G06T7/80—Analysis of captured images to determine intrinsic or extrinsic camera parameters, i.e. camera calibration
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2207/00—Indexing scheme for image analysis or image enhancement
- G06T2207/10—Image acquisition modality
- G06T2207/10068—Endoscopic image
Landscapes
- Health & Medical Sciences (AREA)
- Life Sciences & Earth Sciences (AREA)
- Engineering & Computer Science (AREA)
- Surgery (AREA)
- Physics & Mathematics (AREA)
- Biomedical Technology (AREA)
- Animal Behavior & Ethology (AREA)
- Radiology & Medical Imaging (AREA)
- Optics & Photonics (AREA)
- Nuclear Medicine, Radiotherapy & Molecular Imaging (AREA)
- Biophysics (AREA)
- Heart & Thoracic Surgery (AREA)
- Medical Informatics (AREA)
- Molecular Biology (AREA)
- Pathology (AREA)
- General Health & Medical Sciences (AREA)
- Public Health (AREA)
- Veterinary Medicine (AREA)
- Computer Vision & Pattern Recognition (AREA)
- General Physics & Mathematics (AREA)
- Theoretical Computer Science (AREA)
- Multimedia (AREA)
- Instruments For Viewing The Inside Of Hollow Bodies (AREA)
Abstract
The invention relates to a three-dimensional endoscope with instrument pose navigation and a navigation method thereof, wherein the three-dimensional endoscope with instrument pose navigation comprises the following steps: the front end of the three-dimensional endoscope is placed at a position with a certain distance from a measured surface to project line laser, the structured light of the measured surface is collected, and the structure points on the camera plane are respectively extracted and calculated to obtain the three-dimensional coordinate value of the curved surface point cloud under the endoscope coordinate system; converting the coordinate value of the measured surface on the coordinate system of the endoscope into the coordinate system of the electromagnetic sensor; rendering the curved surface point cloud under the electromagnetic coordinate system to generate an entity curved surface; calculating a weighted average value of all curved surface point cloud three-dimensional coordinates of the generated entity curved surface to obtain a space coordinate value of the gravity center G of the curved surface; and acquiring a real-time attitude vector of the instrument on the current curved surface according to the space coordinate value of the gravity center G to finish navigation. The invention can complete the three-dimensional reconstruction and real-time reproduction of the measured curved surface, and the navigation has continuity and is not interfered by shielding or signals.
Description
Technical Field
The invention relates to the field of surgical navigation equipment, in particular to a three-dimensional endoscope with instrument pose navigation and a navigation method thereof.
Background
An endoscope is an optical instrument that is accessed through a small surgical incision to assist a physician in viewing internal organ or tissue surface structures. Currently, most of the 30-70 visual endoscopes are widely used in operation auxiliary examination. However, the conventional endoscope can only provide a two-dimensional image without depth information of the surgical field, can only perform intra-operative observation on the surgical field range, and cannot provide stereoscopic vision perception of the surgical target, especially perception of the relative distance between the endoscope and the target object. Moreover, the conventional endoscope is complicated in construction and expensive, but nevertheless, collision with a hard object (such as a bone) in the body due to lack of relative distance sensing often occurs, which is extremely likely to cause damage to the distal lens of the endoscope. In terms of instrument navigation, the common main forms are a photoelectric tracking type and an electromagnetic control type. The former can realize the precondition of continuous navigation that the arm can not block the photoelectric signal, otherwise the navigation information will be interrupted; the latter typically requires an electromagnetic guide to control the motion profile of the endoscope (commonly found in capsule endoscopes). Both lack navigation stability and ease of operation.
Disclosure of Invention
In view of the above problems, an object of the present invention is to provide a three-dimensional endoscope with instrument pose navigation and a navigation method thereof, which can complete three-dimensional reconstruction and real-time reproduction of a measured curved surface, and the navigation is continuous and is not interfered by occlusion or signals.
In order to achieve the purpose, the invention adopts the following technical scheme: a three-dimensional endoscope navigation method with instrument pose navigation comprises the following steps: s1, placing the front end of the three-dimensional endoscope at a certain distance from the measured surface to project line laser, collecting the structured light of the measured surface, respectively extracting and calculating the structural points on the camera plane, and obtaining the three-dimensional coordinate value of the curved surface point cloud under the endoscope coordinate system Wherein c represents an endoscope coordinate system, and i represents the index of the number of laser points; s2, converting the coordinate value of the measured surface on the endoscope coordinate system into an electromagnetic sensor coordinate system; s3, rendering the curved surface point cloud under the electromagnetic coordinate system to generate an entity curved surface; s4, calculating a weighted average value of all the three-dimensional coordinates of the point clouds of the curved surfaces of the generated entity curved surfaces to obtain a spatial coordinate value of the gravity center G of the curved surfaces; and S5, acquiring a real-time attitude vector of the instrument on the current curved surface according to the space coordinate value of the gravity center G, and completing navigation.
Further, in the step S1, the certain distance is 3 to 10 mm.
Further, in step S1, the structured light of the measured surface is collected by a camera.
Further, in step S1, the optical axis of the camera is defined as the Z axis, i.e., the depth direction, the upper side of the camera is defined as the X axis, and the right side is defined as the Y axis; then the three-dimensional coordinate value of each detected laser point in the endoscope coordinate system is:
in the formula, B is the length of a base line, namely the distance value between the front end of the optical fiber and the front end of the camera; f is the focal length of the camera; (x, y) is a two-dimensional coordinate value of the laser point on the camera plane; (c)x,cy) Is the projected center coordinate of the camera plane.
Further, in step S2, the conversion formula is:
in the formula, RsIs a three-dimensional rotation matrix of an electromagnetic sensor and has Rs=RzRyRx,Rz、Ry、RxThe rotation around the Z axis, the Y axis and the X axis is respectively a matrix of 3X 3;is a displacement matrix between the camera and the electromagnetic sensor; s denotes the electromagnetic sensor coordinate system.
Further, in step S3, a Marching Cubes optimization method is adopted to render the curved point cloud in the electromagnetic coordinate system in OpenGL to generate an entity curved surface.
Further, the method for generating the solid curved surface specifically comprises the following steps:
s31, setting the top points of two adjacent grids as CiAnd Ci-1If the interval parameter t of the point cloud lattice is equal to Ci-Ci-1;
S32, calculating by CiCi-1For each point cloud within a cube of side lengths at side length CiCi-1If projected at the point diOn the left side of the midpoint of the side length, the point cloud is assigned to the vertex CiOtherwise, configure to Ci-1;
S33, extracting a triangular curved surface meeting preset conditions according to the distribution state of the vertex, and generating an index of an 8-bit two-dimensional numerical value according to the extracted triangular curved surface;
and S34, obtaining the extraction type of the curved surface according to the index of the 8-bit two-dimensional numerical value, finishing the rendering of the curved surface and generating the visual curved surface.
Further, in step S5, the specific navigation method includes the following steps:
s51, searching three point clouds on a circle with the center of gravity G as the center of circle and r as the radius, if and only if an equilateral triangle can be formed, storing the three-dimensional coordinates of the three point clouds, and recording the vertexes of the triangle as P1,P2,P3;
S52, calculating the vector product of any two sides of the equilateral triangle to obtain the normal vector in the minimal neighborhood of the equilateral triangle, wherein the normal vector passing through the gravity center G is the real-time attitude vector of the appliance on the current curved surface, and the coordinate of the gravity center G is the current position of the appliance;
and S53, displaying and reproducing the normal vector and the real-time instrument posture vector in the step S52 on a display screen through an OpenGL image, and completing navigation.
A three-dimensional endoscope for implementing the above navigation method, comprising: the system comprises a camera, an optical fiber, an electromagnetic sensor and an endoscope probe; the camera, the optical fiber and the electromagnetic sensor are all arranged in the probe tube of the endoscope, and are arranged in a triangular shape, and a triangular geometric structure formed by the camera, the optical fiber and the electromagnetic sensor is stored in an upper computer; the helium-neon laser and the coupling mirror are arranged at one end of the optical fiber, line laser projected by the helium-neon laser is emitted by the optical fiber after passing through the coupling mirror, is projected on the measured curved surface to form structured light consistent with the measured curved surface, and is transmitted to the upper computer after being collected by the camera; the electromagnetic sensor collects the relative position and posture information of the current endoscope and transmits the information to the upper computer, and the motion state of the camera is tracked.
Further, the electromagnetic sensor and the camera are fixed in a parallel constraint relationship, and the electromagnetic sensor is positioned between the camera and the optical fiber; the geometric connecting line between the camera and the optical fiber and the geometric connecting line between the optical fiber and the electromagnetic sensor are of a splayed structure.
Due to the adoption of the technical scheme, the invention has the following advantages: 1. the invention adopts optical fiber to transmit laser, adopts a 6-freedom micro electromagnetic sensor to track the motion state of the camera, has the diameter of 1mm, is fixed side by side with the camera in a splayed shape, can effectively compress the tip of the endoscope to be within the diameter of 5-6mm, and has compact structure. 2. The invention can complete the three-dimensional reconstruction and real-time three-dimensional reproduction of the measured curved surface, the measurement does not need to set a reference plane and calibrate, and the measurement, reconstruction and three-dimensional reproduction time can be completed within 5 seconds. 3. The navigation of the invention has continuity and is not interfered by shielding or signals.
Drawings
Fig. 1 is a schematic view of the overall structure of a three-dimensional endoscope according to the present invention.
FIG. 2 is a schematic diagram of the normal vector of the center of gravity G and the real-time pose vector of the tool on the three-dimensional rendering of the measured surface; wherein the dashed line represents the normal vector of the center of gravity G and the solid line represents the real-time pose vector of the tool.
Detailed Description
In the description of the present invention, it is to be understood that the terms "upper", "lower", "inside", "outside", and the like, indicate orientations or positional relationships based on the orientations or positional relationships shown in the drawings, are only for convenience in describing the present invention and simplifying the description, and do not indicate or imply that the referred device or element must have a specific orientation, be constructed in a specific orientation, and be operated, and thus, should not be construed as limiting the present invention. The invention is described in detail below with reference to the figures and examples.
As shown in fig. 1, the present invention provides a three-dimensional endoscope with instrument posture guidance, which includes a camera 1, an optical fiber 2, an electromagnetic sensor 3 (electromagnetic position tracker), and an endoscope probe 6. Camera 1, optic fibre 2 and electromagnetic sensor 3 all set up in endoscope probe 6, and camera 1, optic fibre 2 and electromagnetic sensor 3 are the triangle-shaped and arrange, and the triangle-shaped geometry that the three formed is saved to the host computer in. The helium-neon laser and the coupling mirror are arranged at one end of the optical fiber, line laser 4 projected by the helium-neon laser is emitted by the optical fiber 2 after passing through the coupling mirror, and is projected on the measured curved surface 5 to form structural light consistent with the measured curved surface 5, and the structural light is collected by the camera 1 and then transmitted to an upper computer. The electromagnetic sensor 3 collects the information of the relative position (3 degrees of freedom) and the posture (3 degrees of freedom) of the current endoscope and transmits the information to the upper computer, so that the tracking of the motion state of the camera 1 is realized.
In the above embodiment, the camera 1 is a miniature camera with a diameter of 2mm and an OVM6946 model, the resolution of the miniature camera is 16 ten thousand pixels, the viewing angle is 120 °, the image plane size is 400 × 400, 30 frames/second, and the miniature camera has a waterproof function and is provided with an LED light source with adjustable brightness.
In the above embodiments, the electromagnetic sensor 3 is a 6-degree-of-freedom micro electromagnetic sensor having a diameter of 1 mm. For example, in the present embodiment, the electromagnetic Sensor 3 employs a Micro-Sensor 1.8 having the dimensions: 17.3mm in length and 1.8mm in diameter from Polhemus, USA.
In the above embodiments, the diameter of the endoscope probe 6 is 5-6 mm.
In the above embodiments, the camera 1, the optical fiber 2 and the electromagnetic sensor 3 are arranged in a triangular manner, and in the endoscope probe tube 6, the electromagnetic sensor 3 and the camera 1 are fixed in a parallel constraint relationship, and the electromagnetic sensor 3 must be located between the camera 1 and the optical fiber 2; the geometric connecting line between the camera 1 and the optical fiber 2 and the geometric connecting line between the optical fiber 2 and the electromagnetic sensor 3 are of a splayed structure, so that the parallax error during measurement is increased, and the sensitivity and the precision of measurement are improved.
Based on the three-dimensional endoscope, the invention also provides a three-dimensional endoscope navigation method with instrument pose navigation, which comprises the following steps:
s1, placing the front end of the three-dimensional endoscope at a certain distance from the measured surface to project line laser 4, collecting the structured light of the measured surface, respectively extracting and calculating the structural points on the camera plane (i.e. image plane), and obtaining the three-dimensional coordinate value of the curved surface point cloud under the endoscope coordinate systemWhere c denotes an endoscope coordinate system and i denotes an index of the number of laser points.
Preferably, the certain distance may be 3-10 mm;
preferably, the structured light of the measured surface can be collected by using the camera 1;
the optical axis of the camera is defined as the Z axis, i.e., the depth direction, the upper side of the camera is defined as the X axis, and the right side is the Y axis. Then the three-dimensional coordinate value of each detected laser point in the endoscope coordinate system is:
in the formula, B is the length of a base line, namely the distance value between the front end of the optical fiber and the front end of the camera; f is the focal length of the camera; (x, y) is a two-dimensional coordinate value of the laser point on the camera plane; (c)x,cy) Is the projected center coordinate of the camera plane.
S2, converting the coordinate values of the measured surface on the endoscope coordinate system to the electromagnetic sensor 3, and expressing the coordinate values in the electromagnetic sensor coordinate system.
The conversion formula is:
in the formula, RsIs a three-dimensional rotation matrix of an electromagnetic sensor and has Rs=RzRyRx,Rz、Ry、RxThe rotation around the Z axis, the Y axis and the X axis is respectively a matrix of 3X 3;for cameras and electromagnetic transmissionsA displacement matrix between the sensors; s denotes the electromagnetic sensor coordinate system.
And S3, rendering the curved surface point cloud under the electromagnetic coordinate system to generate an entity curved surface.
Preferably, a Marching Cubes optimization method is adopted, and a curved surface point cloud under an electromagnetic coordinate system is rendered under OpenGL to generate an entity curved surface;
the method for generating the solid curved surface comprises the following specific steps:
s31, setting the top points of two adjacent grids as CiAnd Ci-1If the interval parameter t of the point cloud lattice is equal to Ci-Ci-1;
S32, calculating by CiCi-1For each point cloud within a cube of side lengths at side length CiCi-1If projected at the point diOn the left side of the midpoint of the side length, the point cloud is assigned to the vertex CiOtherwise, configure to Ci-1;
S33, extracting a triangular curved surface meeting preset conditions according to the distribution state of the vertex, and generating an index of an 8-bit two-dimensional numerical value according to the extracted triangular curved surface;
within a cube, the vertices where the triangle exists give a two-dimensional value of 1, otherwise 0;
and S34, obtaining the type of the extraction of the curved surface according to the index of the 8-bit two-dimensional numerical value, further finishing the rendering of the curved surface and generating the visual curved surface.
And S4, calculating the weighted average of the three-dimensional coordinates of all the curved surface point clouds of the generated entity curved surface to obtain the spatial coordinate value of the gravity center G of the curved surface.
And S5, acquiring a real-time attitude vector of the instrument on the current curved surface according to the space coordinate value of the gravity center G, and completing navigation.
The method comprises the following steps:
s51, searching three point clouds on a circle with center of gravity G as center and r as radius (preferably, r is 1mm), if and only if an equilateral triangle can be formed, storing three-dimensional coordinates of the three point clouds, and designating the vertexes of the triangle as P1,P2,P3。
S52, calculating the vector product of any two sides of the equilateral triangle obtained in the step S51, and obtaining the normal vector in the minimum neighborhood of the equilateral triangle, wherein the normal vector passing through the gravity center G is the real-time attitude vector of the appliance on the current curved surface, and the coordinate of the gravity center G is the current position of the appliance.
And S53, reproducing the normal vector and the real-time instrument posture vector in the step S52 on a display screen through an OpenGL image display (as shown in FIG. 2), and completing navigation.
The above embodiments are only for illustrating the present invention, and the structure, size, arrangement position and shape of each component can be changed, and on the basis of the technical scheme of the present invention, the improvement and equivalent transformation of the individual components according to the principle of the present invention should not be excluded from the protection scope of the present invention.
Claims (10)
1. A three-dimensional endoscope navigation method with instrument pose navigation is characterized by comprising the following steps:
s1, placing the front end of the three-dimensional endoscope at a certain distance from the measured surface to project line laser, collecting the structured light of the measured surface, respectively extracting and calculating the structural points on the camera plane, and obtaining the three-dimensional coordinate value of the curved surface point cloud under the endoscope coordinate systemWherein c represents an endoscope coordinate system, and i represents the index of the number of laser points;
s2, converting the coordinate value of the measured surface on the endoscope coordinate system into an electromagnetic sensor coordinate system;
s3, rendering the curved surface point cloud under the electromagnetic coordinate system to generate an entity curved surface;
s4, calculating a weighted average value of all the three-dimensional coordinates of the point clouds of the curved surfaces of the generated entity curved surfaces to obtain a spatial coordinate value of the gravity center G of the curved surfaces;
and S5, acquiring a real-time attitude vector of the instrument on the current curved surface according to the space coordinate value of the gravity center G, and completing navigation.
2. The navigation method of claim 1, wherein: in the step S1, the certain distance is 3-10 mm.
3. The navigation method of claim 1, wherein: in step S1, the structured light of the measured surface is collected by a camera.
4. The navigation method of claim 1, wherein: in step S1, the optical axis of the camera is defined as the Z axis, i.e., the depth direction, the upper side of the camera is defined as the X axis, and the right side is defined as the Y axis; then the three-dimensional coordinate value of each detected laser point in the endoscope coordinate system is:
in the formula, B is the length of a base line, namely the distance value between the front end of the optical fiber and the front end of the camera; f is the focal length of the camera; (x, y) is a two-dimensional coordinate value of the laser point on the camera plane; (c)x,cy) Is the projected center coordinate of the camera plane.
5. The navigation method of claim 1, wherein: in step S2, the conversion formula is:
in the formula, RsIs a three-dimensional rotation matrix of an electromagnetic sensor and has Rs=RzRyRx,Rz、Ry、RxRespectively around the Z axis,The rotation of the Y axis and the rotation of the X axis are both 3-by-3 matrixes;is a displacement matrix between the camera and the electromagnetic sensor; s denotes the electromagnetic sensor coordinate system.
6. The navigation method of claim 1, wherein: in step S3, a Marching Cubes optimization method is adopted to render the curved point cloud in the electromagnetic coordinate system in OpenGL to generate an entity curved surface.
7. The navigation method according to claim 6, wherein the method for generating the solid surface comprises the following steps:
s31, setting the top points of two adjacent grids as CiAnd Ci-1If the interval parameter t of the point cloud lattice is equal to Ci-Ci-1;
S32, calculating by CiCi-1For each point cloud within a cube of side lengths at side length CiCi-1If projected at the point diOn the left side of the midpoint of the side length, the point cloud is assigned to the vertex CiOtherwise, configure to Ci-1;
S33, extracting a triangular curved surface meeting preset conditions according to the distribution state of the vertex, and generating an index of an 8-bit two-dimensional numerical value according to the extracted triangular curved surface;
and S34, obtaining the extraction type of the curved surface according to the index of the 8-bit two-dimensional numerical value, finishing the rendering of the curved surface and generating the visual curved surface.
8. The navigation method according to claim 1, wherein in step S5, the specific navigation method includes the following steps:
s51, searching three point clouds on a circle with the center of gravity G as the center of circle and r as the radius, if and only if an equilateral triangle can be formed, storing the three-dimensional coordinates of the three point clouds, and recording the vertexes of the triangle as P1,P2,P3;
S52, calculating the vector product of any two sides of the equilateral triangle to obtain the normal vector in the minimal neighborhood of the equilateral triangle, wherein the normal vector passing through the gravity center G is the real-time attitude vector of the appliance on the current curved surface, and the coordinate of the gravity center G is the current position of the appliance;
and S53, displaying and reproducing the normal vector and the real-time instrument posture vector in the step S52 on a display screen through an OpenGL image, and completing navigation.
9. A three-dimensional endoscope for implementing the navigation method according to any one of claims 1 to 8, comprising: the system comprises a camera, an optical fiber, an electromagnetic sensor and an endoscope probe; the camera, the optical fiber and the electromagnetic sensor are all arranged in the probe tube of the endoscope, and are arranged in a triangular shape, and a triangular geometric structure formed by the camera, the optical fiber and the electromagnetic sensor is stored in an upper computer; the helium-neon laser and the coupling mirror are arranged at one end of the optical fiber, line laser projected by the helium-neon laser is emitted by the optical fiber after passing through the coupling mirror, is projected on the measured curved surface to form structured light consistent with the measured curved surface, and is transmitted to the upper computer after being collected by the camera; the electromagnetic sensor collects the relative position and posture information of the current endoscope and transmits the information to the upper computer, and the motion state of the camera is tracked.
10. The three-dimensional endoscope according to claim 9, wherein said electromagnetic sensor is fixed in parallel constrained relationship with said camera head, and said electromagnetic sensor is located between said camera head and said optical fiber; the geometric connecting line between the camera and the optical fiber and the geometric connecting line between the optical fiber and the electromagnetic sensor are of a splayed structure.
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN202010739975.1A CN111870211A (en) | 2020-07-28 | 2020-07-28 | Three-dimensional endoscope with instrument pose navigation function and navigation method thereof |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN202010739975.1A CN111870211A (en) | 2020-07-28 | 2020-07-28 | Three-dimensional endoscope with instrument pose navigation function and navigation method thereof |
Publications (1)
Publication Number | Publication Date |
---|---|
CN111870211A true CN111870211A (en) | 2020-11-03 |
Family
ID=73201520
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN202010739975.1A Pending CN111870211A (en) | 2020-07-28 | 2020-07-28 | Three-dimensional endoscope with instrument pose navigation function and navigation method thereof |
Country Status (1)
Country | Link |
---|---|
CN (1) | CN111870211A (en) |
Cited By (2)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN113925441A (en) * | 2021-12-17 | 2022-01-14 | 极限人工智能有限公司 | Imaging method and imaging system based on endoscope |
CN115281850A (en) * | 2022-08-12 | 2022-11-04 | 北京信息科技大学 | Instrument attitude evaluation method based on hemispherical laser listing method |
Citations (4)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN105030331A (en) * | 2015-04-24 | 2015-11-11 | 长春理工大学 | Position sensor and three-dimension laparoscope camera calibration device and method |
CN107485447A (en) * | 2017-08-09 | 2017-12-19 | 北京信息科技大学 | Utensil pose guider and method in a kind of art towards knee cartilage transplantation |
CN208319312U (en) * | 2017-08-09 | 2019-01-04 | 北京信息科技大学 | Utensil pose navigation device in a kind of art towards knee cartilage transplantation |
CN109620104A (en) * | 2019-01-10 | 2019-04-16 | 深圳市资福医疗技术有限公司 | Capsule endoscope and its localization method and system |
-
2020
- 2020-07-28 CN CN202010739975.1A patent/CN111870211A/en active Pending
Patent Citations (4)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN105030331A (en) * | 2015-04-24 | 2015-11-11 | 长春理工大学 | Position sensor and three-dimension laparoscope camera calibration device and method |
CN107485447A (en) * | 2017-08-09 | 2017-12-19 | 北京信息科技大学 | Utensil pose guider and method in a kind of art towards knee cartilage transplantation |
CN208319312U (en) * | 2017-08-09 | 2019-01-04 | 北京信息科技大学 | Utensil pose navigation device in a kind of art towards knee cartilage transplantation |
CN109620104A (en) * | 2019-01-10 | 2019-04-16 | 深圳市资福医疗技术有限公司 | Capsule endoscope and its localization method and system |
Cited By (3)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN113925441A (en) * | 2021-12-17 | 2022-01-14 | 极限人工智能有限公司 | Imaging method and imaging system based on endoscope |
CN113925441B (en) * | 2021-12-17 | 2022-05-03 | 极限人工智能有限公司 | Imaging method and imaging system based on endoscope |
CN115281850A (en) * | 2022-08-12 | 2022-11-04 | 北京信息科技大学 | Instrument attitude evaluation method based on hemispherical laser listing method |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
US20210345855A1 (en) | Real time correlated depiction system of surgical tool | |
US9636188B2 (en) | System and method for 3-D tracking of surgical instrument in relation to patient body | |
EP2825087B1 (en) | Otoscanner | |
Fuchs et al. | Augmented reality visualization for laparoscopic surgery | |
US8248414B2 (en) | Multi-dimensional navigation of endoscopic video | |
US7945310B2 (en) | Surgical instrument path computation and display for endoluminal surgery | |
US7824328B2 (en) | Method and apparatus for tracking a surgical instrument during surgery | |
US8248413B2 (en) | Visual navigation system for endoscopic surgery | |
US20080071141A1 (en) | Method and apparatus for measuring attributes of an anatomical feature during a medical procedure | |
CN106890025A (en) | A kind of minimally invasive operation navigating system and air navigation aid | |
CN102448398A (en) | Distance-based position tracking method and system | |
CN111870211A (en) | Three-dimensional endoscope with instrument pose navigation function and navigation method thereof | |
CN107485447B (en) | Device and method for navigating pose of surgical instrument for knee cartilage grafting | |
CN110169821A (en) | A kind of image processing method, apparatus and system | |
CN112184653A (en) | Binocular endoscope-based focus three-dimensional size measuring and displaying method | |
CN211484971U (en) | Intelligent auxiliary system for comprehensive vision of operation | |
CN208319312U (en) | Utensil pose navigation device in a kind of art towards knee cartilage transplantation | |
JP2001293006A (en) | Surgical navigation apparatus | |
Payandeh et al. | Application of imaging to the laproscopic surgery | |
CN114061738A (en) | Wind turbine tower drum foundation ring vibration monitoring method based on calibration plate pose calculation |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
PB01 | Publication | ||
PB01 | Publication | ||
SE01 | Entry into force of request for substantive examination | ||
SE01 | Entry into force of request for substantive examination | ||
WD01 | Invention patent application deemed withdrawn after publication |
Application publication date: 20201103 |
|
WD01 | Invention patent application deemed withdrawn after publication |