CN113547515B - Coordinate calibration method based on ultrasonic servo surgical robot - Google Patents
Coordinate calibration method based on ultrasonic servo surgical robot Download PDFInfo
- Publication number
- CN113547515B CN113547515B CN202110808016.5A CN202110808016A CN113547515B CN 113547515 B CN113547515 B CN 113547515B CN 202110808016 A CN202110808016 A CN 202110808016A CN 113547515 B CN113547515 B CN 113547515B
- Authority
- CN
- China
- Prior art keywords
- coordinate system
- probe
- tail end
- mechanical arm
- guide frame
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Active
Links
- 238000000034 method Methods 0.000 title claims abstract description 30
- 239000000523 sample Substances 0.000 claims abstract description 113
- 230000009466 transformation Effects 0.000 claims abstract description 68
- 230000001131 transforming effect Effects 0.000 claims abstract description 5
- 239000011159 matrix material Substances 0.000 claims description 3
- 238000001514 detection method Methods 0.000 abstract description 4
- 206010004446 Benign prostatic hyperplasia Diseases 0.000 description 3
- 208000004403 Prostatic Hyperplasia Diseases 0.000 description 3
- 238000010586 diagram Methods 0.000 description 3
- 208000025844 Prostatic disease Diseases 0.000 description 2
- 230000006872 improvement Effects 0.000 description 2
- 210000002307 prostate Anatomy 0.000 description 2
- 230000009286 beneficial effect Effects 0.000 description 1
- 230000008859 change Effects 0.000 description 1
- 230000007547 defect Effects 0.000 description 1
- 230000004064 dysfunction Effects 0.000 description 1
- 238000013507 mapping Methods 0.000 description 1
- 230000004048 modification Effects 0.000 description 1
- 238000012986 modification Methods 0.000 description 1
- 230000001360 synchronised effect Effects 0.000 description 1
- 230000002485 urinary effect Effects 0.000 description 1
Images
Classifications
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B25—HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
- B25J—MANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
- B25J9/00—Programme-controlled manipulators
- B25J9/02—Programme-controlled manipulators characterised by movement of the arms, e.g. cartesian coordinate type
- B25J9/04—Programme-controlled manipulators characterised by movement of the arms, e.g. cartesian coordinate type by rotating at least one arm, excluding the head movement itself, e.g. cylindrical coordinate type or polar coordinate type
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B25—HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
- B25J—MANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
- B25J18/00—Arms
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B25—HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
- B25J—MANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
- B25J19/00—Accessories fitted to manipulators, e.g. for monitoring, for viewing; Safety devices combined with or specially adapted for use in connection with manipulators
- B25J19/02—Sensing devices
- B25J19/021—Optical sensing devices
- B25J19/023—Optical sensing devices including video camera means
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B25—HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
- B25J—MANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
- B25J9/00—Programme-controlled manipulators
- B25J9/16—Programme controls
- B25J9/1679—Programme controls characterised by the tasks executed
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B25—HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
- B25J—MANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
- B25J9/00—Programme-controlled manipulators
- B25J9/16—Programme controls
- B25J9/1694—Programme controls characterised by use of sensors other than normal servo-feedback from position, speed or acceleration sensors, perception control, multi-sensor controlled systems, sensor fusion
- B25J9/1697—Vision controlled systems
Landscapes
- Engineering & Computer Science (AREA)
- Robotics (AREA)
- Mechanical Engineering (AREA)
- Multimedia (AREA)
- Manipulator (AREA)
Abstract
The invention belongs to the technical field of surgical robots and discloses a coordinate calibration method based on an ultrasonic servo surgical robot, which comprises the following steps: arranging a stereoscopic vision high-precision positioning camera at the operation execution tail end and obliquely above the probe, and acquiring a camera coordinate system; the method comprises the steps of adopting a stereoscopic vision high-precision positioning camera to photograph a guide frame, a probe, a mechanical arm tail end and an operation execution tail end respectively to obtain a guide frame coordinate system, a probe coordinate system, a mechanical arm coordinate system and an operation execution tail end coordinate system under a camera coordinate system, further obtaining transformation relations between the probe and the guide frame, between the guide frame and a mechanical arm base and between the operation execution tail end and the mechanical arm tail end coordinate system respectively, multiplying the transformation relations, and transforming step by step to obtain a transformation relation between the probe coordinate system and the operation execution tail end coordinate system. The coordinate calibration method and the coordinate calibration device can realize coordinate calibration of the detection end and the operation execution end intuitively and quickly, and realize unification of coordinates of the detection end and the operation execution end.
Description
Technical Field
The invention belongs to the technical field of surgical robots, and particularly relates to a coordinate calibration method based on an ultrasonic servo surgical robot.
Background
Benign prostatic hyperplasia and the most common causes of urinary dysfunction in men are shown, the benign prostatic disease has the highest incidence among the elderly, and statistics show that the morbidity of the benign prostatic disease is about 16-25% in 50-65-year-old men; the prevalence rate of the male over 70 years old reaches 30% -46%, the increase of the prevalence rate is along with the increase of the age, transurethral prostate electrotomy is the main mode of treating prostatic hyperplasia at present, but the operation has the problems of safety, effectiveness, consistency and the like. With the gradual maturity of the surgical robot technology, the prostate surgical robot provides a brand new idea for the treatment of prostatic hyperplasia, but in the prior art, an ultrasonic probe is used for detecting images in real time, the two surgical robots cannot be synchronized well during the operation, the detected images cannot be timely fed back to the robot execution tail end, and the operation of the surgical robot is slower.
Disclosure of Invention
Aiming at the defects or the improvement requirements of the prior art, the invention provides the coordinate calibration method based on the ultrasonic servo surgical robot, which can intuitively and quickly realize the coordinate calibration of the detection end and the surgical execution tail end and realize the unification of the coordinates of the detection end and the surgical execution tail end.
To achieve the above object, according to one aspect of the present invention, there is provided a coordinate calibration method based on an ultrasonic servo surgical robot, the ultrasonic servo surgical robot including a robot arm, a surgical execution end disposed at an end of the robot arm, a guide frame, a motor disposed on the guide frame, and a probe, the motor driving the probe to rotate, the method including: s1: arranging a stereoscopic vision high-precision positioning camera at the operation execution tail end and obliquely above the probe, and acquiring a coordinate system of the stereoscopic vision high-precision positioning camera, namely a camera coordinate system; s2: and the stereoscopic vision high-precision positioning camera is adopted to photograph the guide frame, the probe, the tail end of the mechanical arm and the operation execution tail end respectively, a guide frame coordinate system, a probe coordinate system, a mechanical arm coordinate system and an operation execution tail end coordinate system under a camera coordinate system are obtained, the transformation relations between the probe coordinate system and the guide frame coordinate system, between the guide frame coordinate system and the mechanical arm base coordinate system, between the operation execution tail end coordinate system and the mechanical arm tail end coordinate system are further obtained respectively, and the transformation relations are multiplied step by step to obtain the transformation relations between the probe coordinate system and the operation execution tail end coordinate system.
Preferably, the method further comprises: s3: and acquiring a transformation relation between a two-dimensional image coordinate system and the probe coordinate system, wherein the two-dimensional image coordinate system is superposed with a coordinate plane of the probe coordinate system, and multiplying the transformation relation between the probe coordinate system and the operation execution tail end coordinate system and the transformation relation between the two-dimensional image coordinate system and the probe coordinate system to obtain the transformation relation between the two-dimensional image coordinate system and the operation execution tail end coordinate system.
Preferably, the step S2 includes: s21: the stereoscopic vision high-precision positioning camera is adopted to photograph the probe and the guide frame to obtain a probe coordinate system under any pose under a camera coordinate systemAnd a guide frame coordinate systemWherein T is a 4 × 4 homogeneous matrix; s22: based on the probe coordinate systemAnd a guide frame coordinate systemObtaining the transformation relation between the probe coordinate system and the guide frame coordinate systemS23: the stereoscopic vision high-precision positioning camera is adopted to photograph the tail end of the mechanical arm to obtain the transformation relation between the camera coordinate system and the mechanical arm base coordinate systemFurther obtaining the transformation relation between the mechanical arm base coordinate system and the guide frame coordinate system S24: adopting the stereoscopic vision high-precision positioning camera to photograph the operation execution tail endObtaining the surgical performing tip coordinate systemObtaining a transformation relation between the coordinate system of the surgical execution end and the coordinate system of the end of the mechanical armWherein,directly acquiring a transformation relation between a mechanical arm tail end coordinate system and a mechanical arm base coordinate system according to the geometric structure of the mechanical arm; s25: according to the aboveThen obtaining the transformation relation between the coordinate system of the probe and the coordinate system of the operation executing end
Preferably, the step S3 includes: defining the coincidence of a coordinate plane of the two-dimensional image coordinate system and the probe coordinate system, and then obtaining the transformation relation between the two-dimensional image coordinate system and the probe coordinate systemMultiplying the transformation relation between the probe coordinate system and the surgical operation execution tail end coordinate system and the transformation relation between the two-dimensional image coordinate system and the probe coordinate system to obtain the transformation relation between the two-dimensional image coordinate system and the surgical operation execution tail end coordinate system
Preferably, step S21 specifically includes: step 1, respectively arranging a first reflective target ball and a second reflective target ball on the probe and the guide frame; step 2, keeping the guide frame not moving back and forth, starting a motor to drive the probe to rotate for a circle, and simultaneously recording the first reflected light by adopting the stereoscopic vision high-precision positioning cameraA plurality of points on the motion trail of the target ball are recorded by the stereoscopic vision high-precision positioning camera, and the positions of the first reflective target ball and the second reflective target ball are recorded after the target ball rotates; step 3, fitting the plurality of points to obtain a circle center serving as an origin of the guide frame coordinate system and the initial probe coordinate system, wherein a circle axis obtained by fitting is a Z axis of the guide frame coordinate system and the initial probe coordinate system; obtaining a probe coordinate system under any pose under a camera coordinate system by combining the positions of the first reflective target sphere and the second reflective target sphere after rotationAnd a guide frame coordinate system
Preferably, the probe coordinate system in any pose under the camera coordinate system is obtained by combining the positions of the first reflective target ball and the second reflective target ball after rotation in the step 3And a guide frame coordinate systemThe method specifically comprises the following steps: obtaining an initial probe coordinate system of the probe according to the original point, the Z axis and the positions of the first reflective target ball and the second reflective target ballAnd a guide frame coordinate systemAccording to the initial probe coordinate systemAnd a guide frame coordinate systemAcquiring the initial position of the probeInitial rotation angles of a probe coordinate system and a guide frame coordinate system; when the probe moves to a certain pose, acquiring the motion variables of the forward and backward movement and the rotation around the Z axis of the probe, and correcting the motion variables by adopting the initial rotation angle to obtain a probe coordinate system of the probe in any pose
Preferably, step S23 specifically includes: arranging a third light reflecting target ball at the tail end of the mechanical arm, photographing the third light reflecting target ball at the tail end of the mechanical arm by using the stereoscopic vision high-precision positioning camera to obtain the coordinate of the tail end of the mechanical arm in a camera coordinate system, and simultaneously obtaining the transformation relation between the camera coordinate system and a mechanical arm base coordinate system according to the transformation relation between the mechanical arm tail end coordinate system and the mechanical arm base coordinate systemTransforming the camera coordinate system and the mechanical arm base coordinate systemAnd a guide frame coordinate systemMultiplying to obtain the transformation relation between the base coordinate system of the mechanical arm and the coordinate system of the guide frame
Preferably, step S24 is specifically: setting a fourth reflective target ball at the operation execution tail end, and photographing the fourth reflective target ball at the operation execution tail end by using the stereoscopic vision high-precision positioning camera to obtain an operation execution tail end coordinate system of the operation execution tail end in the camera coordinate systemCoordinate system of the surgical performing endAndandconverting to obtain the transformation relation between the coordinate system of the operation executing end and the coordinate system of the mechanical arm end
Generally, compared with the prior art, the coordinate calibration method based on the ultrasonic servo surgical robot provided by the invention has the following beneficial effects:
1. the camera is arranged in the working space of the robot, the coordinate system of the camera is used as a reference, the coordinate systems of different positions of different surgical robots are collected by the camera to be converted with each other, and then the coordinate calibration of the probe and the execution tail end can be obtained.
2. Because the two-dimensional image coordinate system is overlapped with a plane of the probe coordinate system, the change relation between the two-dimensional image and the operation execution tail end can be easily obtained according to the transformation relation between the probe and the execution tail end, so that the unified calibration of the two-dimensional image coordinate system and the operation execution tail end coordinate system is realized.
Drawings
FIG. 1 is a schematic diagram of coordinate calibration of an ultrasonic servo-based surgical robot according to the present embodiment;
FIG. 2 is a step diagram of a coordinate calibration method based on an ultrasonic servo surgical robot according to the present embodiment;
fig. 3 is a schematic diagram of the coordinate calibration method according to the embodiment.
Detailed Description
In order to make the objects, technical solutions and advantages of the present invention more apparent, the present invention is described in further detail below with reference to the accompanying drawings and embodiments. It should be understood that the specific embodiments described herein are merely illustrative of the invention and are not intended to limit the invention. In addition, the technical features involved in the embodiments of the present invention described below may be combined with each other as long as they do not conflict with each other.
Referring to fig. 1, the present invention provides a coordinate calibration method based on an ultrasonic servo surgical robot, where the ultrasonic servo surgical robot includes a mechanical arm, a surgical execution end disposed at the end of the mechanical arm, a guide frame, a motor disposed on the guide frame, and a probe, and the motor drives the probe to rotate during operation. The calibration coordinate system of the present application relates to a two-dimensional IMAGE coordinate system (IMAGE), a PROBE coordinate system (PROBE), a GUIDE frame coordinate system (GUIDE), a robot BASE coordinate system (BASE), a robot END coordinate system (END), a continuum execution coordinate system (WORK), and a CAMERA coordinate system (CAMERA), and the relationship between the coordinate systems is shown in fig. 1 below. The two-dimensional image coordinate system and the three-dimensional probe coordinate system YZ plane or XZ plane are superposed, so that the two-dimensional image coordinate system and the three-dimensional probe coordinate system are not directly marked in the figure.
As shown in FIG. 2, the method includes the following steps S1-S2.
S1: and arranging a stereoscopic vision high-precision positioning camera obliquely above the operation execution tail end and the probe, and acquiring a coordinate system of the stereoscopic vision high-precision positioning camera, namely a camera coordinate system.
The camera coordinate system is set to an absolute coordinate system.
S2: the stereoscopic vision high-precision positioning camera is adopted to photograph the guide frame, the probe, the tail end of the mechanical arm and the tail end of the operation execution respectively, a guide frame coordinate system, a probe coordinate system, a mechanical arm coordinate system and a tail end coordinate system of the operation execution under a camera coordinate system are obtained, the transformation relations between the probe coordinate system and the guide frame coordinate system, between the guide frame coordinate system and the mechanical arm base coordinate system, between the tail end coordinate system of the operation execution and the tail end coordinate system of the mechanical arm are further obtained respectively, and the transformation relations are multiplied step by step to obtain the transformation relations between the probe coordinate system and the tail end coordinate system of the operation execution.
Specifically, the method comprises steps S21-S25.
S21: the stereoscopic vision high-precision positioning camera is adopted to photograph the probe and the guide frame to obtain a probe coordinate system under any pose under a camera coordinate systemAnd a guide frame coordinate systemWhere T is a 4 × 4 homogeneous matrix. The method specifically comprises the following steps 1-3.
Step 1, respectively arranging a first reflective target ball and a second reflective target ball on the probe and the guide frame;
step 2, keeping the guide frame from moving back and forth, starting a motor to drive the probe to rotate for a circle, simultaneously recording a plurality of points on the motion trail of the first reflective target ball by using the stereoscopic vision high-precision positioning camera, and recording the positions of the first reflective target ball and the second reflective target ball after rotation by using the stereoscopic vision high-precision positioning camera;
step 3, fitting the plurality of points to obtain a circle center serving as an origin of the guide frame coordinate system and the initial probe coordinate system, wherein a circle axis obtained by fitting is a Z axis of the guide frame coordinate system and the initial probe coordinate system; obtaining a probe coordinate system under any pose under a camera coordinate system by combining the positions of the first reflective target ball and the second reflective target ball after rotationAnd a guide frame coordinate system
Specifically, an initial probe coordinate system of the probe is obtained according to the original point, the Z axis and the positions of the first reflective target ball and the second reflective target ballAnd a guide frame coordinate systemBecause the position of the origin-determined reflective target ball is determined, and the Z axis is determined, the initial probe coordinate system of the probe can be determinedAnd a guide frame coordinate system
According to the initial probe coordinate systemAnd a guide frame coordinate systemAcquiring initial rotation angles of an initial probe coordinate system and a guide frame coordinate system when the probe is in an initial pose;
when the probe moves to a certain pose, acquiring the motion variables of the forward and backward movement and the rotation around the Z axis of the probe, and correcting the motion variables by adopting the initial rotation angle to obtain a probe coordinate system of the probe in any pose
S22: based on the probe coordinate systemAnd a guide frame coordinate systemObtaining a transformation relation between a probe coordinate system and a guide frame coordinate system
S23: adopt the stereoscopic vision high accuracyThe positioning camera shoots the tail end of the mechanical arm to obtain the transformation relation between the camera coordinate system and the mechanical arm base coordinate systemFurther obtaining the transformation relation between the mechanical arm base coordinate system and the guide frame coordinate system
Specifically, a third reflective target ball is arranged at the tail end of the mechanical arm, the stereoscopic vision high-precision positioning camera is adopted to photograph the third reflective target ball at the tail end of the mechanical arm to obtain the coordinate of the tail end of the mechanical arm in a camera coordinate system, and meanwhile, the transformation relation between the camera coordinate system and a mechanical arm base coordinate system is obtained according to the transformation relation between the mechanical arm tail end coordinate system and the mechanical arm base coordinate system
Transforming the camera coordinate system and the mechanical arm base coordinate systemAnd a guide frame coordinate systemMultiplying to obtain the transformation relation between the base coordinate system of the mechanical arm and the coordinate system of the guide frame
S24: the stereoscopic vision high-precision positioning camera is adopted to photograph the operation execution tail end to obtain the coordinate system of the operation execution tail endObtaining a transformation relation between the coordinate system of the surgical execution end and the coordinate system of the mechanical arm endWherein,the method is characterized in that the transformation relation between a mechanical arm tail end coordinate system and a mechanical arm base coordinate system is directly obtained according to the connecting rod parameters of the mechanical arm.
Specifically, a fourth reflective target ball is arranged at the operation execution end, and the stereoscopic vision high-precision positioning camera is adopted to photograph the fourth reflective target ball at the operation execution end to obtain an operation execution end coordinate system of the operation execution end in the camera coordinate system
Coordinate system of the surgical execution terminalAndandconverting to obtain the transformation relation between the coordinate system of the operation execution end and the coordinate system of the end of the mechanical arm
S25: according to the aboveThen obtaining the transformation relation between the coordinate system of the probe and the coordinate system of the operation executing end
The method of the present application further includes transforming the transformation relationship between the probe coordinate system and the surgical performing tip coordinate system into a transformation relationship between the two-dimensional image coordinate system and the surgical performing tip coordinate system, i.e., as follows, step S3.
S3: and acquiring a transformation relation between a two-dimensional image coordinate system and the probe coordinate system, wherein the two-dimensional image coordinate system is superposed with a coordinate plane of the probe coordinate system, and multiplying the transformation relation between the probe coordinate system and the operation execution tail end coordinate system and the transformation relation between the two-dimensional image coordinate system and the probe coordinate system to obtain the transformation relation between the two-dimensional image coordinate system and the operation execution tail end coordinate system.
The present application calibrates coordinate system transformation relationships by means of representations of the same point in different coordinate systems. As shown in FIG. 3, the specific method is to know two coordinate systems A and B, and respectively record the coordinates of N same spatial points in A, B coordinate systemAndi is 1 … N, the recorded point set is used for constructing an auxiliary coordinate system C, and the transformation relation between the coordinate system A, B and the auxiliary coordinate system C can be obtained from the mapping relation of the point setAndaccording to the formulaCan obtainThe transformation relation between the coordinate systems A and B can be calibrated
After a certain point in a two-dimensional image coordinate system is obtained through medical images, the point can be converted into a coordinate system of an operation execution tail end by using the calibration method, and then the operation execution tail end is guided to complete a corresponding operation.
It will be understood by those skilled in the art that the foregoing is only a preferred embodiment of the present invention, and is not intended to limit the invention, and that any modification, equivalent replacement, or improvement made within the spirit and principle of the present invention should be included in the scope of the present invention.
Claims (5)
1. A coordinate calibration method based on an ultrasonic servo surgical robot is characterized in that the ultrasonic servo surgical robot comprises a mechanical arm, a surgical execution tail end arranged at the tail end of the mechanical arm, a guide frame, a motor arranged on the guide frame and a probe, wherein the motor drives the probe to rotate, and the method comprises the following steps:
s1: arranging a stereoscopic vision high-precision positioning camera at the operation execution tail end and obliquely above the probe, and acquiring a coordinate system of the stereoscopic vision high-precision positioning camera, namely a camera coordinate system;
s2: the method comprises the following steps of adopting the stereoscopic vision high-precision positioning camera to respectively photograph the guide frame, the probe, the tail end of the mechanical arm and the operation execution tail end to obtain a guide frame coordinate system, a probe coordinate system, a tail end coordinate system of the mechanical arm and a coordinate system of the operation execution tail end under a camera coordinate system, further respectively obtaining a transformation relation between a probe coordinate system and the guide frame coordinate system, a guide frame coordinate system and a mechanical arm base coordinate system, and a coordinate system of the operation execution tail end and a coordinate system of the tail end of the mechanical arm, multiplying the transformation relations, and transforming step by step to obtain a transformation relation between the probe coordinate system and the tail end coordinate system of the operation execution, and specifically comprising the following steps:
s21: the stereoscopic vision high-precision positioning camera is adopted to photograph the probe and the guide frame to obtain a probe coordinate system under any pose under a camera coordinate systemAnd a guide frame coordinate systemWherein T is a 4 × 4 homogeneous matrix; step S21 specifically includes:
step 1, respectively arranging a first reflective target ball and a second reflective target ball on the probe and the guide frame;
step 2, keeping the guide frame from moving back and forth, starting a motor to drive the probe to rotate for a circle, simultaneously recording a plurality of points on the motion trail of the first reflective target ball by using the stereoscopic vision high-precision positioning camera, and recording the positions of the first reflective target ball and the second reflective target ball after rotation by using the stereoscopic vision high-precision positioning camera;
step 3, fitting the plurality of points to obtain a circle center serving as an origin of the guide frame coordinate system and the initial probe coordinate system, wherein a circle axis obtained by fitting is a Z axis of the guide frame coordinate system and the initial probe coordinate system; obtaining a probe coordinate system under any pose under a camera coordinate system by combining the positions of the first reflective target ball and the second reflective target ball after rotationAnd a guide frame coordinate system
S22: based on the probe coordinate systemAnd a guide frame coordinate systemObtaining a transformation relation between a probe coordinate system and a guide frame coordinate system
S23: the stereoscopic vision high-precision positioning camera is adopted to photograph the tail end of the mechanical arm to obtain the transformation relation between the camera coordinate system and the mechanical arm base coordinate systemFurther obtaining the transformation relation between the mechanical arm base coordinate system and the guide frame coordinate system
S24: the stereoscopic vision high-precision positioning camera is adopted to photograph the operation execution tail end to obtain the coordinate system of the operation execution tail endObtaining a transformation relation between the coordinate system of the surgical execution end and the coordinate system of the end of the mechanical arm Wherein,directly acquiring the transformation relation between a mechanical arm tail end coordinate system and a mechanical arm base coordinate system according to the connecting rod parameters of the mechanical arm;
s25: according to the aboveObtaining the transformation relation between the coordinate system of the probe and the coordinate system of the operation executing end
S3: and acquiring a transformation relation between a two-dimensional image coordinate system and the probe coordinate system, wherein the two-dimensional image coordinate system is superposed with a coordinate plane of the probe coordinate system, and multiplying the transformation relation between the probe coordinate system and the operation execution tail end coordinate system and the transformation relation between the two-dimensional image coordinate system and the probe coordinate system to obtain the transformation relation between the two-dimensional image coordinate system and the operation execution tail end coordinate system.
2. The method according to claim 1, wherein the step S3 includes:
defining the coincidence of a coordinate plane of the two-dimensional image coordinate system and the probe coordinate system, and then obtaining the transformation relation between the two-dimensional image coordinate system and the probe coordinate systemMultiplying the transformation relation between the probe coordinate system and the surgical operation execution tail end coordinate system and the transformation relation between the two-dimensional image coordinate system and the probe coordinate system to obtain the transformation relation between the two-dimensional image coordinate system and the surgical operation execution tail end coordinate system
3. The method according to claim 1, wherein the probe coordinate system in any pose under the camera coordinate system is obtained in the step 3 by combining the positions of the first and second retro-reflective target balls after rotationAnd a guide frame coordinate systemThe method specifically comprises the following steps:
obtaining an initial probe coordinate system of the probe according to the original point, the Z axis and the positions of the first reflective target ball and the second reflective target ballAnd a guide frame coordinate system
According to the initial probe coordinate systemAnd a guide frame coordinate systemAcquiring initial rotation angles of an initial probe coordinate system and a guide frame coordinate system when the probe is in an initial pose;
4. The method according to claim 1, wherein step S23 specifically comprises:
arranging a third light reflecting target ball at the tail end of the mechanical arm, photographing the third light reflecting target ball at the tail end of the mechanical arm by using the stereoscopic vision high-precision positioning camera to obtain the coordinate of the tail end of the mechanical arm in a camera coordinate system, and simultaneously obtaining the transformation relation between the camera coordinate system and a mechanical arm base coordinate system according to the transformation relation between the mechanical arm tail end coordinate system and the mechanical arm base coordinate system
5. The method according to claim 1, wherein step S24 is specifically:
setting a fourth reflective target ball at the operation execution tail end, and photographing the fourth reflective target ball at the operation execution tail end by using the stereoscopic vision high-precision positioning camera to obtain an operation execution tail end coordinate system of the operation execution tail end in the camera coordinate system
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN202110808016.5A CN113547515B (en) | 2021-07-16 | 2021-07-16 | Coordinate calibration method based on ultrasonic servo surgical robot |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN202110808016.5A CN113547515B (en) | 2021-07-16 | 2021-07-16 | Coordinate calibration method based on ultrasonic servo surgical robot |
Publications (2)
Publication Number | Publication Date |
---|---|
CN113547515A CN113547515A (en) | 2021-10-26 |
CN113547515B true CN113547515B (en) | 2022-07-12 |
Family
ID=78131970
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN202110808016.5A Active CN113547515B (en) | 2021-07-16 | 2021-07-16 | Coordinate calibration method based on ultrasonic servo surgical robot |
Country Status (1)
Country | Link |
---|---|
CN (1) | CN113547515B (en) |
Families Citing this family (3)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN114137085A (en) * | 2021-12-01 | 2022-03-04 | 仲恺农业工程学院 | Ultrasonic flaw detection robot based on vision-assisted positioning and detection method thereof |
CN115153782A (en) * | 2022-08-12 | 2022-10-11 | 哈尔滨理工大学 | Puncture robot space registration method under ultrasonic guidance |
CN116473681B (en) * | 2023-03-28 | 2024-02-20 | 北京维卓致远医疗科技发展有限责任公司 | Control system and method of surgical robot |
Citations (12)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2005342832A (en) * | 2004-06-02 | 2005-12-15 | Fanuc Ltd | Robot system |
CN103706568A (en) * | 2013-11-26 | 2014-04-09 | 中国船舶重工集团公司第七一六研究所 | System and method for machine vision-based robot sorting |
CN104858870A (en) * | 2015-05-15 | 2015-08-26 | 江南大学 | Industrial robot measurement method based on tail end numbered tool |
JP2016007277A (en) * | 2014-06-23 | 2016-01-18 | 公立大学法人公立はこだて未来大学 | Surgery support device and surgery support system |
CN105716525A (en) * | 2016-03-30 | 2016-06-29 | 西北工业大学 | Robot end effector coordinate system calibration method based on laser tracker |
CN108778179A (en) * | 2016-02-26 | 2018-11-09 | 思想外科有限公司 | Method and system for instructing user positioning robot |
CN109199586A (en) * | 2018-11-09 | 2019-01-15 | 山东大学 | A kind of laser bone-culting operation robot system and its paths planning method |
CN110202560A (en) * | 2019-07-12 | 2019-09-06 | 易思维(杭州)科技有限公司 | A kind of hand and eye calibrating method based on single feature point |
CN110225720A (en) * | 2017-01-13 | 2019-09-10 | 株式会社卓越牵引力 | Operation auxiliary device and its control method, program and surgical assistant system |
CN110279467A (en) * | 2019-06-19 | 2019-09-27 | 天津大学 | Ultrasound image under optical alignment and information fusion method in the art of puncture biopsy needle |
CN111070199A (en) * | 2018-10-18 | 2020-04-28 | 杭州海康威视数字技术股份有限公司 | Hand-eye calibration assessment method and robot |
CN112525074A (en) * | 2020-11-24 | 2021-03-19 | 杭州素问九州医疗科技有限公司 | Calibration method, calibration system, robot, computer device and navigation system |
Family Cites Families (1)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
TWI672206B (en) * | 2018-12-19 | 2019-09-21 | 財團法人工業技術研究院 | Method and apparatus of non-contact tool center point calibration for a mechanical arm, and a mechanical arm system with said calibration function |
-
2021
- 2021-07-16 CN CN202110808016.5A patent/CN113547515B/en active Active
Patent Citations (12)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2005342832A (en) * | 2004-06-02 | 2005-12-15 | Fanuc Ltd | Robot system |
CN103706568A (en) * | 2013-11-26 | 2014-04-09 | 中国船舶重工集团公司第七一六研究所 | System and method for machine vision-based robot sorting |
JP2016007277A (en) * | 2014-06-23 | 2016-01-18 | 公立大学法人公立はこだて未来大学 | Surgery support device and surgery support system |
CN104858870A (en) * | 2015-05-15 | 2015-08-26 | 江南大学 | Industrial robot measurement method based on tail end numbered tool |
CN108778179A (en) * | 2016-02-26 | 2018-11-09 | 思想外科有限公司 | Method and system for instructing user positioning robot |
CN105716525A (en) * | 2016-03-30 | 2016-06-29 | 西北工业大学 | Robot end effector coordinate system calibration method based on laser tracker |
CN110225720A (en) * | 2017-01-13 | 2019-09-10 | 株式会社卓越牵引力 | Operation auxiliary device and its control method, program and surgical assistant system |
CN111070199A (en) * | 2018-10-18 | 2020-04-28 | 杭州海康威视数字技术股份有限公司 | Hand-eye calibration assessment method and robot |
CN109199586A (en) * | 2018-11-09 | 2019-01-15 | 山东大学 | A kind of laser bone-culting operation robot system and its paths planning method |
CN110279467A (en) * | 2019-06-19 | 2019-09-27 | 天津大学 | Ultrasound image under optical alignment and information fusion method in the art of puncture biopsy needle |
CN110202560A (en) * | 2019-07-12 | 2019-09-06 | 易思维(杭州)科技有限公司 | A kind of hand and eye calibrating method based on single feature point |
CN112525074A (en) * | 2020-11-24 | 2021-03-19 | 杭州素问九州医疗科技有限公司 | Calibration method, calibration system, robot, computer device and navigation system |
Non-Patent Citations (1)
Title |
---|
基于多模态感知的前列腺介入手术机器人控制;刘博健;《中国优秀硕士学位论文全文数据库 工程科技Ⅱ辑》;20190815;第13、16-28页 * |
Also Published As
Publication number | Publication date |
---|---|
CN113547515A (en) | 2021-10-26 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
CN113547515B (en) | Coordinate calibration method based on ultrasonic servo surgical robot | |
CN110051436B (en) | Automated cooperative work assembly and application thereof in surgical instrument | |
CN111801198B (en) | Hand-eye calibration method, system and computer storage medium | |
CN111156925B (en) | Three-dimensional measurement method for large component based on line structured light and industrial robot | |
CN110728715B (en) | Intelligent inspection robot camera angle self-adaptive adjustment method | |
CN107883929B (en) | Monocular vision positioning device and method based on multi-joint mechanical arm | |
CN113133832B (en) | Calibration method and system for double-arm robot puncture system | |
CN114289934B (en) | Automatic welding system and method for large structural part based on three-dimensional vision | |
CN109974584A (en) | The calibration system and scaling method of a kind of auxiliary laser bone-culting operation robot | |
CN111300384B (en) | Registration system and method for robot augmented reality teaching based on identification card movement | |
CN109794963A (en) | A kind of robot method for rapidly positioning towards curved surface member | |
WO2020063058A1 (en) | Calibration method for multi-degree-of-freedom movable vision system | |
CN113208731B (en) | Binocular vision system-based hand and eye calibration method for surgical puncture robot | |
CN109900251A (en) | A kind of robotic positioning device and method of view-based access control model technology | |
CN113305851A (en) | Online detection device for robot micro-assembly | |
CN112958974A (en) | Interactive automatic welding system based on three-dimensional vision | |
CN114670199B (en) | Identification positioning device, system and real-time tracking system | |
CN108347561B (en) | Laser guide scanning system and scanning method | |
CN114777676A (en) | Self-adaptive terahertz three-dimensional tomography device and method | |
CN110961583A (en) | Steel ladle positioning device adopting laser scanning and using method thereof | |
CN212256370U (en) | Optical motion capture system | |
US20230181263A1 (en) | Dynamic 3d scanning robotic laparoscope | |
CN111862170A (en) | Optical motion capture system and method | |
CN114046889B (en) | Automatic calibration method for infrared camera | |
CN112743546B (en) | Robot hand-eye calibration pose selection method and device, robot system and medium |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
PB01 | Publication | ||
PB01 | Publication | ||
SE01 | Entry into force of request for substantive examination | ||
SE01 | Entry into force of request for substantive examination | ||
GR01 | Patent grant | ||
GR01 | Patent grant |