[go: up one dir, main page]
More Web Proxy on the site http://driver.im/

CN113547515B - Coordinate calibration method based on ultrasonic servo surgical robot - Google Patents

Coordinate calibration method based on ultrasonic servo surgical robot Download PDF

Info

Publication number
CN113547515B
CN113547515B CN202110808016.5A CN202110808016A CN113547515B CN 113547515 B CN113547515 B CN 113547515B CN 202110808016 A CN202110808016 A CN 202110808016A CN 113547515 B CN113547515 B CN 113547515B
Authority
CN
China
Prior art keywords
coordinate system
probe
tail end
mechanical arm
guide frame
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN202110808016.5A
Other languages
Chinese (zh)
Other versions
CN113547515A (en
Inventor
赵兴炜
郑果
陶波
凌青
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Huazhong University of Science and Technology
Original Assignee
Huazhong University of Science and Technology
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Huazhong University of Science and Technology filed Critical Huazhong University of Science and Technology
Priority to CN202110808016.5A priority Critical patent/CN113547515B/en
Publication of CN113547515A publication Critical patent/CN113547515A/en
Application granted granted Critical
Publication of CN113547515B publication Critical patent/CN113547515B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • BPERFORMING OPERATIONS; TRANSPORTING
    • B25HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
    • B25JMANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
    • B25J9/00Programme-controlled manipulators
    • B25J9/02Programme-controlled manipulators characterised by movement of the arms, e.g. cartesian coordinate type
    • B25J9/04Programme-controlled manipulators characterised by movement of the arms, e.g. cartesian coordinate type by rotating at least one arm, excluding the head movement itself, e.g. cylindrical coordinate type or polar coordinate type
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B25HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
    • B25JMANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
    • B25J18/00Arms
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B25HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
    • B25JMANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
    • B25J19/00Accessories fitted to manipulators, e.g. for monitoring, for viewing; Safety devices combined with or specially adapted for use in connection with manipulators
    • B25J19/02Sensing devices
    • B25J19/021Optical sensing devices
    • B25J19/023Optical sensing devices including video camera means
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B25HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
    • B25JMANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
    • B25J9/00Programme-controlled manipulators
    • B25J9/16Programme controls
    • B25J9/1679Programme controls characterised by the tasks executed
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B25HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
    • B25JMANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
    • B25J9/00Programme-controlled manipulators
    • B25J9/16Programme controls
    • B25J9/1694Programme controls characterised by use of sensors other than normal servo-feedback from position, speed or acceleration sensors, perception control, multi-sensor controlled systems, sensor fusion
    • B25J9/1697Vision controlled systems

Landscapes

  • Engineering & Computer Science (AREA)
  • Robotics (AREA)
  • Mechanical Engineering (AREA)
  • Multimedia (AREA)
  • Manipulator (AREA)

Abstract

The invention belongs to the technical field of surgical robots and discloses a coordinate calibration method based on an ultrasonic servo surgical robot, which comprises the following steps: arranging a stereoscopic vision high-precision positioning camera at the operation execution tail end and obliquely above the probe, and acquiring a camera coordinate system; the method comprises the steps of adopting a stereoscopic vision high-precision positioning camera to photograph a guide frame, a probe, a mechanical arm tail end and an operation execution tail end respectively to obtain a guide frame coordinate system, a probe coordinate system, a mechanical arm coordinate system and an operation execution tail end coordinate system under a camera coordinate system, further obtaining transformation relations between the probe and the guide frame, between the guide frame and a mechanical arm base and between the operation execution tail end and the mechanical arm tail end coordinate system respectively, multiplying the transformation relations, and transforming step by step to obtain a transformation relation between the probe coordinate system and the operation execution tail end coordinate system. The coordinate calibration method and the coordinate calibration device can realize coordinate calibration of the detection end and the operation execution end intuitively and quickly, and realize unification of coordinates of the detection end and the operation execution end.

Description

Coordinate calibration method based on ultrasonic servo surgical robot
Technical Field
The invention belongs to the technical field of surgical robots, and particularly relates to a coordinate calibration method based on an ultrasonic servo surgical robot.
Background
Benign prostatic hyperplasia and the most common causes of urinary dysfunction in men are shown, the benign prostatic disease has the highest incidence among the elderly, and statistics show that the morbidity of the benign prostatic disease is about 16-25% in 50-65-year-old men; the prevalence rate of the male over 70 years old reaches 30% -46%, the increase of the prevalence rate is along with the increase of the age, transurethral prostate electrotomy is the main mode of treating prostatic hyperplasia at present, but the operation has the problems of safety, effectiveness, consistency and the like. With the gradual maturity of the surgical robot technology, the prostate surgical robot provides a brand new idea for the treatment of prostatic hyperplasia, but in the prior art, an ultrasonic probe is used for detecting images in real time, the two surgical robots cannot be synchronized well during the operation, the detected images cannot be timely fed back to the robot execution tail end, and the operation of the surgical robot is slower.
Disclosure of Invention
Aiming at the defects or the improvement requirements of the prior art, the invention provides the coordinate calibration method based on the ultrasonic servo surgical robot, which can intuitively and quickly realize the coordinate calibration of the detection end and the surgical execution tail end and realize the unification of the coordinates of the detection end and the surgical execution tail end.
To achieve the above object, according to one aspect of the present invention, there is provided a coordinate calibration method based on an ultrasonic servo surgical robot, the ultrasonic servo surgical robot including a robot arm, a surgical execution end disposed at an end of the robot arm, a guide frame, a motor disposed on the guide frame, and a probe, the motor driving the probe to rotate, the method including: s1: arranging a stereoscopic vision high-precision positioning camera at the operation execution tail end and obliquely above the probe, and acquiring a coordinate system of the stereoscopic vision high-precision positioning camera, namely a camera coordinate system; s2: and the stereoscopic vision high-precision positioning camera is adopted to photograph the guide frame, the probe, the tail end of the mechanical arm and the operation execution tail end respectively, a guide frame coordinate system, a probe coordinate system, a mechanical arm coordinate system and an operation execution tail end coordinate system under a camera coordinate system are obtained, the transformation relations between the probe coordinate system and the guide frame coordinate system, between the guide frame coordinate system and the mechanical arm base coordinate system, between the operation execution tail end coordinate system and the mechanical arm tail end coordinate system are further obtained respectively, and the transformation relations are multiplied step by step to obtain the transformation relations between the probe coordinate system and the operation execution tail end coordinate system.
Preferably, the method further comprises: s3: and acquiring a transformation relation between a two-dimensional image coordinate system and the probe coordinate system, wherein the two-dimensional image coordinate system is superposed with a coordinate plane of the probe coordinate system, and multiplying the transformation relation between the probe coordinate system and the operation execution tail end coordinate system and the transformation relation between the two-dimensional image coordinate system and the probe coordinate system to obtain the transformation relation between the two-dimensional image coordinate system and the operation execution tail end coordinate system.
Preferably, the step S2 includes: s21: the stereoscopic vision high-precision positioning camera is adopted to photograph the probe and the guide frame to obtain a probe coordinate system under any pose under a camera coordinate system
Figure GDA0003634934770000021
And a guide frame coordinate system
Figure GDA0003634934770000022
Wherein T is a 4 × 4 homogeneous matrix; s22: based on the probe coordinate system
Figure GDA0003634934770000023
And a guide frame coordinate system
Figure GDA0003634934770000024
Obtaining the transformation relation between the probe coordinate system and the guide frame coordinate system
Figure GDA0003634934770000025
S23: the stereoscopic vision high-precision positioning camera is adopted to photograph the tail end of the mechanical arm to obtain the transformation relation between the camera coordinate system and the mechanical arm base coordinate system
Figure GDA0003634934770000026
Further obtaining the transformation relation between the mechanical arm base coordinate system and the guide frame coordinate system
Figure GDA0003634934770000027
Figure GDA0003634934770000028
S24: adopting the stereoscopic vision high-precision positioning camera to photograph the operation execution tail endObtaining the surgical performing tip coordinate system
Figure GDA0003634934770000029
Obtaining a transformation relation between the coordinate system of the surgical execution end and the coordinate system of the end of the mechanical arm
Figure GDA00036349347700000210
Wherein,
Figure GDA00036349347700000211
directly acquiring a transformation relation between a mechanical arm tail end coordinate system and a mechanical arm base coordinate system according to the geometric structure of the mechanical arm; s25: according to the above
Figure GDA00036349347700000212
Then obtaining the transformation relation between the coordinate system of the probe and the coordinate system of the operation executing end
Figure GDA0003634934770000031
Preferably, the step S3 includes: defining the coincidence of a coordinate plane of the two-dimensional image coordinate system and the probe coordinate system, and then obtaining the transformation relation between the two-dimensional image coordinate system and the probe coordinate system
Figure GDA0003634934770000032
Multiplying the transformation relation between the probe coordinate system and the surgical operation execution tail end coordinate system and the transformation relation between the two-dimensional image coordinate system and the probe coordinate system to obtain the transformation relation between the two-dimensional image coordinate system and the surgical operation execution tail end coordinate system
Figure GDA0003634934770000033
Preferably, step S21 specifically includes: step 1, respectively arranging a first reflective target ball and a second reflective target ball on the probe and the guide frame; step 2, keeping the guide frame not moving back and forth, starting a motor to drive the probe to rotate for a circle, and simultaneously recording the first reflected light by adopting the stereoscopic vision high-precision positioning cameraA plurality of points on the motion trail of the target ball are recorded by the stereoscopic vision high-precision positioning camera, and the positions of the first reflective target ball and the second reflective target ball are recorded after the target ball rotates; step 3, fitting the plurality of points to obtain a circle center serving as an origin of the guide frame coordinate system and the initial probe coordinate system, wherein a circle axis obtained by fitting is a Z axis of the guide frame coordinate system and the initial probe coordinate system; obtaining a probe coordinate system under any pose under a camera coordinate system by combining the positions of the first reflective target sphere and the second reflective target sphere after rotation
Figure GDA0003634934770000034
And a guide frame coordinate system
Figure GDA0003634934770000035
Preferably, the probe coordinate system in any pose under the camera coordinate system is obtained by combining the positions of the first reflective target ball and the second reflective target ball after rotation in the step 3
Figure GDA0003634934770000036
And a guide frame coordinate system
Figure GDA0003634934770000037
The method specifically comprises the following steps: obtaining an initial probe coordinate system of the probe according to the original point, the Z axis and the positions of the first reflective target ball and the second reflective target ball
Figure GDA0003634934770000038
And a guide frame coordinate system
Figure GDA0003634934770000039
According to the initial probe coordinate system
Figure GDA00036349347700000310
And a guide frame coordinate system
Figure GDA00036349347700000311
Acquiring the initial position of the probeInitial rotation angles of a probe coordinate system and a guide frame coordinate system; when the probe moves to a certain pose, acquiring the motion variables of the forward and backward movement and the rotation around the Z axis of the probe, and correcting the motion variables by adopting the initial rotation angle to obtain a probe coordinate system of the probe in any pose
Figure GDA00036349347700000312
Preferably, step S23 specifically includes: arranging a third light reflecting target ball at the tail end of the mechanical arm, photographing the third light reflecting target ball at the tail end of the mechanical arm by using the stereoscopic vision high-precision positioning camera to obtain the coordinate of the tail end of the mechanical arm in a camera coordinate system, and simultaneously obtaining the transformation relation between the camera coordinate system and a mechanical arm base coordinate system according to the transformation relation between the mechanical arm tail end coordinate system and the mechanical arm base coordinate system
Figure GDA0003634934770000041
Transforming the camera coordinate system and the mechanical arm base coordinate system
Figure GDA0003634934770000042
And a guide frame coordinate system
Figure GDA0003634934770000043
Multiplying to obtain the transformation relation between the base coordinate system of the mechanical arm and the coordinate system of the guide frame
Figure GDA0003634934770000044
Preferably, step S24 is specifically: setting a fourth reflective target ball at the operation execution tail end, and photographing the fourth reflective target ball at the operation execution tail end by using the stereoscopic vision high-precision positioning camera to obtain an operation execution tail end coordinate system of the operation execution tail end in the camera coordinate system
Figure GDA0003634934770000045
Coordinate system of the surgical performing end
Figure GDA0003634934770000046
And
Figure GDA0003634934770000047
and
Figure GDA0003634934770000048
converting to obtain the transformation relation between the coordinate system of the operation executing end and the coordinate system of the mechanical arm end
Figure GDA0003634934770000049
Generally, compared with the prior art, the coordinate calibration method based on the ultrasonic servo surgical robot provided by the invention has the following beneficial effects:
1. the camera is arranged in the working space of the robot, the coordinate system of the camera is used as a reference, the coordinate systems of different positions of different surgical robots are collected by the camera to be converted with each other, and then the coordinate calibration of the probe and the execution tail end can be obtained.
2. Because the two-dimensional image coordinate system is overlapped with a plane of the probe coordinate system, the change relation between the two-dimensional image and the operation execution tail end can be easily obtained according to the transformation relation between the probe and the execution tail end, so that the unified calibration of the two-dimensional image coordinate system and the operation execution tail end coordinate system is realized.
Drawings
FIG. 1 is a schematic diagram of coordinate calibration of an ultrasonic servo-based surgical robot according to the present embodiment;
FIG. 2 is a step diagram of a coordinate calibration method based on an ultrasonic servo surgical robot according to the present embodiment;
fig. 3 is a schematic diagram of the coordinate calibration method according to the embodiment.
Detailed Description
In order to make the objects, technical solutions and advantages of the present invention more apparent, the present invention is described in further detail below with reference to the accompanying drawings and embodiments. It should be understood that the specific embodiments described herein are merely illustrative of the invention and are not intended to limit the invention. In addition, the technical features involved in the embodiments of the present invention described below may be combined with each other as long as they do not conflict with each other.
Referring to fig. 1, the present invention provides a coordinate calibration method based on an ultrasonic servo surgical robot, where the ultrasonic servo surgical robot includes a mechanical arm, a surgical execution end disposed at the end of the mechanical arm, a guide frame, a motor disposed on the guide frame, and a probe, and the motor drives the probe to rotate during operation. The calibration coordinate system of the present application relates to a two-dimensional IMAGE coordinate system (IMAGE), a PROBE coordinate system (PROBE), a GUIDE frame coordinate system (GUIDE), a robot BASE coordinate system (BASE), a robot END coordinate system (END), a continuum execution coordinate system (WORK), and a CAMERA coordinate system (CAMERA), and the relationship between the coordinate systems is shown in fig. 1 below. The two-dimensional image coordinate system and the three-dimensional probe coordinate system YZ plane or XZ plane are superposed, so that the two-dimensional image coordinate system and the three-dimensional probe coordinate system are not directly marked in the figure.
As shown in FIG. 2, the method includes the following steps S1-S2.
S1: and arranging a stereoscopic vision high-precision positioning camera obliquely above the operation execution tail end and the probe, and acquiring a coordinate system of the stereoscopic vision high-precision positioning camera, namely a camera coordinate system.
The camera coordinate system is set to an absolute coordinate system.
S2: the stereoscopic vision high-precision positioning camera is adopted to photograph the guide frame, the probe, the tail end of the mechanical arm and the tail end of the operation execution respectively, a guide frame coordinate system, a probe coordinate system, a mechanical arm coordinate system and a tail end coordinate system of the operation execution under a camera coordinate system are obtained, the transformation relations between the probe coordinate system and the guide frame coordinate system, between the guide frame coordinate system and the mechanical arm base coordinate system, between the tail end coordinate system of the operation execution and the tail end coordinate system of the mechanical arm are further obtained respectively, and the transformation relations are multiplied step by step to obtain the transformation relations between the probe coordinate system and the tail end coordinate system of the operation execution.
Specifically, the method comprises steps S21-S25.
S21: the stereoscopic vision high-precision positioning camera is adopted to photograph the probe and the guide frame to obtain a probe coordinate system under any pose under a camera coordinate system
Figure GDA0003634934770000061
And a guide frame coordinate system
Figure GDA0003634934770000062
Where T is a 4 × 4 homogeneous matrix. The method specifically comprises the following steps 1-3.
Step 1, respectively arranging a first reflective target ball and a second reflective target ball on the probe and the guide frame;
step 2, keeping the guide frame from moving back and forth, starting a motor to drive the probe to rotate for a circle, simultaneously recording a plurality of points on the motion trail of the first reflective target ball by using the stereoscopic vision high-precision positioning camera, and recording the positions of the first reflective target ball and the second reflective target ball after rotation by using the stereoscopic vision high-precision positioning camera;
step 3, fitting the plurality of points to obtain a circle center serving as an origin of the guide frame coordinate system and the initial probe coordinate system, wherein a circle axis obtained by fitting is a Z axis of the guide frame coordinate system and the initial probe coordinate system; obtaining a probe coordinate system under any pose under a camera coordinate system by combining the positions of the first reflective target ball and the second reflective target ball after rotation
Figure GDA0003634934770000063
And a guide frame coordinate system
Figure GDA0003634934770000064
Specifically, an initial probe coordinate system of the probe is obtained according to the original point, the Z axis and the positions of the first reflective target ball and the second reflective target ball
Figure GDA0003634934770000065
And a guide frame coordinate system
Figure GDA0003634934770000066
Because the position of the origin-determined reflective target ball is determined, and the Z axis is determined, the initial probe coordinate system of the probe can be determined
Figure GDA0003634934770000067
And a guide frame coordinate system
Figure GDA0003634934770000068
According to the initial probe coordinate system
Figure GDA0003634934770000069
And a guide frame coordinate system
Figure GDA00036349347700000610
Acquiring initial rotation angles of an initial probe coordinate system and a guide frame coordinate system when the probe is in an initial pose;
when the probe moves to a certain pose, acquiring the motion variables of the forward and backward movement and the rotation around the Z axis of the probe, and correcting the motion variables by adopting the initial rotation angle to obtain a probe coordinate system of the probe in any pose
Figure GDA00036349347700000611
S22: based on the probe coordinate system
Figure GDA00036349347700000612
And a guide frame coordinate system
Figure GDA00036349347700000613
Obtaining a transformation relation between a probe coordinate system and a guide frame coordinate system
Figure GDA00036349347700000614
S23: adopt the stereoscopic vision high accuracyThe positioning camera shoots the tail end of the mechanical arm to obtain the transformation relation between the camera coordinate system and the mechanical arm base coordinate system
Figure GDA0003634934770000071
Further obtaining the transformation relation between the mechanical arm base coordinate system and the guide frame coordinate system
Figure GDA0003634934770000072
Specifically, a third reflective target ball is arranged at the tail end of the mechanical arm, the stereoscopic vision high-precision positioning camera is adopted to photograph the third reflective target ball at the tail end of the mechanical arm to obtain the coordinate of the tail end of the mechanical arm in a camera coordinate system, and meanwhile, the transformation relation between the camera coordinate system and a mechanical arm base coordinate system is obtained according to the transformation relation between the mechanical arm tail end coordinate system and the mechanical arm base coordinate system
Figure GDA0003634934770000073
Transforming the camera coordinate system and the mechanical arm base coordinate system
Figure GDA0003634934770000074
And a guide frame coordinate system
Figure GDA0003634934770000075
Multiplying to obtain the transformation relation between the base coordinate system of the mechanical arm and the coordinate system of the guide frame
Figure GDA0003634934770000076
S24: the stereoscopic vision high-precision positioning camera is adopted to photograph the operation execution tail end to obtain the coordinate system of the operation execution tail end
Figure GDA0003634934770000077
Obtaining a transformation relation between the coordinate system of the surgical execution end and the coordinate system of the mechanical arm end
Figure GDA0003634934770000078
Wherein,
Figure GDA0003634934770000079
the method is characterized in that the transformation relation between a mechanical arm tail end coordinate system and a mechanical arm base coordinate system is directly obtained according to the connecting rod parameters of the mechanical arm.
Specifically, a fourth reflective target ball is arranged at the operation execution end, and the stereoscopic vision high-precision positioning camera is adopted to photograph the fourth reflective target ball at the operation execution end to obtain an operation execution end coordinate system of the operation execution end in the camera coordinate system
Figure GDA00036349347700000710
Coordinate system of the surgical execution terminal
Figure GDA00036349347700000711
And
Figure GDA00036349347700000712
and
Figure GDA00036349347700000713
converting to obtain the transformation relation between the coordinate system of the operation execution end and the coordinate system of the end of the mechanical arm
Figure GDA00036349347700000714
S25: according to the above
Figure GDA00036349347700000715
Then obtaining the transformation relation between the coordinate system of the probe and the coordinate system of the operation executing end
Figure GDA00036349347700000716
The method of the present application further includes transforming the transformation relationship between the probe coordinate system and the surgical performing tip coordinate system into a transformation relationship between the two-dimensional image coordinate system and the surgical performing tip coordinate system, i.e., as follows, step S3.
S3: and acquiring a transformation relation between a two-dimensional image coordinate system and the probe coordinate system, wherein the two-dimensional image coordinate system is superposed with a coordinate plane of the probe coordinate system, and multiplying the transformation relation between the probe coordinate system and the operation execution tail end coordinate system and the transformation relation between the two-dimensional image coordinate system and the probe coordinate system to obtain the transformation relation between the two-dimensional image coordinate system and the operation execution tail end coordinate system.
The present application calibrates coordinate system transformation relationships by means of representations of the same point in different coordinate systems. As shown in FIG. 3, the specific method is to know two coordinate systems A and B, and respectively record the coordinates of N same spatial points in A, B coordinate system
Figure GDA0003634934770000081
And
Figure GDA0003634934770000082
i is 1 … N, the recorded point set is used for constructing an auxiliary coordinate system C, and the transformation relation between the coordinate system A, B and the auxiliary coordinate system C can be obtained from the mapping relation of the point set
Figure GDA0003634934770000083
And
Figure GDA0003634934770000084
according to the formula
Figure GDA0003634934770000085
Can obtain
Figure GDA0003634934770000086
The transformation relation between the coordinate systems A and B can be calibrated
Figure GDA0003634934770000087
After a certain point in a two-dimensional image coordinate system is obtained through medical images, the point can be converted into a coordinate system of an operation execution tail end by using the calibration method, and then the operation execution tail end is guided to complete a corresponding operation.
It will be understood by those skilled in the art that the foregoing is only a preferred embodiment of the present invention, and is not intended to limit the invention, and that any modification, equivalent replacement, or improvement made within the spirit and principle of the present invention should be included in the scope of the present invention.

Claims (5)

1. A coordinate calibration method based on an ultrasonic servo surgical robot is characterized in that the ultrasonic servo surgical robot comprises a mechanical arm, a surgical execution tail end arranged at the tail end of the mechanical arm, a guide frame, a motor arranged on the guide frame and a probe, wherein the motor drives the probe to rotate, and the method comprises the following steps:
s1: arranging a stereoscopic vision high-precision positioning camera at the operation execution tail end and obliquely above the probe, and acquiring a coordinate system of the stereoscopic vision high-precision positioning camera, namely a camera coordinate system;
s2: the method comprises the following steps of adopting the stereoscopic vision high-precision positioning camera to respectively photograph the guide frame, the probe, the tail end of the mechanical arm and the operation execution tail end to obtain a guide frame coordinate system, a probe coordinate system, a tail end coordinate system of the mechanical arm and a coordinate system of the operation execution tail end under a camera coordinate system, further respectively obtaining a transformation relation between a probe coordinate system and the guide frame coordinate system, a guide frame coordinate system and a mechanical arm base coordinate system, and a coordinate system of the operation execution tail end and a coordinate system of the tail end of the mechanical arm, multiplying the transformation relations, and transforming step by step to obtain a transformation relation between the probe coordinate system and the tail end coordinate system of the operation execution, and specifically comprising the following steps:
s21: the stereoscopic vision high-precision positioning camera is adopted to photograph the probe and the guide frame to obtain a probe coordinate system under any pose under a camera coordinate system
Figure FDA0003634934760000011
And a guide frame coordinate system
Figure FDA0003634934760000012
Wherein T is a 4 × 4 homogeneous matrix; step S21 specifically includes:
step 1, respectively arranging a first reflective target ball and a second reflective target ball on the probe and the guide frame;
step 2, keeping the guide frame from moving back and forth, starting a motor to drive the probe to rotate for a circle, simultaneously recording a plurality of points on the motion trail of the first reflective target ball by using the stereoscopic vision high-precision positioning camera, and recording the positions of the first reflective target ball and the second reflective target ball after rotation by using the stereoscopic vision high-precision positioning camera;
step 3, fitting the plurality of points to obtain a circle center serving as an origin of the guide frame coordinate system and the initial probe coordinate system, wherein a circle axis obtained by fitting is a Z axis of the guide frame coordinate system and the initial probe coordinate system; obtaining a probe coordinate system under any pose under a camera coordinate system by combining the positions of the first reflective target ball and the second reflective target ball after rotation
Figure FDA0003634934760000021
And a guide frame coordinate system
Figure FDA0003634934760000022
S22: based on the probe coordinate system
Figure FDA0003634934760000023
And a guide frame coordinate system
Figure FDA0003634934760000024
Obtaining a transformation relation between a probe coordinate system and a guide frame coordinate system
Figure FDA0003634934760000025
Figure FDA0003634934760000026
S23: the stereoscopic vision high-precision positioning camera is adopted to photograph the tail end of the mechanical arm to obtain the transformation relation between the camera coordinate system and the mechanical arm base coordinate system
Figure FDA0003634934760000027
Further obtaining the transformation relation between the mechanical arm base coordinate system and the guide frame coordinate system
Figure FDA0003634934760000028
Figure FDA0003634934760000029
S24: the stereoscopic vision high-precision positioning camera is adopted to photograph the operation execution tail end to obtain the coordinate system of the operation execution tail end
Figure FDA00036349347600000210
Obtaining a transformation relation between the coordinate system of the surgical execution end and the coordinate system of the end of the mechanical arm
Figure FDA00036349347600000211
Figure FDA00036349347600000212
Wherein,
Figure FDA00036349347600000213
directly acquiring the transformation relation between a mechanical arm tail end coordinate system and a mechanical arm base coordinate system according to the connecting rod parameters of the mechanical arm;
s25: according to the above
Figure FDA00036349347600000214
Obtaining the transformation relation between the coordinate system of the probe and the coordinate system of the operation executing end
Figure FDA00036349347600000215
Figure FDA00036349347600000216
S3: and acquiring a transformation relation between a two-dimensional image coordinate system and the probe coordinate system, wherein the two-dimensional image coordinate system is superposed with a coordinate plane of the probe coordinate system, and multiplying the transformation relation between the probe coordinate system and the operation execution tail end coordinate system and the transformation relation between the two-dimensional image coordinate system and the probe coordinate system to obtain the transformation relation between the two-dimensional image coordinate system and the operation execution tail end coordinate system.
2. The method according to claim 1, wherein the step S3 includes:
defining the coincidence of a coordinate plane of the two-dimensional image coordinate system and the probe coordinate system, and then obtaining the transformation relation between the two-dimensional image coordinate system and the probe coordinate system
Figure FDA00036349347600000217
Multiplying the transformation relation between the probe coordinate system and the surgical operation execution tail end coordinate system and the transformation relation between the two-dimensional image coordinate system and the probe coordinate system to obtain the transformation relation between the two-dimensional image coordinate system and the surgical operation execution tail end coordinate system
Figure FDA00036349347600000218
3. The method according to claim 1, wherein the probe coordinate system in any pose under the camera coordinate system is obtained in the step 3 by combining the positions of the first and second retro-reflective target balls after rotation
Figure FDA0003634934760000031
And a guide frame coordinate system
Figure FDA0003634934760000032
The method specifically comprises the following steps:
obtaining an initial probe coordinate system of the probe according to the original point, the Z axis and the positions of the first reflective target ball and the second reflective target ball
Figure FDA0003634934760000033
And a guide frame coordinate system
Figure FDA0003634934760000034
According to the initial probe coordinate system
Figure FDA0003634934760000035
And a guide frame coordinate system
Figure FDA0003634934760000036
Acquiring initial rotation angles of an initial probe coordinate system and a guide frame coordinate system when the probe is in an initial pose;
when the probe moves to a certain pose, acquiring the motion variables of the probe moving back and forth and rotating around the Z axis, and correcting the motion variables by adopting the initial corner to obtain the probe coordinate system of the probe in any pose
Figure FDA0003634934760000037
4. The method according to claim 1, wherein step S23 specifically comprises:
arranging a third light reflecting target ball at the tail end of the mechanical arm, photographing the third light reflecting target ball at the tail end of the mechanical arm by using the stereoscopic vision high-precision positioning camera to obtain the coordinate of the tail end of the mechanical arm in a camera coordinate system, and simultaneously obtaining the transformation relation between the camera coordinate system and a mechanical arm base coordinate system according to the transformation relation between the mechanical arm tail end coordinate system and the mechanical arm base coordinate system
Figure FDA0003634934760000038
Transforming the camera coordinate system and the mechanical arm base coordinate system
Figure FDA0003634934760000039
And a guide frame coordinate system
Figure FDA00036349347600000310
Multiplying to obtain the transformation relation between the base coordinate system of the mechanical arm and the coordinate system of the guide frame
Figure FDA00036349347600000311
Figure FDA00036349347600000312
5. The method according to claim 1, wherein step S24 is specifically:
setting a fourth reflective target ball at the operation execution tail end, and photographing the fourth reflective target ball at the operation execution tail end by using the stereoscopic vision high-precision positioning camera to obtain an operation execution tail end coordinate system of the operation execution tail end in the camera coordinate system
Figure FDA00036349347600000313
Coordinate system of the surgical performing end
Figure FDA00036349347600000314
And
Figure FDA00036349347600000315
and
Figure FDA00036349347600000316
converting to obtain the transformation relation between the coordinate system of the operation execution end and the coordinate system of the end of the mechanical arm
Figure FDA00036349347600000317
CN202110808016.5A 2021-07-16 2021-07-16 Coordinate calibration method based on ultrasonic servo surgical robot Active CN113547515B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202110808016.5A CN113547515B (en) 2021-07-16 2021-07-16 Coordinate calibration method based on ultrasonic servo surgical robot

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202110808016.5A CN113547515B (en) 2021-07-16 2021-07-16 Coordinate calibration method based on ultrasonic servo surgical robot

Publications (2)

Publication Number Publication Date
CN113547515A CN113547515A (en) 2021-10-26
CN113547515B true CN113547515B (en) 2022-07-12

Family

ID=78131970

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202110808016.5A Active CN113547515B (en) 2021-07-16 2021-07-16 Coordinate calibration method based on ultrasonic servo surgical robot

Country Status (1)

Country Link
CN (1) CN113547515B (en)

Families Citing this family (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN114137085A (en) * 2021-12-01 2022-03-04 仲恺农业工程学院 Ultrasonic flaw detection robot based on vision-assisted positioning and detection method thereof
CN115153782A (en) * 2022-08-12 2022-10-11 哈尔滨理工大学 Puncture robot space registration method under ultrasonic guidance
CN116473681B (en) * 2023-03-28 2024-02-20 北京维卓致远医疗科技发展有限责任公司 Control system and method of surgical robot

Citations (12)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2005342832A (en) * 2004-06-02 2005-12-15 Fanuc Ltd Robot system
CN103706568A (en) * 2013-11-26 2014-04-09 中国船舶重工集团公司第七一六研究所 System and method for machine vision-based robot sorting
CN104858870A (en) * 2015-05-15 2015-08-26 江南大学 Industrial robot measurement method based on tail end numbered tool
JP2016007277A (en) * 2014-06-23 2016-01-18 公立大学法人公立はこだて未来大学 Surgery support device and surgery support system
CN105716525A (en) * 2016-03-30 2016-06-29 西北工业大学 Robot end effector coordinate system calibration method based on laser tracker
CN108778179A (en) * 2016-02-26 2018-11-09 思想外科有限公司 Method and system for instructing user positioning robot
CN109199586A (en) * 2018-11-09 2019-01-15 山东大学 A kind of laser bone-culting operation robot system and its paths planning method
CN110202560A (en) * 2019-07-12 2019-09-06 易思维(杭州)科技有限公司 A kind of hand and eye calibrating method based on single feature point
CN110225720A (en) * 2017-01-13 2019-09-10 株式会社卓越牵引力 Operation auxiliary device and its control method, program and surgical assistant system
CN110279467A (en) * 2019-06-19 2019-09-27 天津大学 Ultrasound image under optical alignment and information fusion method in the art of puncture biopsy needle
CN111070199A (en) * 2018-10-18 2020-04-28 杭州海康威视数字技术股份有限公司 Hand-eye calibration assessment method and robot
CN112525074A (en) * 2020-11-24 2021-03-19 杭州素问九州医疗科技有限公司 Calibration method, calibration system, robot, computer device and navigation system

Family Cites Families (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
TWI672206B (en) * 2018-12-19 2019-09-21 財團法人工業技術研究院 Method and apparatus of non-contact tool center point calibration for a mechanical arm, and a mechanical arm system with said calibration function

Patent Citations (12)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2005342832A (en) * 2004-06-02 2005-12-15 Fanuc Ltd Robot system
CN103706568A (en) * 2013-11-26 2014-04-09 中国船舶重工集团公司第七一六研究所 System and method for machine vision-based robot sorting
JP2016007277A (en) * 2014-06-23 2016-01-18 公立大学法人公立はこだて未来大学 Surgery support device and surgery support system
CN104858870A (en) * 2015-05-15 2015-08-26 江南大学 Industrial robot measurement method based on tail end numbered tool
CN108778179A (en) * 2016-02-26 2018-11-09 思想外科有限公司 Method and system for instructing user positioning robot
CN105716525A (en) * 2016-03-30 2016-06-29 西北工业大学 Robot end effector coordinate system calibration method based on laser tracker
CN110225720A (en) * 2017-01-13 2019-09-10 株式会社卓越牵引力 Operation auxiliary device and its control method, program and surgical assistant system
CN111070199A (en) * 2018-10-18 2020-04-28 杭州海康威视数字技术股份有限公司 Hand-eye calibration assessment method and robot
CN109199586A (en) * 2018-11-09 2019-01-15 山东大学 A kind of laser bone-culting operation robot system and its paths planning method
CN110279467A (en) * 2019-06-19 2019-09-27 天津大学 Ultrasound image under optical alignment and information fusion method in the art of puncture biopsy needle
CN110202560A (en) * 2019-07-12 2019-09-06 易思维(杭州)科技有限公司 A kind of hand and eye calibrating method based on single feature point
CN112525074A (en) * 2020-11-24 2021-03-19 杭州素问九州医疗科技有限公司 Calibration method, calibration system, robot, computer device and navigation system

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
基于多模态感知的前列腺介入手术机器人控制;刘博健;《中国优秀硕士学位论文全文数据库 工程科技Ⅱ辑》;20190815;第13、16-28页 *

Also Published As

Publication number Publication date
CN113547515A (en) 2021-10-26

Similar Documents

Publication Publication Date Title
CN113547515B (en) Coordinate calibration method based on ultrasonic servo surgical robot
CN110051436B (en) Automated cooperative work assembly and application thereof in surgical instrument
CN111801198B (en) Hand-eye calibration method, system and computer storage medium
CN111156925B (en) Three-dimensional measurement method for large component based on line structured light and industrial robot
CN110728715B (en) Intelligent inspection robot camera angle self-adaptive adjustment method
CN107883929B (en) Monocular vision positioning device and method based on multi-joint mechanical arm
CN113133832B (en) Calibration method and system for double-arm robot puncture system
CN114289934B (en) Automatic welding system and method for large structural part based on three-dimensional vision
CN109974584A (en) The calibration system and scaling method of a kind of auxiliary laser bone-culting operation robot
CN111300384B (en) Registration system and method for robot augmented reality teaching based on identification card movement
CN109794963A (en) A kind of robot method for rapidly positioning towards curved surface member
WO2020063058A1 (en) Calibration method for multi-degree-of-freedom movable vision system
CN113208731B (en) Binocular vision system-based hand and eye calibration method for surgical puncture robot
CN109900251A (en) A kind of robotic positioning device and method of view-based access control model technology
CN113305851A (en) Online detection device for robot micro-assembly
CN112958974A (en) Interactive automatic welding system based on three-dimensional vision
CN114670199B (en) Identification positioning device, system and real-time tracking system
CN108347561B (en) Laser guide scanning system and scanning method
CN114777676A (en) Self-adaptive terahertz three-dimensional tomography device and method
CN110961583A (en) Steel ladle positioning device adopting laser scanning and using method thereof
CN212256370U (en) Optical motion capture system
US20230181263A1 (en) Dynamic 3d scanning robotic laparoscope
CN111862170A (en) Optical motion capture system and method
CN114046889B (en) Automatic calibration method for infrared camera
CN112743546B (en) Robot hand-eye calibration pose selection method and device, robot system and medium

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant