CN109421050A - A kind of control method and device of robot - Google Patents
A kind of control method and device of robot Download PDFInfo
- Publication number
- CN109421050A CN109421050A CN201811037353.3A CN201811037353A CN109421050A CN 109421050 A CN109421050 A CN 109421050A CN 201811037353 A CN201811037353 A CN 201811037353A CN 109421050 A CN109421050 A CN 109421050A
- Authority
- CN
- China
- Prior art keywords
- camera
- tool
- arm
- actual
- mechanical arm
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Granted
Links
- 238000000034 method Methods 0.000 title claims abstract description 60
- 238000009434 installation Methods 0.000 claims abstract description 23
- 230000009471 action Effects 0.000 claims description 19
- 239000011159 matrix material Substances 0.000 claims description 17
- 238000006243 chemical reaction Methods 0.000 claims description 10
- 238000004364 calculation method Methods 0.000 claims description 9
- 238000001514 detection method Methods 0.000 claims description 3
- 238000012549 training Methods 0.000 abstract description 4
- 230000008569 process Effects 0.000 description 7
- 238000010586 diagram Methods 0.000 description 4
- 230000003287 optical effect Effects 0.000 description 4
- 238000004891 communication Methods 0.000 description 2
- 238000004590 computer program Methods 0.000 description 2
- 230000009466 transformation Effects 0.000 description 2
- 238000004422 calculation algorithm Methods 0.000 description 1
- 230000001413 cellular effect Effects 0.000 description 1
- 230000008859 change Effects 0.000 description 1
- 210000000078 claw Anatomy 0.000 description 1
- 238000013480 data collection Methods 0.000 description 1
- 230000007547 defect Effects 0.000 description 1
- 238000005516 engineering process Methods 0.000 description 1
- 230000008676 import Effects 0.000 description 1
- 238000002372 labelling Methods 0.000 description 1
- 238000003754 machining Methods 0.000 description 1
- 238000005259 measurement Methods 0.000 description 1
- 238000012986 modification Methods 0.000 description 1
- 230000004048 modification Effects 0.000 description 1
- 238000012545 processing Methods 0.000 description 1
- 239000004984 smart glass Substances 0.000 description 1
- 238000013519 translation Methods 0.000 description 1
Classifications
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B25—HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
- B25J—MANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
- B25J9/00—Programme-controlled manipulators
- B25J9/16—Programme controls
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B25—HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
- B25J—MANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
- B25J19/00—Accessories fitted to manipulators, e.g. for monitoring, for viewing; Safety devices combined with or specially adapted for use in connection with manipulators
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B25—HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
- B25J—MANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
- B25J9/00—Programme-controlled manipulators
- B25J9/16—Programme controls
- B25J9/1679—Programme controls characterised by the tasks executed
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V10/00—Arrangements for image or video recognition or understanding
- G06V10/10—Image acquisition
Landscapes
- Engineering & Computer Science (AREA)
- Robotics (AREA)
- Mechanical Engineering (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- Multimedia (AREA)
- Theoretical Computer Science (AREA)
- Manipulator (AREA)
Abstract
The application provides a kind of control method and device of robot, for in robot system, the robot system includes mechanical arm, mechanical arm tail end tool and video camera, which comprises described image is converted to benchmark image by the image for obtaining the video camera shooting;According to the benchmark image, target object to be operated is detected;In the case where determining that the pose of the video camera meets operating condition of the mechanical arm tail end tool to the target object, the first actual motion strategy of the mechanical arm tail end tool is determined based on Motion Controlling Model;Wherein, the position between the video camera and the mechanical arm tail end tool is relatively fixed, and the Motion Controlling Model is obtained by training in reference atmosphere;According to mechanical arm tail end movement of tool described in the first actual motion policy control, to complete the movement to target object, so as to avoid crawl object failure caused by the installation error between video camera and mechanical arm tail end.
Description
Technical Field
The present disclosure relates to the field of robot technologies, and in particular, to a method and an apparatus for controlling a robot.
Background
The robot is generally provided with a camera, and the robot can control the tail end of the mechanical arm to perform various actions by using images acquired by the camera. The camera is equivalent to the eyes of the robot, the tail end of the mechanical arm is equivalent to the hands of the robot, and preset action tasks are completed through the cooperation between the hands and the eyes.
In the process of controlling the tail end of the mechanical arm of the robot to execute the operation action on the target object, an important step is to solve the relative position relationship between the target object and the tail end of the robot. One specific method is as follows: acquiring image data of a target object, labeling a relative position relation between the target object and a camera in the image, further training a motion control model, adjusting the tail end of a mechanical arm to move towards the target object according to the input image data and a motion strategy of the tail end of an output mechanical arm each time, and executing an operation instruction by a tool at the tail end of the mechanical arm to finish the operation on the target object under the condition that the motion control model judges that the position relation between the target object and the tail end of the mechanical arm meets an operation condition. Because the motion model is trained based on the acquired data of the reference environment, under the condition that a camera in the actual environment has installation errors or the parameters of the actual camera are inconsistent with the parameters of the camera during the acquisition of sample data, the original trained model is directly applied, so that larger errors are introduced, and the operation of a tool at the tail end of the mechanical arm on a target object fails.
Disclosure of Invention
In view of this, embodiments of the present application provide a method and an apparatus for controlling a robot to solve technical defects in the prior art.
The embodiment of the application discloses a control method of a robot, which is used in a robot system, wherein the robot system comprises a mechanical arm, a mechanical arm tail end tool and a camera;
the method comprises the following steps:
a2, acquiring the image shot by the camera, and converting the image into a reference image;
a4, detecting a target object to be operated according to the reference image;
a6, determining a first actual motion strategy of an end-of-arm-tool based on a motion control model in case it is determined that the pose of the camera meets the operational conditions of the end-of-arm-tool on the target object; wherein the position between the camera and the end-of-arm tool is relatively fixed, and the motion control model is trained in a reference environment;
a8, controlling the movement of the mechanical arm end tool according to the first actual movement strategy so as to complete the action of the target object.
In an exemplary embodiment of the present application, determining a first actual motion strategy of the end-of-arm-tool based on a motion control model comprises:
calculating an installation error between a reference environment and an actual environment according to a reference pose relation between the camera and the mechanical arm end tool in the reference environment and an actual pose relation in the actual environment;
obtaining a first reference motion strategy of the mechanical arm tail end tool in a reference environment according to a motion control model;
and calculating to obtain a first actual motion strategy of the tail end tool of the mechanical arm in the actual environment according to the installation error and the first reference motion strategy of the tail end tool of the mechanical arm in the reference environment.
In an exemplary embodiment of the present application, the first actual motion strategy of the end-of-arm-tool in the actual environment is obtained by the following formula:
T1=E1 -1*E0*T0
wherein, T0A first reference motion strategy, T, for said end-of-arm-tool in a reference environment1A first actual motion strategy for the end-of-arm tool in an actual environment;
E0a reference pose relationship matrix in a reference environment between the camera and the end-of-arm tool;
E1and (4) obtaining an actual pose relation matrix between the camera and the mechanical arm end tool in an actual environment.
In an exemplary embodiment of the present application, after step a4, the method further comprises:
a10, if the pose of the camera does not meet the operation condition of the end-of-arm-tool on the target object, adjusting the pose of the camera to the target pose, acquiring the image shot by the camera under the target pose, and then executing the step a 2.
In an exemplary embodiment of the present application, converting the image captured by the camera into the reference image comprises:
acquiring internal parameters of a reference camera in a reference environment;
acquiring current internal parameters of the camera;
obtaining the current parameter error between the camera and the reference camera according to the internal parameters of the camera and the internal parameters of the reference camera;
and converting according to the image shot by the camera and the parameter error to obtain the reference image in the reference environment.
In an exemplary embodiment of the present application, the image captured by the camera is converted into the reference image by the following formula:
wherein,
the reference image is an image in the camera in a reference environment;
(u, v) are coordinate values of the target point in the reference camera coordinate system;
(u ', v') are coordinate values of the target point in the current camera coordinate system;
u0,v0,fu,fvis the internal parameter of the reference camera;
u0',v0',fu',fv' is the current intrinsic parameter of the camera.
In an exemplary embodiment of the present application, the operating condition is that a distance between the end-of-arm-tool and the target object is equal to or less than a first threshold value.
The embodiment of the application discloses controlling means of robot sets up in the robot system, the robot system includes arm, the terminal instrument of arm and camera, the device includes:
the conversion module is used for acquiring the image shot by the camera and converting the image into a reference image;
the detection module is used for detecting a target object to be operated according to the reference image;
an actual motion strategy determination module, configured to determine a first actual motion strategy of an end-of-arm-tool based on a motion control model in a case that it is determined that a pose of the camera satisfies an operating condition of the end-of-arm-tool on the target object, where a position between the camera and the end-of-arm-tool is relatively fixed, and the motion control model is trained in a reference environment;
and the motion control module is used for controlling the motion of the mechanical arm end tool according to the first actual motion strategy so as to finish the action on the target object.
In an exemplary embodiment of the present application, the actual motion strategy determining module includes:
the error calculation module is used for calculating the installation error between the reference environment and the actual environment according to the reference pose relationship between the camera and the mechanical arm end tool in the reference environment and the actual pose relationship in the actual environment;
the reference motion strategy determining module is used for obtaining a first reference motion strategy of the mechanical arm terminal tool in a reference environment according to the motion control model;
and the actual motion strategy calculation module is used for calculating to obtain a first actual motion strategy of the mechanical arm end tool in the actual environment according to the installation error and the first reference motion strategy of the mechanical arm end tool in the reference environment.
In an exemplary embodiment of the present application, the actual motion strategy calculation module calculates a first actual motion strategy of the end-of-arm-tool in the actual environment by the following formula:
T1=E1 -1*E0*T0
wherein, T0A first reference motion strategy, T, for said end-of-arm-tool in a reference environment1A first actual motion strategy for the end-of-arm tool in an actual environment;
E0a reference pose relationship matrix in a reference environment between the camera and the end-of-arm tool;
E1and (4) obtaining an actual pose relation matrix between the camera and the mechanical arm end tool in an actual environment.
In an exemplary embodiment of the present application, the apparatus further comprises:
and the adjusting module is used for adjusting the pose of the camera to the target pose if the pose of the camera does not meet the operation condition of the end-of-arm tool on the target object, acquiring an image shot by the camera in the target pose, and executing the converting module.
In an exemplary embodiment of the present application, the conversion module includes:
the first parameter acquisition module is used for acquiring the internal parameters of the reference camera in the reference environment;
the second parameter acquisition module is used for acquiring the current internal parameters of the camera;
the parameter error acquisition module is used for acquiring the current parameter error between the camera and the reference camera according to the internal parameters of the camera and the internal parameters of the reference camera;
and the reference image acquisition module is used for converting according to the image shot by the camera and the parameter error to obtain the reference image in a reference environment.
In an exemplary embodiment of the present application, the conversion module converts the image captured by the camera into the reference image by the following formula:
wherein,
the reference image is an image in the camera in a reference environment;
(u, v) are coordinate values of the target point in the reference camera coordinate system;
(u ', v') are coordinate values of the target point in the current camera coordinate system;
u0,v0,fu,fvis the internal parameter of the reference camera;
u0',v0',fu',fv' is the current intrinsic parameter of the camera.
In an exemplary embodiment of the present application, the operating condition is that a distance between the end-of-arm-tool and the target object is equal to or less than a first threshold value.
The embodiment of the application discloses computing equipment, which comprises a memory, a processor and computer instructions stored on the memory and capable of running on the processor, wherein the processor executes the instructions to realize the following steps:
a2, acquiring the image shot by the camera, and converting the image into a reference image;
a4, detecting a target object to be operated according to the reference image;
a6, determining a first actual motion strategy of an end-of-arm-tool based on a motion control model in case it is determined that the pose of the camera meets the operational conditions of the end-of-arm-tool on the target object; wherein the position between the camera and the end-of-arm tool is relatively fixed, and the motion control model is trained in a reference environment;
a8, controlling the movement of the mechanical arm end tool according to the first actual movement strategy so as to complete the action of the target object.
The embodiment of the application discloses a computer readable storage medium, which stores computer instructions, and the instructions are executed by a processor to realize the steps of the control method of the robot.
According to the control method and device for the robot, in the process of controlling the robot to grab the target object, the camera is regarded as a part of the mechanical arm, so that the position relation between the target object and the tail end of the mechanical arm does not need to be calculated, under the condition that the pose of the camera meets the operation condition of the tail end tool of the mechanical arm on the target object, the first actual motion strategy of the tail end tool of the mechanical arm is determined based on the motion control model, the motion of the tail end tool of the mechanical arm is controlled according to the first actual motion strategy, the action on the target object is completed, and therefore the object grabbing failure caused by the installation error between the camera and the tail end of the mechanical arm is avoided.
Drawings
Fig. 1 is a schematic flowchart of a control method of a robot according to an embodiment of the present application;
fig. 2 is a schematic flowchart of a control method of a robot according to an embodiment of the present application;
fig. 3 is a flowchart illustrating a control method of a robot according to an embodiment of the present application;
fig. 4 is a flowchart illustrating a control method of a robot according to an embodiment of the present application;
fig. 5 is a schematic structural diagram of a control device of a robot according to an embodiment of the present application;
fig. 6 is a schematic structural diagram of a computing device according to an embodiment of the present application.
Detailed Description
In the following description, numerous specific details are set forth in order to provide a thorough understanding of the present application. This application is capable of implementation in many different ways than those herein set forth and of similar import by those skilled in the art without departing from the spirit of this application and is therefore not limited to the specific implementations disclosed below.
In the present application, a control method and apparatus of a robot, a computing device, and a computer-readable storage medium are provided, which are described in detail one by one in the following embodiments.
The embodiment of the application discloses a control method of a robot, which is used in a robot system, wherein the robot system comprises a mechanical arm, a mechanical arm end tool and a camera.
In this embodiment, the camera may be fixed to the end of the robot arm, and may move along with the movement of the robot arm, or may not be fixed to the end of the robot arm. In any case, the actual pose relation matrix E between the end-of-arm tool and the camera in the actual environment needs to be solved1. The relative positions of the camera and the tail end of the mechanical arm can be determined whether the camera and the tail end of the mechanical arm are fixedly connected relatively or are respectively installed in the actual environment. Solving an actual pose relation matrix E1Thereafter, canThe camera is regarded as a part of the mechanical arm, and whether the operation condition of the end tool of the mechanical arm on the target object is met or not is judged according to the pose of the camera.
In the prior art, a first attitude relationship matrix E for the end-of-arm tool to the camera1Can be obtained by eye-in-hand calibration method. Such as conventional calibration methods (e.g., OpenCV implementations of Tsai-Lenz in the prior art), Self-calibration methods (e.g., Self-calibration by Ma and Self-calibration by Lei), active vision calibration methods, and the like. Each method has the advantages and disadvantages that: the traditional calibration method has high precision but complex calculation process; the robustness of the self-calibration method is not strong; the active vision calibration method is simple in calculation and strong in robustness, but a high-precision active vision platform is needed, and due to errors of machining, installation and the like, ideal structural parameters usually have large deviation from actual parameters, so that the precision is not high. In general, a first attitude relationship matrix E is obtained1Eye-in-hand calibration methods are many, and the requirements are different under different scenes.
Referring to FIG. 1, the method comprises the following steps 102-108:
102. and acquiring an image shot by the camera, and converting the image into a reference image.
Optionally, referring to fig. 2, the step 102 includes:
202. the intrinsic parameters of a reference camera in a reference environment are acquired.
In this step, the reference camera may be the first camera used in acquiring the training sample.
The camera parameters themselves, including focal length, optical axis offset, etc., can be described by the camera intrinsic parameters.
204. And acquiring the current internal parameters of the camera.
206. And obtaining the current parameter error between the camera and the reference camera according to the internal parameters of the camera and the internal parameters of the reference camera.
208. And converting according to the image shot by the camera and the parameter error to obtain the reference image in the reference environment.
Specifically, the key points in the current image captured by the camera may be acquired, and then the coordinate information of the key points in the reference environment may be calculated according to the coordinate information of the key points and the parameter error, so as to obtain the reference image in the reference environment.
The following description is given with reference to a specific example. In one specific application example, if a spatial position of a certain target point in the camera coordinate system is P ═ (x, y, z), and its image coordinate P ═ (u, v), they have the following relationship:
wherein,
u0,v0,fu,fvthe internal parameters of the reference camera can be calibrated by using a Zhang friend calibration method.
fu is f/dx, fv is f/dy, called normalized focal length on u-axis and v-axis, respectively;
f is the focal length of the camera;
dx and dy denote the size of a unit pixel on the u-axis and the v-axis, respectively.
u0And v0It is indicated that the optical centre, i.e. the intersection of the optical axis of the camera with the image plane, is usually located at the image centre, so its value is often half the resolution.
The first camera used in the acquisition of the training sample is taken as a reference camera, and the parameter is assumed to be a parameter u0,v0,fu,fvWhen the current camera is used for sample collection or actual measurement, a new camera needs to be shot firstCalibrating the machine to obtain the internal parameter u0',v0',fu',fv'. An image is then acquired. Assuming that the pixel coordinates of the target point in the current camera are (u ', v'), substituting the above equation (1), the following relationship is obtained:
wherein,
(u, v) are coordinate values of the target point in the reference camera coordinate system;
(u ', v') are coordinate values of the target point in the current camera coordinate system;
u0,v0,fu,fvis the internal parameter of the reference camera;
u0',v0',fu',fv' is the current intrinsic parameter of the camera.
After a picture is taken by using the current camera, the information of each pixel is stored as { (u ', v', R, G, B) }. Converting the stored pixel information into a pixel set { (u, v, R, G, B) } to be obtained when the reference camera takes a picture according to the rule of formula (2), which corresponds to converting an image taken by the camera into a reference image.
It should be noted that, in actual use, coordinate values u and v of a target point obtained by direct conversion in a reference camera coordinate system are not integers in general, and one-step neighbor interpolation is required. Two interpolation methods are illustrated below:
1) nearest neighbor interpolation: and the pixel value of a certain point of the target image after transformation is equal to the pixel value of the point which is closest to the corresponding point in the source image before transformation. For example, assuming that the scaling ratios of the horizontal and vertical scaling are w and h, respectively, the coordinates of the point src in the source image corresponding to the point des (x, y) in the target image are (x)0,y0) (x/w, y/h). Wherein x is0,y0Possibly a small number, it is rounded off, i.e. (x)0,y0)=int(x0+0.5,y0+0.5), the pixel value of the point des (x, y) is the point src (x, y)0,y0) The pixel value of (2).
2) And bilinear interpolation can be carried out, the processed image is smoother and clearer than the image processed by nearest neighbor interpolation, and the RGB value of each pixel on the corrected image is finally obtained.
When a new camera is used for sample collection or pose estimation, the collected image is converted into a standard image by using the conversion, and the image coordinate and the RGB value of the coordinate of a new image are obtained. At the moment, the relation between the data and the space position coordinates (x, y and z) is not generated, and then subsequent processing is carried out, so that the consistency between sample data and actual data is ensured.
104. And detecting a target object to be operated according to the reference image.
106. Determining a first actual motion strategy of an end-of-arm-tool based on a motion control model if it is determined that the pose of the camera satisfies an operating condition of the end-of-arm-tool on the target object.
In this embodiment, the pose refers to a position value and a posture value, where the position value includes a spatial coordinate value (x, y, z) and the posture value includes a spatial angle value (Rx, Ry, Rz).
Wherein the position between the camera and the end-of-arm tool is relatively fixed, and the motion control model is trained in a reference environment.
In addition, because the position and posture relation matrix between the camera and the mechanical arm end tool is determined, whether the mechanical arm end tool meets the operation condition of the target object or not can be judged through the position and posture of the camera.
The operating condition may be various, for example, the distance of the end-of-arm-tool to the target object is equal to or less than a first threshold value. The first threshold may be set according to actual requirements, and may be set to 10 centimeters, 20 centimeters, or the like, for example.
More specifically, referring to fig. 3, determining a first actual motion strategy for the end-of-arm-tool based on a motion control model comprises:
302. and calculating the installation error between the reference environment and the actual environment according to the reference pose relationship between the camera and the mechanical arm end tool in the reference environment and the actual pose relationship in the actual environment.
During the in-service use, error between benchmark environment and the actual environment can exist to most cameras, for example the different model cameras of different brands, and installation error mostly can exist, and different process accuracy can lead to the difference that installation error has the size.
304. And obtaining a first reference motion strategy of the mechanical arm terminal tool in a reference environment according to the motion control model.
306. And calculating to obtain a first actual motion strategy of the tail end tool of the mechanical arm in the actual environment according to the installation error and the first reference motion strategy of the tail end tool of the mechanical arm in the reference environment.
In a specific application example, a first actual motion strategy of the end-of-arm tool in an actual environment is obtained by the following formula:
T1=E1 -1*E0*T0
wherein, T0A first reference motion strategy, T, for said end-of-arm-tool in a reference environment1A first actual motion strategy for the end-of-arm tool in an actual environment;
E0a reference pose relationship matrix in a reference environment between the camera and the end-of-arm tool;
E1for said cameraAnd the actual pose relation matrix of the end-of-arm tool in the actual environment.
108. And controlling the end-of-arm tool to move according to the first actual motion strategy so as to finish the action on the target object.
The actions of the end-of-arm-tool on the target object include: grabbing, moving, turning and the like.
Taking grabbing the target object as an example, the first actual motion strategy may be: moving forwards for 2cm to close the claws; the action of the mechanical arm end tool on the target object is as follows: and (5) grabbing action.
According to the control method of the robot, in the process of controlling the robot to grab the target object, the camera is regarded as a part of the mechanical arm, so that under the condition that the pose of the camera meets the operation condition of the end tool of the mechanical arm on the target object, a first actual motion strategy of the end tool of the mechanical arm is determined based on the motion control model, and the end tool of the mechanical arm is controlled to move according to the first actual motion strategy so as to finish the action on the target object; and under the condition that the pose of the camera does not meet the operating condition of the end-of-arm tool on the target object, adjusting the pose of the camera to meet the operating condition of the end-of-arm tool on the target object, thereby avoiding the failure of object grabbing caused by the installation error between the camera and the end of the arm.
The embodiment of the application also discloses a control method of the robot, and the control method is shown in FIG. 4 and comprises steps 402-412. Wherein, the steps 402, 404, 408, 410 are respectively consistent with the steps 102, 104, 106, 108 of the above embodiments, and for the detailed explanation of the steps 402, 404, 408, 410, refer to the contents of the steps 102, 104, 106, 108 of the above embodiments.
402. And acquiring an image shot by the camera, and converting the image into a reference image.
404. And detecting a target object to be operated according to the reference image.
406. And judging whether the pose of the camera meets the operation condition of the end-of-arm tool on the target object, if so, executing step 408, and if not, executing step 412.
408. Determining a first actual motion strategy of an end-of-arm-tool based on a motion control model if it is determined that the pose of the camera satisfies an operating condition of the end-of-arm-tool on the target object.
Wherein the position between the camera and the end-of-arm tool is relatively fixed, and the motion control model is trained in a reference environment.
410. And controlling the end-of-arm tool to move according to the first actual motion strategy so as to finish the action on the target object.
412. And if the pose of the camera does not meet the operation condition of the end-of-arm tool on the target object, adjusting the pose of the camera to the target pose, acquiring an image shot by the camera in the target pose, and executing step 402.
In this embodiment, there are various methods for controlling the movement of the mechanical arm, and the end-of-arm tool may be controlled to move once to meet the operating condition, or may be controlled to move in multiple steps to meet the operating condition, and each step is moved by a set amount of movement. Then, after the end-of-arm-end tool moves by one step, it is necessary to determine whether the pose of the camera satisfies the operating condition of the end-of-arm-end tool on the target object according to the reference image.
For example, in adjusting the pose of the camera to the target pose, the adjusting may include: move forward, move up, move down, move left, move right, etc.
According to the control method of the robot, in the process of controlling the robot to grab the target object, the camera is regarded as a part of the mechanical arm, so that under the condition that the pose of the camera meets the operation condition of the end tool of the mechanical arm on the target object, a first actual motion strategy of the end tool of the mechanical arm is determined based on the motion control model, and the end tool of the mechanical arm is controlled to move according to the first actual motion strategy so as to finish the action on the target object; and under the condition that the pose of the camera does not meet the operating condition of the end-of-arm tool on the target object, adjusting the pose of the camera to meet the operating condition of the end-of-arm tool on the target object, thereby avoiding the failure of object grabbing caused by the installation error between the camera and the end of the arm.
Before data acquisition, calibrating an actual pose relation matrix E from a tool at the tail end of the mechanical arm to the camera1And the practical pose relation matrix E is used1And adding the data into a robot motion model. When sample data is collected, the motion of the tail end of the mechanical arm is not controlled any more, and the pose of the camera is controlled instead (for example, in the prior art, if the installation directions of the camera and the tail end of the mechanical arm are not consistent, the original algorithm takes the tail end of the mechanical arm as a reference when the operations such as translation, advancing and the like are carried out during data collection, and the motion path of the camera may be obliquely advancing at the moment). The robot motion model only reflects the relation of relative position change between the image acquired by the camera and the camera-target object. And finally, when operation (such as target grabbing) is carried out, compensation is carried out through the calibrated pose parameters between the camera and the tail end mechanical arm, so that the mechanical arm is guaranteed to move to the designated operation position. By using the control method, model errors caused by installation errors of the camera and the tail end of the mechanical arm can be avoided.
The embodiment of the present application also discloses a control device of a robot, refer to fig. 5, set up in the robot system, the robot system includes arm, the terminal instrument of arm and camera, the device includes:
a conversion module 502, configured to acquire an image captured by the camera and convert the image into a reference image;
a detection module 504, configured to detect a target object to be operated according to the reference image;
an actual motion strategy determination module 506, configured to determine a first actual motion strategy of the end-of-arm-tool based on a motion control model in a case that it is determined that the pose of the camera satisfies an operation condition of the end-of-arm-tool on the target object, where a position between the camera and the end-of-arm-tool is relatively fixed, and the motion control model is trained in a reference environment;
and a motion control module 508, configured to control the end-of-arm-tool to move according to the first actual motion strategy, so as to complete an action on the target object.
Optionally, the operating condition is that a distance between the end-of-arm-tool and the target object is less than or equal to a first threshold value.
Optionally, the actual motion strategy determination module 506 comprises:
an error calculation module 5062, configured to calculate an installation error between the reference environment and the actual environment according to a reference pose relationship between the camera and the end-of-arm-tool in the reference environment and an actual pose relationship in the actual environment;
a reference motion strategy determination module 5064, configured to obtain a first reference motion strategy of the end-of-arm tool in a reference environment according to a motion control model;
and the actual motion strategy calculation module 5066 is configured to calculate a first actual motion strategy of the end-of-arm-tool in an actual environment according to the installation error and the first reference motion strategy of the end-of-arm-tool in the reference environment.
Optionally, the actual motion strategy determination module 506 calculates a first actual motion strategy of the end-of-arm-tool in the actual environment by the following formula:
T1=E1 -1*E0*T0
wherein, T0A first reference motion strategy, T, for said end-of-arm-tool in a reference environment1A first actual motion strategy for the end-of-arm tool in an actual environment;
E0a reference pose relationship matrix in a reference environment between the camera and the end-of-arm tool;
E1and (4) obtaining an actual pose relation matrix between the camera and the mechanical arm end tool in an actual environment.
Optionally, the apparatus further comprises:
an adjusting module 510, configured to adjust the pose of the camera to the target pose if the pose of the camera does not meet the operation condition of the end-of-arm-tool on the target object, acquire an image captured by the camera in the target pose, and then execute the converting module.
Optionally, the conversion module 502 comprises:
a first parameter acquiring module 5022, configured to acquire internal parameters of a reference camera in a reference environment;
a second parameter acquiring module 5024, configured to acquire current internal parameters of the camera;
a parameter error obtaining module 5026, configured to obtain a current parameter error between the camera and the reference camera according to the internal parameter of the camera and the internal parameter of the reference camera;
a reference image obtaining module 5028, configured to perform conversion according to the image captured by the camera and the parameter error to obtain the reference image in a reference environment.
Optionally, the conversion module 502 converts the image captured by the camera into the reference image by the following formula:
wherein,
the reference image is an image in the camera in a reference environment;
(u, v) are coordinate values of the target point in the reference camera coordinate system;
(u ', v') are coordinate values of the target point in the current camera coordinate system;
u0,v0,fu,fvis the internal parameter of the reference camera;
u0',v0',fu',fv' is the current intrinsic parameter of the camera.
According to the control device of the robot, in the process of controlling the robot to grab the target object, the camera is regarded as a part of the mechanical arm, so that the position relation between the target object and the tail end of the mechanical arm does not need to be calculated, under the condition that the pose of the camera is determined to meet the operation condition of the tail end tool of the mechanical arm on the target object, the first actual motion strategy of the tail end tool of the mechanical arm is determined based on the motion control model, the motion of the tail end tool of the mechanical arm is controlled according to the first actual motion strategy, the action on the target object is completed, and therefore the object grabbing failure caused by the installation error between the camera and the tail end of the mechanical arm is avoided.
The above is a schematic configuration of a control apparatus of a robot arm according to the present embodiment. It should be noted that the technical solution of the control apparatus for a robot arm is the same as the technical solution of the control method for a robot arm described above, and for details of the technical solution of the control apparatus for a robot arm, reference may be made to the description of the technical solution of the control method for a robot arm described above.
FIG. 6 is a block diagram illustrating a configuration of a computing device 600 according to an embodiment of the present application. The components of the computing device 600 include, but are not limited to, a memory 610 and a processor 620. The processor 620 is coupled to the memory 610.
Although not shown in fig. 6, it is to be appreciated that computing device 600 can also include a network interface that enables computing device 600 to communicate via one or more networks. Examples of such networks include a Local Area Network (LAN), a Wide Area Network (WAN), a Personal Area Network (PAN), or a combination of communication networks such as the internet. The network interface may include one or more of any type of network interface (e.g., a Network Interface Card (NIC)) whether wired or wireless, such as an IEEE802.11 Wireless Local Area Network (WLAN) wireless interface, a worldwide interoperability for microwave access (Wi-MAX) interface, an ethernet interface, a Universal Serial Bus (USB) interface, a cellular network interface, a bluetooth interface, a Near Field Communication (NFC) interface, and so forth.
In one embodiment of the present application, other components of the computing device 600 described above and not shown in FIG. 6 may also be connected to each other, such as by a bus. It should be understood that the block diagram of the computing device architecture shown in FIG. 6 is for purposes of example only and is not limiting as to the scope of the present application. Those skilled in the art may add or replace other components as desired.
Computing device 600 may be any type of stationary or mobile computing device, including a mobile computer or mobile computing device (e.g., tablet, personal digital assistant, laptop, notebook, netbook, etc.), mobile phone (e.g., smartphone), wearable computing device (e.g., smartwatch, smartglasses, etc.), or other type of mobile device, or a stationary computing device such as a desktop computer or PC. Computing device 600 may also be a mobile or stationary server.
Wherein the processor executes the instructions to implement the following steps a 2-a 8:
a2, acquiring the image shot by the camera, and converting the image into a reference image.
a4, detecting the target object to be operated according to the reference image.
a6, determining a first actual motion strategy of an end-of-arm-tool based on a motion control model in case it is determined that the pose of the camera satisfies the operational conditions of the end-of-arm-tool on the target object.
Wherein the position between the camera and the end-of-arm tool is relatively fixed, and the motion control model is trained in a reference environment.
a8, controlling the movement of the mechanical arm end tool according to the first actual movement strategy so as to complete the action of the target object.
The above is an illustrative scheme of a computing device of the present embodiment. It should be noted that the technical solution of the computing device and the technical solution of the control method of the robot belong to the same concept, and for details that are not described in detail in the technical solution of the computing device, reference may be made to the description of the technical solution of the control method of the robot.
An embodiment of the present application also provides a computer readable storage medium, which stores computer instructions, and the instructions are executed by a processor to implement the steps of the control method of the robot.
The above is an illustrative scheme of a computer-readable storage medium of the present embodiment. It should be noted that the technical solution of the storage medium is the same as the technical solution of the control method of the robot described above, and for details that are not described in detail in the technical solution of the storage medium, reference may be made to the description of the technical solution of the control method of the robot described above.
The computer instructions comprise computer program code which may be in the form of source code, object code, an executable file or some intermediate form, or the like. The computer-readable medium may include: any entity or device capable of carrying the computer program code, recording medium, usb disk, removable hard disk, magnetic disk, optical disk, computer Memory, Read-Only Memory (ROM), Random Access Memory (RAM), electrical carrier wave signals, telecommunications signals, software distribution medium, and the like. It should be noted that the computer readable medium may contain content that is subject to appropriate increase or decrease as required by legislation and patent practice in jurisdictions, for example, in some jurisdictions, computer readable media does not include electrical carrier signals and telecommunications signals as is required by legislation and patent practice.
It should be noted that, for the sake of simplicity, the above-mentioned method embodiments are described as a series of acts or combinations, but those skilled in the art should understand that the present application is not limited by the described order of acts, as some steps may be performed in other orders or simultaneously according to the present application. Further, those skilled in the art should also appreciate that the embodiments described in the specification are preferred embodiments and that the acts and modules referred to are not necessarily required in this application.
In the above embodiments, the descriptions of the respective embodiments have respective emphasis, and for parts that are not described in detail in a certain embodiment, reference may be made to related descriptions of other embodiments.
The preferred embodiments of the present application disclosed above are intended only to aid in the explanation of the application. Alternative embodiments are not exhaustive and do not limit the invention to the precise embodiments described. Obviously, many modifications and variations are possible in light of the above teaching. The embodiments were chosen and described in order to best explain the principles of the application and the practical application, to thereby enable others skilled in the art to best understand and utilize the application. The application is limited only by the claims and their full scope and equivalents.
Claims (10)
1. A control method of a robot is characterized by being used in a robot system, wherein the robot system comprises a mechanical arm, a mechanical arm end tool and a camera;
the method comprises the following steps:
a2, acquiring the image shot by the camera, and converting the image into a reference image;
a4, detecting a target object to be operated according to the reference image;
a6, determining a first actual motion strategy of an end-of-arm-tool based on a motion control model in case it is determined that the pose of the camera meets the operational conditions of the end-of-arm-tool on the target object; wherein the position between the camera and the end-of-arm tool is relatively fixed, and the motion control model is trained in a reference environment;
a8, controlling the movement of the mechanical arm end tool according to the first actual movement strategy so as to complete the action of the target object.
2. The method of controlling a robot according to claim 1,
determining a first actual motion strategy for the end-of-arm-tool based on a motion control model, comprising:
calculating an installation error between a reference environment and an actual environment according to a reference pose relation between the camera and the mechanical arm end tool in the reference environment and an actual pose relation in the actual environment;
obtaining a first reference motion strategy of the mechanical arm tail end tool in a reference environment according to a motion control model;
and calculating to obtain a first actual motion strategy of the tail end tool of the mechanical arm in the actual environment according to the installation error and the first reference motion strategy of the tail end tool of the mechanical arm in the reference environment.
3. The method of controlling a robot according to claim 2, wherein the first actual motion strategy of the end-of-arm-tool in the actual environment is obtained by the following equation:
T1=E1 -1*E0*T0
wherein, T0A first reference motion strategy, T, for said end-of-arm-tool in a reference environment1A first actual motion strategy for the end-of-arm tool in an actual environment;
E0for a reference position between the camera and the end-of-arm-tool in a reference environmentA posture relationship matrix;
E1and (4) obtaining an actual pose relation matrix between the camera and the mechanical arm end tool in an actual environment.
4. The method of controlling a robot of claim 1, wherein after step a4, the method further comprises:
a10, if the pose of the camera does not meet the operation condition of the end-of-arm-tool on the target object, adjusting the pose of the camera to the target pose, acquiring the image shot by the camera under the target pose, and then executing the step a 2.
5. The method of controlling a robot according to claim 1,
converting an image captured by the camera into the reference image, including:
acquiring internal parameters of a reference camera in a reference environment;
acquiring current internal parameters of the camera;
obtaining the current parameter error between the camera and the reference camera according to the internal parameters of the camera and the internal parameters of the reference camera;
and converting according to the image shot by the camera and the parameter error to obtain the reference image in the reference environment.
6. The control method of a robot according to claim 5,
converting the image captured by the camera into the reference image by the following formula:
wherein,
the reference image is an image in the camera in a reference environment;
(u, v) are coordinate values of the target point in the reference camera coordinate system;
(u ', v') are coordinate values of the target point in the current camera coordinate system;
u0,v0,fu,fvis the internal parameter of the reference camera;
u0',v0',fu',fv' is the current intrinsic parameter of the camera.
7. The method of controlling a robot according to claim 1,
the operating condition is that a distance between the end-of-arm-tool and the target object is equal to or less than a first threshold value.
8. A control apparatus of a robot, provided in a robot system including a robot arm, an end-of-arm tool, and a camera, the apparatus comprising:
the conversion module is used for acquiring the image shot by the camera and converting the image into a reference image;
the detection module is used for detecting a target object to be operated according to the reference image;
an actual motion strategy determination module, configured to determine a first actual motion strategy of an end-of-arm-tool based on a motion control model in a case that it is determined that a pose of the camera satisfies an operating condition of the end-of-arm-tool on the target object, where a position between the camera and the end-of-arm-tool is relatively fixed, and the motion control model is trained in a reference environment;
and the motion control module is used for controlling the motion of the mechanical arm end tool according to the first actual motion strategy so as to finish the action on the target object.
9. The control device of a robot according to claim 8, wherein the actual motion strategy determination module comprises:
the error calculation module is used for calculating the installation error between the reference environment and the actual environment according to the reference pose relationship between the camera and the mechanical arm end tool in the reference environment and the actual pose relationship in the actual environment;
the reference motion strategy determining module is used for obtaining a first reference motion strategy of the mechanical arm terminal tool in a reference environment according to the motion control model;
and the actual motion strategy calculation module is used for calculating to obtain a first actual motion strategy of the mechanical arm end tool in the actual environment according to the installation error and the first reference motion strategy of the mechanical arm end tool in the reference environment.
10. A computer-readable storage medium storing computer instructions, which when executed by a processor, implement the steps of the control method of the robot of any one of claims 1-7.
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN201811037353.3A CN109421050B (en) | 2018-09-06 | 2018-09-06 | Robot control method and device |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN201811037353.3A CN109421050B (en) | 2018-09-06 | 2018-09-06 | Robot control method and device |
Publications (2)
Publication Number | Publication Date |
---|---|
CN109421050A true CN109421050A (en) | 2019-03-05 |
CN109421050B CN109421050B (en) | 2021-03-26 |
Family
ID=65514860
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN201811037353.3A Active CN109421050B (en) | 2018-09-06 | 2018-09-06 | Robot control method and device |
Country Status (1)
Country | Link |
---|---|
CN (1) | CN109421050B (en) |
Cited By (7)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN110421565A (en) * | 2019-08-07 | 2019-11-08 | 江苏汇博机器人技术股份有限公司 | Robot global positioning and measuring system and method for practical training |
CN110464471A (en) * | 2019-09-10 | 2019-11-19 | 深圳市精锋医疗科技有限公司 | The control method of operating robot and its end instrument, control device |
CN111015655A (en) * | 2019-12-18 | 2020-04-17 | 深圳市优必选科技股份有限公司 | Mechanical arm grabbing method and device, computer readable storage medium and robot |
CN113284192A (en) * | 2021-06-18 | 2021-08-20 | 广东智源机器人科技有限公司 | Motion capture method and device, electronic equipment and mechanical arm control system |
CN113510697A (en) * | 2021-04-23 | 2021-10-19 | 知守科技(杭州)有限公司 | Manipulator positioning method, device, system, electronic device and storage medium |
CN113733078A (en) * | 2020-05-27 | 2021-12-03 | 中国人民解放军63920部队 | Method for interpreting fine control quantity of mechanical arm and computer-readable storage medium |
WO2022021156A1 (en) * | 2020-07-29 | 2022-02-03 | 西门子(中国)有限公司 | Method and apparatus for robot to grab three-dimensional object |
Citations (5)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN1727839A (en) * | 2004-07-28 | 2006-02-01 | 发那科株式会社 | Method of and device for re-calibrating three-dimensional visual sensor in robot system |
CN101092035A (en) * | 2006-06-20 | 2007-12-26 | 发那科株式会社 | Robot control apparatus |
CN103406905A (en) * | 2013-08-20 | 2013-11-27 | 西北工业大学 | Robot system with visual servo and detection functions |
CN106695792A (en) * | 2017-01-05 | 2017-05-24 | 中国计量大学 | Tracking and monitoring system and method of stacking robot based on machine vision |
WO2017065308A9 (en) * | 2015-10-13 | 2017-08-31 | Canon Kabushiki Kaisha | Imaging apparatus, production system, imaging method, program, and recording medium |
-
2018
- 2018-09-06 CN CN201811037353.3A patent/CN109421050B/en active Active
Patent Citations (5)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN1727839A (en) * | 2004-07-28 | 2006-02-01 | 发那科株式会社 | Method of and device for re-calibrating three-dimensional visual sensor in robot system |
CN101092035A (en) * | 2006-06-20 | 2007-12-26 | 发那科株式会社 | Robot control apparatus |
CN103406905A (en) * | 2013-08-20 | 2013-11-27 | 西北工业大学 | Robot system with visual servo and detection functions |
WO2017065308A9 (en) * | 2015-10-13 | 2017-08-31 | Canon Kabushiki Kaisha | Imaging apparatus, production system, imaging method, program, and recording medium |
CN106695792A (en) * | 2017-01-05 | 2017-05-24 | 中国计量大学 | Tracking and monitoring system and method of stacking robot based on machine vision |
Cited By (11)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN110421565A (en) * | 2019-08-07 | 2019-11-08 | 江苏汇博机器人技术股份有限公司 | Robot global positioning and measuring system and method for practical training |
CN110421565B (en) * | 2019-08-07 | 2022-05-13 | 江苏汇博机器人技术股份有限公司 | Robot global positioning and measuring system and method for practical training |
CN110464471A (en) * | 2019-09-10 | 2019-11-19 | 深圳市精锋医疗科技有限公司 | The control method of operating robot and its end instrument, control device |
CN111015655A (en) * | 2019-12-18 | 2020-04-17 | 深圳市优必选科技股份有限公司 | Mechanical arm grabbing method and device, computer readable storage medium and robot |
CN111015655B (en) * | 2019-12-18 | 2022-02-22 | 深圳市优必选科技股份有限公司 | Mechanical arm grabbing method and device, computer readable storage medium and robot |
CN113733078A (en) * | 2020-05-27 | 2021-12-03 | 中国人民解放军63920部队 | Method for interpreting fine control quantity of mechanical arm and computer-readable storage medium |
WO2022021156A1 (en) * | 2020-07-29 | 2022-02-03 | 西门子(中国)有限公司 | Method and apparatus for robot to grab three-dimensional object |
CN116249607A (en) * | 2020-07-29 | 2023-06-09 | 西门子(中国)有限公司 | Method and device for robotically gripping three-dimensional objects |
CN113510697A (en) * | 2021-04-23 | 2021-10-19 | 知守科技(杭州)有限公司 | Manipulator positioning method, device, system, electronic device and storage medium |
CN113510697B (en) * | 2021-04-23 | 2023-02-14 | 知守科技(杭州)有限公司 | Manipulator positioning method, device, system, electronic device and storage medium |
CN113284192A (en) * | 2021-06-18 | 2021-08-20 | 广东智源机器人科技有限公司 | Motion capture method and device, electronic equipment and mechanical arm control system |
Also Published As
Publication number | Publication date |
---|---|
CN109421050B (en) | 2021-03-26 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
CN109421050B (en) | Robot control method and device | |
CN110193849B (en) | Method and device for calibrating hands and eyes of robot | |
CN107255476B (en) | Indoor positioning method and device based on inertial data and visual features | |
US9639913B2 (en) | Image processing device, image processing method, image processing program, and storage medium | |
US20190043216A1 (en) | Information processing apparatus and estimating method for estimating line-of-sight direction of person, and learning apparatus and learning method | |
CN107030699B (en) | Pose error correction method and device, robot and storage medium | |
US9128366B2 (en) | Image processing system, image processing method, and computer program product | |
EP3771198B1 (en) | Target tracking method and device, movable platform and storage medium | |
CN113409391B (en) | Visual positioning method and related device, equipment and storage medium | |
CN112022355A (en) | Hand-eye calibration method and device based on computer vision and storage medium | |
JP2017092592A (en) | Tracking controller, tracking controlling method, optical equipment, and imaging apparatus | |
CN114445506A (en) | Camera calibration processing method, device, equipment and storage medium | |
JP5926462B2 (en) | Method and system for automatically adjusting optical seismic camera module | |
JP2018009918A (en) | Self-position detection device, moving body device, and self-position detection method | |
KR100944094B1 (en) | Robot and method for quality inspection | |
KR102220173B1 (en) | Automatic calibration method and apparatus for robot vision system | |
CN110619664B (en) | Laser pattern-assisted camera distance posture calculation method and server | |
JP2008298589A (en) | Device and method for detecting positions | |
CN113592907B (en) | Visual servo tracking method and device based on optical flow | |
CN112643718B (en) | Image processing apparatus, control method therefor, and storage medium storing control program therefor | |
CN112669388B (en) | Calibration method and device for laser radar and camera device and readable storage medium | |
CN112184819A (en) | Robot guiding method and device, computer equipment and storage medium | |
JP6315542B2 (en) | Image generating apparatus and image generating method | |
JP2017034616A (en) | Image processing apparatus and control method therefor | |
WO2024230054A1 (en) | Reinforcement learning-based image stitching method, apparatus and device, and storage medium |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
PB01 | Publication | ||
PB01 | Publication | ||
SE01 | Entry into force of request for substantive examination | ||
SE01 | Entry into force of request for substantive examination | ||
GR01 | Patent grant | ||
GR01 | Patent grant |