CN110636923A - Motion control method of robot, robot and controller - Google Patents
Motion control method of robot, robot and controller Download PDFInfo
- Publication number
- CN110636923A CN110636923A CN201780090902.0A CN201780090902A CN110636923A CN 110636923 A CN110636923 A CN 110636923A CN 201780090902 A CN201780090902 A CN 201780090902A CN 110636923 A CN110636923 A CN 110636923A
- Authority
- CN
- China
- Prior art keywords
- robot
- compensation value
- compensation
- marker
- motion
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Granted
Links
Images
Classifications
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B25—HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
- B25J—MANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
- B25J9/00—Programme-controlled manipulators
- B25J9/16—Programme controls
Landscapes
- Engineering & Computer Science (AREA)
- Robotics (AREA)
- Mechanical Engineering (AREA)
- Manipulator (AREA)
Abstract
Disclosed are a motion control method of a robot, the robot and a controller, which are used for calibrating a motion track of the robot in the motion process of the robot. The control method comprises the following steps: acquiring a first image (201) containing a marker by the shooting device at a target moment of a target period; calculating a compensation value (202) from the first image and a pre-stored reference image containing a marker; judging whether the compensation value is larger than a preset error value (203); and if the compensation value is larger than the preset error value, calibrating the motion of the robot according to the compensation value (204). The method can calibrate the motion trail of the robot in the motion process of the robot.
Description
The invention relates to the field of motion control of equipment, in particular to a motion control method of a robot, the robot and a controller.
With the continuous development of science and technology, industrial robots are applied to a plurality of industrial productions. Industrial robots generally perform their movements periodically according to a preset control program, and the precision of industrial robots is somewhat reduced due to long-term movement or other factors. When the precision of the industrial robot is reduced, the industrial robot cannot accurately complete the preset action. For example, when an industrial product is gripped, the gripping position may be inaccurate, resulting in an inability to precisely grip the industrial product. In order to solve the problem of accuracy reduction in the prior art, a technician generally adjusts a control program of an industrial robot to realize the method, and the method needs the operation of the technician and needs to suspend the work of the industrial robot.
Disclosure of Invention
The embodiment of the invention provides a motion control method of a robot and the robot, which are used for calibrating a motion track of the robot.
The first aspect of the embodiments of the present invention provides a motion control method, where the method is applied to a robot, and a robot arm of the robot is provided with a shooting device, and the method specifically includes:
acquiring a first image containing a marker by the shooting device at a target moment of a target period;
calculating a compensation value according to the first image and a pre-stored reference image containing a marker;
judging whether the compensation value is larger than a preset error value or not;
and if the compensation value is larger than the preset error value, calibrating the motion of the robot according to the compensation value.
A second aspect of the embodiments of the present invention provides a robot, where a robot arm of the robot is provided with a shooting device, and the robot specifically includes:
an acquisition unit configured to acquire a first image including a marker at a target time of a target period by the photographing device;
a calculation unit configured to calculate a compensation value from the first image and a reference image including a marker stored in advance;
the judging unit is used for judging whether the compensation value is larger than a preset error value or not;
and the calibration unit is used for calibrating the motion of the robot according to the compensation value when the compensation value is greater than the preset error value.
A third aspect of an embodiment of the present invention provides a robot, specifically including:
the shooting device comprises a mechanical arm, a shooting device and a controller, wherein the shooting device and the controller are installed on the mechanical arm;
the controller is configured to:
acquiring a first image containing a marker by the shooting device at a target moment of a target period;
calculating a compensation value according to the first image and a pre-stored reference image containing a marker;
judging whether the compensation value is larger than a preset error value or not;
and if the compensation value is larger than the preset error value, calibrating the motion of the robot according to the compensation value.
A fourth aspect of the embodiments of the present invention provides a controller, where the controller is connected to a shooting device and a robot arm of a robot, the shooting device is installed on the robot arm, and the controller specifically includes:
a memory and a processor;
the memory is used for storing operation instructions and related data;
the processor is used for calling the operation instruction to:
acquiring a first image containing a marker by the shooting device at a target moment of a target period;
calculating a compensation value according to the first image and a pre-stored reference image containing a marker;
judging whether the compensation value is larger than a preset error value or not;
and if the compensation value is larger than the preset error value, calibrating the motion of the robot according to the compensation value.
According to the technical scheme, the embodiment of the invention has the following advantages:
in the embodiment of the invention, the robot acquires a first image containing a marker through the shooting device at the target moment of a target period; calculating a compensation value according to the first image and a pre-stored reference image containing a marker; judging whether the compensation value is larger than a preset error value or not; and if the compensation value is larger than the preset error value, calibrating the motion of the robot according to the compensation value. The technical scheme of the invention can calibrate the motion trail of the robot in the motion process of the robot.
Fig. 1 is a schematic view of a scene architecture of a motion control method of a robot according to an embodiment of the present invention;
FIG. 2 is a flowchart illustrating an embodiment of a method for controlling the movement of a robot according to the present invention;
FIG. 3 is a functional block diagram of a robot according to an embodiment of the present invention;
FIG. 4 is a schematic diagram of another functional module of the robot in the embodiment of the present invention;
fig. 5 is a schematic structural diagram of a controller of a robot according to an embodiment of the present invention.
The embodiment of the invention provides a motion control method of a robot and the robot, which are used for calibrating a motion track of the robot.
In order to make the technical solutions of the present invention better understood, the technical solutions in the embodiments of the present invention will be clearly and completely described below with reference to the drawings in the embodiments of the present invention, and it is obvious that the described embodiments are only a part of the embodiments of the present invention, and not all of the embodiments.
The terms "first," "second," "third," "fourth," and the like in the description and in the claims, as well as in the drawings, if any, are used for distinguishing between similar elements and not necessarily for describing a particular sequential or chronological order. It will be appreciated that the data so used may be interchanged under appropriate circumstances such that the embodiments described herein may be practiced otherwise than as specifically illustrated or described herein. Furthermore, the terms "comprises," "comprising," or "having," and any variations thereof, are intended to cover non-exclusive inclusions, such that a process, method, system, article, or apparatus that comprises a list of steps or elements is not necessarily limited to those steps or elements expressly listed, but may include other steps or elements not expressly listed or inherent to such process, method, article, or apparatus.
The embodiment of the invention can be applied to the architecture as shown in fig. 1, wherein a shooting device is installed on a mechanical arm of an industrial robot, and one or more markers are placed between the moving tracks of the industrial robot. Specifically, the photographing device is installed at the end of a mechanical arm of the industrial robot.
The image captured by the imaging device is a depth image including depth information from an arbitrary point on the image to the imaging device. In this embodiment, the photographing device may be a binocular camera, a depth camera, or the like. In other embodiments, the camera may also be another camera, and the specific type of the camera is not limited herein.
Referring to fig. 2, an embodiment of a method for controlling a motion of a robot according to an embodiment of the present invention includes:
201. a first image containing a marker is acquired by a camera at a target time of a target cycle.
In this embodiment, when the robot moves to a target period in which the marker needs to be captured, a first image containing the marker is captured by the capturing device at a target time of the target period.
The target time is a time point of shooting a prestored reference image in one period, and the time point is a time point after a certain period of time from the beginning of the one period. For example, in the first cycle, if there is 60s in one cycle, the robot captures the target image at the 10 th s after the cycle starts, and the 10 th s calculated with respect to the start of each cycle is the target time of the cycle. It should be noted that the reference image may be acquired in the first period, or may be acquired in another period. In this embodiment, the reference image is acquired in the first period.
It should be noted that the control commands of the robot are identical in each cycle, or the control commands from the start time to the target time in each cycle are identical.
It should be noted that the target period may be set to each movement period of the robot, or may be set to one target period at intervals of several periods, which is not limited herein.
202. A compensation value is calculated from the first image and a pre-stored reference image containing a marker.
In this embodiment, calculating the compensation value according to the first image and the pre-stored reference image includes determining, by the robot, first position information of the marker with respect to the imaging device according to the first image acquired by the imaging device and the attribute information of the marker, determining, by the robot, second position information of the marker with respect to the imaging device according to the pre-stored reference image and the attribute information of the marker, and then determining, by the robot, the compensation value according to the first position information and the second position information. The robot calculates a compensation value according to the first position information and the second position information, and includes that the robot calculates a position compensation value in a three-dimensional space according to the first position information and the second position information, and calculates an angle compensation value around a central axis of a camera of the photographing device according to the first position information and the second position information.
The method for determining the first position information of the marker relative to the shooting position by the robot according to the first image acquired by the shooting device and the attribute information of the marker may be that the robot performs 3D calculation on the marker image information of the first image, wherein the performing 3D calculation on the marker image information of the first image by the robot includes:
after the robot acquires the first image at the target time of the target period through the photographing device, the robot analyzes the marker image in the first image according to the attribute information of the marker, and extracts the three-dimensional coordinates of the marker at the target time, where: when extracting the three-dimensional coordinates of the marker in the image, a coordinate axis needs to be established, the origin of the coordinate axis can be located at the center position of a camera of the shooting device, the vertical upward direction is the Z axis, the direction of the X axis is from the west to the east, the direction of the Y axis is from the south to the north, the Z axis intersects the intersection point of the X axis and the Y axis, and the direction can be upward. It should be noted that the position of the coordinate axis may be selected from other positions as long as it is ensured that the three-dimensional coordinates of the calculation marker in the target period and the three-dimensional coordinates of the calculation target in the first period are in the same coordinate system, and therefore, the specific establishment manner of the coordinate axis is not limited herein.
The robot may calculate the position compensation value in the three-dimensional space according to the first position information and the second position information, and after extracting the three-dimensional coordinates of the marker in the first image and the three-dimensional coordinates of the marker in the reference image, the robot may obtain the position compensation value in the three-dimensional space by subtracting the three-dimensional coordinates of the marker in the first image and the three-dimensional coordinates of the marker in the reference image.
In this embodiment, in addition to the position compensation value acquired in the three-dimensional space from the first image, an angle compensation value around a central axis of a camera of the photographing device needs to be calculated according to the first position information and the second position information. In other embodiments, only the position compensation value or the angle compensation value may be calculated.
The camera can acquire depth information from any point on the image shot by the camera, and the center of the image shot each time faces the camera. And calculating the three-dimensional coordinates of the marker according to the shot image and the depth information from any point on the image to the camera. Meanwhile, if the robot rotates, the camera on the robot also rotates, that is, the captured image also rotates along the center of the captured image, that is, the amount of rotation of the markers in the first image and the reference image relative to the center point of the image is the angle compensation value.
The present embodiment is described below with reference to specific embodiments:
for example, the robot takes a picture (i.e., a first image) at a target time of a target period by using the photographing device, extracts coordinates of the marker from the picture, for example, the coordinates are (7,6,6), the coordinates of the marker extracted by the robot from the picture (i.e., a reference image) taken at the target time of the first period (or other periods) stored in advance are (7,7,8), and then calculates a difference value according to the two coordinates to obtain a position compensation value (0,1,2) in the three-dimensional space of the robot, that is, the position of the photographing device of the robot in the current period is moved by 1 unit in the negative direction of the Y axis and is moved by 2 units in the negative direction of the Z axis relative to the position in the first period. Meanwhile, the robot obtains an included angle (namely a first angle) between a connecting line of the image center point and the marker and the X axis from the first image and is 44.5 degrees, an included angle (namely a second angle) between a connecting line of the image center point and the marker and the X axis from the reference image and is 45 degrees, the difference value calculation is carried out according to the two angles, an angle compensation value of the robot is obtained and is 0.5 degrees, namely the position of the shooting device of the robot in the current period is rotated by 0.5 degrees clockwise relative to the position of the shooting device of the robot in the first period.
It should be noted that, when the marker position information of the target time referring to the first cycle is collected, the markers need to be respectively photographed at the time when the robot performs the translation and rotation actions, and the two times are respectively marked as two target times, that is, images respectively photographed at the two target times are respectively used for detecting the position information and the posture information of the robot during the movement process, so that the robot can maximally separate and detect the position and the posture of the robot during the movement, perform precision calibration on the position in the moving area, and perform precision calibration on the posture in the rotating area. The selection of the number of target moments and the number of markers in the period is related to the distance and the motion of the motion trail of the robot, the longer the motion trail is, the more the number of the target moments and the number of the markers in the period are, the more the motion is complex, and the more the number of the target moments and the number of the markers in the period are; when the motion trajectory of the robot in a period includes motions such as translation, rotation and the like, then the robot needs to set one or more target moments when performing translation operation, set one or more target moments when performing rotation, and set one or more target moments after completing rotation, each target moment corresponds to a corresponding marker, so that the number of the target moments and the number of the markers in the period may be one or more, and the specific description is not limited herein.
It should be noted that when the image is captured by the capturing device, the capturing device should be as far as possible from the marker or from the plane on which the marker is located, so as to achieve the best effect.
203. And judging whether the compensation value is greater than a preset error value, if so, executing the step 204, and if not, executing the step 205.
In this embodiment, the preset error value is input by the programmer at an earlier stage, and may be different according to different working types of the robot, and the specific numerical value is not limited here.
For example, if the position error value in the preset three-dimensional space is (1,1,1), that is, the coordinate error of the X axis in the compensation value coordinate is not greater than 1, the coordinate error of the Y axis is not greater than 1, and the coordinate error of the Z axis is not greater than 1, as long as the value of one coordinate axis in the compensation value is greater than or equal to the preset error value, it is determined that the compensation value is greater than the preset error value; if the preset angle error value is 1 °, that is, the angle in the compensation value is greater than or equal to 1 °, it is determined that the compensation value is greater than the preset error value.
204. And calibrating the motion of the robot according to the compensation value.
In this embodiment, the calibration process includes: compensating the coordinate sequence in the robot control command by using the compensation value to obtain a compensation coordinate sequence; and controlling the motion of the robot according to the compensation coordinate sequence. Specifically, a coordinate sequence in a robot control command is compensated according to a position compensation value in a three-dimensional space to obtain a compensation coordinate sequence, and then the motion of the robot is controlled according to the compensation coordinate sequence to correct the motion track of the robot.
In this embodiment, the calibration process further includes: and utilizing the compensation value to compensate the rotation angle in the robot control command to obtain a compensation angle, and controlling the motion of the robot according to the compensation angle.
In this embodiment, the coordinates and the rotation angle of the robot are compensated at the same time, so as to calibrate the motion trajectory of the robot and the pose of the robot at the same time. In other embodiments, the coordinates or the rotation angle of the robot may be individually compensated to individually calibrate the motion trajectory of the robot or the pose of the robot.
As an example in step 202, coordinates of the marker obtained from the first image are (7,6,6), coordinates of the marker obtained from the reference image are (7,7,8), and then, a difference is calculated from these two coordinates to obtain a position compensation value (0,1,2) in the three-dimensional space. And the position error value in the preset three-dimensional space is (1,1,1), and because the compensation value and the coordinate 2 of the Z axis in the preset error value are greater than 1, the compensation value is judged to be greater than the preset error value, and at the moment, the compensation value (0,1,2) is utilized to perform compensation processing on the coordinate sequence in the robot control command in the current period. For example, if one coordinate value in the robot control command of the current cycle is (9,7,8), the compensation coordinate is (9,8,10) calculated from the compensation value. When the robot is controlled to run according to the compensation coordinates, corresponding errors can be corrected. Similarly, the angle compensation value of the robot is 0.5 °, the corresponding preset angle error value of the robot is 1 °, and the angle of the robot does not need to be adjusted because 0.5 ° <1 °. If the posture of the robot needs to be corrected, the rotation angle in the robot control command of the current period is compensated by using the compensation value of 0.5 degrees, and the robot is controlled to move according to the compensated rotation angle, so that the related error can be corrected.
205. The motion of the robot is not calibrated.
In this embodiment, after the robot obtains the compensation value and determines that the compensation value is not greater than the preset error value, the robot still controls the motion of the robot according to the original preset control command, and the motion of the robot does not need to be calibrated.
In the embodiment of the invention, a first image containing a marker is acquired by the shooting device at the target moment of a target period; calculating a compensation value according to the first image and a pre-stored reference image containing a marker; judging whether the compensation value is larger than a preset error value or not; and if the compensation value is larger than the preset error value, calibrating the motion of the robot according to the compensation value. By the method and the device, the motion trail of the robot can be calibrated in the running process of the robot.
The motion control method of the robot in the embodiment of the present invention is described above, and the robot in the embodiment of the present invention is described below with reference to fig. 3.
An acquisition unit 301 for acquiring a first image containing a marker by a camera at a target time of a target cycle;
a calculating unit 302 for calculating a compensation value from the first image and a pre-stored reference image containing the marker;
a judging unit 303, configured to judge whether the compensation value is greater than a preset error value;
and the calibration unit 304 is configured to perform calibration processing on the motion of the robot according to the compensation value when the compensation value is greater than the preset error value.
And a motion unit 305, configured to not calibrate the motion of the robot when the compensation value is smaller than the preset error value.
In the embodiment of the present invention, the obtaining unit 301 obtains a first image including a marker through a camera at a target time of a target period; the calculation unit 302 calculates a compensation value from the first image and a reference image containing a marker stored in advance; the judging unit 303 judges whether the compensation value is greater than a preset error value; if the compensation value is greater than the preset error value, the calibration unit 304 performs calibration processing on the motion of the robot according to the compensation value. The technical scheme of the invention can calibrate the motion trail of the robot in the motion process of the robot.
Referring to fig. 4, another embodiment of the robot according to the embodiment of the present invention includes:
an acquisition unit 401 for acquiring a first image containing a marker by a camera at a target time of a target cycle;
a calculation unit 402 for calculating a compensation value from the first image and a reference image containing a marker stored in advance;
wherein, the calculating unit 402 comprises:
a first determining subunit 4021 configured to determine first position information of the marker with respect to the imaging apparatus, based on the first image and the attribute information of the marker;
a second determination subunit 4022 configured to determine second position information of the marker with respect to the imaging apparatus, based on a reference image including the marker and attribute information of the marker stored in advance; and
the calculating subunit 4023 is configured to calculate a compensation value according to the first position information and the second position information.
A judging unit 403, configured to judge whether the compensation value is greater than a preset error value;
and the calibration unit 404 is configured to perform calibration processing on the motion of the robot according to the compensation value when the compensation value is greater than the preset error value.
Among them, the calibration unit 404 includes:
the compensation subunit 4041 is configured to perform compensation processing on the coordinate sequence in the robot control command by using the compensation value to obtain a compensation coordinate sequence;
a calibration subunit 4042 for controlling the movement of the robot in accordance with the compensated coordinate sequence.
And the motion unit 405 is used for not calibrating the motion trail of the robot when the compensation value is smaller than the preset error value.
In the embodiment of the present invention, the obtaining unit 401 obtains a first image containing a marker through a shooting device at a target time of a target period; the calculation unit 402 calculates a compensation value from the first image and a reference image containing a marker stored in advance; the judgment unit 403 judges whether the compensation value is greater than a preset error value; if the compensation value is greater than the preset error value, the calibration unit 404 calibrates the motion of the robot according to the compensation value. The technical scheme of the invention can calibrate the motion trail of the robot in the motion process of the robot.
The invention also provides a controller, which is connected with the shooting device and the mechanical arm of the robot, and the shooting device is arranged on the mechanical arm. Referring to fig. 5, a device diagram of a controller according to an embodiment of the present invention is shown, in which the controller 50 includes: memory 510, processor 520.
The memory 510 is used for storing operation instructions and related data;
the processor 520 is used for executing the following steps by calling the operation instructions stored in the memory 510:
acquiring a first image containing a marker by a shooting device at a target moment of a target period; calculating a compensation value according to the first image and a pre-stored reference image containing the marker; judging whether the compensation value is larger than a preset error value or not; and if the compensation value is greater than the preset error value, calibrating the motion of the robot according to the compensation value.
The reference image and a first image captured by the camera include depth information.
In this embodiment, the processor 520 may also be referred to as a Central Processing Unit (CPU).
The memory 510 is used for storing operation instructions and data so that the processor 520 calls the operation instructions to implement corresponding operations, and may include a read-only memory and a random access memory. A portion of Memory 510 may also include Non-Volatile Random Access Memory (NVRAM).
The controller 50 further includes a bus system 530, the bus system 530 coupling together the various components of the controller 50, including the memory 510, the processor 520, wherein the bus system 530 may include a power bus, a control bus, a status signal bus, and the like, in addition to a data bus. For clarity of illustration, however, the various buses are designated in the figure as bus system 530.
In this embodiment, it should be further noted that the method disclosed in the above embodiments of the present invention may be applied to the processor 520, or implemented by the processor 520. Processor 520 may be an integrated circuit chip having signal processing capabilities. In implementation, the steps of the above method may be performed by integrated logic circuits of hardware or instructions in the form of software in the processor 520. The processor 520 may be a general-purpose processor, a Digital Signal Processor (DSP), an Application Specific Integrated Circuit (ASIC), a Programmable Gate Array (FPGA), or other Programmable logic device, discrete Gate or transistor logic device, or discrete hardware component. The various methods, steps and logic blocks disclosed in the embodiments of the present invention may be implemented or performed. A general purpose processor may be a microprocessor or the processor may be any conventional processor or the like. The steps of the method disclosed in connection with the embodiments of the present invention may be directly implemented by a hardware decoding processor, or implemented by a combination of hardware and software modules in the decoding processor. The software module may be located in ram, flash memory, rom, prom, or eprom, registers, etc. storage media as is well known in the art. The storage medium is located in the memory 510, and the processor 520 reads the information in the memory 510 and performs the steps of the above method in combination with the hardware thereof.
In another possible embodiment, the processor 520 may also call the operation instructions in the memory 510 to perform the following steps:
determining first position information of the marker relative to the shooting device according to the first image and the attribute information of the marker;
determining second position information of the marker relative to the shooting device according to a pre-stored reference image containing the marker and the attribute information of the marker; and
and calculating a compensation value according to the first position information and the second position information.
The compensation values include position compensation values and/or angle compensation values.
The specific implementation of the calibration process performed by the processor 520 according to the compensation value in the above embodiment may be:
compensating the coordinate sequence in the robot control command by using the compensation value to obtain a compensation coordinate sequence; and controlling the motion of the robot according to the compensation coordinate sequence.
The specific implementation of the calibration process performed by the processor 520 on the motion of the robot according to the compensation value may be as follows:
compensating the rotating angle in the robot control command by using the compensation value to obtain a compensation angle; and controlling the motion of the robot according to the compensation angle.
In the above embodiment, the controller acquires the first image including the marker through the photographing device at the target time of the target period; calculating a compensation value according to the first image and a pre-stored reference image containing the marker; judging whether the compensation value is larger than a preset error value or not; and if the compensation value is greater than the preset error value, calibrating the motion of the robot according to the compensation value. The motion trail of the robot can be calibrated in the process of the robot motion.
The embodiment of the invention also provides a robot, which comprises a mechanical arm, a shooting device and a controller, wherein the shooting device and the controller are installed on the mechanical arm, and the robot executes the motion control method of the robot through the controller, so that the motion track of the robot can be calibrated in the motion process if the motion track of the robot deviates from the preset motion track to a certain extent in the motion process of the robot.
It is clear to those skilled in the art that, for convenience and brevity of description, the specific working processes of the above-described systems, apparatuses and units may refer to the corresponding processes in the foregoing method embodiments, and are not described herein again.
In the several embodiments provided in the present application, it should be understood that the disclosed system, apparatus and method may be implemented in other manners. For example, the above-described apparatus embodiments are merely illustrative, and for example, the division of the units is only one logical division, and other divisions may be realized in practice, for example, a plurality of units or components may be combined or integrated into another system, or some features may be omitted, or not executed. In addition, the shown or discussed mutual coupling or direct coupling or communication connection may be an indirect coupling or communication connection through some interfaces, devices or units, and may be in an electrical, mechanical or other form.
The units described as separate parts may or may not be physically separate, and parts displayed as units may or may not be physical units, may be located in one place, or may be distributed on a plurality of network units. Some or all of the units can be selected according to actual needs to achieve the purpose of the solution of the embodiment.
In addition, functional units in the embodiments of the present invention may be integrated into one processing unit, or each unit may exist alone physically, or two or more units are integrated into one unit. The integrated unit can be realized in a form of hardware, and can also be realized in a form of a software functional unit.
The integrated unit, if implemented in the form of a software functional unit and sold or used as a stand-alone product, may be stored in a computer readable storage medium. Based on such understanding, the technical solution of the present invention may be embodied in the form of a software product, which is stored in a storage medium and includes instructions for causing a computer device (which may be a personal computer, a server, or a network device) to execute all or part of the steps of the method according to the embodiments of the present invention. And the aforementioned storage medium includes: a U-disk, a removable hard disk, a Read-Only Memory (ROM), a Random Access Memory (RAM), a magnetic disk or an optical disk, and other various media capable of storing program codes.
The above-mentioned embodiments are only used for illustrating the technical solutions of the present invention, and not for limiting the same; although the present invention has been described in detail with reference to the foregoing embodiments, it will be understood by those of ordinary skill in the art that: the technical solutions described in the foregoing embodiments may still be modified, or some technical features may be equivalently replaced; and such modifications or substitutions do not depart from the spirit and scope of the corresponding technical solutions of the embodiments of the present invention.
Claims (24)
- A motion control method of a robot is characterized in that the method is applied to the robot, a shooting device is installed on a mechanical arm of the robot, and the method comprises the following steps:acquiring a first image containing a marker by the shooting device at a target moment of a target period;calculating a compensation value according to the first image and a pre-stored reference image containing a marker;judging whether the compensation value is larger than a preset error value or not;and if the compensation value is larger than the preset error value, calibrating the motion of the robot according to the compensation value.
- The method of claim 1, wherein calculating a compensation value from the first image and a pre-stored reference image containing a marker comprises:determining first position information of the marker relative to an imaging device according to the first image and the attribute information of the marker;determining second position information of the marker relative to the shooting device according to the pre-stored reference image containing the marker and the attribute information of the marker; andand calculating the compensation value according to the first position information and the second position information.
- The method of claim 2, wherein the compensation value comprises a position compensation value and/or an angle compensation value.
- The method of claim 2, wherein the reference image and the first image captured by the camera include depth information.
- The method according to any one of claims 1 to 4, wherein the calibrating the motion of the robot according to the compensation value comprises:compensating the coordinate sequence in the robot control command by using the compensation value to obtain a compensation coordinate sequence; andand controlling the motion of the robot according to the compensation coordinate sequence.
- The method according to any one of claims 1 to 4, wherein the calibrating the motion of the robot according to the compensation value comprises:compensating the rotation angle in the robot control command by using the compensation value to obtain a compensation angle; andand controlling the motion of the robot according to the compensation angle.
- The robot is characterized in that a shooting device is installed on a mechanical arm of the robot, and the robot comprises:an acquisition unit configured to acquire a first image including a marker at a target time of a target period by the photographing device;a calculation unit configured to calculate a compensation value from the first image and a reference image including a marker stored in advance;the judging unit is used for judging whether the compensation value is larger than a preset error value or not;and the calibration unit is used for calibrating the motion of the robot according to the compensation value when the compensation value is greater than the preset error value.
- The robot of claim 7, wherein the computing unit comprises:a first determining subunit, configured to determine, according to the first image and attribute information of the marker, first position information of the marker with respect to a photographing device;a second determining subunit, configured to determine, according to the pre-stored reference image including the marker and the attribute information of the marker, second position information of the marker with respect to the imaging device; andand the calculating subunit is used for calculating the compensation value according to the first position information and the second position information.
- A robot as claimed in claim 8, characterized in that the compensation values comprise position compensation values and/or angle compensation values.
- The robot of claim 8, wherein the reference image and the first image captured by the camera include depth information.
- A robot according to any of claims 7-9, characterized in that the calibration unit comprises:the compensation subunit is used for compensating the coordinate sequence in the robot control command by using the compensation value to obtain a compensation coordinate sequence;and the calibration subunit is used for controlling the motion of the robot according to the compensation coordinate sequence.
- The robot of claim 11, wherein the compensation subunit is further configured to perform compensation processing on a rotation angle in the robot control command by using the compensation value to obtain a compensation angle; andthe calibration subunit is further configured to control the movement of the robot according to the compensation angle.
- A robot, characterized in that the robot comprises:the shooting device comprises a mechanical arm, a shooting device and a controller, wherein the shooting device and the controller are installed on the mechanical arm;the controller is configured to:acquiring a first image containing a marker by the shooting device at a target moment of a target period;calculating a compensation value according to the first image and a pre-stored reference image containing a marker;judging whether the compensation value is larger than a preset error value or not;and if the compensation value is larger than the preset error value, calibrating the motion of the robot according to the compensation value.
- The robot of claim 13, wherein the controller calculates a compensation value from the first image and a pre-stored reference image containing a marker, comprising:determining first position information of the marker relative to an imaging device according to the first image and the attribute information of the marker;determining second position information of the marker relative to the shooting device according to the pre-stored reference image containing the marker and the attribute information of the marker; andand calculating the compensation value according to the first position information and the second position information.
- A robot according to claim 14, characterized in that the compensation values comprise position compensation values and/or angle compensation values.
- The robot of claim 14, wherein the reference image and the first image captured by the camera include depth information.
- A robot according to any of claims 13 to 16, wherein the controller calibrates the motion of the robot according to the compensation value, including:compensating the coordinate sequence in the robot control command by using the compensation value to obtain a compensation coordinate sequence; andand controlling the motion of the robot according to the compensation coordinate sequence.
- A robot according to any of claims 13 to 16, wherein the controller calibrates the motion of the robot according to the compensation value, including:compensating the rotation angle in the robot control command by using the compensation value to obtain a compensation angle; andand controlling the motion of the robot according to the compensation angle.
- A controller, characterized in that the controller links to each other with the arm of shooting device and robot, the shooting device is installed on the arm, the controller includes:a memory and a processor;the memory is used for storing operation instructions and related data;the processor is used for calling the operation instruction to:acquiring a first image containing a marker by the shooting device at a target moment of a target period;calculating a compensation value according to the first image and a pre-stored reference image containing a marker;judging whether the compensation value is larger than a preset error value or not;and if the compensation value is larger than the preset error value, calibrating the motion of the robot according to the compensation value.
- The controller of claim 19, wherein the processor calculates a compensation value based on the first image and a pre-stored reference image containing a marker, comprising:determining first position information of the marker relative to an imaging device according to the first image and the attribute information of the marker;determining second position information of the marker relative to the shooting device according to the pre-stored reference image containing the marker and the attribute information of the marker; andand calculating the compensation value according to the first position information and the second position information.
- The controller of claim 20, wherein the compensation value comprises a position compensation value and/or an angle compensation value.
- The controller of claim 20, wherein the reference image and the first image captured by the camera comprise depth information.
- The controller according to any one of claims 19 to 22, wherein the processor calibrates the motion of the robot according to the compensation value, and comprises:compensating the coordinate sequence in the robot control command by using the compensation value to obtain a compensation coordinate sequence; andand controlling the motion of the robot according to the compensation coordinate sequence.
- The controller according to any one of claims 19 to 22, wherein the processor calibrates the motion of the robot according to the compensation value, and comprises:compensating the rotation angle in the robot control command by using the compensation value to obtain a compensation angle; andand controlling the motion of the robot according to the compensation angle.
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
PCT/CN2017/084702 WO2018209592A1 (en) | 2017-05-17 | 2017-05-17 | Movement control method for robot, robot and controller |
Publications (2)
Publication Number | Publication Date |
---|---|
CN110636923A true CN110636923A (en) | 2019-12-31 |
CN110636923B CN110636923B (en) | 2023-03-21 |
Family
ID=64273115
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN201780090902.0A Active CN110636923B (en) | 2017-05-17 | 2017-05-17 | Motion control method of robot, robot and controller |
Country Status (2)
Country | Link |
---|---|
CN (1) | CN110636923B (en) |
WO (1) | WO2018209592A1 (en) |
Cited By (5)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN111739092A (en) * | 2020-06-12 | 2020-10-02 | 广东博智林机器人有限公司 | Hanging basket, detection robot, detection control system and detection method |
CN113768626A (en) * | 2020-09-25 | 2021-12-10 | 武汉联影智融医疗科技有限公司 | Surgical robot control method, computer equipment and surgical robot system |
CN115331174A (en) * | 2022-08-19 | 2022-11-11 | 中国安全生产科学研究院 | Enterprise safety production standardization intelligent supervision system and method |
CN115519544A (en) * | 2022-10-10 | 2022-12-27 | 深圳进化动力数码科技有限公司 | Fresh sorting robot grabbing method, device and equipment and storage medium |
CN118279865A (en) * | 2023-10-24 | 2024-07-02 | 中建一局集团第五建筑有限公司 | Robot walking correction method, device, computer equipment and medium thereof |
Families Citing this family (2)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN113514050B (en) * | 2021-04-01 | 2024-05-28 | 佛山中车四方轨道车辆有限公司 | Positioning method, system, medium, equipment, mobile platform and overhaul production line |
CN116423505B (en) * | 2023-03-30 | 2024-04-23 | 杭州邦杰星医疗科技有限公司 | Error calibration method for mechanical arm registration module in mechanical arm navigation operation |
Citations (8)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JPH08272414A (en) * | 1995-03-29 | 1996-10-18 | Fanuc Ltd | Calibrating method for robot and visual sensor using hand camera |
JP2005149299A (en) * | 2003-11-18 | 2005-06-09 | Fanuc Ltd | Teaching position correction apparatus |
JP2009006452A (en) * | 2007-06-29 | 2009-01-15 | Nissan Motor Co Ltd | Method for calibrating between camera and robot, and device therefor |
CN101637908A (en) * | 2008-07-29 | 2010-02-03 | 上海发那科机器人有限公司 | Visual positioning method for robot transport operation |
CN102448679A (en) * | 2009-05-27 | 2012-05-09 | 莱卡地球系统公开股份有限公司 | Method and system for positioning at least one object with high precision in a final position in space |
CN104540648A (en) * | 2012-08-02 | 2015-04-22 | 富士机械制造株式会社 | Work machine provided with articulated robot and electric component mounting machine |
CN105643369A (en) * | 2014-11-28 | 2016-06-08 | 发那科株式会社 | Cooperation system having machine tool and robot |
CN106217372A (en) * | 2015-06-02 | 2016-12-14 | 精工爱普生株式会社 | Robot, robot controller and robot system |
Family Cites Families (2)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN102922521B (en) * | 2012-08-07 | 2015-09-09 | 中国科学技术大学 | A kind of mechanical arm system based on stereoscopic vision servo and real-time calibration method thereof |
JP2016187844A (en) * | 2015-03-30 | 2016-11-04 | セイコーエプソン株式会社 | Robot, robot control device and robot system |
-
2017
- 2017-05-17 WO PCT/CN2017/084702 patent/WO2018209592A1/en active Application Filing
- 2017-05-17 CN CN201780090902.0A patent/CN110636923B/en active Active
Patent Citations (8)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JPH08272414A (en) * | 1995-03-29 | 1996-10-18 | Fanuc Ltd | Calibrating method for robot and visual sensor using hand camera |
JP2005149299A (en) * | 2003-11-18 | 2005-06-09 | Fanuc Ltd | Teaching position correction apparatus |
JP2009006452A (en) * | 2007-06-29 | 2009-01-15 | Nissan Motor Co Ltd | Method for calibrating between camera and robot, and device therefor |
CN101637908A (en) * | 2008-07-29 | 2010-02-03 | 上海发那科机器人有限公司 | Visual positioning method for robot transport operation |
CN102448679A (en) * | 2009-05-27 | 2012-05-09 | 莱卡地球系统公开股份有限公司 | Method and system for positioning at least one object with high precision in a final position in space |
CN104540648A (en) * | 2012-08-02 | 2015-04-22 | 富士机械制造株式会社 | Work machine provided with articulated robot and electric component mounting machine |
CN105643369A (en) * | 2014-11-28 | 2016-06-08 | 发那科株式会社 | Cooperation system having machine tool and robot |
CN106217372A (en) * | 2015-06-02 | 2016-12-14 | 精工爱普生株式会社 | Robot, robot controller and robot system |
Non-Patent Citations (1)
Title |
---|
林强: "《行为识别与智能计算》", 30 November 2016, 西安电子科技大学出版社 * |
Cited By (8)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN111739092A (en) * | 2020-06-12 | 2020-10-02 | 广东博智林机器人有限公司 | Hanging basket, detection robot, detection control system and detection method |
CN113768626A (en) * | 2020-09-25 | 2021-12-10 | 武汉联影智融医疗科技有限公司 | Surgical robot control method, computer equipment and surgical robot system |
CN113768626B (en) * | 2020-09-25 | 2024-03-22 | 武汉联影智融医疗科技有限公司 | Surgical robot control method, computer device and surgical robot system |
CN115331174A (en) * | 2022-08-19 | 2022-11-11 | 中国安全生产科学研究院 | Enterprise safety production standardization intelligent supervision system and method |
CN115331174B (en) * | 2022-08-19 | 2023-06-13 | 中国安全生产科学研究院 | Enterprise safety production standardized intelligent supervision system and method |
CN115519544A (en) * | 2022-10-10 | 2022-12-27 | 深圳进化动力数码科技有限公司 | Fresh sorting robot grabbing method, device and equipment and storage medium |
CN115519544B (en) * | 2022-10-10 | 2023-12-15 | 深圳进化动力数码科技有限公司 | Fresh sorting robot grabbing method, device, equipment and storage medium |
CN118279865A (en) * | 2023-10-24 | 2024-07-02 | 中建一局集团第五建筑有限公司 | Robot walking correction method, device, computer equipment and medium thereof |
Also Published As
Publication number | Publication date |
---|---|
WO2018209592A1 (en) | 2018-11-22 |
CN110636923B (en) | 2023-03-21 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
CN110636923B (en) | Motion control method of robot, robot and controller | |
CN106426172B (en) | A kind of scaling method and system of industrial robot tool coordinates system | |
JP7326911B2 (en) | Control system and control method | |
US10173324B2 (en) | Facilitating robot positioning | |
CN107871328B (en) | Machine vision system and calibration method implemented by machine vision system | |
CN110125926B (en) | Automatic workpiece picking and placing method and system | |
CN103020952B (en) | Messaging device and information processing method | |
JP6657469B2 (en) | Automatic calibration method for robot system | |
US20160346932A1 (en) | Automatic Calibration Method For Robot Systems Using a Vision Sensor | |
CN107030699B (en) | Pose error correction method and device, robot and storage medium | |
JP5815761B2 (en) | Visual sensor data creation system and detection simulation system | |
KR20180120647A (en) | System and method for tying together machine vision coordinate spaces in a guided assembly environment | |
JP2005201824A (en) | Measuring device | |
CN101733755A (en) | Robot system, robot control device and method for controlling robot | |
JP2019119027A (en) | Method of controlling robot system, and robot system | |
JP7078894B2 (en) | Control systems, controls, image processing devices and programs | |
US11340576B2 (en) | Method and apparatus for estimating system error of commissioning tool of industrial robot | |
JP2020075327A (en) | Control system | |
JP6924455B1 (en) | Trajectory calculation device, trajectory calculation method, trajectory calculation program | |
CN104424601B (en) | Centering assembly method and device for special-shaped body assembly parts | |
JP2019077026A (en) | Control device, robot system, and control device operating method and program | |
US11267129B2 (en) | Automatic positioning method and automatic control device | |
US20210178601A1 (en) | Device to monitor state of balance of robot, method of operation for such device, and computer readable medium | |
CN108748155B (en) | The automatic aligning method of more scenes | |
EP3416007B1 (en) | Control device, position control system, position control method, and position control program |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
PB01 | Publication | ||
PB01 | Publication | ||
SE01 | Entry into force of request for substantive examination | ||
SE01 | Entry into force of request for substantive examination | ||
GR01 | Patent grant | ||
GR01 | Patent grant |