WO2023013698A1 - ロボット制御装置、ロボット制御システム、及びロボット制御方法 - Google Patents
ロボット制御装置、ロボット制御システム、及びロボット制御方法 Download PDFInfo
- Publication number
- WO2023013698A1 WO2023013698A1 PCT/JP2022/029851 JP2022029851W WO2023013698A1 WO 2023013698 A1 WO2023013698 A1 WO 2023013698A1 JP 2022029851 W JP2022029851 W JP 2022029851W WO 2023013698 A1 WO2023013698 A1 WO 2023013698A1
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- robot
- control unit
- calibration
- coordinate system
- spatial information
- Prior art date
Links
- 238000000034 method Methods 0.000 title description 26
- 238000005259 measurement Methods 0.000 claims abstract description 110
- 238000012937 correction Methods 0.000 description 21
- 239000012636 effector Substances 0.000 description 18
- 238000001514 detection method Methods 0.000 description 16
- 230000005856 abnormality Effects 0.000 description 10
- 238000004891 communication Methods 0.000 description 9
- 230000006870 function Effects 0.000 description 5
- 230000008859 change Effects 0.000 description 4
- 238000010586 diagram Methods 0.000 description 3
- 238000012545 processing Methods 0.000 description 3
- 238000005516 engineering process Methods 0.000 description 2
- 238000003384 imaging method Methods 0.000 description 2
- 238000012986 modification Methods 0.000 description 2
- 230000004048 modification Effects 0.000 description 2
- 230000002159 abnormal effect Effects 0.000 description 1
- 230000001133 acceleration Effects 0.000 description 1
- 230000009471 action Effects 0.000 description 1
- 230000003213 activating effect Effects 0.000 description 1
- 230000004075 alteration Effects 0.000 description 1
- 238000004364 calculation method Methods 0.000 description 1
- 238000003745 diagnosis Methods 0.000 description 1
- 238000005553 drilling Methods 0.000 description 1
- 238000003754 machining Methods 0.000 description 1
- 230000003287 optical effect Effects 0.000 description 1
- 230000008569 process Effects 0.000 description 1
- 230000004044 response Effects 0.000 description 1
- 239000004065 semiconductor Substances 0.000 description 1
- 239000007787 solid Substances 0.000 description 1
- 210000000707 wrist Anatomy 0.000 description 1
Images
Classifications
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B25—HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
- B25J—MANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
- B25J9/00—Programme-controlled manipulators
- B25J9/10—Programme-controlled manipulators characterised by positioning means for manipulator elements
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B25—HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
- B25J—MANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
- B25J9/00—Programme-controlled manipulators
- B25J9/16—Programme controls
- B25J9/1679—Programme controls characterised by the tasks executed
- B25J9/1692—Calibration of manipulator
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B25—HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
- B25J—MANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
- B25J13/00—Controls for manipulators
- B25J13/08—Controls for manipulators by means of sensing devices, e.g. viewing or touching devices
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B25—HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
- B25J—MANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
- B25J9/00—Programme-controlled manipulators
- B25J9/16—Programme controls
- B25J9/1694—Programme controls characterised by use of sensors other than normal servo-feedback from position, speed or acceleration sensors, perception control, multi-sensor controlled systems, sensor fusion
- B25J9/1697—Vision controlled systems
Definitions
- the present disclosure relates to a robot control device, a robot control system, and a robot control method.
- a robot control device includes a control unit that controls a robot.
- the control unit acquires the measurement position of the destination of the robot, acquires the measurement result of the position of the robot calculated based on the spatial information regarding the motion space of the robot acquired by the spatial information acquisition unit, It is determined whether or not to perform calibration of the robot based on the measurement position and the measurement result.
- FIG. 1 is a block diagram showing a configuration example of a robot control system according to an embodiment
- FIG. 1 is a schematic diagram showing a configuration example of a robot control system according to an embodiment
- FIG. 4 is a schematic diagram showing an example of a difference between a sensor-based tip position and orientation and an image-based tip position and orientation
- 4 is a flow chart showing an example procedure of a robot control method according to an embodiment
- a robot control system 1 includes a robot 40 , a robot control device 10 and a spatial information acquisition section 20 .
- the robot 40 operates in a predetermined motion space.
- the space information acquisition unit 20 generates depth information of the motion space in which the robot 40 moves.
- the spatial information acquisition unit 20 calculates the distance to the measurement point located on the surface of the object 50 existing in the motion space.
- the distance from the spatial information acquisition unit 20 to the measurement point is also called depth.
- Depth information is information about the depth measured for each measurement point. In other words, the depth information is information about the distance to the measurement point located on the surface of the object 50 existing in the motion space.
- the depth information may be expressed as a depth map that associates the direction viewed from the spatial information acquisition unit 20 and the depth in that direction.
- the spatial information acquisition unit 20 generates depth information of the motion space based on the (X, Y, Z) coordinate system.
- the robot control device 10 operates the robot 40 based on the depth information generated by the space information acquisition section 20 .
- the robot controller 10 controls and operates the robot 40 based on the (X_RB, Y_RB, Z_RB) coordinate system.
- the (X_RB, Y_RB, Z_RB) coordinate system is also called the coordinate system of the robot 40 .
- the (X, Y, Z) coordinate system is also called the coordinate system of the spatial information acquisition unit 20 .
- the coordinate system of the robot 40 may be set as the same coordinate system as the coordinate system of the spatial information acquisition unit 20, or may be set as a different coordinate system.
- the robot control device 10 converts the depth information generated in the coordinate system of the space information acquisition unit 20 into the coordinates of the robot 40. It is converted into a system and used.
- the number of robots 40 and robot control devices 10 is not limited to one as illustrated, but may be two or more. As illustrated, the number of spatial information acquisition units 20 may be one for one motion space, or may be two or more. Each component will be specifically described below.
- the robot control device 10 includes a control section 11 and a storage section 12 .
- the storage unit 12 may be configured including an electromagnetic storage medium such as a magnetic disk, or may be configured including a memory such as a semiconductor memory or a magnetic memory.
- the storage unit 12 may be configured as an HDD (Hard Disk Drive) or an SSD (Solid State Drive).
- the storage unit 12 stores various information, programs executed by the control unit 11, and the like.
- the storage unit 12 may function as a work memory for the control unit 11 .
- the control unit 11 may include at least part of the storage unit 12 .
- the robot control device 10 may further include a communication device configured to be able to communicate with the spatial information acquisition unit 20 and the robot 40 by wire or wirelessly.
- a communication device may be configured to be able to communicate with communication schemes based on various communication standards.
- a communication device may be configured according to known communication technologies. A detailed description of the hardware of the communication device and the like is omitted.
- the functions of the communication device may be realized by one interface, or may be realized by separate interfaces for each connection destination.
- the control unit 11 may be configured to communicate with the spatial information acquisition unit 20 and the robot 40 .
- the control unit 11 may be configured including a communication device.
- the arm 42 may be configured as, for example, a 6-axis or 7-axis vertical articulated robot.
- the arm 42 may be configured as a 3-axis or 4-axis horizontal articulated robot or SCARA robot.
- Arm 42 may be configured as a 2-axis or 3-axis Cartesian robot.
- Arm 42 may be configured as a parallel link robot or the like.
- the number of axes forming the arm 42 is not limited to the illustrated one.
- the end effector 44 may include, for example, a gripping hand configured to grip a work object.
- the grasping hand may have multiple fingers. The number of fingers of the grasping hand may be two or more. The fingers of the grasping hand may have one or more joints.
- the end effector 44 may include a suction hand configured to be able to suction a work target.
- the end effector 44 may include a scooping hand configured to scoop the work object.
- the end effector 44 may include a tool such as a drill, and may be configured to perform various machining operations such as drilling holes in a work object.
- the end effector 44 is not limited to these examples, and may be configured to perform various other operations.
- the robot 40 may further include sensors that detect the state of each component of the robot 40 .
- the sensors may detect information regarding the actual position or orientation of each component of the robot 40 or the velocity or acceleration of each component of the robot 40 .
- the sensors may detect forces acting on each component of the robot 40 .
- the sensors may detect the current flowing through the motors that drive each component of the robot 40 or the torque of the motors.
- the sensors can detect information resulting from the actual movement of robot 40 .
- the robot control device 10 can grasp the actual operation result of the robot 40 by acquiring the detection result of the sensor. That is, the robot control device 10 can acquire the state of the robot 40 based on the detection result of the sensor.
- the spatial information acquisition unit 20 acquires spatial information regarding the motion space of the robot 40 .
- the spatial information acquisition unit 20 may photograph the motion space and acquire an image of the motion space as the spatial information.
- the spatial information acquisition unit 20 may photograph an object 50 existing in the motion space, as illustrated in FIG. 2 .
- the spatial information acquisition section 20 may be configured as a camera.
- the spatial information acquisition section 20 may be configured as a 3D stereo camera.
- the 3D stereo camera photographs an object 50 existing in the motion space, calculates the distance to a measurement point located on the surface of the object 50 existing in the motion space as depth, and generates depth information.
- the spatial information acquisition unit 20 may be configured as a LiDAR (light detection and ranging).
- the space information acquisition unit 20 may generate point group information in the motion space of the robot 40 and output it to the robot control device 10 . That is, the spatial information may be output in the form of point cloud data. In other words, the point cloud information may have spatial information.
- the point group information is information on a set of measurement points located on the surface of the object 50 existing in the motion space, and is information including coordinate information or color information on each measurement point.
- the point group information can also be said to be data representing the object 50 in the measurement space with a plurality of points. Since the spatial information is in the form of point cloud data, the data density can be made smaller than the spatial information based on the initial data acquired by the spatial information acquiring section 20 .
- the control unit 11 of the robot control device 10 can acquire the state of the robot 40 based on the position and orientation of the mark 46 captured in the image captured by the space information acquisition unit 20, and can acquire the positional relationship between the robot 40 and the object 50. .
- the control unit 11 acquires the state of the robot 40 based on a sensor of the robot 40 such as an encoder installed on the arm 42 or the like.
- the state based on the sensor of the robot 40 expresses the position and orientation of the robot 40 with higher accuracy than the state based on the captured image of the spatial information acquisition unit 20 .
- the control unit 11 can control the robot 40 in the motion space with high accuracy by matching the state of the robot 40 based on the captured image of the space information acquisition unit 20 with the state of the robot 40 based on the sensor of the robot 40.
- the work of matching the state of the robot 40 based on the captured image of the spatial information acquisition unit 20 with the state of the robot 40 based on the sensor of the robot 40 is also called calibration.
- the control unit 11 performs calibration so that the (X, Y, Z) coordinate system of the spatial information acquisition unit 20 matches the (X_RB, Y_RB, Z_RB) coordinate system of the robot 40 .
- the control unit 11 may perform calibration using at least part of the FOV of the spatial information acquisition unit 20 as the calibration range 60 .
- the control unit 11 performs calibration within the calibration range 60 shown in FIG.
- a calibration range 60 is shown as a region surrounded by a two-dot chain line in FIG.
- a calibration range 60 corresponds to a range in which the robot 40 is calibrated.
- Calibration range 60 may include the work area of robot 40 .
- the calibration range 60 may be the range where the working area of the robot 40 and the FOV overlap.
- control unit 11 sets a point for performing calibration within the calibration range 60 .
- the points for performing calibration are also referred to as calibration positions.
- the control unit 11 moves the mark 46 of the robot 40 to the calibration position and causes the spatial information acquisition unit 20 to photograph the mark 46 .
- the control unit 11 calculates the position and orientation of the mark 46 based on the image of the mark 46 .
- the control unit 11 adjusts the position and orientation of the mark 46 based on the image so that the position and orientation of the mark 46 calculated based on the image match the position and orientation of the mark 46 determined based on the detection result of the sensor of the robot 40 .
- Correct position and posture. Correction of the position and orientation of the mark 46 based on the image corresponds to calibration.
- the position and orientation of mark 46 is also referred to as the tip position and orientation.
- Calibration corresponds to correction of tip position and orientation.
- a calibration position corresponds to a position for correcting the tip position and orientation.
- control unit 11 may perform calibration as described below.
- the control unit 11 generates control information for the robot 40 for moving the mark 46 of the robot 40 to the calibration position.
- the control unit 11 operates the robot 40 based on the control information to move the mark 46 of the robot 40 to the calibration position.
- the control unit 11 acquires an image of the mark 46 from the spatial information acquisition unit 20 .
- the control unit 11 calculates the position and orientation of the mark 46 based on the image.
- the position and orientation of the mark 46 calculated based on the image are also referred to as the tip position and orientation based on the image.
- the control unit 11 calculates the position and orientation of the mark 46 determined based on the detection result of the sensor of the robot 40 .
- the position and orientation of the mark 46 calculated based on the sensor detection results are also referred to as the tip position and orientation based on the sensor.
- the control unit 11 compares the tip position/orientation based on the image and the tip position/orientation based on the sensor.
- the control unit 11 corrects the tip position/orientation based on the image so that the tip position/orientation based on the image matches the tip position/orientation based on the sensor.
- the control unit 11 may correct the algorithm for calculating the tip position/orientation based on the image.
- the control unit 11 may correct the parameters included in the algorithm, or may correct the formula, table, or program. When a plurality of calibration positions are set, the control unit 11 moves the robot 40 to each calibration position, acquires an image of the mark 46 at each calibration position, and determines the tip position/orientation based on the image. to correct.
- the control unit 11 generates control information for the robot 40 so as to move the robot 40 to the calibration position.
- the control unit 11 generates, as a calibration item, information specifying the tip position/orientation when the robot 40 is moved to the calibration position and the recognition result of the mark 46 of the robot 40 .
- the calibration item is, for example, information about coordinates.
- the calibration item is, for example, coordinate information indicating the tip position and orientation based on the detection result of the sensor of the robot 40 when the robot 40 is moved to the calibration position, or is recognized by the spatial information acquisition unit 20. coordinate information indicating the position and orientation of the tip based on the recognition result of the mark 46.
- the control unit 11 acquires, for example, information on the real field size of the spatial information acquisition unit 20 or information on the FOV from the spatial information acquisition unit 20 .
- the control unit 11 sets the calibration range 60 based on the actual field of view size or FOV of the spatial information acquisition unit 20 and the work area of the robot 40 .
- the controller 11 may set the calibration range 60 based on the position of the object 50 in the motion space of the robot 40 .
- the control unit 11 may set the calibration range 60 based on the depth information or point group information of the object 50 detected by the spatial information acquisition unit 20 .
- the shape of the calibration range 60 is set to a truncated quadrangular pyramid shape.
- the shape of the calibration range 60 is not limited to these and may be set to various other shapes.
- the control unit 11 matches the tip position/orientation based on the sensor of the robot 40 with the tip position/orientation based on the image of the spatial information acquisition unit 20 . Specifically, the controller 11 moves the robot 40 to the first position.
- the control unit 11 generates control information for operating the robot 40 so that the mark 46 of the robot 40 assumes a predetermined position and posture, and controls the robot 40 based on the control information to move the robot 40 to the first position. move.
- the first position may be a predetermined position included in the FOV of the spatial information acquisition section 20 .
- the first position may be the center position of the FOV of the spatial information acquisition unit 20, for example.
- the control unit 11 acquires an image of the mark 46 when the robot 40 moves to the first position, and calculates the position and orientation of the mark 46 as the tip position and orientation based on the image. Also, the control unit 11 calculates the tip position and orientation based on the sensor. Based on the comparison between the tip position/orientation based on the image and the tip position/orientation based on the sensor, the control unit 11 adjusts the position of the robot 40 so that the position of the robot 40 becomes the first position based on the detection result of the sensor in the image. Correct the control information. The control unit 11 moves the robot 40 based on the corrected control information so that the position of the robot 40 in the coordinate system of the robot 40 and the position of the robot 40 in the coordinate system of the space information acquisition unit 20 match. Update 40 states. In other words, it can be said that the control unit 11 updates the state of the robot 40 so that the position of the robot 40 becomes the first position in the image.
- the control unit 11 determines that the robot 40 moves to the second position. Register the position as a calibration position. When registering the second position as the calibration position, the control unit 11 stores the tip position and orientation based on the sensor detection results of the robot 40 when the robot 40 is moved to the second position and the recognition results of the mark 46 of the robot 40 . may be generated as a plurality of calibration items that specify each of the tip position and orientation based on . If the second position is not registered as the calibration position, the control unit 11 may generate a new second position of a different position and determine whether the new second position can be registered as the calibration position.
- a singular point corresponds to a posture of the robot 40 where the robot 40 is structurally uncontrollable. If the trajectory for operating the robot 40 includes a singular point, the robot 40 moves (runs away) at high speed near the singular point and stops at the singular point.
- the singular points of the robot 40 are the following three types (1) to (3). (1) Points outside the work area when controlling the robot 40 to near the outer limits of the work area. (The work area is the area corresponding to the motion space of the robot 40.) (2) Points when controlling the robot 40 directly above and below the robot base even within the work area. (3) A point where the joint angle one before the tip joint of the arm 42 of the robot 40 is zero or 180 degrees (wrist alignment singular point).
- the control unit 11 may determine that the state of the robot 40 is the state of singularity when the numerical value representing the state of the robot 40 matches the numerical value representing the state of singularity.
- the control unit 11 may determine that the state of the robot 40 is the state of singularity when the difference between the numerical value representing the state of the robot 40 and the numerical value representing the state of singularity is less than a predetermined value.
- the numerical value representing the state of the robot 40 may include, for example, the angle of the joint of the arm 42 or the torque of the motor that drives the robot 40 .
- the control unit 11 performs calibration so that the tip position/orientation calibration item regarding the recognition result of the mark 46 matches the tip position/orientation calibration item regarding the detection result of the sensor of the robot 40 . Specifically, the controller 11 moves the robot 40 to the calibration position. The control unit 11 acquires the recognition result of the mark 46 of the robot 40 when the robot 40 moves to the calibration position by the space information acquisition unit 20 . The control unit 11 calculates the relative positional relationship of the tip position/orientation calibration item acquired as the recognition result of the mark 46 with respect to the tip position/orientation calibration item based on the sensor of the robot 40 . The relative positional relationship corresponds to the coordinate difference and angle difference between both calibration items.
- the control unit 11 controls the spatial information acquisition unit so that the coordinate error and angle error corresponding to the relative positional relationship between the two calibration items are zero or close to zero (that is, the error is less than a predetermined value).
- the coordinate system of 20 is corrected to match the coordinate system of robot 40 .
- the control unit 11 matches the recognition result of the mark 46 when the robot 40 moves to the calibration position with the tip position/orientation specified by the sensor of the robot 40, thereby adjusting the relative positional relationship. can be calculated.
- the control unit 11 detects that the coordinate system relationship specified by the calibration has changed. determine whether or not If the relationship of the coordinate system has not changed, the control unit 11 does not need to perform correction or calibration of the relationship of the coordinate system. On the other hand, if the coordinate system relationship has changed when the robot 40 is activated, the control unit 11 determines whether the coordinate system relationship can be corrected. If the relationship between the coordinate systems can be corrected, the control unit 11 corrects the relationship between the coordinate systems and does not perform calibration. If the relationship between the coordinate systems cannot be corrected, the control unit 11 re-specifies the relationship between the coordinate systems by performing recalibration.
- the need for calibration may be determined when the robot 40 is stopped or when the robot 40 is started.
- the time when the robot 40 stops is not limited to when it stops abnormally, and may be when the designated work is completed.
- the time to activate the robot 40 is not limited to when it is activated after an abnormal stop, and may be when to start a designated work.
- control unit 11 determines whether the relationship between the coordinate system of the robot 40 and the coordinate system of the spatial information acquisition unit 20 has changed, as described below. and determine if recalibration is necessary.
- the control unit 11 acquires the recognition result of the mark 46 when the robot 40 is moved to the measurement position.
- the control unit 11 calculates the position of the robot 40 as a measurement result based on the recognition result of the mark 46 .
- the control unit 11 calculates the difference between the initial value of the measurement position and the measurement result. Since the control unit 11 moves the robot 40 to the measurement position based on the detection result of the sensor, the difference between the set measurement position itself and the measurement result is calculated without calculating the position of the robot 40 based on the sensor. You can
- the control unit 11 acquires the detection result of the sensor when the robot 40 is moved to the measurement position, calculates and acquires the position of the robot 40 based on the sensor based on the detection result as the measurement position, and uses the calculated measurement position as the measurement position. As the initial value of the measurement position, a difference from the measurement result may be calculated.
- the control unit 11 determines whether to correct the relationship of the coordinate system based on the difference between the initial value of the measurement position and the measurement result. For example, when the difference between the initial value of the measurement position and the measurement result is greater than a predetermined threshold value, the control unit 11 determines to correct the relationship of the coordinate system. When measurement results are acquired at a plurality of measurement positions, the control unit 11 determines to correct the relationship of the coordinate system when the difference between the initial value of at least one measurement position and the measurement result is greater than a predetermined threshold. . When the difference between the initial value of the measurement position and the measurement result is equal to or less than a predetermined threshold value, the control unit 11 determines that correction of the relationship between the coordinate systems and recalibration are not necessary.
- the control unit 11 may appropriately set a predetermined threshold value.
- the control unit 11 may set a predetermined threshold value, for example, based on the specifications of the positional accuracy during operation of the robot 40 .
- control unit 11 may correct the coordinate system of the spatial information acquiring unit 20 so that the coordinate system of the spatial information acquiring unit 20 matches the coordinate system of the robot 40 .
- the control unit 11 may correct the coordinate system of the robot 40 so that the coordinate system of the robot 40 matches the coordinate system of the spatial information acquisition unit 20 .
- control unit 11 may correct the coordinate system of the spatial information acquisition unit 20 or the coordinate system of the robot 40 by rotating or translating them.
- the control unit 11 may correct the coordinate system of the spatial information acquisition unit 20 or the coordinate system of the robot 40 by enlarging or reducing it.
- the control unit 11 may correct distortion of the coordinate system of the spatial information acquisition unit 20 or the coordinate system of the robot 40 .
- the control unit 11 may calculate the correction value of the measurement position based on the correction of the coordinate system.
- the control unit 11 acquires the tip position and orientation of the robot 40 at one measurement position, and corrects the coordinate system not only in the translational direction but also in the rotational direction based on information such as the rotation angle representing the orientation of the mark 46. good.
- a measurement position model 70 having the coordinates of the measurement position as vertices is defined.
- the control unit 11 moves the robot 40 to each vertex (measurement position) of the measurement position model 70 and acquires the recognition result of the mark 46 .
- the recognition result of the mark 46 acquired based on the (X, Y, Z) coordinate system of the spatial information acquisition unit 20 is matched with the (X_RB, Y_RB, Z_RB) coordinate system of the robot 40, the calculated measurement result is , is represented by a measurement result model 72 whose vertices are the coordinates of the measurement result.
- the measurement position model 70 corresponds to the set measurement position itself or the initial value of the measurement position when the robot 40 is moved to the measurement position based on the detection result of the sensor.
- the measurement result model 72 corresponds to the measurement result when matching the recognition result of the mark 46 when the robot 40 is moved to the measurement position with the (X_RB, Y_RB, Z_RB) coordinate system of the robot 40 .
- the control unit 11 determines whether the relationship between the coordinate systems has changed. In the present embodiment, it is assumed that the coordinate system of the robot 40 and the coordinate system of the spatial information acquisition section 20 match each other due to calibration performed in advance. In the state illustrated in FIG.
- the coordinate system of the spatial information acquisition unit 20 changes in the rotational direction and the translational direction with respect to the coordinate system of the robot 40 .
- the control unit 11 may correct the coordinate system of the spatial information acquisition unit 20 in translational and rotational directions so that the coordinate system of the spatial information acquisition unit 20 matches the coordinate system of the robot 40 .
- control unit 11 determines that the correction does not return the relationship of the coordinate system to the relationship at the time of execution of the previous calibration, and restarts the robot 40 . Determine that calibration is required.
- the correction of the coordinate system described above may be executed when an abnormality occurs. Specifically, the correction of the coordinate system may be performed when restarting the robot 40 after the robot 40 stops abnormally. Further, the correction of the coordinate system may be executed when the (X, Y, Z) coordinate system of the spatial information acquiring section 20 changes due to the movement of the spatial information acquiring section 20 . Correction of the coordinate system is not limited to when an abnormality occurs, and may be performed when the robot 40 is activated.
- the control unit 11 of the robot control device 10 may execute the robot control method including the procedure of the flowchart illustrated in FIG.
- the robot control method may be implemented as a robot control program that is executed by a processor that configures the control unit 11 .
- the robot control program may be stored on a non-transitory computer-readable medium.
- the control unit 11 acquires the measurement position (step S1).
- the controller 11 moves the robot 40 to the measurement position (step S2).
- the control unit 11 acquires the recognition result of the mark 46 at the measurement position to which the robot 40 has moved (step S3).
- the control unit 11 determines whether the recognition of the mark 46 is completed at all measurement positions (step S4). If recognition of the marks 46 has not been completed at all measurement positions (step S4: NO), the controller 11 returns to step S2.
- step S5 determines whether the difference between the initial value of the measurement position and the measurement result is equal to or less than a predetermined threshold (step S5). If the difference between the initial value of the measured position and the measurement result is equal to or less than the predetermined threshold value (step S5: YES), the control unit 11 ends execution of the procedure of the flowchart of FIG.
- step S5 If the difference between the initial value of the measured position and the measurement result is not equal to or less than a predetermined threshold (step S5: NO), that is, if the difference between the initial value of the measured position and the measurement result is greater than the predetermined threshold,
- the coordinate system is corrected and the corrected position is calculated (step S6).
- the controller 11 moves the robot 40 to the correction position (step S7).
- the control unit 11 acquires the recognition result of the mark 46 at the corrected position to which the robot 40 has moved, and acquires the post-correction measurement result (step S8).
- the control unit 11 determines whether the difference between the corrected position and the post-correction measurement result is equal to or less than a predetermined threshold (step S9). If the difference between the corrected position and the corrected measurement result is equal to or less than a predetermined threshold value (step S9: YES), the control unit 11 calculates a correction value based on the difference between the initial value of the measurement position and the measurement result, A calibration value is updated (step S10). A calibration value represents a relative positional relationship. After executing the procedure of step S10, the control unit 11 ends the execution of the procedure of the flowchart of FIG.
- step S9 If the difference between the corrected position and the corrected measurement result is not equal to or less than a predetermined threshold value (step S9: NO), that is, if the difference between the corrected value of the measured position and the measurement result is greater than the predetermined threshold value, the control unit 11 controls the robot. 40 recalibration is required (step S11). After executing the procedure of step S11, the control unit 11 ends the execution of the procedure of the flowchart of FIG. The control unit 11 may perform recalibration of the robot 40 after performing the procedure of step S11.
- the robot control device 10 and the robot control method according to the present embodiment when the robot 40 is activated or recovered from an abnormality, the difference between the measured position and the measurement result reaches a predetermined threshold value. The coordinate system is corrected if greater. Then, when the difference between the measured position after correction of the coordinate system and the measurement result is larger than a predetermined threshold value, it can be determined that recalibration is necessary. Conversely, it can be determined that recalibration is unnecessary when the difference between the measured position after correction of the coordinate system and the measurement result is equal to or less than a predetermined threshold value. By doing so, the workload can be reduced when an abnormality occurs.
- Embodiments according to the present disclosure are not limited to any specific configuration of the embodiments described above. Embodiments of the present disclosure extend to any novel feature or combination thereof described in the present disclosure or any novel method or process step or combination thereof described. be able to.
- robot control system 10 robot control device (11: control unit, 12: storage unit) 20 spatial information acquisition unit 40 robot (42: arm, 44: end effector, 46: mark) 50 object 60 calibration range 70 measurement position model 72 measurement result model
Landscapes
- Engineering & Computer Science (AREA)
- Robotics (AREA)
- Mechanical Engineering (AREA)
- Human Computer Interaction (AREA)
- Manipulator (AREA)
Abstract
Description
図1及び図2に例示されるように、一実施形態に係るロボット制御システム1は、ロボット40と、ロボット制御装置10と、空間情報取得部20とを備える。ロボット40は、所定の動作空間において動作する。空間情報取得部20は、ロボット40が動作する動作空間のデプス情報を生成する。空間情報取得部20は、後述するように、動作空間に存在する物体50の表面に位置する測定点までの距離を算出する。空間情報取得部20から測定点までの距離は、デプスとも称される。デプス情報は、各測定点について測定したデプスに関する情報である。言い換えれば、デプス情報は、動作空間に存在する物体50の表面に位置する測定点までの距離に関する情報である。デプス情報は、空間情報取得部20から見た方向とその方向のデプスとを関連づけたデプスマップとして表されてもよい。空間情報取得部20は、(X,Y,Z)座標系に基づいて動作空間のデプス情報を生成する。ロボット制御装置10は、空間情報取得部20で生成されたデプス情報に基づいてロボット40を動作させる。ロボット制御装置10は、(X_RB,Y_RB,Z_RB)座標系に基づいてロボット40を制御し、動作させる。
ロボット制御装置10は、制御部11と、記憶部12とを備える。
ロボット40は、図2に例示されるように、アーム42と、アーム42に取り付けられるエンドエフェクタ44と、エンドエフェクタ44に設置されているマーク46とを備える。なお、マーク46は、エンドエフェクタ44ではなく、アーム42に設置されていてもよい。
空間情報取得部20は、ロボット40の動作空間に関する空間情報を取得する。空間情報取得部20は、動作空間を撮影し、空間情報として動作空間の画像を取得してよい。空間情報取得部20は、図2に例示されるように、動作空間に存在する物体50を撮影してよい。空間情報取得部20は、カメラとして構成されてよい。空間情報取得部20は、3Dステレオカメラとして構成されてもよい。3Dステレオカメラは、動作空間に存在する物体50を撮影し、動作空間に存在する物体50の表面に位置する測定点までの距離をデプスとして算出し、デプス情報を生成する。空間情報取得部20は、LiDAR(light detection and ranging)として構成されてもよい。LiDARは、動作空間に存在する物体50の表面に位置する測定点までの距離を測定し、デプス情報を生成する。つまり、空間情報取得部20は、空間情報として動作空間のデプス情報を取得してよい。空間情報取得部20は、これらに限られず種々のデバイスとして構成されてもよい。空間情報取得部20は、空間情報として、動作空間の画像又はデプス情報に限られず他の種々の情報を取得してよい。空間情報取得部20は、撮像素子を備えてよい。空間情報取得部20は、光学系を更に備えてよい。空間情報取得部20は、動作空間を撮影した画像をロボット制御装置10に出力してよい。空間情報取得部20は、ロボット40の動作空間におけるデプス情報を生成してロボット制御装置10に出力してもよい。空間情報取得部20は、ロボット40の動作空間における点群情報を生成してロボット制御装置10に出力してもよい。すなわち、空間情報は、点群データの形式で出力されてもよい。言い換えれば、点群情報は、空間情報を有していてもよい。点群情報は、動作空間に存在する物体50の表面に位置する各測定点の集合の情報であり、各測定点の座標情報又は色情報を含む情報である。点群情報は、測定空間内の物体50を複数の点で表すデータであるともいえる。空間情報が点群データの形式であることによって、空間情報取得部20で取得された初期データに基づく空間情報よりも、データ密度を小さくすることができる。
ロボット制御装置10は、動作空間に存在する物体50等の作業対象に作用するようにロボット40を動作させたり、物体50を避けるようにロボット40を動作させたりする。ロボット制御装置10は、物体50を空間情報取得部20で写した撮影画像に基づいて、物体50に作用したり物体50を避けたりするようにロボット40を動作させる。
ロボット制御装置10の制御部11は、空間情報取得部20の撮影画像に写ったマーク46の位置及び姿勢に基づいてロボット40の状態を取得し、ロボット40と物体50との位置関係を取得できる。一方で、制御部11は、例えばアーム42などに設置されたエンコーダなどのロボット40のセンサに基づいてロボット40の状態を取得する。ロボット40のセンサに基づく状態は、空間情報取得部20の撮影画像に基づく状態よりもロボット40の位置及び姿勢を高精度に表す。したがって、制御部11は、空間情報取得部20の撮影画像に基づくロボット40の状態を、ロボット40のセンサに基づくロボット40の状態に一致させることによって、ロボット40を動作空間において高精度で制御できる。空間情報取得部20の撮影画像に基づくロボット40の状態を、ロボット40のセンサに基づくロボット40の状態に一致させる作業は、キャリブレーションとも称される。具体的に、制御部11は、空間情報取得部20の(X,Y,Z)座標系をロボット40の(X_RB,Y_RB,Z_RB)座標系に一致させるようにキャリブレーションを実行する。制御部11は、空間情報取得部20の座標系とロボット40の座標系との相対位置関係を推定し、推定した相対位置関係に基づいて空間情報取得部20の座標系をロボット40の座標系に合わせてよい。
制御部11は、キャリブレーションを実行する前に、あらかじめキャリブレーション範囲60を設定する。また、制御部11は、キャリブレーション範囲60に含まれるキャリブレーション位置を設定する。制御部11は、キャリブレーション範囲60の中にキャリブレーション位置を設定する。
(1)作業領域の外側限界の近くまでにロボット40を制御するときの作業領域外の点。(作業領域は、ロボット40の動作空間に対応する領域である。)
(2)作業領域内であっても、ロボットベースの真上と真下にロボット40を制御するときの点。
(3)ロボット40のアーム42の先端の関節より1つ前の関節角度がゼロ又は180度になる点(手首整列特異点)。
制御部11は、マーク46の認識結果に関する先端位置姿勢のキャリブレーションアイテムが、ロボット40のセンサの検出結果に関する先端位置姿勢のキャリブレーションアイテムに一致するようにキャリブレーションを実行する。具体的に、制御部11は、キャリブレーション位置にロボット40を移動させる。制御部11は、空間情報取得部20によって、ロボット40がキャリブレーション位置に移動したときのロボット40のマーク46の認識結果を取得する。制御部11は、ロボット40のセンサに基づく先端位置姿勢のキャリブレーションアイテムに対する、マーク46の認識結果として取得された先端位置姿勢のキャリブレーションアイテムの相対位置関係を算出する。相対位置関係は、両者のキャリブレーションアイテムの間の座標の差及び角度の差に対応する。制御部11は、両者のキャリブレーションアイテムに対する相対位置関係に対応する座標の誤差及び角度の誤差がゼロ又はゼロに近くなるように(つまり、誤差が所定値未満になるように)空間情報取得部20の座標系を補正してロボット40の座標系に合わせる。このようにすることで、制御部11は、ロボット40がキャリブレーション位置に移動したときのマーク46の認識結果をロボット40のセンサで特定される先端位置姿勢に一致させることによって、相対位置関係を算出できる。
上述したように、キャリブレーションによって空間情報取得部20の座標系とロボット40の座標系との関係が特定される。本実施形態において、空間情報取得部20の座標系とロボット40の座標系とは一致する。ここで、種々の原因によって、座標系の関係が変化することがある。座標系の関係は、ロボット40又はロボット制御システム1に異常が発生したときに変化し得る。座標系の関係は、ロボット40を停止したとき、又は、ロボット40を起動するときに変化し得る。
ロボット制御装置10の制御部11は、ロボット40の起動時又は異常からの回復時等に、図4に例示されるフローチャートの手順を含むロボット制御方法を実行してもよい。ロボット制御方法は、制御部11を構成するプロセッサに実行させるロボット制御プログラムとして実現されてもよい。ロボット制御プログラムは、非一時的なコンピュータ読み取り可能な媒体に格納されてよい。
以上述べてきたように、本実施形態に係るロボット制御装置10及びロボット制御方法によれば、ロボット40の起動時又は異常からの回復時等において、計測位置と計測結果との差が所定の閾値より大きい場合に座標系が補正される。そして、座標系の補正後の計測位置と計測結果との差が所定の閾値より大きい場合に再キャリブレーションが必要と判定され得る。逆に言えば、座標系の補正後の計測位置と計測結果との差が所定の閾値以下となる場合に再キャリブレーションが不要と判定され得る。このようにすることで、異常発生時の作業負荷が低減され得る。
10 ロボット制御装置(11:制御部、12:記憶部)
20 空間情報取得部
40 ロボット(42:アーム、44:エンドエフェクタ、46:マーク)
50 物体
60 キャリブレーション範囲
70 計測位置モデル
72 計測結果モデル
Claims (7)
- ロボットを制御する制御部を備え、
前記制御部は、
前記ロボットの移動先の計測位置を取得し、
空間情報取得部で取得した前記ロボットの動作空間に関する空間情報に基づいて算出された前記ロボットの位置の計測結果を取得し、
前記計測位置と前記計測結果とに基づいて前記ロボットのキャリブレーションを実行するか判定する、
ロボット制御装置。 - 前記制御部は、前記計測位置と前記計測結果との差に基づいて、前記空間情報取得部の座標系又は前記ロボットを制御する座標系を補正する、請求項1に記載のロボット制御装置。
- 前記制御部は、前記空間情報取得部の座標系又は前記ロボットを制御する座標系を、回転、並進、拡大若しくは縮小、又は、歪みについて補正する、請求項2に記載のロボット制御装置。
- 前記制御部は、
補正した座標系において前記ロボットを計測位置に移動させて前記ロボットの位置を補正後計測結果として算出し、
前記計測位置と前記補正後計測結果との差が所定値以上である場合、前記ロボットのキャリブレーションを実行する、請求項2又は3に記載のロボット制御装置。 - 前記制御部は、前記ロボットのうち、前記キャリブレーションを少なくとも1回実行しているロボットに対して、前記キャリブレーションを再実行するか否か判定することができる、請求項1から4までのいずれか一項に記載したロボット制御装置。
- 請求項1から5までのいずれか一項に記載のロボット制御装置と、前記ロボットとを備える、ロボット制御システム。
- ロボットを制御する制御部が、前記ロボットの移動先の計測位置を取得し、
空間情報取得部で取得した前記ロボットの動作空間に関する空間情報に基づいて算出された前記ロボットの位置の計測結果を取得することと、
前記計測位置と前記計測結果とに基づいて前記ロボットのキャリブレーションを実行するか判定することと
を含む、ロボット制御方法。
Priority Applications (4)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
US18/681,013 US20240246237A1 (en) | 2021-08-03 | 2022-08-03 | Robot control device, robot control system, and robot control method |
CN202280053883.5A CN117813182A (zh) | 2021-08-03 | 2022-08-03 | 机器人控制设备、机器人控制系统和机器人控制方法 |
EP22853116.6A EP4382258A1 (en) | 2021-08-03 | 2022-08-03 | Robot control device, robot control system, and robot control method |
JP2023540394A JPWO2023013698A1 (ja) | 2021-08-03 | 2022-08-03 |
Applications Claiming Priority (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
JP2021127715 | 2021-08-03 | ||
JP2021-127715 | 2021-08-03 |
Publications (1)
Publication Number | Publication Date |
---|---|
WO2023013698A1 true WO2023013698A1 (ja) | 2023-02-09 |
Family
ID=85156017
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
PCT/JP2022/029851 WO2023013698A1 (ja) | 2021-08-03 | 2022-08-03 | ロボット制御装置、ロボット制御システム、及びロボット制御方法 |
Country Status (5)
Country | Link |
---|---|
US (1) | US20240246237A1 (ja) |
EP (1) | EP4382258A1 (ja) |
JP (1) | JPWO2023013698A1 (ja) |
CN (1) | CN117813182A (ja) |
WO (1) | WO2023013698A1 (ja) |
Citations (2)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2009125857A (ja) * | 2007-11-22 | 2009-06-11 | Mitsubishi Electric Corp | ロボットのキャリブレーション装置及び方法 |
JP2020116717A (ja) | 2019-01-28 | 2020-08-06 | 株式会社Fuji | ロボット制御システム |
-
2022
- 2022-08-03 EP EP22853116.6A patent/EP4382258A1/en active Pending
- 2022-08-03 JP JP2023540394A patent/JPWO2023013698A1/ja active Pending
- 2022-08-03 WO PCT/JP2022/029851 patent/WO2023013698A1/ja active Application Filing
- 2022-08-03 US US18/681,013 patent/US20240246237A1/en active Pending
- 2022-08-03 CN CN202280053883.5A patent/CN117813182A/zh active Pending
Patent Citations (2)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2009125857A (ja) * | 2007-11-22 | 2009-06-11 | Mitsubishi Electric Corp | ロボットのキャリブレーション装置及び方法 |
JP2020116717A (ja) | 2019-01-28 | 2020-08-06 | 株式会社Fuji | ロボット制御システム |
Also Published As
Publication number | Publication date |
---|---|
US20240246237A1 (en) | 2024-07-25 |
CN117813182A (zh) | 2024-04-02 |
EP4382258A1 (en) | 2024-06-12 |
JPWO2023013698A1 (ja) | 2023-02-09 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
JP6966582B2 (ja) | ロボットモーション用のビジョンシステムの自動ハンドアイ校正のためのシステム及び方法 | |
CN112672860B (zh) | 用于ar和数字孪生的机器人校准 | |
JP6180087B2 (ja) | 情報処理装置及び情報処理方法 | |
JP5949242B2 (ja) | ロボットシステム、ロボット、ロボット制御装置、ロボット制御方法、およびロボット制御プログラム | |
JP6429473B2 (ja) | ロボットシステム、ロボットシステムの校正方法、プログラム、およびコンピュータ読み取り可能な記録媒体 | |
JP6855492B2 (ja) | ロボットシステム、ロボットシステム制御装置、およびロボットシステム制御方法 | |
JP6430986B2 (ja) | ロボットを用いた位置決め装置 | |
US9519736B2 (en) | Data generation device for vision sensor and detection simulation system | |
JP2767417B2 (ja) | ロボツト制御装置 | |
JP5223407B2 (ja) | 冗長ロボットの教示方法 | |
US7957834B2 (en) | Method for calculating rotation center point and axis of rotation, method for generating program, method for moving manipulator and positioning device, and robotic system | |
JP2019098409A (ja) | ロボットシステムおよびキャリブレーション方法 | |
WO2023013740A1 (ja) | ロボット制御装置、ロボット制御システム、及びロボット制御方法 | |
WO2023013698A1 (ja) | ロボット制御装置、ロボット制御システム、及びロボット制御方法 | |
WO2023013739A1 (ja) | ロボット制御装置、ロボット制御システム、及びロボット制御方法 | |
JP7583942B2 (ja) | ロボット制御装置、ロボット制御システム、及びロボット制御方法 | |
JP2016203282A (ja) | エンドエフェクタの姿勢変更機構を備えたロボット | |
JP2005186193A (ja) | ロボットのキャリブレーション方法および三次元位置計測方法 | |
CN115082550A (zh) | 从对象的相机图像中定位对象的位置的设备和方法 | |
CN118510633A (zh) | 具备三维传感器的机器人装置以及机器人装置的控制方法 | |
US20240346674A1 (en) | Information processing device, information processing method, imaging device, and information processing system | |
WO2023013699A1 (ja) | ロボット制御装置、ロボット制御システム、及びロボット制御方法 | |
CN113733078A (zh) | 机械臂精调控制量判读方法、计算机可读存储介质 | |
US20220092290A1 (en) | Image Recognition Method And Robot System | |
JP2023009776A (ja) | ピッキング方法、ピッキングシステムおよびピッキング制御装置 |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
121 | Ep: the epo has been informed by wipo that ep was designated in this application |
Ref document number: 22853116 Country of ref document: EP Kind code of ref document: A1 |
|
WWE | Wipo information: entry into national phase |
Ref document number: 202280053883.5 Country of ref document: CN Ref document number: 2023540394 Country of ref document: JP |
|
WWE | Wipo information: entry into national phase |
Ref document number: 18681013 Country of ref document: US |
|
NENP | Non-entry into the national phase |
Ref country code: DE |
|
ENP | Entry into the national phase |
Ref document number: 2022853116 Country of ref document: EP Effective date: 20240304 |