WO2010079564A1 - ロボットアームの制御装置及び制御方法、ロボット、ロボットアームの制御プログラム、並びに、集積電子回路 - Google Patents
ロボットアームの制御装置及び制御方法、ロボット、ロボットアームの制御プログラム、並びに、集積電子回路 Download PDFInfo
- Publication number
- WO2010079564A1 WO2010079564A1 PCT/JP2009/007155 JP2009007155W WO2010079564A1 WO 2010079564 A1 WO2010079564 A1 WO 2010079564A1 JP 2009007155 W JP2009007155 W JP 2009007155W WO 2010079564 A1 WO2010079564 A1 WO 2010079564A1
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- robot arm
- force
- unit
- information
- control method
- Prior art date
Links
Images
Classifications
-
- G—PHYSICS
- G05—CONTROLLING; REGULATING
- G05B—CONTROL OR REGULATING SYSTEMS IN GENERAL; FUNCTIONAL ELEMENTS OF SUCH SYSTEMS; MONITORING OR TESTING ARRANGEMENTS FOR SUCH SYSTEMS OR ELEMENTS
- G05B19/00—Programme-control systems
- G05B19/02—Programme-control systems electric
- G05B19/42—Recording and playback systems, i.e. in which the programme is recorded from a cycle of operations, e.g. the cycle of operations being manually controlled, after which this record is played back on the same machine
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B25—HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
- B25J—MANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
- B25J9/00—Programme-controlled manipulators
- B25J9/0003—Home robots, i.e. small robots for domestic use
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B25—HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
- B25J—MANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
- B25J13/00—Controls for manipulators
- B25J13/08—Controls for manipulators by means of sensing devices, e.g. viewing or touching devices
- B25J13/085—Force or torque sensors
Definitions
- the present invention relates to a robot arm control device and control method for generating and teaching a robot motion, a robot having a robot arm control device, a robot arm control program, and an integrated electronic circuit.
- home robots such as care robots or housework support robots have been actively developed. Unlike an industrial robot, a home robot is operated by an amateur at home, so it needs to be able to easily teach the operation. Furthermore, since the operating environment when the robot is working varies depending on the home, it is necessary to flexibly cope with the home environment.
- a force sensor is attached to the wrist of the robot, the teacher directly grasps the handle attached to the tip of the force sensor, guides the robot to the teaching point, and determines the position of the robot. Teaching is performed (see Patent Document 1).
- Patent Document 1 since it is necessary for the teacher to teach all the teaching points, the teaching takes time and is very troublesome. Furthermore, in the industrial field, when correcting a part of the taught movement, it must be corrected by programming with a remote device called a teaching pendant, or all operations must be taught from the beginning. Was bad.
- the home environment is changing every moment, it is difficult to predict all environmental changes at the time of teaching, and even if many sensors can be installed, the detection accuracy is not 100% In some cases, malfunction may occur.
- a human recognizes the situation of the operating robot and transmits it to the robot each time. Even when fluctuations occur, the robot can be operated by teaching each time.
- the wiping and cleaning operation is an operation of rubbing the dirt by applying a certain force to the dirty surface.
- the robot is wiping and cleaning, if the person confirms a highly dirty area and instructs the person to grip the robot directly and wipe it hard, the robot will become stronger.
- the operation is controlled to perform the wiping and cleaning work.
- the robot can perform the work with higher accuracy by controlling the force with the target value rather than controlling the position with the target value. It is.
- the work since the work is performed with the force set to the target value, for example, when a person instructs to apply force so that the robot directly grips the robot and wipes it strongly, the force applied by the person is reduced. There is a problem that it is impossible to distinguish whether it is a force from a person or a disturbance such as a drag from a contact surface (dirt surface).
- the industrial robot has a handle ( The handle portion is determined in advance, and a force sensor is mounted on the handle portion to measure the force applied by the person.
- a force sensor is mounted on the handle portion to measure the force applied by the person.
- FIG. 18 in the home, (a) there is an obstacle 99 or the like in the handle portion, and when a person grips and operates other than the handle, (b) a plurality of people operate at the same time.
- a person grips a plurality of places using both hands the force applied by the person cannot be detected correctly.
- the object of the present invention is to provide a robot arm that can realize robot control that enables an operator to easily teach a robot in a short time even when there is an unpredictable environmental change.
- a control device and a control method a robot, a control program for a robot arm, and an integrated electronic circuit.
- the present invention is configured as follows.
- a robot arm control apparatus that performs operations by the robot arm by controlling the operation of the robot arm,
- An operation information acquisition unit that acquires operation information related to the operation of the robot arm;
- a gripping position detector that detects the gripping position of the robot arm of the person when the person grips the robot arm;
- a characteristic information acquisition unit for acquiring characteristic information having information on the presence or absence of the influence of drag,
- a control method switching unit that switches a control method of the robot arm according to the gripping position detected by the gripping position detection unit and the characteristic information acquired by the characteristic information acquisition unit;
- An operation correction unit that corrects information related to the force of the
- a robot arm control method for performing an operation by the robot arm by controlling the operation of the robot arm
- the movement information acquisition unit acquires movement information related to the movement of the robot arm
- a gripping position detection unit detects a gripping position of the robot arm when the person grips the robot arm
- Information on presence / absence of detection of force when the person grips at the gripping position detected by the gripping position detection unit and from the contact surface when the person grips and operates the robot arm at the gripping position
- Characteristic information having information on the presence or absence of the influence of drag of the characteristic information acquisition unit
- the control method switching unit switches the control method of the robot arm according to the grip position detected by the grip position detection unit and the characteristic information acquired by the characteristic information acquisition unit,
- the motion correction unit corrects information regarding the force of the
- a control program for a robot arm that controls the operation of the robot arm to perform work by the robot arm, Acquiring operation information related to the operation of the robot arm by an operation information acquisition unit; Detecting a gripping position of the robot arm of the person when the person grips the robot arm with a gripping position detection unit; Information on presence / absence of detection of force when the person grips at the gripping position detected by the gripping position detection unit, and from the contact surface when the person grips and operates the robot arm at the gripping position Obtaining characteristic information having information on the presence or absence of the influence of drag in the characteristic information acquisition unit; Switching a control method of the robot arm by a control method switching unit according to the gripping position detected by the gripping position detection unit and the characteristic information acquired by the characteristic information acquisition unit; During the movement of the robot arm based on the movement information acquired by the movement information acquisition unit, after switching the control method by the control method switching unit according to the gripping position and the characteristic information,
- an integrated electronic circuit for controlling a robot arm that performs an operation by the robot arm by controlling an operation of the robot arm
- the movement information acquisition unit acquires movement information that is information related to the movement of the robot arm
- a gripping position detection unit detects a gripping position of the robot arm when the person grips the robot arm
- Information on presence / absence of detection of force when the person grips at the gripping position detected by the gripping position detection unit and from the contact surface when the person grips and operates the robot arm at the gripping position
- Characteristic information having information on the presence or absence of the influence of drag of the characteristic information acquisition unit
- the robot arm has the motion information acquisition unit, the gripping position detection unit, the characteristic information acquisition unit, the control method switching unit, and the motion correction unit. This makes it possible to perform robot control that can easily correct the motion of the robot described by the motion information regardless of the portion of the robot arm that the person holds.
- the control method of the robot arm is switched based on the characteristic information including the grip position of the human robot arm, the presence / absence of force detection and the information on the presence / absence of the drag, and the robot arm based on the operation information During operation, after the control method is switched by the control method switching unit according to the gripping position and the characteristic information, the information regarding the force of the motion information can be corrected by the motion correction unit according to the operation of the person.
- the motion information Robot control can be performed in which information relating to force can be corrected by the motion correction unit.
- FIG. 1 is a diagram showing an outline of the configuration of the robot control apparatus according to the first embodiment of the present invention.
- FIG. 2 is a diagram illustrating a detailed configuration of the control device and the robot arm that is a control target that configure the robot system according to the first embodiment of the present invention;
- FIG. 3 is a block diagram showing a configuration of a control unit of the control device in the first embodiment of the present invention,
- FIG. 4A is a diagram relating to a coordinate system of the robot arm in the control device according to the first embodiment of the present invention;
- FIG. 1 is a diagram showing an outline of the configuration of the robot control apparatus according to the first embodiment of the present invention.
- FIG. 2 is a diagram illustrating a detailed configuration of the control device and the robot arm that is a control target that configure the robot system according to the first embodiment of the present invention;
- FIG. 3 is a block diagram showing a configuration of a control unit of the control device in the first embodiment of the present invention,
- FIG. 4A is a
- FIG. 4B is a diagram relating to a coordinate system of the robot arm in the control device according to the first embodiment of the present invention
- FIG. 4C is a diagram relating to a coordinate system of the robot arm in the control device according to the first embodiment of the present invention
- FIG. 5 is a diagram for explaining a list of operation information in the operation information database of the robot control device according to the first embodiment.
- FIG. 6 is a diagram for explaining flag information in the operation information database of the robot control device in the first embodiment.
- FIG. 7 is a diagram for explaining information about the correction parameter flag in the operation information database of the robot control apparatus in the first embodiment.
- FIG. 8 is a diagram showing the operation of the robot control device and the operation state of a human robot arm in the first embodiment of the present invention, FIG.
- FIG. 9A is a diagram illustrating a list of a force detection unit characteristic database of the robot control device according to the first embodiment of the present invention
- FIG. 9B is a diagram illustrating a list of a force detection unit characteristic database of the robot control device according to the first embodiment of the present invention
- FIG. 10 is a diagram illustrating an operation of the robot control device and an operation state of a human robot arm in the first embodiment of the present invention
- FIG. 11 is a flowchart showing the operation steps of the control method switching unit of the robot control apparatus in the first embodiment of the present invention
- FIG. 12A is a diagram showing an operation of the robot control device and an operation state of a human robot arm in the first embodiment of the present invention
- FIG. 12B is a diagram illustrating an operation of the robot control device and an operation state of a human robot arm in the first embodiment of the present invention
- FIG. 12C is a diagram illustrating an operation of the robot control device and an operation state of a human robot arm in the first embodiment of the present invention
- FIG. 13 is a diagram illustrating an operation state of the robot control device according to the first embodiment of the present invention
- FIG. 14 is a diagram showing an operation state of the robot control device in the first embodiment of the present invention
- FIG. 15 is a flowchart showing the operation steps of the control unit of the robot control apparatus in the first embodiment of the present invention
- FIG. 16A is a diagram showing an operation of the robot control device and an operation state of a human robot arm in the first embodiment of the present invention
- FIG. 16B is a diagram illustrating an operation of the robot control device and an operation state of a human robot arm in the first embodiment of the present invention
- FIG. 17 illustrates operation steps of the motion correction unit, the motion command unit, the motion storage unit, the control method switching unit, the gripping position detection unit, and the control parameter management unit of the robot control apparatus according to the first embodiment of the present invention.
- FIG. 18 is a diagram showing the operation of the robot control device and the operation state of a human robot arm in the second embodiment of the present invention
- FIG. 19 is a diagram illustrating a detailed configuration of the control device and a robot arm that is a control target that configure the robot system according to the second embodiment of the present invention
- FIG. 20A is a diagram illustrating a list of force calculation method tables used in the force calculation unit of the robot control device according to the second embodiment of the present invention
- FIG. 20B is a diagram illustrating a list of force calculation method tables used in the force calculation unit of the robot control device according to the second embodiment of the present invention
- FIG. 20C is a diagram illustrating a list of force calculation method tables used in the force calculation unit of the robot control device according to the second embodiment of the present invention
- FIG. 21A is a diagram showing an operation state of the robot control device and an operation state of a human robot arm in the first embodiment of the present invention
- FIG. 21B is a diagram showing an operation state of the robot control device and an operation state of a human robot arm in the first embodiment of the present invention
- FIG. 21A is a diagram showing an operation state of the robot control device and an operation state of a human robot arm in the first embodiment of the present invention
- FIG. 21B is a diagram showing an operation state of the robot control device and an operation state of
- FIG. 22A is a diagram showing the operation of the robot control device and the operation state of the human robot arm in the third embodiment of the present invention
- FIG. 22B is a diagram illustrating an operation of the robot control device and an operation state of a human robot arm in the third embodiment of the present invention
- FIG. 22C is a diagram showing an operation state of the robot control device and an operation state of a human robot arm in the third embodiment of the present invention
- FIG. 22D is a diagram showing an operation state of the robot control device and an operation state of a human robot arm in the third embodiment of the present invention
- FIG. 23 is a diagram illustrating a detailed configuration of the control device and the robot arm that is a control target that configure the robot system according to the third embodiment of the present invention
- FIG. 23 is a diagram illustrating a detailed configuration of the control device and the robot arm that is a control target that configure the robot system according to the third embodiment of the present invention
- FIG. 22A is a diagram showing the operation of the robot control device and the operation state of the
- FIG. 24 is a diagram illustrating a list of operation information in the operation information database of the robot control device according to the third embodiment of the present invention
- FIG. 25A is a diagram illustrating a list of force detection unit characteristic databases of the robot control device according to the third embodiment of the present invention
- FIG. 25B is a diagram illustrating a list of force detection unit characteristic databases of the robot control device according to the third embodiment of the present invention
- FIG. 26A is a diagram illustrating a list of force calculation method tables used in the force calculation unit of the robot control device according to the third embodiment of the present invention
- FIG. 26B is a diagram illustrating a list of force calculation method tables used in the force calculation unit of the robot control device according to the third embodiment of the present invention
- FIG. 26A is a diagram illustrating a list of force calculation method tables used in the force calculation unit of the robot control device according to the third embodiment of the present invention
- FIG. 26B is a diagram illustrating a list of force calculation method tables used in the force calculation unit of the robot control
- FIG. 26C is a diagram illustrating a list of force calculation method tables used in the force calculation unit of the robot control device according to the third embodiment of the present invention.
- FIG. 27A is a diagram illustrating one configuration example of a force detection unit of the robot control device according to the first embodiment of the present invention
- FIG. 27B is a diagram illustrating another configuration example of the force detection unit of the robot control device according to the first embodiment of the present invention
- FIG. 28 is a diagram illustrating an operation state of a human robot arm with respect to the robot arm in the robot control apparatus according to the first embodiment of the present invention
- FIG. 29 is a diagram showing an operation state of the human robot arm in the first embodiment of the present invention, FIG.
- FIG. 30A is a diagram illustrating a list of force detection unit characteristic databases of the robot control device according to the first embodiment of the present invention
- FIG. 30B is a diagram illustrating a list of force detection unit characteristic databases of the robot control device according to the first embodiment of the present invention
- FIG. 31A is a diagram illustrating a list of a force detection unit characteristic database of the robot control device according to the first embodiment of the present invention
- FIG. 31B is a diagram illustrating a list of force detection unit characteristic databases of the robot control device according to the first embodiment of the present invention.
- a robot arm control apparatus that performs operations by the robot arm by controlling the operation of the robot arm,
- An operation information acquisition unit that acquires operation information related to the operation of the robot arm;
- a gripping position detector that detects the gripping position of the robot arm of the person when the person grips the robot arm; Information on presence / absence of detection of force when the person grips at the gripping position detected by the gripping position detection unit, and drag force from the contact surface when the person grips the robot at the gripping position and the robot arm.
- a characteristic information acquisition unit for acquiring characteristic information having information on the presence or absence of influence;
- a control method switching unit that switches a control method of the robot arm according to the gripping position detected by the gripping position detection unit and the characteristic information acquired by the characteristic information acquisition unit;
- An operation correction unit that corrects information related to the force of the operation information acquired by the operation information acquisition unit according to a human operation,
- a robot arm control device is provided that controls the movement of the robot arm based on the movement information corrected by the movement correction unit.
- the robot arm further includes a force detection unit that detects a force applied to the robot arm from the outside,
- the characteristic information acquisition unit when detecting the force applied from the outside to the robot arm by the force detection unit, the influence of the drag force from the contact surface when the person grips the characteristic information Information about the presence or absence of
- the motion correcting unit switches the control method by the control method switching unit during the operation of the robot arm based on the motion information, and then applies the force detected by the force detection unit as the human operation. Accordingly, the robot arm control device according to the first aspect is provided, wherein the information regarding the force of the motion information is corrected.
- the control method switching unit is (I) a control method for controlling the operation of the robot arm so that the robot arm moves with the force applied by the person to the robot arm; (II) a control method for controlling the operation of the robot arm so that the robot arm does not move even when the person applies force to the robot arm; (III) Switch to any one of the control methods for controlling the operation of the robot arm by the control method before the switching, In the case of the control method (I), the force detection unit detects the force while the robot arm is moving, or the robot arm moves to the contact surface after the robot arm moves.
- the robot arm control device According to the second aspect is provided, wherein the force is detected when the person applies the force to the robot arm.
- the control method switching unit is (I)
- the characteristic information includes information that there is no influence of drag from the contact surface, and the force applied to the robot arm at the gripping position of the person is within a detectable range of the force detection unit. If there is information that there is, control the operation of the robot arm by the control method before switching, or perform the operation of the robot arm so that the person does not move by the force applied to the robot arm.
- the control method switching unit is (I) a control method for controlling the operation of the robot arm so that the robot arm moves with the force applied by the person; (II) a control method for controlling the operation of the robot arm so that the robot arm does not move even when the person applies force to the robot arm; (III) a control method for controlling the operation of the robot arm by the control method before the switching; Sequentially switching one of the control methods, the force detection unit detects the force by each control method, Based on a plurality of values detected by the force detection unit at each gripping position, a force calculation unit that calculates the value of the force applied to the robot arm by the person, The robot motion controller according to the third aspect, wherein the motion correction unit corrects the motion information in the motion information database with the force value calculated by the force calculation unit.
- the force calculation unit is (I) a method of calculating a sum of a plurality of values detected by the force detection unit; (II) a method of calculating a minimum value among the plurality of values detected by the force detection unit; (III) a method of calculating a maximum value among the plurality of values detected by the force detection unit; (IV) A method of calculating the plurality of values detected by the force detection unit by multiplying each value by a weighting factor, and The value of the force applied by the person to the robot arm is calculated by any one of the calculation methods,
- the motion correction unit corrects information regarding the force of the motion information acquired by the motion information acquisition unit based on the value calculated by the force calculation unit.
- An arm control device is provided.
- the robot comprises a plurality of robot arms
- the grip position detection unit detects whether the person is gripping any of the plurality of robot arms, When the person is holding one of the plurality of robot arms, the force is detected by the force detection unit provided in the one robot arm, Furthermore, a force calculation unit that calculates a value for correcting the other robot arm that the person does not hold from the value detected by the force detection unit,
- the robot arm control device according to the second or third aspect, wherein the motion correction unit corrects the motion information acquired by the motion information acquisition unit with the value calculated by the force calculation unit. I will provide a.
- the force calculation unit is (I) a method of calculating a sum of a plurality of values detected by the force detection unit; (II) a method of calculating a minimum value among the plurality of values detected by the force detection unit; (III) a method of calculating a maximum value among the plurality of values detected by the force detection unit; (IV) A method of calculating the plurality of values detected by the force detection unit by multiplying each value by a weighting factor, and The value of the force applied by the person to the robot arm is calculated by any one of the calculation methods, The motion correction unit corrects the motion information of all the robot arms acquired by the motion information acquisition unit based on the value calculated by the force calculation unit.
- a robot arm control apparatus is provided.
- the control method switching unit switches to a control method that controls the operation of the robot arm so that the robot arm does not move even if the person applies force to the robot arm.
- the control method before switching and the control method after switching are alternately switched,
- the robot arm control device according to the third aspect, wherein the force detection unit detects the force when switching to the control method after the switching.
- a robot arm control method for performing an operation by the robot arm by controlling the operation of the robot arm
- the movement information acquisition unit acquires movement information related to the movement of the robot arm
- a gripping position detection unit detects a gripping position of the robot arm when the person grips the robot arm
- Information on presence / absence of detection of force when the person grips at the gripping position detected by the gripping position detection unit and from the contact surface when the person grips and operates the robot arm at the gripping position
- Characteristic information having information on the presence or absence of the influence of drag of the characteristic information acquisition unit
- the control method switching unit switches the control method of the robot arm according to the grip position detected by the grip position detection unit and the characteristic information acquired by the characteristic information acquisition unit,
- the motion correction unit corrects information regarding the force of the
- the robot arm there is provided a robot comprising the robot arm control device according to any one of the first to eighth aspects for controlling the operation of the robot arm.
- a control program for a robot arm that controls the operation of the robot arm to perform work by the robot arm, Acquiring operation information related to the operation of the robot arm by an operation information acquisition unit; Detecting a gripping position of the robot arm of the person when the person grips the robot arm with a gripping position detection unit; Information on presence / absence of detection of force when the person grips at the gripping position detected by the gripping position detection unit, and from the contact surface when the person grips and operates the robot arm at the gripping position Obtaining characteristic information having information on the presence or absence of the influence of drag in the characteristic information acquisition unit; Switching a control method of the robot arm by a control method switching unit according to the gripping position detected by the gripping position detection unit and the characteristic information acquired by the characteristic information acquisition unit; During the movement of the robot arm based on the movement information acquired by the movement information acquisition unit, after switching the control method by the control method switching unit according to the gripping position and the characteristic information,
- an integrated electronic circuit for controlling a robot arm that performs an operation by the robot arm by controlling an operation of the robot arm
- the movement information acquisition unit acquires movement information that is information related to the movement of the robot arm
- a gripping position detection unit detects a gripping position of the robot arm when the person grips the robot arm
- Information on presence / absence of detection of force when the person grips at the gripping position detected by the gripping position detection unit and from the contact surface when the person grips and operates the robot arm at the gripping position
- Characteristic information having information on the presence or absence of the influence of drag of the characteristic information acquisition unit
- 1 and 2 are diagrams showing an outline of a robot system 1 including the robot arm 5 and its control device 70 in the first embodiment of the present invention.
- the robot arm 5 of the robot system 1 is installed on a wall surface 7a of a work table 7 such as a home kitchen or table.
- the base end 5a of the robot arm 5 is supported so as to be movable with respect to the rail 8 fixed to the wall surface 7a, and the robot arm 5 moves along the rail 8 in the lateral direction (for example, in the horizontal direction). Therefore, it can be moved.
- the robot system 1 performs a work performed by the robot arm 5 and the person 4 in cooperation in the home, such as a work of wiping the kitchen dirt 91 using the robot arm 5 or a work of stirring the bottom of the pan.
- FIG. 1 shows an example of an operation procedure of work by the robot system 1 that performs wiping and cleaning work as an example.
- the person 4 directly grips the robot arm 5, and the person 4 applies a force to the robot arm 5.
- the robot arm 5 is moved along the rail 8 by the force applied from the person 4 to the robot arm 5, and the robot arm 5 is guided to the vicinity of the cooking device 6 such as an IH cooking heater or a gas stove.
- the person 4 attaches a sponge 46 as an example of a cleaning tool for wiping and cleaning work to the hand 30 at the tip of the robot arm 5.
- the person 4 presses the button 13a of the operation panel 13 of the robot system 1 arranged on the side surface of the cooking apparatus 6 or the like and the person 4 uses the data input IF 26, the person 4 moves to the robot arm 5.
- the robot arm 5 is activated to start a preselected operation (in this case, a wiping operation).
- the person 4 finds a dirty portion 91a in a place different from the place where the wiping is performed. Then, the robot arm 5 is directly gripped by the hand 4a of the person 4 and a force is applied to the robot arm 5 to move the robot arm 5 to the portion 91a (in the direction of arrow (1) in the case of FIG. 29). Further, by applying a force to the robot arm 5 with the hand 4a of the person 4 in the direction to be corrected (the direction of the arrow (2) in the case of FIG. 29) in the heavily contaminated portion 91a, A correction command for correcting the operation is input, and the operation of the robot arm 5 is corrected so as to perform a strong wiping and cleaning operation (see FIG. 29).
- the rail 8 is arranged on the wall surface 7a of the work table 7.
- it can be installed at a place suitable for work such as a ceiling surface or a side surface of the island kitchen top plate.
- operation panel 13 is fixed to the side surface of the cooking device 6, a remote control capable of remote operation may be used instead of the operation panel 13.
- FIG. 2 is a diagram showing a detailed configuration of the robot arm 5 to be controlled and the control device 70 of the robot arm 5 constituting the robot system 1.
- the control device 70 of the robot arm 5 includes a gripping position detection unit 23, a motion generation device 12 that generates motion of the robot arm 5, a control device main body 11, and a peripheral device 14. ing.
- the robot arm 5 of the first embodiment is an articulated robot arm composed of a multi-link manipulator having 6 degrees of freedom.
- the robot arm 5 includes a hand 30, a forearm link 32 having a wrist 31 to which the hand 30 is attached at a distal end 32a, and an upper arm link 33 having a distal end 33a rotatably connected to a proximal end 32b of the forearm link 32.
- the base end 33b of the upper arm link 33 is provided with a base portion 34 that is rotatably connected and supported.
- the pedestal 34 is connected to the movable rail 8, but may be fixed at a fixed position.
- the wrist portion 31 has three rotation axes of the fourth joint portion 38, the fifth joint portion 39, and the sixth joint portion 40, and the relative posture (direction) of the hand 30 with respect to the forearm link 32.
- the fifth joint portion 39 can change the relative posture of the hand 30 with respect to the wrist portion 31 around the vertical axis ( ⁇ ) orthogonal to the horizontal axis of the fourth joint portion 38.
- the sixth joint portion 40 is relative to the wrist portion 31 around the horizontal axis ( ⁇ ) orthogonal to the horizontal axis ( ⁇ ) of the fourth joint portion 38 and the vertical axis ( ⁇ ) of the fifth joint portion 39. Change your attitude.
- the other end of the forearm link 32 is rotatable around the third joint portion 37 with respect to the tip of the upper arm link 33, that is, around a horizontal axis parallel to the horizontal axis of the fourth joint portion 38.
- the other end of the upper arm link 33 is rotatable around the second joint portion 36 with respect to the base portion 34, that is, around a horizontal axis parallel to the horizontal axis of the fourth joint portion 38.
- the upper movable portion 34a of the base portion 34 rotates around the first joint portion 35 with respect to the lower fixed portion 34b of the base portion 34, that is, around the vertical axis parallel to the vertical axis of the fifth joint portion 39. It is possible.
- the robot arm 5 constitutes the multi-link manipulator having 6 degrees of freedom so as to be rotatable around a total of six axes.
- Each joint that constitutes the rotating portion of each axis includes a rotation drive device such as a motor 43 and an encoder 44 that detects the rotation phase angle (ie, the joint angle) of the rotation axis of the motor 43.
- the motor 43 of the first embodiment is disposed inside each joint portion of the robot arm 5.
- the motor 43 is driven and controlled by a motor driver 25 (described later) provided on one of the two link members constituting each joint.
- the rotation shaft of the motor 43 provided in the joint portion of one link member of each joint portion is connected to the other link member, and the other link member is moved forward and backward by rotating the rotation shaft forward and backward. It is possible to rotate around each axis with respect to one link member.
- Reference numeral 41 denotes an absolute coordinate system in which the relative positional relationship is fixed with respect to the lower fixed portion 34 b of the base portion 34
- reference numeral 42 denotes a hand coordinate system in which the relative positional relationship with respect to the hand 30 is fixed.
- the origin position O e (x, y, z) of the hand coordinate system 42 viewed from the absolute coordinate system 41 is set as the hand position (position of the hand 30) of the robot arm 5, and the hand coordinate system 42 viewed from the absolute coordinate system 41 is used.
- a coordinate system in which the coordinate system is rotated by an angle ⁇ with the Z axis of the absolute coordinate system 35 as a rotation axis is considered (see FIG. 4A), and the coordinate axes at this time are represented as [X ′, Y ′, Z].
- the coordinate system is rotated by an angle ⁇ with Y ′ as the rotation axis (see FIG. 4B), and the coordinate axes at this time are set to [X ′′, Y ′, Z ′′].
- this coordinate system is rotated by an angle ⁇ with X ′′ as the rotation axis (see FIG.
- the posture vector is ( ⁇ , ⁇ , ⁇ ).
- the coordinate system of the posture ( ⁇ , ⁇ , ⁇ ) is parallel to the origin position O e (x, y, z) of the hand coordinate system 42 and the hand coordinate system 42 coincides with the hand coordinate system 42
- the posture vector of the system is assumed to be ( ⁇ , ⁇ , ⁇ ).
- a force detector 53 such as a force sensor detects a force applied to the robot arm 5 from a person 4 or the like. Specifically, the force is detected for each of six directions of the three directions (x, y, z directions) applied to the force detection unit 53 and the posture direction ( ⁇ , ⁇ , ⁇ directions).
- the force detection unit 53 As an example of the configuration of the force detection unit 53, as shown in FIG. 27A, when a force sensor 53a for operation and a force sensor 53b for force control described later are mounted, the force sensors 53a, The force applied to 53b is detected as a force detector 53.
- the force detection unit 53 when torque sensors 53 c, 53 d, and 53 e are mounted on each joint unit, the force detection unit 53 includes a current of the motor driver 25.
- the current value i [i 1 , i 2 , i 3 , i 4 , i 5 , i 6 ] T flowing through the motor 43 that drives each joint portion of the robot arm 5 measured by the sensor is input via the input / output IF 24 The force is detected and the force is detected by the force detector 53.
- the current value q of the joint angle measured by each encoder 44 is taken in via the input / output IF 24, and the joint angle error compensation output u qe from the approximate inverse kinematics calculation unit 57 described later is taken in.
- the force detection unit 53 functions as an observer, and the torque generated in each joint by an external force applied to the robot arm 5 based on the current value i, the current value q of the joint angle, and the joint angle error compensation output u qe. ⁇ ext is calculated.
- v [v x , v y , v z , ⁇ x , ⁇ y , ⁇ z ] T
- (v x , v y , v z ) is translation of the hand of the robot arm 5 in the hand coordinate system 42.
- the velocity, ( ⁇ x , ⁇ y , ⁇ z ) is the angular velocity of the hand of the robot arm 5 in the hand coordinate system 42.
- m is the weight of the gripping object gripped by the hand 30, and g is the gravitational acceleration.
- the value of the weight m of the gripping object can be input to the force detection unit 53 by the person 4 via the input / output IF 24 before gripping the object. It is also possible to actually hold the object with the hand 30 of the robot arm 5 and calculate the value of the weight m of the grasped object from the estimation result of the equivalent hand external force F ext of the force detection unit 53 at that time. .
- the operations of the motion generation device 12, the control device main body 11, and the peripheral device 14 are executed, whereby each joint portion of the robot arm 5 is described later.
- the joint angle information output from the encoder 44 is taken into the control device main body 11 through the counter board of the input / output IF 24, and the control device main body 11 rotates the joints based on the taken joint angle information.
- the control command value at is calculated.
- the calculated control command values are given to the motor driver 25 for driving and controlling the joints of the robot arm 5 through the D / A board of the input / output IF 24, and the control command values sent from the motor driver 25 are sent. Accordingly, the motor 43 of each joint portion of the robot arm 5 is driven.
- a hand drive motor 62 and an encoder 61 that detects the rotation phase angle of the rotation shaft of the hand drive motor 62 are further provided in the hand 30.
- the rotation angle information detected by the encoder 61 is taken into the control device main body 11 through the counter board of the input / output IF 24, and the control unit 22 of the control device main body 11 is based on the taken rotation angle information.
- a control command value for the opening / closing operation of the hand 30 is calculated by the hand control unit 54 (shown in FIG. 3).
- the calculated control command value is given to the motor driver 25 that also opens and closes the hand 30 through the D / A board of the input / output IF 24, and the motor 62 is rotated according to each control command value sent from the motor driver 25.
- the hand 30 is opened and closed by controlling the drive and rotating the rotating shaft of the motor 62 for driving the hand forward and backward.
- the control device main body 11, the motion generation device 12, the peripheral device 14, and the grip position detection unit 23 are each configured by a general personal computer as an example.
- -Gripping position detector- Reference numeral 23 denotes a grip position detection unit, which detects which part (position) of the robot arm 5 is gripped and operated when the person 4 grips the robot arm 5 (the robot arm of the person 4). 5) is detected. Specifically, image recognition of the hand 4 a of the operating person 4 is performed by the gripping position detection unit 23 from the image data of the image capturing device 19 such as a camera, and the hand 4 a is the forearm link 32 of the robot arm 5. Whether the upper arm link 33 or the hand 30 is gripped and operated is detected by the gripping position detector 23.
- the motion generation device 12 includes a motion information database 17 that functions as an example of a motion information acquisition unit, a force detection unit characteristic database 18 that functions as an example of a characteristic information acquisition unit, a motion command unit 27, a motion correction unit 20, An operation storage unit 15 and a control method switching unit 16 are included. Between the motion correction unit 20 and the control parameter management unit 21, information on the hand position and posture of the robot arm 5, information on the force applied by the person 4, and operation commands are input / output. The control parameter management unit 21 outputs the hand position and posture of the robot arm 5 and information on the force applied from the person 4 to the robot arm 5 to the operation storage unit 15. Details of each control mode (i) position control mode, (ii) impedance control mode, and (iii) force control mode) in the motion generation device 12 will be described later in the control parameter management unit 21 of the control device main body 11. To do.
- the motion information database 17 inputs / outputs motion information to / from the motion command unit 27, inputs / outputs motion information to / from the motion correction unit 20, and receives various motion information from the motion storage unit 15.
- operation information database 17 for example, information (operation information) related to the operation of the robot arm 5 shown in FIG. Specific data examples of the operation information are shown below.
- the work ID is a code used to identify information related to the work
- the action ID is a code used to identify information related to the action.
- the “position / posture” in the motion information database 17 of FIG. 5, that is, information on the hand position and posture of the robot arm 5 represents the hand position and posture of the robot arm 5 described above, and from the coordinates and posture of the origin position O e , (X, y, z, ⁇ , ⁇ , ⁇ ).
- the information on the force indicates information on the force applied to the target object when the robot arm 5 performs work, and the components of the force in the x, y, z, ⁇ , ⁇ , and ⁇ directions are expressed as (f x , f y , f z , f ⁇ , f ⁇ , f ⁇ ).
- f z 5 [N]
- wiping and cleaning the top plate such as the IH cooking heater 6, it is used when the surface of the top plate is rubbed with force.
- the information regarding the “flag” in the motion information database in FIG. 5 is a value indicating which information among the hand position, posture, and force of the robot arm 5 based on the motion information indicated by each “motion ID” is valid. Specifically, it is represented by the 32-bit numerical value shown in FIG. In FIG. 6, “1” is set when each value of the hand position, posture, and force is valid in each bit, and “0” is set when each value of the hand position, posture, and force is invalid. For example, at the 0th bit, “1” is set when the x-coordinate value of the hand position of the robot arm 5 is valid, and “0” is set when it is invalid.
- the first bit “1” is set when the y-coordinate value of the hand position of the robot arm 5 is valid, and “0” is set when it is invalid.
- “1” is set when the value of the z-coordinate of the hand position of the robot arm 5 is valid, and “0” is set when the value is invalid.
- the third, fourth, and fifth bits sequentially indicate the validity of the postures ⁇ , ⁇ , and ⁇ (that is, “1” is valid and “0” is invalid).
- the 6th to 11th bits indicate whether each component of force is valid or invalid (ie, “1” when valid, and “0” when invalid).
- the information related to “time” in the motion information database 17 in FIG. 5 is the time for executing each motion of the robot arm 5, and the motion stored in this “motion ID” is the information related to “time” here. It represents performing over time stored as. That is, it represents the relative time from the previous operation, not the absolute time. That is, the time until the hand 30 of the robot arm 5 moves or the time until the “force” indicated by the “motion ID” is represented in the “position and posture” indicated by the “motion ID”.
- the information regarding the “correction parameter flag” in the operation information database 17 in FIG. 5 is information indicating which parameter is corrected by the operation correction unit 20 described later. Specifically, it is represented by a 32-bit numerical value shown in FIG. In FIG. 7, “1” is set when the respective values of the hand position, posture, and force can be corrected by each bit, and when correction of the respective values of the hand position, posture, and force is impossible, “ 0 ”. For example, in the 0th bit, “1” is set when the x-coordinate value of the hand position can be corrected, and “0” is set when the x-coordinate value of the hand position cannot be corrected.
- the first bit “1” is set when the y-coordinate value of the hand position can be corrected, and “0” is set when the y-coordinate value of the hand position cannot be corrected.
- the second bit “1” is set when the z-coordinate value of the hand position can be corrected, and “0” is set when the z-coordinate value of the hand position cannot be corrected.
- the third, fourth, and fifth bits indicate the correctability of the posture ⁇ , ⁇ , and ⁇ (that is, “1” when correction is possible, and “0” when correction is impossible).
- the 6th to 11th bits indicate the possibility of correction of each component of the force (ie, “1” when correction is possible and “0” when correction is impossible).
- the 12th to 31st bits are not used, so “0” is inserted. It is good also as a variable which can store only a bit.
- the information related to “progress information” in the operation information database 17 of FIG. 5 is information indicating whether or not the operation is currently being executed, and is “1” when the operation is being executed, and is not the operation being executed. Is stored in the operation information database 17 by the operation storage unit 15 as “0”. Specifically, when the operation is started by the operation command unit 27, among the operations of the operation, “1” is stored in the operation storage unit 15 for the operation currently being executed (executed), and the current operation For operations that are not (executed), the operation storage unit 15 stores “0”.
- the operation command unit 27 receives an instruction to start the operation of the work of “work ID” designated by the data input IF 26 by the person 4.
- the operation command unit 27 receives an instruction to start the operation of the designated “work ID” and starts the operation of the designated “work ID”.
- “1” is set in “progress information” of “operation ID” by the operation information command unit 27 and stored in the operation information database 17 by the operation storage unit 15.
- “0” is set in the motion command unit 27 and stored in the motion information database 17 by the motion storage unit 15.
- the human arm 4 holds the robot arm 5 and moves the robot arm 5 in the impedance control mode to be described later.
- information on the hand position and posture of the robot arm 5 is acquired at certain time intervals (for example, every 0.2 msec), and is stored and created in the operation information database 17 by the operation storage unit 15 along with the time.
- the force information in the motion information database 17 is created by inputting a force value to be applied by the data input IF 26.
- 3 is a pan as an example of a cooking utensil
- 9 is a ladle as an example of a cooking utensil that is held by the hand 30 and stirs the pan 3.
- the force detection unit characteristic database 18 stores information (characteristic information) indicating the characteristics of the force detection unit 53 (shown in FIG. 3), and an example is shown in FIG. 9A.
- the characteristic information includes information on the grip position of the robot arm 5 of the person 4 (see the “grip position ID” column in FIG. 9A) and whether the force can be detected at the position where the person 4 grips the robot arm 5.
- Information on whether or not (refer to the column “Presence / absence of detection” in FIG. 9A) and information on whether or not there is a possibility of being affected by the drag from the contact surface in the force detection (“the influence of the drag” in FIG. 9A) (See “Possibility”).
- FIG. 9A shows information regarding which part of the robot arm 5 the person 4 is holding based on the table shown in FIG. 9B stored in the force detection unit characteristic database 18.
- Information about each “gripping position ID” is determined by the table shown in FIG. 9B. Specifically, when the upper arm link 33 in FIG. 2 is gripped (operation A in FIG. 28), “holding position ID” of “upper arm” becomes “1” according to FIG. 9B. In addition, when the forearm link 32 is gripped (operation B in FIG. 28), the “forearm” has a “grip position ID” of “2” according to FIG. 9B. To the control method switching unit 16. Similarly, when the wrist part 31 is gripped, “Wrist Position ID” of “Wrist” is “3” according to FIG. 9B. Similarly, when the hand 30 is gripped (operation C in FIG. 28), it can be seen from FIG. 9B that the “hand” has a “grip position ID” of “4”. It is detected by the control method switching unit 16.
- the information regarding “presence / absence of detection” in FIG. 9A can be detected by the force detection unit 53 when the person 4 grips the gripping position indicated by each “grip position ID” and the person 4 applies force at the gripping position. “1” is set, and “0” is set when the force detection unit 53 cannot detect.
- the information regarding “presence / absence of the influence of drag” in FIG. 9A is obtained when the robot arm 5 performs a predetermined operation, for example, when the robot arm 5 directly contacts the contact surface with the hand 30 or when the robot arm 5 This is information as to whether or not the force detection unit 53 may be affected by a drag force from the contact surface when the contact surface is indirectly contacted via an object or the like grasped at 30.
- the robot arm 5 grips an object, for example, a sponge 46 as an example of a wiping cleaning tool, and wipes and cleans it by applying force to the cooking device 6 such as an IH cooking heater or a gas stove.
- the force detection unit 53 when the force detection unit 53 is set only on the wrist portion 31 and the sponge 46 is gripped with the hand 30 and cleaned, It is assumed that the hand 30 is directly or indirectly in contact with the cooking device 6 and is affected by drag from the top plate of the cooking device 6. At this time, when the hand 30 or the wrist portion 31 is further gripped by the hand 4 a of the person 4, the force detection unit 53 can detect the force. On the other hand, when the hand 4 a of the person 4 grips a part other than the hand 30 or the wrist part 31, the force detection unit 53 cannot detect the force. In such a case, the force detector characteristic database 18 is as shown in FIG. 9A.
- the force detection unit 53 since the installed force detection unit 53 may be affected by the drag, all the information regarding the “presence / absence of the possibility of the influence of the drag” in FIG. 9A is “1” regardless of the gripping position. . Furthermore, since the force detection unit 53 can detect a force when the hand 30 is gripped by the hand 30 or the wrist 31, the force detection unit 53 indicates the hand 30 when the “grip position ID” indicating the wrist 31 is “3”. “Presence / absence of detection” when “grip position ID” is “4” is “1”, respectively. Similarly, when the upper arm link 33 or the forearm link 32 is gripped, the force applied by the person 4 cannot be directly detected. Therefore, the case where the “grip position ID” indicating the upper arm link 33 is “1” and the forearm link 32 are similar. “Presence / absence of detection” when “grip position ID” indicating “2” is “0”, respectively.
- the force detection unit characteristic database 18 when the operation force sensor 53a and the force control force sensor 53b in the forearm link 32 are mounted is as shown in FIG. 30A.
- the force from the person 4 can be detected directly by the force sensor 53a for operation in FIG.
- “presence / absence of detection” is “1”.
- the force sensor 53a for operation does not receive the influence of the drag even if it directly or indirectly contacts the cooking device 6, the “possibility of the influence of the drag” with the “gripping position ID” being “2”.
- "Presence / absence of” is "0".
- the other gripping positions are the same as in the example of FIG.
- the control method can be switched by the control method switching unit 16 described later.
- the motion correction unit 20 uses (i) position control mode, (ii) impedance control mode, and (iii) force control mode according to the direction based on the position, posture, force, and time information in the motion information database 17.
- the operation correction unit 20 receives an operation correction start command from the data input IF 26 via the operation command unit 27. Then, after the control method switching unit 16 switches the control method, the motion correction unit 20 applies the force to the robot arm 5 by the person 4 based on the motion correction information in the motion correction information database 18.
- the robot arm 5 has a function of correcting the operation information. Details of the control modes (i), (ii), and (iii) will be described later together with the operation of the control parameter management unit 21.
- the person 4 selects the work to be executed by the robot arm 5 from the work related to “work ID” in the work information database 17 by the data input IF 26 and inputs the selection information to the motion command unit 27.
- the operation command unit 27 instructs the operation correction unit 20 to select a work.
- the motion correction unit 20 sets the motion information (specifically, position information, posture information, time information, and force information) of the work of “work ID” selected from the motion information database 17 as a flag. Accordingly, the control parameter management unit 21 is instructed to set the control mode and operate.
- the robot arm 5 wipes the top plate such as the IH cooking heater 6 and starts the cleaning operation.
- the data input IF 26 causes the data input IF 26 to move the robot arm 5 to another location. Assume that the input is made to the unit 27.
- the operation command unit 27 receives a correction start command from the data input IF 26 and outputs a correction start to the operation correction unit 20.
- the motion correction unit 20 issues a command to the control parameter management unit 21 to set the control mode and operate according to the correction parameter flag in the motion information database 17. Specifically, the correction parameter flag of “operation ID” when the operation of “operation ID” “1” in FIG. 5 is in operation (operation with progress information “1”) is only the eighth bit in FIG. Since “1” and others are “0”, this indicates that only the z-axis force component of the movement of the robot arm 5 can be corrected.
- the force hybrid impedance control mode is set so that the force component can be corrected by the force applied by the person 4, and a command for switching the control method in the z-axis direction is issued as an operation correction unit. 20 to the control method switching unit 16.
- the gripping position detection unit 23 detects where the person 4 is gripping the robot arm 5, and the detection result is transferred from the gripping position detection unit 23 to the control method switching unit 16. To enter. Then, the control method switching unit 16 switches the control method according to the gripping position, and the control method switching unit 16 issues a command to the control parameter management unit 21 so that the control method is operated.
- the person 4 grips the robot arm 5 with a strong force and cleans the vicinity of the dirt 91, and moves the robot from the person 4 toward the dirty portion 91a. Apply force to the arm 5.
- Information on the applied force is detected by the force detection unit 53 of the control unit 22 and input to the operation correction unit 20 via the control parameter management unit 21.
- the motion correction unit 20 the value of the z-axis force component that is operating by force control among the information of “work ID” and “motion ID” that is currently in motion is input as the value of the input force component. Correct to correct.
- the operation correcting unit 20 corrects the operation of the robot arm 5 with the force applied by the person 4 to the robot arm 5, and performs the wiping and cleaning operation by increasing the force for rubbing the IH cooking heater 6 of the robot arm 5.
- the motion correction unit 20 can correct the motion information when the person 4 applies force to the robot arm 5 while operating with the motion information in the motion information database 17. Further, the control method switching unit 16 can correctly detect the force applied by the person 4 regardless of which part of the robot arm 5 is gripped by the person 4.
- the control method switching unit 16 (I) position control mode; (Ii) an impedance control mode; (Iii) force control mode; (Iv) The gripping position and force detection unit characteristics detected by the gripping position detection unit 23 when a command is received from the motion correction unit 20 during operation in any one of the control modes in which they are combined according to direction. Based on the information stored in the database 18, the control method when the person 4 grips the robot arm 5 and corrects the force parameter is switched.
- the control method switching unit 16 determines whether to switch the control method, the operation of the robot arm 5 is controlled by the control method before switching (before determining whether to switch). When switching to the control method to be performed (as a result, the control method is not changed), this is also referred to as “switching to the control method”.
- FIG. 11 is a flowchart showing switching of the control method when receiving a command to operate in the force hybrid impedance control mode as an example of the control mode iv).
- control method switching unit 16 when the control method switching unit 16 receives a command to perform control in the force hybrid impedance control mode (iv) from the motion correction unit 20, the control method switching unit 16 uses the gripping position detection unit 23 in the robot arm 5. The grip position of the person 4 is detected, and information on the detected result is input to the control method switching unit 16 (step S1).
- the characteristic relating to the presence / absence of drag of the force detection unit 53 at the gripping position detected by the gripping position detection unit 23 is detected by the control method switching unit 16 using the force detection unit characteristic database 18 (step S2).
- the “grip position ID” representing “forearm” in the force detection unit characteristic database 18 of FIG. 9B is “2”.
- the control method switching unit 16 detects that the “presence / absence of the possibility of the influence of the drag” is “1”.
- the “grip position ID” representing “hand” in FIG. 9B is “4”, and therefore the “grip position ID” in FIG.
- Step S3 If the “presence / absence of the possibility of the influence of the drag” is “0” (not affected by the drag), the result in Step S2 is No, and the process proceeds to Step S4.
- step S3 and step S4 the characteristics relating to the presence / absence of detection of the gripping position force detection unit 53 detected by the gripping position detection unit 23, respectively, using the force detection unit characteristic database 18, the control method switching unit 16 (step S3, step S4).
- step S3 and step S4 if the “presence / absence of detection” is “1” (when it can be detected by the force detection unit 53), “Yes” is obtained in step S3 and step S4, respectively, and step S5 or S7. Proceed to If the “presence / absence of detection” is “0” (when it can be detected by the force detection unit 53), the result is No in step S3 and step S4, and the process proceeds to step S6 or S8.
- step S3 and step S4 when the person 4 is gripping the forearm link 32, the “grip position ID” of “forearm” in the force detection unit characteristic database 18 of FIG. 9B is “2”.
- 9A the “detection presence / absence” when “gripping position ID” is “2” is detected by the control method switching unit 16 (that is, the forearm cannot be detected by the force detection unit 53). ).
- the “grip position ID” of “Hand” in FIG. 9B is “4”
- the “grip position ID” in FIG. 9A is “4”. If the “presence / absence of detection” is “1”, the control method switching unit 16 detects (that is, the hand 30 detects that it can be detected by the force detection unit 53).
- step S5 the control method switching unit 16 switches to a high-rigidity position control mode described later (a control method in which the robot arm 5 cannot move with the force of the person 4). That is, in step S2, “presence / absence of possibility of influence of drag” is “1” (that is, in the case of being affected by drag), and “presence / absence of detection” is “1” in step S3 ( For example, the position where the robot arm 5 does not directly or indirectly contact the contact surface in step S5 when the hand 30 is gripped and affected by the drag and can be detected by the force detection unit 53) Thus, the control method switching unit 16 switches to a high-rigidity position control mode (control method in which the robot arm 5 cannot move by the force of the person 4), which will be described later.
- the current hand position of the robot arm 5 that is performing operations such as wiping and cleaning on the contact surface is acquired, and the position separated from the contact surface by a height h from the hand position.
- the switching operation is executed until “a position that does not contact the contact surface” when the person grips the robot arm 5.
- step S6 the control method switching unit 16 switches to the low-rigidity position control mode which is a control method in which the robot arm 5 can be moved by the force of the person 4. That is, when “the presence / absence of the possibility of the influence of the drag” is “1” in step S2 and “the presence / absence of detection” is “0” in step S3 (for example, by dragging the forearm link 32, In step S6, the robot arm 5 is switched to the low-rigidity position control mode, which is a control method in which the robot arm 5 can be moved by the force of the person 4.
- the person 4 grips the robot arm 5 and pushes the robot arm 5 in a direction approaching the contact surface in order to apply the force desired to be applied to the contact surface via the robot arm 5 to the robot arm 5. . Because of the low-rigidity position control mode, the robot arm 5 can be easily moved by the force applied by the person 4, and the robot arm 5 can be brought into direct or indirect contact with the contact surface. Since “the presence / absence of detection” is “0”, the force applied by the person 4 to the robot arm 5 cannot be directly detected by the force detection unit 53, but the drag force from the contact surface on the robot arm 5 is Since the force is equal to the applied force, the force detection unit 53 can detect the force applied by the person 4 by detecting the drag force from the contact surface.
- step S7 the control method switching unit 16 does not switch the control method. That is, when “possibility of drag influence” is “0” in step S2 and “detection presence / absence” is “1” in step S4 (the force sensors 53a and 53b are set as shown in FIG. 27A). When placed and gripped by the forearm link 32 (see FIG. 30A), the control method is not switched.
- control method switching unit 16 does not switch the control method, but even when switching to the position control mode, the force can be correctly detected, so instead of not switching the control method, The control method switching unit 16 may switch to the position control mode.
- step S8 since it is impossible to detect the drag from the contact surface, the force applied by the person 4 cannot be detected, and the process proceeds to step S9.
- step S9 a warning is displayed for the person 4, and the person 4 is notified that the detection has failed. That is, if “presence / absence of potential of drag” is “0” in step S2 and “presence / absence of detection” is “0” in step S4, it is impossible to detect the drag from the contact surface. Therefore, the control method switching unit 16 determines that the force applied by the person 4 cannot be detected in step 8. Therefore, in step S9, the control method switching unit 16 notifies the person 4 that the detection has failed, such as displaying a warning to the person 4, for example.
- the motion storage unit 15 stores the motion information corrected by the motion correction unit 20 in the motion information database 17. Furthermore, information on the hand position (position of the hand 30) and posture of the robot arm 5 and the force applied by the person 4 to the robot arm 5 is input from the control parameter management unit 21 to the operation storage unit 15.
- information on the hand position (position of the hand 30) and posture of the robot arm 5 and the force applied by the person 4 to the robot arm 5 is input from the control parameter management unit 21 to the operation storage unit 15.
- the control device main body 11 is configured to include a control parameter management unit 21 and a control unit 22. Between the control unit 22 and the control parameter management unit 21, information on the hand position or force of the robot arm 5 is input / output.
- control parameter management unit 21 Details of the control parameter management unit 21 will be described.
- the control parameter management unit 21 is based on an instruction from the motion correction unit 20 or the control method switching unit 16, and the position control mode, impedance control mode, hybrid impedance control mode, and (iv) force hybrid impedance of the robot arm 5. Settings are made to switch the control mode among six modes including a control mode, a high-rigidity position control mode, and a low-rigidity position control mode. Further, the control parameter management unit 21 sets mechanical impedance setting values in the impedance control mode and the force hybrid impedance control mode. The control parameter managing unit 21 performs the setting of the operation information to the hand position and orientation target correcting output r d ⁇ settings and target track generation unit 55 to output impedance calculation unit 51 to be described later.
- the control parameter management unit 21 Based on an instruction from the motion correction unit 20 or the control method switching unit 16, the control parameter management unit 21 issues a command to the control unit 22 so that the robot arm 5 operates according to the set control mode. The robot arm 5 is operated under the control. Further, the control parameter management unit 21 notifies the operation correction unit 20 of the hand position or force information of the robot arm 5 from the control unit 22.
- the position control mode is a mode in which the robot arm 5 is operated based on the hand position and posture target vector command of the target trajectory generation unit 55 described later, and the person 4 applies a force to the robot arm 5.
- this is a mode of a control method for controlling the operation of the robot arm 5 so that the robot arm 5 does not move.
- the position control mode is a mode that operates with the movement of the robot arm 5 at the time of work such as stirring work or wiping and cleaning work.
- the impedance control mode controls the operation of the robot arm 5 so that the robot arm 5 is operated according to the force detected by the force detection unit 53 and applied to the robot arm 5 from a person 4 or the like. This is the control method mode. For example, as shown in FIG. 8, the mode that operates when the person 4 directly holds the robot arm 5 and guides the robot arm 5 to the work place (the position of the pan 3 in FIG. 8) is the impedance control mode.
- the force control mode is a control method for controlling the operation of the robot arm 5 so as to operate while pressing the object with the robot arm 5 with the force set from the operation correction unit 20 to the control parameter management unit 21. Is the control mode. For example, when wiping and cleaning the top plate such as the IH cooking heater 6 as shown in FIG. 13, in the case of a wiping and cleaning operation such that the surface of the top plate is rubbed with force, or as shown in FIG. 14, The force control mode is a control mode that is used in the direction in which the force is applied and controlled, such as in the case of a stirring operation in which the force is applied to the bottom and rubbed.
- (Iv) Hybrid Impedance Control Mode In the hybrid impedance control mode, when the robot arm 5 is operating in the position control mode, the force applied to the robot arm 5 is detected by the force detector 53 and detected by the force detector 53. This is a mode of a control method for controlling the operation of the robot arm 5 so that the robot arm 5 operates according to the applied force. Specifically, as shown in FIG. 12A, the robot arm 5 may stir the bottom portion of the pot 3 when the robot arm 5 is performing the stirring work in the position control mode. When it is desired to correct the operation of the robot arm 5 so that it can be performed, the control parameter management unit 21 outputs a command to switch to the hybrid impedance control mode to the control unit 22. As a result, as shown in FIG.
- the person 4 applies a downward force in the hybrid impedance control mode while holding the robot arm 5 (see the downward arrow in FIG. 12B), and the horizontal direction is set to the position control mode. While stirring, correction can be made in the vertical direction, that is, the operation of stirring the bottom of the pan, as indicated by the downward arrow and the downward rotation direction arrow in FIG. 12C.
- Such a control method is a hybrid impedance control mode.
- (V) Force hybrid impedance control mode In the force hybrid impedance control mode, when the robot arm 5 is operating in the force control mode, the robot arm 5 operates according to the force applied to the robot arm 5 from the person 4. 3 is a control method mode for controlling the operation of the robot arm 5. Specifically, when the top plate such as the IH cooking heater 6 is wiped and cleaned as shown in FIG. 16A, the person 4 finds the highly contaminated portion 91a as shown in FIG. In this mode, the robot arm 5 is moved to the part 91a that is very dirty, and the force that the robot arm 5 applies to the top plate is corrected. Note that, as a specific control mode of the force hybrid impedance control mode, the mode switched by the control method switching unit 16 is the force hybrid impedance control mode.
- the above control modes can be set separately for each of the six axis directions.
- the wiping operation of FIG. 16A is switched by the control method switching unit 16 between the hybrid impedance control mode, the impedance control mode, and the position control mode for each of the six axes, and the force specified by the control method switching unit 16
- This is a control mode that is operated in a force control mode that operates by operating.
- the impedance control mode cannot be set in the direction in which the force control mode is set (the force control mode and the impedance control mode are in an exclusive relationship).
- the cleaning surface is operated in a circular shape parallel to the cleaning surface, and the cleaning surface is wiped by applying a specified force vertically downward.
- the following control modes are set for each of the six axes (x, y, z, ⁇ , ⁇ , ⁇ ). That is, the (x, y) component is the hybrid impedance control mode, the ( ⁇ , ⁇ , ⁇ ) component is the impedance control mode, and the z-axis component is the force control mode.
- the mode is switched to the force hybrid impedance control mode.
- the direction parallel to the cleaning surface is set to the hybrid impedance control mode, so that the robot arm 5 is moved according to the force applied to the robot arm 5 from the person 4 or the like while operating in the position control mode.
- the posture of the robot arm 5 can be changed according to the force applied to the robot arm 5 from the person 4 or the like while the vehicle is stopped. It becomes like this.
- the z-axis component to the force control mode, it becomes possible to operate while pressing with a designated force.
- the operation of the robot arm 5 can be corrected so as to wipe with the force applied by the person 4.
- the z-axis component may be operated in the force control mode, and the other axes may be operated in the position control mode. In that case, even if an inadvertent force such as a collision is applied to the robot arm 5, the position control component is not moved by mistake.
- the high-rigidity position control mode is a control mode in which the position control mode during operation of the robot arm 5 is further increased in rigidity. Specifically, it is realized by increasing the gain in the position error compensation unit 56 described later, and the robot arm 5 cannot be easily moved even if the person 4 applies force. Therefore, by switching to the high-rigidity position control mode without directly or indirectly contacting the contact surface, it is not affected by the drag from the contact surface, so the force applied by the person 4 is correctly detected. can do.
- the low-rigidity position control mode is a control mode in which the position control mode during operation of the robot arm 5 is further reduced in rigidity. Specifically, it is realized by reducing the gain in the position error compensation unit 56 described later, and the robot arm 5 can be easily moved when the person 4 applies force. Therefore, when the force cannot be detected by the force detector 53 at the position held by the person 4, the person 4 moves until the robot arm 5 directly or indirectly contacts the contact surface (collises with the contact surface). The force applied by the person 4 can be correctly detected by stopping at the contact position and detecting the drag from the contact surface by the force detection unit 53.
- control modes are operated by setting an appropriate control mode for each direction and posture of the robot arm 5 when the robot arm 5 is operated.
- the robot 4 when the robot 4 is operating in the hybrid impedance control mode or the impedance control mode, the person 4 outputs the hand position and posture target correction output r d output by the machine impedance parameter or the impedance calculation unit 51 according to the parameter to be corrected. Change the ⁇ setting.
- the control parameter management unit 21 outputs the inertia M, the viscosity D, and the stiffness K of the mechanical impedance parameter calculated based on the equations (3) to (5) to the control unit 22.
- the person 4 uses the ladle 9 held by the hand 30 of the robot arm 5 to stir the upper part in the pan 3.
- the control parameter management unit 21 sets the above correction value to be high only for the position component and the posture component other than the z axis of the robot arm 5 (specifically, for example, about 10 times the above correction value).
- the robot arm 5 is set so that the viscosity D and the rigidity K are increased, and resistance or hardness is generated in the movement of the robot arm 5, and the robot arm 5 becomes difficult to move.
- control parameter management unit 21 needs to notify the motion storage unit 15 and the motion correction unit 20 of the hand position and posture of the robot arm 5 and information on the force applied by the person 4. is there. Therefore, when the control parameter management unit 21 receives information on the hand position and force of the robot arm 5 from the control unit 22, the control parameter management unit 21 notifies the operation storage unit 15 and the operation correction unit 20. Further, the control parameter management unit 21 notifies the control unit 22 of the operation information such as the position, posture, and time input from the operation correction unit 20 to the control parameter management unit 21.
- the control unit 22 includes a target trajectory generation unit 55, a hand control unit 54, a force detection unit 53, an impedance calculation unit 51, a position control system 59 (a position error compensation unit 56, an approximate inverse kinematics calculation unit 57, And a forward kinematics calculation unit 58.) and a position error calculation unit 80.
- the force detection unit 53 is illustrated as a part of the control unit 22 in FIG. 3, but may be configured differently from the control unit 22.
- the current value (joint angle vector) vector q [q 1 , q 2 , q 3 , q 4 , q 5 , q 6 ] T of the joint angle measured by the encoder 44 of each joint axis. Is output and taken into the control unit 22 by the input / output IF 24.
- q 1 , q 2 , q 3 , q 4 , q 5 , q 6 are respectively the first joint part 35, the second joint part 36, the third joint part 37, the fourth joint part 38, and the fifth joint. This is the joint angle of the part 39 and the sixth joint part 40.
- the target trajectory generating unit 55 is generated by the motion correcting unit 20 and controls the control parameters so as to operate the robot arm 5 in the position control mode, in the force control mode, or in the position control mode or the force control mode for each direction. from operation information input to the target track generation unit 55 through the section 21, the tip unit position and orientation target vector r d and the target to hand (the hand 30) in such a force vector f d and the target track generation unit to the target 55.
- the hand position and posture for realizing the target motion of the robot arm 5 are output from the target trajectory generation unit 55 to the position error calculation unit 80.
- the target trajectory generation unit 55 uses polynomial interpolation. and, interpolating the trajectory and force between the respective points, it generates a tip unit position and orientation target vector r d and the target force vector f d.
- the impedance control mode and outputs a hand position of the robot arm 5 at the time of switching to the impedance control mode, as the hand position and orientation target vector r d of the target. Further, an opening / closing command of the hand 30 is issued to the hand control unit 54 described later by the opening / closing flag of the hand 30 of the operation information in the operation information database 17.
- a hand control unit which issues a command to the robot arm 5 via the input / output IF 24 to open / close the hand 30 by the open / close flag input from the target trajectory generating unit 55.
- Reference numeral 53 denotes a force detection unit that detects an external force F ext applied to the robot arm 5 due to contact between the person 4 and the robot arm 5. However, in the case of working with an object having a weight of m held at the hand, mg is subtracted from the detected F ext beforehand.
- g is a gravitational acceleration. The value of the mass m of the grasped object can be input to the force detection unit 53 by the person 4 by the data input IF 26 before grasping the object.
- the impedance calculation unit 51 is a part that fulfills the function of realizing the control of the mechanical impedance value of the robot arm 5 to the mechanical impedance set value in the robot arm 5, and has been switched from the control parameter management unit 21 to the position control mode. In this case, 0 is output.
- the hand position and posture target correction output rd ⁇ for realizing the control of the mechanical impedance value of the robot arm 5 to the mechanical impedance set value by the robot arm 5 is expressed as follows:
- the hand position and posture target correction output rd ⁇ calculated by the impedance calculation unit 51 according to the equation (6) are output to the position error calculation unit 80.
- the inertia M which is the impedance parameter set by the control parameter management unit 21 is present.
- the hand position and posture target correction output rd ⁇ for realizing the control of the mechanical impedance value of the robot arm 5 is calculated by the impedance calculation unit 51 according to the following equation (10) and obtained by calculation.
- the hand position and posture target correction output rd ⁇ is output to the position error calculator 80.
- Hand position and orientation target correcting output r d? Is added by the position error calculation unit 80 to output to the hand position and orientation target vector r d of the target track generation unit 55, the tip unit position and orientation correction target vector r dm is generated .
- the hand position and posture target correction output rd ⁇ The impedance calculation unit 51 sets 0 except for the z component.
- a forward kinematics calculation unit to which a joint angle vector q, which is a current value q of the joint angle measured by the encoder 44 of each joint axis from the robot arm 5, is input via the input / output IF 24.
- the forward kinematics calculation unit 58 performs geometric calculation of conversion from the joint angle vector q of the robot arm 5 to the hand position and posture vector r by the forward kinematics calculation unit 58.
- the hand position and orientation vector r calculated by the forward kinematics calculator 58 is output to the position error calculator 80, the impedance calculator 51, and the target trajectory generator 55.
- Reference numeral 56 denotes a position error compensation unit, which includes a hand position and posture vector r calculated by a forward kinematics calculation unit 58 from a joint angle vector q measured by the robot arm 5, and a hand position and posture correction target vector r dm .
- the error r e is determined by the position error calculation unit 80, the error r e is inputted to the positional error compensating unit 56, the positional error compensating unit 56, the position error compensation output u re is the approximation reverse kinematical calculation unit 57 Is output toward.
- the three gains of proportionality, differentiation, and integration which are constant diagonal matrices, are set to large preset values (compared to the normal position control mode).
- Set a large value specifically, a value about twice that of the normal position control mode, where high rigidity means that the rigidity is higher than that of normal position control. If the value is about twice that in the normal position control mode, the rigidity can be increased up to about 2. By doing so, position control with high rigidity can be realized.
- the value of the gain for each component for example, it is possible to perform control so that only the z-axis direction is highly rigid and the other directions are operated by normal position control.
- the three gains of proportionality, differentiation, and integration which are constant diagonal matrices, are set small to preset values. That is, a smaller value is set as compared with the normal position control mode. Specifically, the value is set to about half that in the normal position control mode.
- the low rigidity means that the rigidity is low as compared with the normal position control. If the value is about half that in the normal position control mode, the rigidity can be reduced to about half. In this way, position control with low rigidity can be realized. Note that by changing the value of the gain for each component, for example, control can be performed so that the rigidity is low only in the z-axis direction and the other directions are operated by normal position control.
- the joint angle error compensation output u qe is given as a voltage command value to the motor driver 25 of the robot arm 5 via the D / A board of the input / output IF 24, and each joint axis is driven to rotate forward and reverse by each motor 43. 5 operates.
- Basic impedance control operation (as well as hybrid impedance control) is a hand position and feedback control of the orientation error r e by the positional error compensating unit 56 (position control), the portion enclosed by the dotted line in FIG. 3 position control system 59.
- positional error compensating unit 56 for example, using a PID compensator, the tip unit position and orientation error r e acts to control by the position control system 59 so as to converge to 0, the impedance control operation of the robot arm 5 to the target Can be realized.
- the hand position and posture target correction output is output by the impedance calculation unit 51 to the position control system 59 described above.
- the rd ⁇ is added by the position error calculation unit 80, and the target position and orientation target values are corrected.
- the above-described position control system 59 causes the hand position and posture target values to deviate slightly from the original values, and as a result, the value of the mechanical impedance of the robot arm 5 is set appropriately. An operation for controlling to the set value is realized, and the position control operation of the position control system 59 can be corrected.
- the hand position and posture target correction output rd ⁇ is calculated by the equation (6) in the impedance control mode or the hybrid impedance control mode, and is calculated by the equation (10) in the force hybrid impedance control mode.
- the peripheral device 14 is configured to include a data input IF (interface) 26, an input / output IF (interface) 24, a motor driver 25, and the display unit 2.
- Control information such as a control signal is output from the control unit 22 to the input / output IF 24.
- correction information such as correction parameters stored in the motion information database 17 and a video, photo, or text corresponding to the motion ID are output to the display unit 2, and the robot arm 5 described in the motion information is described.
- An operation image, a photograph, or a text is displayed on the display unit 2.
- the input / output IF 24 is configured to have, for example, a D / A board, an A / D board, a counter board, and the like connected to an expansion slot such as a PCI bus of a personal computer.
- the input / output IF 24 inputs each joint angle information output from the encoder 44 described later and the angle information output from the encoder 61 of the hand 30 of each joint portion of the robot arm 5 and inputs them to the control unit 22.
- control information such as a control signal is input from the control unit 22 to the input / output IF 24, and control information such as a control command value is output to the motor driver 25.
- the motor driver 25 outputs control information such as control command values to the motor 43 described later and the motor 62 of the hand 30 of each joint portion of the robot arm 5.
- the data input IF (interface) 26 is an interface for the person 4 to input or change operation information to be described later using an input device such as a keyboard, a mouse, or a microphone. Further, the data input IF 26 uses the input device such as the button 13a of the operation panel 13 of FIG. 1 to give commands to start and end the control operation and start and end the operation correction from the person 4 to the operation command unit 27. You may make it receive.
- the button 13a for example, it is possible to input a control operation start and a control operation end with one button as a toggle switch, or a control operation start button and a control operation end button may be provided separately.
- the display unit 2 is a display device installed on the side surface of the robot arm 5 or the workbench 7, for example, and displays operation information and the like.
- Joint angle data (joint variable vector or joint angle vector q) measured by the encoder 44 of each joint part of the robot arm 5 is transferred from the encoder 44 to the control part 22 of the control device main body part 11 via the input / output IF 24. Captured (step S101).
- the inverse kinematics calculation unit 57 uses the Jacobian matrix J r and the like necessary for the kinematics calculation of the robot arm 5. Calculation is performed (step S102).
- the forward kinematics calculator 58 calculates the current hand position and posture vector r of the robot arm 5 from the joint angle data (joint variable vector or joint angle vector q) from each encoder 44 of the robot arm 5. Then, the data is output to the position error calculation unit 80, the target trajectory generation unit 55, and the impedance calculation unit 51 (step S103).
- the target track calculation unit 55 based on the operation information transmitted through the control parameter managing unit 21 from the operation correction unit 20, the target track calculation unit 55, a hand position of the robot arm 5 and the orientation target vector r d and the target force vector f d calculated, the impedance control mode, outputs the hand position of the robot arm 5, a position error calculator 80 as hand position and orientation target vectors r d to the target (step S104).
- the force detection unit 53 calculates the equivalent hand at the hand of the robot arm 5 from the drive current value i of the motor 43, the joint angle data (joint variable vector or joint angle vector q), and the joint angle error compensation output u qe.
- the external force F ext is calculated and output to the impedance calculator 51 (step S105).
- step S106 the control parameter management unit 21 is instructed that there is correction by an operation correction unit 20 to be described later, and corrects the force component of the six axes with the correction parameters, and the control method switching unit 16
- the control mode can be switched to the low-rigidity position control mode for the component for which the force component is set. Thereafter, the process proceeds to step S107.
- step S106 the control parameter management unit 21 is instructed that there is a correction by an operation correction unit 20 described later, and corrects the force component of the six axes using the correction parameters, and the control method switching unit 16
- the control mode can be switched to the high-rigidity position control mode for the component for which the force component is set. Thereafter, the process proceeds to step S108.
- step S106 when the control parameter management unit 21 corrects the position component of the six axes, the control method switching unit 16 changes the position component to be changed to the impedance control mode. Then, it progresses to step S110.
- step S106 if there is no correction, the control parameter management unit 21 proceeds to step S109 and sets the position control mode.
- step S106 the control parameter management unit 21 is instructed that there is a correction by an operation correction unit 20 described later, and corrects the force component of the six axes using the correction parameters, and the control method switching unit 16 If no switching is set in, the control method (force control mode) before switching is switched. Then, it progresses to step S110.
- step S107 (process in the impedance calculation unit 51), in the control parameter managing unit 21, when the low-rigidity position control mode is set, in the impedance calculation unit 51, the tip unit position and orientation target correcting output r d? 0 Let it be a vector. Then, it progresses to step S111.
- step S108 processing in the impedance calculation unit 51
- the impedance calculation unit 51 sets the hand position and posture target correction output rd ⁇ to 0. Let it be a vector. Thereafter, the process proceeds to step S112.
- step S109 process in the impedance calculation unit 51
- the control parameter managing unit 21 when the position control mode is set, in the impedance calculation unit 51, and the 0 vector hand position and orientation target correcting output r d? To do. Thereafter, the process proceeds to step S113.
- step S110 when the control parameter management unit 21 sets the impedance control mode or the force control mode, the impedance calculation unit 51 sets the inertia M and viscosity D of the mechanical impedance parameter set in the control parameter management unit 21. And the stiffness K, the joint angle data (joint angle vector q), and the equivalent hand external force F ext applied to the robot arm 5 calculated by the force detection unit 53, the hand position and posture target correction output r d ⁇ are impedance calculated. Calculated by the unit 51. Further, based on the correction parameter, the value of any component of the hand position and posture target correction output rd ⁇ is set to zero.
- step S113 the positional error compensating unit 56, and the tip unit position and orientation correction target vector r dm is the sum of the tip unit position and orientation target vector r d and the tip unit position and orientation target correcting output r d?, And the current hand position error r e of the tip unit position and orientation that is a difference between the attitude vector r is calculated.
- a specific example of the position error compensator 56 is a PID compensator. The position error compensator 56 is controlled so that the position error converges to 0 by appropriately adjusting three gains of proportionality, differentiation, and integration, which are constant diagonal matrices. Thereafter, the process proceeds to step S114.
- step S111 the position error compensator 56 is controlled so that the position error converges to 0 by appropriately adjusting three gains of proportionality, differentiation, and integration, which are constant diagonal matrices of the position error compensation unit 56. Work. By reducing the gain to a certain value, low-rigidity position control is realized. Thereafter, the process proceeds to step S114.
- step S112 the position error compensator 56 is controlled so that the position error converges to 0 by appropriately adjusting three gains of proportionality, differentiation, and integration that are constant diagonal matrices of the position error compensation unit 56. Work. By increasing the gain to a certain value, highly rigid position control is realized. Thereafter, the process proceeds to step S114.
- step S114 the approximation reverse kinematical calculation unit 57, by multiplying by the approximation reverse kinematical calculation unit 57 to the inverse matrix of the Jacobian matrix J r calculated in step S102 to the positional error compensating output u re, positional error compensating output
- the approximate inverse kinematics calculation unit 57 converts u re into a joint angle error compensation output u qe that is a value related to a joint angle error from a value related to a hand position and posture error.
- step S115 the joint angle error compensation output u qe is given from the approximate inverse kinematics calculation unit 57 to the motor driver 25 through the input / output IF 24.
- the motor driver 25 changes the amount of current flowing through each motor 43 in the joint based on the joint angle error compensation output u qe . Due to this change in the amount of current, a rotational motion of each joint portion of the robot arm 5 occurs, and the robot arm 5 operates.
- the person 4 uses the data input IF 26 to issue a selection command for selecting a task to be executed by the robot arm 5 from the operations in the motion information database 17, that is, a selection command for the selected (designated) “work ID”. Input to the operation command unit 27 (step S50).
- step S51 based on the selection command input to the motion command unit 27, the motion correction unit 20 is stored in the motion information database 17 and is in accordance with the “flag” of the motion information related to the “designated” “work ID”.
- a control mode is set (step S51).
- step S52 when an operation start command for an operation selected by the person 4 using the data input IF 26 is input to the operation command unit 27, the operation command unit 27 receives the operation start command and passes through the operation correction unit 20. Then, the control parameter management unit 21 is instructed to operate in the set control mode (step S52). The control parameter management unit 21 issues a command to the control unit 22 to operate the robot arm 5 under the control of the control unit 22 so that the robot arm 5 operates in the set control mode.
- the person 4 uses the data input IF 26 to input a correction start command to the operation command unit 27 (step S53).
- the motion command unit 27 receives a correction start command
- the motion command start unit 27 inputs a motion correction start command to the motion correction unit 20.
- the operation correction unit 20 instructs the control method switching unit 16 to switch the control method.
- the control method switching unit 16 determines the control method based on the gripping position (step S54) detected by the gripping position detection unit 23 and the information stored in the force detection unit characteristic database 18, and operates according to the determined control method.
- a command is issued to the control parameter management unit 21 (step S55).
- the motion correction unit 20 corrects the motion information. Specifically, information on the force corrected by the person 4 is detected by the force detection unit 53 (step S56), and information on the result detected by the force detection unit 53 is operated via the control parameter management unit 21. Input to the correction unit 20.
- the motion correction unit 20 corrects the value of the z-axis force component operating in force control, out of the operation information of “work ID” and “motion ID” currently being operated, to the input force component value. To correct it (step S57).
- the motion information corrected by the motion correction unit 20 is stored in the motion information database 17 by the motion storage unit 15 (step 58).
- the control method switching unit is used to correct the degree of force of the wiping and cleaning work when the person 4 applies a strong force to the robot arm 5.
- the case where the control method is switched at 16 to correct the operation of the wiping and cleaning work of the robot arm 5 is described.
- the control method is switched to the high-rigidity position control mode, and in order to eliminate the influence of drag from the contact surface, the operation is corrected by switching to a state (see FIG. 21A) slightly floating from the contact surface. While the force is being corrected, the mode is the high-rigidity position control mode and is in a state of floating from the contact surface.
- the force control mode before correction and the control mode being corrected is alternately switched (specifically, switched every 0.02 seconds).
- force control is performed with the corrected force, and in the control mode being corrected (low-rigidity position control mode or high-rigidity position control mode), the correction by the corrected force is repeated alternately, so that the person 4 It is possible to perform correction by the operation correction unit 20 while confirming whether the work is accurately performed with the corrected value.
- the wiping and cleaning work is described as an example.
- the operation correction unit 20 corrects the mixing operation while rubbing the pot bottom in the force control mode in the same manner. be able to.
- the force detection unit 53 is mounted on the wrist (hand 30) is described as an example.
- the force sensor 53a for operation and the force sensor 53b for force control Or when the torque sensors 53c, 53d, and 53e are mounted on each joint as shown in FIG. 27B, the force detection unit characteristic database 18 as shown in FIG. 30A or FIG.
- the gripping position where the hand 4a of the person 4 is gripping more may be detected.
- FIG. 19 is a diagram showing a detailed configuration of the robot arm 5 to be controlled and the control device 70 of the robot arm 5 that constitute the robot system 1 in the second embodiment.
- a significant difference from the first embodiment is that a force calculation unit that calculates one force used for operation correction from a plurality of force information detected by the force detection unit 53 using a force calculation method table 81 described later. 28.
- the gripping position detection unit 23 detects which part of the robot arm 5 is gripped and operated, and a plurality of persons 4 ⁇ / b> A and 4 ⁇ / b> B are gripping.
- each gripping position is detected.
- the image recognition device 19 such as a camera recognizes the image of the hand of the person 4, 4 A, 4 B who is operating, and the hand is the forearm link 32 or the upper arm link 33 of the robot arm 5.
- the hand 30 is detected by the gripping position detector 23.
- the control method switching unit 16 uses the gripping position detected by the gripping position detection unit 23 and the information stored in the force detection unit characteristic database 18 to control the person 4 to correct the force parameter. Switch.
- the control method switching unit 16 obtains a control method at each gripping position by the same method as in the first embodiment. For example, when one person 4 holds the forearm link 32 and the hand 30 and operates the robot arm 5, the forearm link 32 of the robot arm 5 is used by using the force detection unit characteristic database 18 of FIG. 9A. 9 is “2” from the force detection unit characteristic database 18 in FIG. 9B, and the “grip position ID” is “2” from the force detection unit characteristic database 18 in FIG. 9A.
- the control method switching unit 16 detects from the force detection unit characteristic database 18 that “the presence / absence of detection” of “” is “0” and “the presence / absence of the possibility of the influence of drag” is “1”.
- the “grip position ID” is “4” from the force detection unit characteristic database 18 of FIG. 9B, and the “grip position” is determined by the force detection unit characteristic database 18 of FIG. 9A. It is confirmed from the force detection unit characteristic database 18 that the “detection presence / absence” of “4” is “1” and “possibility of influence of drag” is “1”. Detected. And the control method switching part 16 calculates
- the low-rigidity position control mode is set (see step S6 of FIG. 11).
- the gripping position ID is “4” (in the case of “hand”)
- the “presence / absence of detection” is “1” using the flowchart of FIG. Since “is present” is “1”, the high-rigidity position control mode is set (see step S5 in FIG. 11).
- the obtained control modes in this example, the low-rigidity position control mode and the high-rigidity position control mode
- the force detection unit 53 detects forces at all gripping positions in the same manner as in the first embodiment, and the detection result is input to the operation correction unit 20 via the control parameter management unit 21.
- FIG. 20A is an example of a force calculation method table 81 stored in a database built in the force calculation unit 28 or in the memory 28a.
- the “calculation method” is a force calculation unit 28 in FIG. 19 that calculates information about one force (force information) from any information (force information) about a plurality of forces detected by the force detection unit 53. It describes what to do. Specifically, when calculating the maximum value from a plurality of force information, “maximum” is described. When calculating the minimum value from a plurality of force information, “minimum” is described. When calculating the average value of the force information, describe as “average”, and when calculating the sum of multiple force information, describe as “summing” and multiplying the multiple force information by a weighting factor When calculating by adding up, it is described as “weighting”.
- the force calculation unit 28 grips the robot arm 5 at a plurality of gripping positions and detects the force detection unit 53 at each gripping position. From the force information, one force used for motion correction is calculated using the force calculation method table 81.
- the force calculation unit 28 refers to the force calculation method table 81 and calculates, for example, according to the “calculation method” in which the “flag” is “1”. In the case of the force calculation method table 81 shown in FIG. 20A, since “total” is “1”, all the output values of the force detection unit 53 at each gripping position are combined by the force calculation unit 28. To calculate.
- the “flag” is a flag indicating which calculation method is effective from a plurality of calculation methods.
- the force calculation unit 28 sets the “total” value. This shows that the calculation method is effective. Therefore, all of the force information detected by the force detection unit 53 at each gripping position is added up by the force calculation unit 28. The force value added by the force calculation unit 28 is output to the motion correction unit 20.
- coefficient ID indicates a weighting coefficient indicated by the coefficient ID shown in FIG. 20C.
- the force calculation unit 28 When the coefficient ID is “4” when the “calculation method” is “weighting” as shown in FIG. 20A, the force calculation unit 28 has the coefficient (0.6, 0.4) from FIG. 20C. That is acquired beforehand. Based on this coefficient, when the person 4 grips two positions of the robot arm 5, the force calculation unit 28 multiplies the force value detected first by the force detection unit 53 by the force calculation unit 28, The force value detected by the force detection unit 53 is multiplied by 0.4 by the force calculation unit 28, and values obtained by multiplying the respective values by the coefficients are summed by the force calculation unit 28. If the robot arm 5 is gripped at three positions, it can be calculated by using “3” of “coefficient ID” in FIG. 20C.
- the motion correction unit 20 corrects the motion information in the motion information database 17 by the same method as in the first embodiment.
- the force detection unit 53 and the force calculation unit 28 correctly detect the force. can do. Furthermore, by providing the force calculation unit 28, when the person 4 is an elderly person or a child, even if the force to be corrected is insufficient, the calculation method is “summing up”. A large force can be applied. Also, when the child and the adult are respectively holding, by selecting “maximum” as the “calculation method” by the force calculation unit 28, the force applied by the adult is given priority over the force applied by the child. Can be set.
- the “calculation method” is “summation”, but the person 4 may input which calculation method to select using the data input IF 26 or the like.
- “1” is set in the “calculation method” flag input at the data input IF 26.
- ID1 to ID3 may be appropriately switched and calculated depending on the state of the person 4 to be operated (for example, an elderly person or a child), for example, as shown in FIG. 20B.
- the data input IF 26 is used to input an ID for identifying whether it is “elderly” or “child”, or the age is input. If the force calculation unit 28 determines that the child is 15 years or younger, it can be detected that the person is an elderly person or a child.
- FIG. 22A shows an operation in which a plurality of robot arms 5A and 5B cooperate to apply force to the object 47, for example, round the bread dough or stretch the bread dough during cooking.
- FIG. 22B is an operation of simultaneously applying force to the contact surface with a plurality of robot arms 5A and 5B such as kneading bread dough or wiping and cleaning during cooking.
- the person 4 operates one of the robot arms 5A and 5B to correct the force applied to the robot arms 5A and 5B. An example of this will be described.
- FIG. 23 shows a control apparatus for the first robot arm 5A, the second robot arm 5B, the first robot arm 5A, and the second robot arm 5B, which are control targets, constituting the robot system 1 in the third embodiment.
- FIG. 24 is an example of the operation information database 17 in the third embodiment.
- “Work ID” in FIG. 24 is an ID number for identifying work performed by the plurality of robot arms 5A and 5B.
- “Action ID” is an action ID number that identifies the actions of the plurality of robot arms 5A and 5B in the work.
- “Robot ID” is an ID number for identifying the plurality of robot arms 5A and 5B. For example, “1” is the first robot arm 5A, and “2” is the second robot arm 5B.
- “Position and orientation”, “force”, “flag”, “hand”, “time”, and “correction parameter flag” are information relating to the operation of the robot arms 5A and 5B identified by the “robot ID”. Since each description is the same as that of the first embodiment, a description thereof will be omitted.
- “Progress information” is information indicating whether or not each of the robot arms 5A and 5B is operating.
- FIG. 25A is a force detection unit characteristic database 18 according to the third embodiment.
- “Robot ID” is an ID number for identifying a plurality of robot arms 5A and 5B.
- “Gripping position ID”, “Presence / absence of detection”, and “Presence / absence of influence of drag” represent the characteristics of the force detection unit of each robot arm 5A, 5B identified by “Robot ID”.
- Each description is the same as that of the first embodiment, and will be omitted.
- the motion storage unit 15 identifies the motion information corrected by the motion correction unit 20 described later by “robot ID” for each robot arm, and stores the motion information in the motion information database 17.
- the gripping position detection unit 23 detects which part of the robot arm 5A or the robot arm 5B is gripped and operated when the person 4 grips the robot arm 5A or 5B. Specifically, image recognition of the hand 4a of the operating person 4 is performed by the gripping position detection unit 23 from the image data of the image pickup device 19 such as a camera, and the hand 4a is used by the robot arm 5A and the robot arm 5B.
- the grasping position detecting unit 23 detects whether the forearm link 32, the upper arm link 33, or the hand 30 is grasped and operated.
- the control parameter management unit 21 switches the control mode of the robot arm 5A and the robot arm 5B, and outputs the impedance calculation unit 51 of each robot arm 5A, 5B.
- the hand position and posture target correction output rd ⁇ to be set and the operation information to the target trajectory generation unit 55 are set.
- the control parameter management unit 21 receives a command for switching the operation information or the control method together with the “robot ID” for identifying the robot arms 5A and 5B from the operation correction unit 20 or the control method switching unit 16, and the control parameter management unit 21
- the controller 20 issues a command to the control unit 20, but the contents of the commands to the respective robot arms 5A and 5B and the operation of the control unit 20 after receiving the commands are the same as those in the first embodiment, and thus description thereof is omitted.
- the control method switching unit 16 includes, for each robot arm identified by the robot ID, a gripping position (which robot arm 5A, 5B is detected) detected by the gripping position detection unit 23, and a force detection unit characteristic database.
- the control method used when the person 4 grips the robot arms 5A and 5B and corrects the force parameter is switched based on the information stored in the information 18.
- the identification ID (“robot ID”) and “gripping position ID” (“gripping position ID”) of the robot arms 5A and 5B detected by the gripping position detector 23 are used.
- “Gripping position ID” is obtained from “Holding position” from FIG. 25B.)
- the control method is determined according to the flowchart of FIG.
- the force detection unit 53 detects forces at all gripping positions in the same manner as in the first embodiment, and inputs the detected force to the operation correction unit 20 via the control parameter management unit 21.
- the force calculation unit 28 uses the force calculation method table 81 based on the force information detected by the control method switching unit 16 at the gripping positions of the plurality of robot arms 5A and 5B, and operates the robot arms 5A and 5B. Calculate the force used for correction.
- FIG. 26A is an example of the force calculation method table 81.
- the “calculation method” is used by the force calculation unit 28 to correct the operation of each of the robot arms 5A and 5B based on information (force information) on the force of the plurality of robot arms 5A and 5B detected by the force detection unit 53. Describes how to calculate information on force (force information). Specifically, the maximum value is calculated from a plurality of force information of the respective robot arms 5A and 5B, and “maximum” is described when the correction value is set for all the robot arms 5A and 5B. Further, when a minimum value is calculated from a plurality of force information and used as a correction value for all the robot arms 5A and 5B, “minimum” is described.
- the force calculation unit 28 refers to the force calculation method table 81 and calculates, for example, according to the “calculation method” in which the “flag” is “1”. In the case of the force calculation method table 81 shown in FIG. 26A, since “total” is “1”, all the output values of the force detection unit 53 at the gripping positions of the respective robot arms 5A and 5B are obtained. The force calculation unit 28 calculates the sum.
- the “flag” is a flag indicating which calculation method is effective from a plurality of calculation methods.
- the force calculation unit 28 sets the “total” value. This shows that the calculation method is effective. Therefore, the force information detected by the force calculation unit 28 at each gripping position is added together by the force calculation unit 28. The force value added by the force calculation unit 28 is output to the motion correction unit 20.
- calculation method in FIG. 26A is “weighting”, it is described as (“flag”, “coefficient ID”) as shown in FIG. 26A, and “flag” is calculated using any calculation method.
- the “coefficient ID” is shown in FIG. 26C. The weighting coefficient indicated by the coefficient ID is shown.
- the force calculation unit 28 indicates that the coefficient is (0.6, 0.4) from FIG. 26C. Based on this coefficient obtained in advance, for the force values from the two robot arms 5A and 5B, 0.6 is applied to the force value detected by the force detector 53 in the first robot arm 5A. The force calculation unit 28 multiplies the force value detected by the force detection unit 53 in the second robot arm 5B by 0.4 by the force calculation unit 28, and the force calculation unit 28 adds the respective values multiplied by the coefficients. To calculate. In the case of three robot arms 5, the calculation can be performed by using “3” of “coefficient ID” in FIG. 26C.
- the motion correction unit 20 uses the force correction value calculated by the force calculation unit 28 to correct the motion information in the motion information database 17 to obtain a value corrected by the person 4 as in the first embodiment.
- the operation can be corrected.
- the force can be correctly detected by the force detector 53 and the force calculator 28. Furthermore, by providing the force calculation unit 28, the operation of the robot arm 5B or 5A that is not gripped can be corrected simply by gripping and operating one of the robot arms 5A or 5B. It becomes possible to do. Furthermore, when the person 4 is an elderly person or a child, even if the power to be corrected is insufficient, it is possible to apply a larger force by setting the calculation method to “summing”. . Also, when the child and the adult are respectively holding, by selecting “maximum” as the “calculation method” by the force calculation unit 28, the force applied by the adult is given priority over the force applied by the child. Can be set.
- the “calculation method” is “summation”, but the person 4 may input which calculation method to select using the data input IF 26 or the like.
- “1” is set in the “calculation method” flag input at the data input IF 26.
- ID1 to ID3 may be switched and calculated as appropriate, for example, as shown in FIG. 26B, depending on the state of the operating person 4 (for example, an elderly person or a child).
- the data input IF 26 is used to input an ID for identifying whether it is “elderly” or “child”, or the age is input. If the force calculation unit 28 determines that the child is 15 years or younger, it can be detected that the person is an elderly person or a child.
- the robot arm 5 has been described as an example.
- the present invention is not limited to the arm, and may be a mobile robot that moves by wheels, a biped walking robot, a multi-legged walking robot, or the like. Can also be applied, and the same effect is exhibited with respect to contact between a mobile robot and a human.
- the motion information database 17 is described as an example of the motion information acquisition unit.
- the motion information acquisition unit is not limited to the database, and the robot arm If it has a function of acquiring operation information related to the operation, it may be configured by means capable of acquiring the information using a network or the like from a database stored in another device such as a server. Good.
- the force detection unit characteristic database 18 is described as an example of the characteristic information acquisition unit.
- the characteristic information acquisition unit is not limited to the database.
- the present invention relates to a robot arm control device and control method for controlling the operation of a robot arm when a human and a robot work in cooperation with each other, such as a home robot, a robot having a robot arm control device, and a robot arm control It is useful as a program and an integrated electronic circuit. Further, not only a home robot but also an industrial robot or a robot arm control device and control method of a movable mechanism in a production facility, a robot having a robot arm control device, a robot arm control program, and integrated electronics It can also be applied as a circuit.
Landscapes
- Engineering & Computer Science (AREA)
- Robotics (AREA)
- Mechanical Engineering (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- Automation & Control Theory (AREA)
- Human Computer Interaction (AREA)
- Manipulator (AREA)
Abstract
Description
前記ロボットアームの前記動作に関する動作情報を取得する動作情報取得部と、
人が前記ロボットアームを把持したときの前記人の前記ロボットアームの把持位置を検出する把持位置検出部と、
前記把持位置検出部で検出された前記把持位置で前記人が把持した際の力の検出の有無に関する情報と、前記把持位置で前記人が把持して前記ロボットアームを操作するときに接触面からの抗力の影響の有無に関する情報とを有する特性情報を取得する特性情報取得部と、
前記把持位置検出部で検出された前記把持位置と、前記特性情報取得部で取得された前記特性情報とに応じて前記ロボットアームの制御方法を切り替える制御方法切替部と、
前記動作情報取得部で取得された前記動作情報に基づいての前記ロボットアームの前記動作中に、前記把持位置と前記特性情報とに応じて前記制御方法切替部で制御方法を切り替えた後、前記人の操作に応じて、前記動作情報取得部で取得された前記動作情報の力に関する情報を補正する動作補正部とを備えて、
前記動作補正部により補正された前記動作情報に基づいて、前記ロボットアームの前記動作を制御することを特徴とするロボットアームの制御装置を提供する。
前記ロボットアームの前記動作に関する動作情報を動作情報取得部で取得し、
人が前記ロボットアームを把持したときの前記人の前記ロボットアームの把持位置を把持位置検出部で検出し、
前記把持位置検出部で検出された前記把持位置で前記人が把持した際の力の検出の有無に関する情報と、前記把持位置で前記人が把持して前記ロボットアームを操作するときに接触面からの抗力の影響の有無に関する情報とを有する特性情報を特性情報取得部で取得し、
前記把持位置検出部で検出された前記把持位置と、前記特性情報取得部で取得された前記特性情報とに応じて前記ロボットアームの制御方法を制御方法切替部で切り替え、
前記動作情報取得部で取得された前記動作情報に基づいての前記ロボットアームの前記動作中に、前記把持位置と前記特性情報とに応じて前記制御方法切替部で制御方法を切り替えた後、前記人の操作に応じて、前記動作情報取得部で取得された前記動作情報の力に関する情報を動作補正部で補正し、
前記動作補正部により補正された前記動作情報に基づいて、前記ロボットアームの前記動作を制御することを特徴とするロボットアームの制御方法を提供する。
前記ロボットアームの前記動作に関する動作情報を動作情報取得部で取得するステップと、
人が前記ロボットアームを把持したときの前記人の前記ロボットアームの把持位置を把持位置検出部で検出するステップと、
前記把持位置検出部で検出された前記把持位置で前記人が把持した際の力の検出の有無に関する情報と、前記把持位置で前記人が把持して前記ロボットアームを操作するときに接触面からの抗力の影響の有無に関する情報とを有する特性情報を特性情報取得部で取得するステップと、
前記把持位置検出部で検出された前記把持位置と、前記特性情報取得部で取得された前記特性情報とに応じて前記ロボットアームの制御方法を制御方法切替部で切り替えるステップと、
前記動作情報取得部で取得された前記動作情報に基づいての前記ロボットアームの前記動作中に、前記把持位置と前記特性情報とに応じて前記制御方法切替部で前記制御方法を切り替えた後、前記人の操作に応じて、前記動作情報取得部で取得された前記動作情報の力に関する情報を動作補正部で補正するステップと、前記動作補正部により補正された前記動作情報に基づいて、前記ロボットアームの前記動作を制御するステップとをコンピュータに実行させるためのロボットアームの制御プログラムを提供する。
前記ロボットアームの前記動作に関する情報である動作情報を動作情報取得部で取得し、
人が前記ロボットアームを把持したときの前記人の前記ロボットアームの把持位置を把持位置検出部で検出し、
前記把持位置検出部で検出された前記把持位置で前記人が把持した際の力の検出の有無に関する情報と、前記把持位置で前記人が把持して前記ロボットアームを操作するときに接触面からの抗力の影響の有無に関する情報とを有する特性情報を特性情報取得部で取得し、
前記把持位置検出部で検出された前記把持位置と、前記特性情報取得部で取得された前記特性情報とに応じて前記ロボットアームの制御方法を制御方法切替部で切り替え、
前記動作情報取得部で取得された前記動作情報に基づいての前記ロボットアームの前記動作中に、前記把持位置と前記特性情報とに応じて前記制御方法切替部で制御方法を切り替えた後、前記人の操作に応じて、前記動作情報取得部で取得された前記動作情報の力に関する情報を動作補正部で補正し、
前記動作補正部により補正された前記動作情報に基づいて、前記ロボットアームの前記動作を制御することを特徴とするロボットアームの集積電子回路を提供する。
前記ロボットアームの前記動作に関する動作情報を取得する動作情報取得部と、
人が前記ロボットアームを把持したときの前記人の前記ロボットアームの把持位置を検出する把持位置検出部と、
前記把持位置検出部で検出された前記把持位置で前記人が把持した際の力の検出の有無に関する情報と、前記把持位置で前記人が把持して前記ロボットアームときに接触面からの抗力の影響の有無に関する情報とを有する特性情報を取得する特性情報取得部と、
前記把持位置検出部で検出された前記把持位置と、前記特性情報取得部で取得された前記特性情報とに応じて前記ロボットアームの制御方法を切り替える制御方法切替部と、
前記動作情報取得部で取得された前記動作情報に基づいての前記ロボットアームの前記動作中に、前記把持位置と前記特性情報とに応じて前記制御方法切替部で制御方法を切り替えた後、前記人の操作に応じて、前記動作情報取得部で取得された前記動作情報の力に関する情報を補正する動作補正部とを備えて、
前記動作補正部により補正された前記動作情報に基づいて、前記ロボットアームの前記動作を制御することを特徴とするロボットアームの制御装置を提供する。
前記特性情報取得部は、前記ロボットアームに前記外部から加えられた力を前記力検出部で検出する際に、前記特性情報のうちの前記人が把持した際の前記接触面からの抗力の影響の有無に関する情報を取得し、
前記動作補正部は、前記動作情報で前記ロボットアームの前記動作中に、前記制御方法切替部で前記制御方法を切り替えた後、前記人の操作として、前記力検出部で検出された前記力に応じて、前記動作情報の力に関する情報を補正することを特徴とする第1の態様に記載のロボットアームの制御装置を提供する。
(I)前記人が前記ロボットアームに加えた力で前記ロボットアームが移動するように前記ロボットアームの前記動作を制御する制御方法と、
(II)前記人が前記ロボットアームに力を加えても前記ロボットアームが移動しないように前記ロボットアームの前記動作を制御する制御方法と、
(III)前記切り替え前の制御方法で前記ロボットアームの前記動作を制御する制御方法とのいずれか1つの制御方法に切り替え、
前記力検出部は、前記(I)の制御方法の場合には、前記ロボットアームが移動している最中の前記力を検出するか、若しくは、前記ロボットアームが移動後に前記接触面に前記ロボットアームが直接的に又は間接的に衝突して前記ロボットアームが移動を停止した状態で前記力を検出するかのいずれかの方法で検出し、前記(II)の制御方法及び前記(III)の制御方法の場合には、前記人が前記力を前記ロボットアームに加えた段階で前記力を検出することを特徴とする第2の態様に記載のロボットアームの制御装置を提供する。
(I)前記特性情報が、前記接触面からの抗力の影響が無いとの情報を含む場合で且つ前記人の前記把持位置で前記ロボットアームに加わる力が前記力検出部の検出可能範囲内であるという情報を含む場合は、前記切り替え前の制御方法で前記ロボットアームの前記動作を制御するか、若しくは、前記人が前記ロボットアームに加えた力で移動しないように前記ロボットアームの前記動作を制御する制御方法に切り替え、
(II)前記特性情報が、前記接触面からの抗力の影響が無いとの情報を含む場合でかつ前記人の前記把持位置で前記ロボットアームに加わる力が前記力検出部の検出可能範囲内であるという情報を含む場合は、前記人が前記ロボットアームに力を加えても前記ロボットアームが移動して接触面に直接的に又は間接的に接触しないように前記ロボットアームの前記動作を制御する制御方法に切り替え、
(III)前記特性情報が、前記接触面からの抗力の影響が無いとの情報を含む場合でかつ前記人の前記把持位置が前記力検出部の検出可能範囲外であるという情報を含む場合は、前記人が前記ロボットアームに加えた力で前記ロボットアームが移動するように前記ロボットアームの前記動作を制御する制御方法に切り替えることを特徴とする第3の態様に記載のロボットアームの制御装置を提供する。
前記制御方法切替部は、
(I)前記人が加えた力で前記ロボットアームが移動するように前記ロボットアームの前記動作を制御する制御方法と、
(II)前記人が前記ロボットアームに力を加えても前記ロボットアームが移動しないように前記ロボットアームの前記動作を制御する制御方法と、
(III)前記切り替え前の制御方法で前記ロボットアームの前記動作を制御する制御方法と、
のいずれか1つの制御方法を順次切り替え、前記力検出部はそれぞれの制御方法で前記力を検出し、
それぞれの前記把持位置により前記力検出部で検出した複数の値に基づいて、前記人が前記ロボットアームに加えた力の値を算出する力算出部を備え、
前記動作補正部は、前記力算出部で算出した力の値で、前記動作情報データベースの前記動作情報を補正することを特徴とする第3の態様に記載のロボットアームの制御装置を提供する。
(I)前記力検出部で検出した複数の値を合算して算出する方法と、
(II)前記力検出部で検出した前記複数の値のうちの最小値を算出する方法と、
(III)前記力検出部で検出した前記複数の値のうちの最大値を算出する方法と、
(IV)前記力検出部で検出した前記複数の値をそれぞれ重み係数を乗じて合算して算出する方法と、
のいずれか1つの算出方法で、前記人が前記ロボットアームに加えた力の値を算出し、
前記動作補正部は、前記力算出部で算出した値に基づいて、前記動作情報取得部で取得された前記動作情報の力に関する情報を補正することを特徴とする第5の態様に記載のロボットアームの制御装置を提供する。
前記把持位置検出部は、前記複数のロボットアームのうち、いずれのロボットアームを前記人が把持しているかどうかを検出し、
前記人が前記複数のロボットアームのうちの一方のロボットアームを把持している場合には、その一方のロボットアームに具備された前記力検出部で前記力を検出し、
さらに、前記力検出部で検出した値から、前記人が把持していない他方のロボットアームを補正する値を算出する力算出部を備え、
前記動作補正部は、前記力算出部で算出した値で、前記動作情報取得部で取得された前記動作情報を補正することを特徴とする第2又は3の態様に記載のロボットアームの制御装置を提供する。
(I)前記力検出部で検出した複数の値を合算して算出する方法と、
(II)前記力検出部で検出した前記複数の値のうちの最小値を算出する方法と、
(III)前記力検出部で検出した前記複数の値のうちの最大値を算出する方法と、
(IV)前記力検出部で検出した前記複数の値をそれぞれ重み係数を乗じて合算して算出する方法と、
のいずれか1つの算出方法で、前記人が前記ロボットアームに加えた力の値を算出し、
前記動作補正部は、前記力算出部で算出した値に基づいて、前記動作情報取得部で取得された、全てのロボットアームの前記動作情報を補正することを特徴とする第7の態様に記載のロボットアームの制御装置を提供する。
前記力検出部は、前記切り替え後の制御方法に切り替えたときに、前記力を検出することを特徴とする第3の態様に記載のロボットアームの制御装置を提供する。
前記ロボットアームの前記動作に関する動作情報を動作情報取得部で取得し、
人が前記ロボットアームを把持したときの前記人の前記ロボットアームの把持位置を把持位置検出部で検出し、
前記把持位置検出部で検出された前記把持位置で前記人が把持した際の力の検出の有無に関する情報と、前記把持位置で前記人が把持して前記ロボットアームを操作するときに接触面からの抗力の影響の有無に関する情報とを有する特性情報を特性情報取得部で取得し、
前記把持位置検出部で検出された前記把持位置と、前記特性情報取得部で取得された前記特性情報とに応じて前記ロボットアームの制御方法を制御方法切替部で切り替え、
前記動作情報取得部で取得された前記動作情報に基づいての前記ロボットアームの前記動作中に、前記把持位置と前記特性情報とに応じて前記制御方法切替部で制御方法を切り替えた後、前記人の操作に応じて、前記動作情報取得部で取得された前記動作情報の力に関する情報を動作補正部で補正し、
前記動作補正部により補正された前記動作情報に基づいて、前記ロボットアームの前記動作を制御することを特徴とするロボットアームの制御方法を提供する。
前記ロボットアームの前記動作を制御する第1~8のいずれか1つの態様に記載のロボットアームの制御装置とを有することを特徴とするロボットを提供する。
前記ロボットアームの前記動作に関する動作情報を動作情報取得部で取得するステップと、
人が前記ロボットアームを把持したときの前記人の前記ロボットアームの把持位置を把持位置検出部で検出するステップと、
前記把持位置検出部で検出された前記把持位置で前記人が把持した際の力の検出の有無に関する情報と、前記把持位置で前記人が把持して前記ロボットアームを操作するときに接触面からの抗力の影響の有無に関する情報とを有する特性情報を特性情報取得部で取得するステップと、
前記把持位置検出部で検出された前記把持位置と、前記特性情報取得部で取得された前記特性情報とに応じて前記ロボットアームの制御方法を制御方法切替部で切り替えるステップと、
前記動作情報取得部で取得された前記動作情報に基づいての前記ロボットアームの前記動作中に、前記把持位置と前記特性情報とに応じて前記制御方法切替部で前記制御方法を切り替えた後、前記人の操作に応じて、前記動作情報取得部で取得された前記動作情報の力に関する情報を動作補正部で補正するステップと、前記動作補正部により補正された前記動作情報に基づいて、前記ロボットアームの前記動作を制御するステップとをコンピュータに実行させるためのロボットアームの制御プログラムを提供する。
前記ロボットアームの前記動作に関する情報である動作情報を動作情報取得部で取得し、
人が前記ロボットアームを把持したときの前記人の前記ロボットアームの把持位置を把持位置検出部で検出し、
前記把持位置検出部で検出された前記把持位置で前記人が把持した際の力の検出の有無に関する情報と、前記把持位置で前記人が把持して前記ロボットアームを操作するときに接触面からの抗力の影響の有無に関する情報とを有する特性情報を特性情報取得部で取得し、
前記把持位置検出部で検出された前記把持位置と、前記特性情報取得部で取得された前記特性情報とに応じて前記ロボットアームの制御方法を制御方法切替部で切り替え、
前記動作情報取得部で取得された前記動作情報に基づいての前記ロボットアームの前記動作中に、前記把持位置と前記特性情報とに応じて前記制御方法切替部で制御方法を切り替えた後、前記人の操作に応じて、前記動作情報取得部で取得された前記動作情報の力に関する情報を動作補正部で補正し、
前記動作補正部により補正された前記動作情報に基づいて、前記ロボットアームの前記動作を制御することを特徴とするロボットアームの集積電子回路を提供する。
まず、本発明の第1実施形態におけるロボットアームの制御装置を備えるロボットシステム1の構成について説明する。図1及び図2は、本発明の第1実施形態におけるロボットアーム5及びその制御装置70を備えるロボットシステム1の概要を示す図である。
この第1実施形態のロボットアーム5は、一例として6自由度の多リンクマニピュレータで構成される多関節ロボットアームとする。ロボットアーム5は、ハンド30と、ハンド30が取り付けられている手首部31を先端32aに有する前腕リンク32と、前腕リンク32の基端32bに回転可能に先端33aが連結される上腕リンク33と、上腕リンク33の基端33bが回転可能に連結支持される台部34とを備えている。台部34は、移動可能なレール8に連結されているが、一定位置に固定されていても良い。手首部31は、第4関節部38と、第5関節部39と、第6関節部40との3つの回転軸を有しており、前腕リンク32に対するハンド30の相対的な姿勢(向き)を変化できる。すなわち、図2において、第4関節部38は、手首部31に対するハンド30の横軸(ψ)周りの相対的な姿勢を変化できる。第5関節部39は、手首部31に対するハンド30の、第4関節部38の横軸とは直交する縦軸(Φ)周りの相対的な姿勢を変化させることができる。第6関節部40は、手首部31に対するハンド30の、第4関節部38の横軸(ψ)及び第5関節部39の縦軸(Φ)とそれぞれ直交する横軸(θ)周りの相対的な姿勢を変化できる。前腕リンク32の他端は、上腕リンク33の先端に対して第3関節部37周りに、すなわち、第4関節部38の横軸と平行な横軸周りに回転可能とする。上腕リンク33の他端は、台部34に対して第2関節部36周りに、すなわち、第4関節部38の横軸と平行な横軸周りに回転可能とする。さらに、台部34の上側可動部34aは、台部34の下側固定部34bに対して第1関節部35周りに、すなわち、第5関節部39の縦軸と平行な縦軸周りに回転可能としている。
23は把持位置検出部であり、人4がロボットアーム5を把持するとき、人4がロボットアーム5のどの部分(どの位置)を把持して操作しているかを検出する(人4のロボットアーム5の把持位置を検出する)。具体的には、カメラなどの画像撮像装置19の画像データより、操作している人4の手4aの画像認識を把持位置検出部23で行い、その手4aが、ロボットアーム5の前腕リンク32か上腕リンク33かハンド30のいずれを把持して操作しているかを把持位置検出部23で検出する。
動作生成装置12は、動作情報取得部の一例として機能する動作情報データベース17と、特性情報取得部の一例として機能する力検出部特性データベース18と、動作指令部27と、動作補正部20と、動作記憶部15と、制御方法切替部16とを有するように構成される。動作補正部20と制御パラメータ管理部21との間では、ロボットアーム5の手先位置及び姿勢の情報と、人4の加えられた力の情報と、動作指令などが入出力される。制御パラメータ管理部21により、ロボットアーム5の手先位置及び姿勢と人4からロボットアーム5に加えられた力の情報などが動作記憶部15に出力される。なお、動作生成装置12における各制御モード(i)位置制御モード、(ii)インピーダンス制御モード、(iii)力制御モード)の詳細については、制御装置本体部11の制御パラメータ管理部21にて後述する。
動作情報データベース17は、動作指令部27との間で動作情報が入出力されるとともに、動作補正部20との間で動作情報が入出力され、動作記憶部15により種々の動作情報が入力され記憶される。
動作指令部27には、人4がデータ入力IF26により指定された「作業ID」の作業の動作開始の指令が入力される。動作指令部27は、指定された「作業ID」の作業の動作開始の指令を受けて、指定された「作業ID」の作業の動作を開始させる。具体的には、動作情報指令部27により「動作ID」の「進捗情報」に「1」を設定し、動作記憶部15により動作情報データベース17に記憶する。その他の「動作ID」の「進捗情報」については「0」を動作指令部27に設定し、動作記憶部15により動作情報データベース17に記憶する。動作指令部27により指令された作業IDの作業の全ての動作は、「動作ID」の番号の小さいものから、順番に実行され、一番最後の動作まで実行されると、その「作業ID」の先頭の「動作ID」の動作に戻り、一連の動作を繰り返し実行する。
力検出部特性データベース18は、力検出部53(図3に図示)の特性を表す情報(特性情報)が記憶されており、図9Aに一例を示す。特性情報は、一例として、人4のロボットアーム5の把持位置に関する情報(図9Aの「把持位置ID」の欄参照)と、人4がロボットアーム5を把持した位置で力の検出ができるかどうかに関する情報(図9Aの「検出の有無」の欄参照)と、力検出の際に、接触面からの抗力の影響を受ける可能性があるかどうかに関する情報(図9Aの「抗力の影響の可能性の有無」の欄参照)とで構成される。
動作補正部20は、動作情報データベース17の位置と姿勢と力と時間との情報にて、(i)位置制御モード、(ii)インピーダンス制御モード、(iii)力制御モードと、それらを方向別に組み合わせた制御モードのうち、いずれかのモードでのロボットアーム5の動作中に、データ入力IF26から動作指令部27を介して動作補正部20で動作補正開始の指令を受ける。すると、動作補正部20は、制御方法切替部16により制御方法を切り替えた後、動作補正情報データベース18の動作補正情報に基づいて、人4がロボットアーム5に力を加えることで動作情報データベース17のロボットアーム5の動作情報を補正する機能を有している。なお、(i)、(ii)、(iii)のそれぞれの制御モードの詳細については、制御パラメータ管理部21の動作と共に後述する。
制御方法切替部16は、
(i)位置制御モードと、
(ii)インピーダンス制御モードと、
(iii)力制御モードと、
(iv)それらを方向別に組み合わせた制御モードと
のうち、いずれかのモードで動作中に動作補正部20からの指令を受けると、把持位置検出部23で検出された把持位置と力検出部特性データベース18に記憶された情報とに基づいて、人4がロボットアーム5を把持して力のパラメータを補正する際の制御方法を切り替える。なお、本発明では、一例として、制御方法を切り替えするか否かを制御方法切替部16で判断するとき、切り替え前(切り替えるか否かの判断前)の制御方法でロボットアーム5の動作を制御する制御方法に切り替える(結果的には、制御方法を変更しない)ときも、「制御方法に切り替える」と称する。
動作記憶部15は、動作補正部20により補正した動作情報を動作情報データベース17に記憶する。さらに、動作記憶部15には、制御パラメータ管理部21からロボットアーム5の手先位置(ハンド30の位置)及び姿勢と人4がロボットアーム5にかけた力の情報が入力され、動作記憶部15で記憶する。
制御装置本体部11は、制御パラメータ管理部21と、制御部22とを有するように構成される。制御部22と制御パラメータ管理部21との間では、ロボットアーム5の手先位置又は力の情報などが入出力される。
次に、制御パラメータ管理部21の詳細について説明する。
位置制御モードは、後述する目標軌道生成部55の手先位置と姿勢目標ベクトル指令とに基づいてロボットアーム5が作動するモードであり、人4がロボットアーム5に力を加えてもロボットアーム5は移動しないようにロボットアーム5の動作を制御する制御方法のモードである。具体的には、かき混ぜ作業又は拭き掃除作業などの作業時におけるロボットアーム5の移動を伴って動作するモードが位置制御モードである。
インピーダンス制御モードは、力検出部53で検出され、かつ人4などからロボットアーム5に加わる力に応じて、ロボットアーム5が作動するようにロボットアーム5の動作を制御する制御方法のモードである。例えば、図8に示すように、人4がロボットアーム5を直接持って、作業場所(図8では鍋3の位置)までロボットアーム5を誘導する場合に動作するモードがインピーダンス制御モードである。
力制御モードは、動作補正部20から制御パラメータ管理部21へ設定された力で対象物をロボットアーム5で押し付けながら動作するようにロボットアーム5の動作を制御する制御方法の制御モードである。例えば、図13のように、IHクッキングヒータ6などの天板を拭き掃除作業する際に、天板の表面に力を加えて擦るような拭き掃除作業の場合、又は、図14のように、鍋3の底に力を加えて擦るようなかき混ぜ作業の場合などの、力を加えて制御する方向に使用する制御モードが力制御モードである。
ハイブリッドインピーダンス制御モードは、ロボットアーム5が位置制御モードで動作しているときに、ロボットアーム5に加わる力が力検出部53により検出され、かつ力検出部53で検出された力に応じて、ロボットアーム5が作動するようにロボットアーム5の動作を制御する制御方法のモードである。具体的には、ロボットアーム5が、図12Aに示すように、ロボットアーム5が位置制御モードによるかき混ぜ業を行っているときに、人4が鍋3の中の底側の部分をかき混ぜることができるようにロボットアーム5の動作を補正したい場合に、制御パラメータ管理部21は、ハイブリッドインピーダンス制御モードに切り替える指令を制御部22へ出力する。その結果、図12Bに示すように、人4がロボットアーム5を把持しながら、ハイブリッドインピーダンス制御モードにより下方向に力をかけることで(図12Bの下向きの矢印参照)、水平方向を位置制御モードでかき混ぜながら、図12Cの下向きの矢印及び下側の回転方向の矢印で示すように、垂直方向すなわち鍋底をかき混ぜる動作に補正することができるようになる。このような制御方法がハイブリッドインピーダンス制御モードである。
力ハイブリッドインピーダンス制御モードは、ロボットアーム5が力制御モードで動作しているときに、人4からロボットアーム5に加わる力に応じて、ロボットアーム5が作動するようにロボットアーム5の動作を制御する制御方法のモードである。具体的には、図16AのようにIHクッキングヒータ6などの天板を拭き掃除作業をしているときに、図16Bのように汚れがひどい部分91aを人4が見つけて、ロボットアーム5を把持して汚れがひどい部分91aまでロボットアーム5を移動させて、ロボットアーム5が天板に加える力を補正する際に使用するモードである。なお、力ハイブリッドインピーダンス制御モードの具体的な制御モードは、制御方法切替部16により切り替えられたモードが力ハイブリッドインピーダンス制御モードである。
例えば、図16Aの拭き掃除作業は、6軸の方向別に、ハイブリッドインピーダンス制御モードか、インピーダンス制御モードか、位置制御モードかを制御方法切替部16により切り替え、さらに、制御方法切替部16により指定した力を作用させて動作する力制御モードで動作させる制御モードである。なお、力制御モードが設定された方向にインピーダンス制御モードを設定することはできない(力制御モードとインピーダンス制御モードは排他的な関係)。
次に、制御部22の詳細について、図3により説明する。制御部22は、目標軌道生成部55と、ハンド制御部54と、力検出部53と、インピーダンス計算部51と、位置制御系59(位置誤差補償部56と、近似逆運動学計算部57と、順運動学計算部58とを有する。)と、位置誤差計算部80とを有するように構成されている。力検出部53は、図3では制御部22の一部として図示しているが、制御部22とは別の構成としてもよい。
周辺装置14は、データ入力IF(インターフェース)26と入出力IF(インターフェース)24とモータドライバ25と表示部2とを有するように構成される。制御部22から入出力IF24には制御信号などの制御情報が出力される。動作補正部20から、動作情報データベース17で記憶された補正パラメータなどの補正情報及び動作IDに該当する映像若しくは写真若しくはテキストが表示部2へ出力されて、動作情報で記述されたロボットアーム5の動作の映像若しくは写真若しくはテキストを表示部2で表示する。
本発明の第2実施形態における、ロボットアームの制御装置の基本的な構成は、第1実施形態の場合と同様であるので、共通部分の説明は省略し、異なる部分についてのみ以下、詳細に説明する。
本発明の第3実施形態における、ロボットアームの制御装置の基本的な構成は、第1実施形態の場合と同様であるので、共通部分の説明は省略し、異なる部分についてのみ以下、詳細に説明する。
Claims (13)
- ロボットアームの動作を制御して前記ロボットアームによる作業を行うロボットアームの制御装置であって、
前記ロボットアームの前記動作に関する動作情報を取得する動作情報取得部と、
人が前記ロボットアームを把持したときの前記人の前記ロボットアームの把持位置を検出する把持位置検出部と、
前記把持位置検出部で検出された前記把持位置で前記人が把持した際の力の検出の有無に関する情報と、前記把持位置で前記人が把持して前記ロボットアームを操作するときに接触面からの抗力の影響の有無に関する情報とを有する特性情報を取得する特性情報取得部と、
前記把持位置検出部で検出された前記把持位置と、前記特性情報取得部で取得された前記特性情報とに応じて前記ロボットアームの制御方法を切り替える制御方法切替部と、
前記動作情報取得部で取得された前記動作情報に基づいての前記ロボットアームの前記動作中に、前記把持位置と前記特性情報とに応じて前記制御方法切替部で制御方法を切り替えた後、前記人の操作に応じて、前記動作情報取得部で取得された前記動作情報の力に関する情報を補正する動作補正部とを備えて、
前記動作補正部により補正された前記動作情報に基づいて、前記ロボットアームの前記動作を制御するロボットアームの制御装置。 - 前記ロボットアームに外部から加えられた力を検出する力検出部をさらに備え、
前記特性情報取得部は、前記ロボットアームに前記外部から加えられた力を前記力検出部で検出する際に、前記特性情報のうちの前記人が把持した際の前記接触面からの抗力の影響の有無に関する情報を取得し、
前記動作補正部は、前記動作情報で前記ロボットアームの前記動作中に、前記制御方法切替部で前記制御方法を切り替えた後、前記人の操作として、前記力検出部で検出された前記力に応じて、前記動作情報の力に関する情報を補正する請求項1に記載のロボットアームの制御装置。 - 前記制御方法切替部は、
(I)前記人が前記ロボットアームに加えた力で前記ロボットアームが移動するように前記ロボットアームの前記動作を制御する制御方法と、
(II)前記人が前記ロボットアームに力を加えても前記ロボットアームが移動しないように前記ロボットアームの前記動作を制御する制御方法と、
(III)前記切り替え前の制御方法で前記ロボットアームの前記動作を制御する制御方法とのいずれか1つの制御方法に切り替え、
前記力検出部は、前記(I)の制御方法の場合には、前記ロボットアームが移動している最中の前記力を検出するか、若しくは、前記ロボットアームが移動後に前記接触面に前記ロボットアームが直接的に又は間接的に衝突して前記ロボットアームが移動を停止した状態で前記力を検出するかのいずれかの方法で検出し、前記(II)の制御方法及び前記(III)の制御方法の場合には、前記人が前記力を前記ロボットアームに加えた段階で前記力を検出する請求項2に記載のロボットアームの制御装置。 - 前記制御方法切替部は、
(I)前記特性情報が、前記接触面からの抗力の影響が無いとの情報を含む場合で且つ前記人の前記把持位置で前記ロボットアームに加わる力が前記力検出部の検出可能範囲内であるという情報を含む場合は、前記切り替え前の制御方法で前記ロボットアームの前記動作を制御するか、若しくは、前記人が前記ロボットアームに加えた力で移動しないように前記ロボットアームの前記動作を制御する制御方法に切り替え、
(II)前記特性情報が、前記接触面からの抗力の影響が有るとの情報を含む場合でかつ前記人の前記把持位置で前記ロボットアームに加わる力が前記力検出部の検出可能範
囲内であるという情報を含む場合は、前記人が前記ロボットアームに力を加えても前記ロボットアームが移動して接触面に直接的に又は間接的に接触しないように前記ロボットアームの前記動作を制御する制御方法に切り替え、
(III)前記特性情報が、前記接触面からの抗力の影響が有るとの情報を含む場合でかつ前記人の前記把持位置が前記力検出部の検出可能範囲外であるという情報を含む場合は、前記人が前記ロボットアームに加えた力で前記ロボットアームが移動するように前記ロボットアームの前記動作を制御する制御方法に切り替える請求項3に記載のロボットアームの制御装置。 - 前記把持位置検出部で、前記把持位置を複数検出した場合には、それぞれの把持位置と前記力検出部の特性とに応じて、
前記制御方法切替部は、
(I)前記人が加えた力で前記ロボットアームが移動するように前記ロボットアームの前記動作を制御する制御方法と、
(II)前記人が前記ロボットアームに力を加えても前記ロボットアームが移動しないように前記ロボットアームの前記動作を制御する制御方法と、
(III)前記切り替え前の制御方法で前記ロボットアームの前記動作を制御する制御方法と、
のいずれか1つの制御方法を順次切り替え、前記力検出部はそれぞれの制御方法で前記力を検出し、
それぞれの前記把持位置により前記力検出部で検出した複数の値に基づいて、前記人が前記ロボットアームに加えた力の値を算出する力算出部を備え、
前記動作補正部は、前記力算出部で算出した力の値で、前記動作情報データベースの前記動作情報を補正する請求項3に記載のロボットアームの制御装置。 - 前記力算出部は、
(I)前記力検出部で検出した複数の値を合算して算出する方法と、
(II)前記力検出部で検出した前記複数の値のうちの最小値を算出する方法と、
(III)前記力検出部で検出した前記複数の値のうちの最大値を算出する方法と、
(IV)前記力検出部で検出した前記複数の値をそれぞれ重み係数を乗じて合算して算出する方法と、
のいずれか1つの算出方法で、前記人が前記ロボットアームに加えた力の値を算出し、
前記動作補正部は、前記力算出部で算出した値に基づいて、前記動作情報取得部で取得された前記動作情報の力に関する情報を補正する請求項5に記載のロボットアームの制御装置。 - 複数のロボットアームを備え、
前記把持位置検出部は、前記複数のロボットアームのうち、いずれのロボットアームを前記人が把持しているかどうかを検出し、
前記人が前記複数のロボットアームのうちの一方のロボットアームを把持している場合には、その一方のロボットアームに具備された前記力検出部で前記力を検出し、
さらに、前記力検出部で検出した値から、前記人が把持していない他方のロボットアームを補正する値を算出する力算出部を備え、
前記動作補正部は、前記力算出部で算出した値で、前記動作情報取得部で取得された前記動作情報を補正する請求項2又は3に記載のロボットアームの制御装置。 - 前記力算出部は、
(I)前記力検出部で検出した複数の値を合算して算出する方法と、
(II)前記力検出部で検出した前記複数の値のうちの最小値を算出する方法と、
(III)前記力検出部で検出した前記複数の値のうちの最大値を算出する方法と、
(IV)前記力検出部で検出した前記複数の値をそれぞれ重み係数を乗じて合算して算出する方法と、
のいずれか1つの算出方法で、前記人が前記ロボットアームに加えた力の値を算出し、
前記動作補正部は、前記力算出部で算出した値に基づいて、前記動作情報取得部で取得された、全てのロボットアームの前記動作情報を補正する請求項7に記載のロボットアームの制御装置。 - 前記制御方法切替部は、前記人が力を前記ロボットアームに加えても前記ロボットアームが移動しないように前記ロボットアームの前記動作を制御する制御方法に切り替えた場合に、切り替え前の制御方法と前記切り替え後の制御方法を交互に切り替え、
前記力検出部は、前記切り替え後の制御方法に切り替えたときに、前記力を検出する請求項3に記載のロボットアームの制御装置。 - ロボットアームの動作を制御して前記ロボットアームによる作業を行うロボットアームの制御方法であって、
前記ロボットアームの前記動作に関する動作情報を動作情報取得部で取得し、
人が前記ロボットアームを把持したときの前記人の前記ロボットアームの把持位置を把持位置検出部で検出し、
前記把持位置検出部で検出された前記把持位置で前記人が把持した際の力の検出の有無に関する情報と、前記把持位置で前記人が把持して前記ロボットアームを操作するときに接触面からの抗力の影響の有無に関する情報とを有する特性情報を特性情報取得部で取得し、
前記把持位置検出部で検出された前記把持位置と、前記特性情報取得部で取得された前記特性情報とに応じて前記ロボットアームの制御方法を制御方法切替部で切り替え、
前記動作情報取得部で取得された前記動作情報に基づいての前記ロボットアームの前記動作中に、前記把持位置と前記特性情報とに応じて前記制御方法切替部で制御方法を切り替えた後、前記人の操作に応じて、前記動作情報取得部で取得された前記動作情報の力に関する情報を動作補正部で補正し、
前記動作補正部により補正された前記動作情報に基づいて、前記ロボットアームの前記動作を制御するロボットアームの制御方法。 - 前記ロボットアームと、
前記ロボットアームの前記動作を制御する請求項1~8のいずれか1つに記載のロボットアームの制御装置とを有するロボット。 - ロボットアームの動作を制御して前記ロボットアームによる作業を行うロボットアームの制御プログラムであって、
前記ロボットアームの前記動作に関する動作情報を動作情報取得部で取得するステップと、
人が前記ロボットアームを把持したときの前記人の前記ロボットアームの把持位置を把持位置検出部で検出するステップと、
前記把持位置検出部で検出された前記把持位置で前記人が把持した際の力の検出の有無に関する情報と、前記把持位置で前記人が把持して前記ロボットアームを操作するときに接触面からの抗力の影響の有無に関する情報とを有する特性情報を特性情報取得部で取得するステップと、
前記把持位置検出部で検出された前記把持位置と、前記特性情報取得部で取得された前記特性情報とに応じて前記ロボットアームの制御方法を制御方法切替部で切り替えるステップと、
前記動作情報取得部で取得された前記動作情報に基づいての前記ロボットアームの前記動作中に、前記把持位置と前記特性情報とに応じて前記制御方法切替部で前記制御方法を切り替えた後、前記人の操作に応じて、前記動作情報取得部で取得された前記動作情報の
力に関する情報を動作補正部で補正するステップと、
前記動作補正部により補正された前記動作情報に基づいて、前記ロボットアームの前記動作を制御するステップとをコンピュータに実行させるためのロボットアームの制御プログラム。 - ロボットアームの動作を制御して前記ロボットアームによる作業を行うロボットアームを制御する集積電子回路であって、
前記ロボットアームの前記動作に関する情報である動作情報を動作情報取得部で取得し、
人が前記ロボットアームを把持したときの前記人の前記ロボットアームの把持位置を把持位置検出部で検出し、
前記把持位置検出部で検出された前記把持位置で前記人が把持した際の力の検出の有無に関する情報と、前記把持位置で前記人が把持して前記ロボットアームを操作するときに接触面からの抗力の影響の有無に関する情報とを有する特性情報を特性情報取得部で取得し、
前記把持位置検出部で検出された前記把持位置と、前記特性情報取得部で取得された前記特性情報とに応じて前記ロボットアームの制御方法を制御方法切替部で切り替え、
前記動作情報取得部で取得された前記動作情報に基づいての前記ロボットアームの前記動作中に、前記把持位置と前記特性情報とに応じて前記制御方法切替部で制御方法を切り替えた後、前記人の操作に応じて、前記動作情報取得部で取得された前記動作情報の力に関する情報を動作補正部で補正し、
前記動作補正部により補正された前記動作情報に基づいて、前記ロボットアームの前記動作を制御するロボットアームの集積電子回路。
Priority Applications (4)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN2009801215012A CN102056715B (zh) | 2009-01-09 | 2009-12-24 | 机器人手臂的控制装置及控制方法、机器人、机器人手臂的控制程序及集成电子电路 |
EP09837454.9A EP2431138A4 (en) | 2009-01-09 | 2009-12-24 | CONTROL APPARATUS AND CONTROL METHOD FOR ROBOTIC ARM, ROBOT, CONTROL PROGRAM FOR ROBOTIC ARM, AND INTEGRATED ELECTRONIC CIRCUIT |
US12/918,897 US8423188B2 (en) | 2009-01-09 | 2009-12-24 | Control apparatus and control method for robot arm, robot, control program for robot arm, and integrated electronic circuit |
JP2010518656A JP4568795B2 (ja) | 2009-01-09 | 2009-12-24 | ロボットアームの制御装置及び制御方法、ロボット、ロボットアームの制御プログラム、並びに、集積電子回路 |
Applications Claiming Priority (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
JP2009-003166 | 2009-01-09 | ||
JP2009003166 | 2009-01-09 |
Publications (1)
Publication Number | Publication Date |
---|---|
WO2010079564A1 true WO2010079564A1 (ja) | 2010-07-15 |
Family
ID=42316343
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
PCT/JP2009/007155 WO2010079564A1 (ja) | 2009-01-09 | 2009-12-24 | ロボットアームの制御装置及び制御方法、ロボット、ロボットアームの制御プログラム、並びに、集積電子回路 |
Country Status (5)
Country | Link |
---|---|
US (1) | US8423188B2 (ja) |
EP (1) | EP2431138A4 (ja) |
JP (1) | JP4568795B2 (ja) |
CN (1) | CN102056715B (ja) |
WO (1) | WO2010079564A1 (ja) |
Cited By (2)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN102686366A (zh) * | 2010-08-31 | 2012-09-19 | 松下电器产业株式会社 | 主从机器人的控制装置及控制方法、主从机器人、控制程序、以及集成电子电路 |
WO2019102746A1 (ja) * | 2017-11-27 | 2019-05-31 | アズビル株式会社 | ロボットの直接教示装置及びその方法 |
Families Citing this family (41)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN101737603B (zh) * | 2008-11-10 | 2011-11-30 | 鸿富锦精密工业(深圳)有限公司 | 万向关节 |
US20110015787A1 (en) * | 2009-01-22 | 2011-01-20 | Yuko Tsusaka | Control apparatus and control method for robot arm, robot, control program for robot arm, and integrated electronic circuit |
US9052710B1 (en) * | 2009-03-20 | 2015-06-09 | Exelis Inc. | Manipulation control based upon mimic of human gestures |
CN102686371B (zh) * | 2010-01-25 | 2015-01-14 | 松下电器产业株式会社 | 危险提示装置、危险提示系统以及危险提示方法 |
JP5743495B2 (ja) * | 2010-11-05 | 2015-07-01 | キヤノン株式会社 | ロボット制御装置 |
CN103118842A (zh) * | 2011-03-17 | 2013-05-22 | 松下电器产业株式会社 | 机器人、机器人的控制装置、控制方法以及控制程序 |
CN103185569B (zh) * | 2011-12-30 | 2017-05-17 | Ge医疗系统环球技术有限公司 | 旋转臂倾角仪以及显示旋转臂转角的方法 |
JP5516610B2 (ja) * | 2012-01-19 | 2014-06-11 | 株式会社安川電機 | ロボット、ロボットハンドおよびロボットハンドの保持位置調整方法 |
JP6111562B2 (ja) * | 2012-08-31 | 2017-04-12 | セイコーエプソン株式会社 | ロボット |
JP5785284B2 (ja) * | 2014-02-17 | 2015-09-24 | ファナック株式会社 | 搬送対象物の落下事故を防止するロボットシステム |
US9676097B1 (en) * | 2014-11-11 | 2017-06-13 | X Development Llc | Systems and methods for robotic device authentication |
US9623560B1 (en) * | 2014-11-26 | 2017-04-18 | Daniel Theobald | Methods of operating a mechanism and systems related therewith |
US9804593B1 (en) * | 2014-12-12 | 2017-10-31 | X Development Llc | Methods and systems for teaching positions to components of devices |
DE102014226933B3 (de) * | 2014-12-23 | 2016-03-24 | Kuka Roboter Gmbh | Vorrichtung und Verfahren zum Aufnehmen von Positionen |
JP6068429B2 (ja) * | 2014-12-25 | 2017-01-25 | ファナック株式会社 | ロボット識別システム |
US10201901B2 (en) * | 2015-01-29 | 2019-02-12 | Canon Kabushiki Kaisha | Robot apparatus, method for controlling robot, program, and recording medium |
DE102015205176B3 (de) | 2015-03-23 | 2016-05-12 | Kuka Roboter Gmbh | Robustes intuitives Bedienverfahren durch Berührung eines Manipulators |
JP2016190292A (ja) * | 2015-03-31 | 2016-11-10 | セイコーエプソン株式会社 | ロボット制御装置、ロボットシステムおよびロボット制御方法 |
CN104914766A (zh) * | 2015-06-04 | 2015-09-16 | 芜湖固高自动化技术有限公司 | 一种工业机器人教学用微处理器的控制电路 |
JP2017001170A (ja) * | 2015-06-16 | 2017-01-05 | セイコーエプソン株式会社 | ロボット、制御装置および制御方法 |
JP6240689B2 (ja) * | 2015-07-31 | 2017-11-29 | ファナック株式会社 | 人の行動パターンを学習する機械学習装置、ロボット制御装置、ロボットシステム、および機械学習方法 |
DE102016009030B4 (de) | 2015-07-31 | 2019-05-09 | Fanuc Corporation | Vorrichtung für maschinelles Lernen, Robotersystem und maschinelles Lernsystem zum Lernen eines Werkstückaufnahmevorgangs |
JP6754364B2 (ja) * | 2015-08-25 | 2020-09-09 | 川崎重工業株式会社 | ロボットシステム |
JP6661925B2 (ja) * | 2015-09-07 | 2020-03-11 | セイコーエプソン株式会社 | 制御装置、ロボットおよびロボットシステム |
US10456910B2 (en) * | 2016-01-14 | 2019-10-29 | Purdue Research Foundation | Educational systems comprising programmable controllers and methods of teaching therewith |
US10427305B2 (en) * | 2016-07-21 | 2019-10-01 | Autodesk, Inc. | Robotic camera control via motion capture |
JP6517762B2 (ja) * | 2016-08-23 | 2019-05-22 | ファナック株式会社 | 人とロボットが協働して作業を行うロボットの動作を学習するロボットシステム |
JP6724831B2 (ja) * | 2017-03-16 | 2020-07-15 | 株式会社安川電機 | コントロールシステム、コントローラ及び制御方法 |
JP2018171668A (ja) * | 2017-03-31 | 2018-11-08 | セイコーエプソン株式会社 | 制御装置、ロボット、およびロボットシステム |
JP6870433B2 (ja) * | 2017-03-31 | 2021-05-12 | セイコーエプソン株式会社 | 制御装置、およびロボットシステム |
JP6514258B2 (ja) * | 2017-03-31 | 2019-05-15 | ファナック株式会社 | ロボットシステム |
JP6946057B2 (ja) | 2017-05-30 | 2021-10-06 | キヤノン株式会社 | ロボットハンド、ロボットハンドの制御方法、ロボット装置 |
JP6796557B2 (ja) * | 2017-06-30 | 2020-12-09 | 株式会社神戸製鋼所 | 溶接ロボットのトーチケーブル干渉評価情報出力装置、評価情報出力方法及びプログラム |
JP6633580B2 (ja) | 2017-08-02 | 2020-01-22 | ファナック株式会社 | ロボットシステム及びロボット制御装置 |
JP6687573B2 (ja) * | 2017-09-07 | 2020-04-22 | ファナック株式会社 | ロボットシステム |
JP7155660B2 (ja) * | 2018-06-26 | 2022-10-19 | セイコーエプソン株式会社 | ロボット制御装置およびロボットシステム |
JP7136729B2 (ja) * | 2019-03-20 | 2022-09-13 | ファナック株式会社 | ロボットを用いて負荷の重量及び重心位置を推定するための装置、方法、プログラム、制御装置及びロボットシステム |
EP3747604B1 (en) * | 2019-06-07 | 2022-01-26 | Robert Bosch GmbH | Robot device controller, robot device arrangement and method for controlling a robot device |
US11992945B2 (en) * | 2020-11-10 | 2024-05-28 | Google Llc | System and methods for training robot policies in the real world |
CN113707488B (zh) * | 2021-08-17 | 2024-01-30 | 广东控银实业有限公司 | 摇杆校正方法及摇杆装置 |
CN115478691B (zh) * | 2022-08-29 | 2024-09-20 | 中联重科股份有限公司 | 臂架控制方法、系统、工程机械和机器可读存储介质 |
Citations (4)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JPS59157715A (ja) | 1983-02-25 | 1984-09-07 | Hitachi Ltd | ロボツトの直接教示方法 |
JPH10249769A (ja) * | 1997-01-13 | 1998-09-22 | Sankyo Seiki Mfg Co Ltd | 力補助装置の操作装置 |
JP2007168000A (ja) * | 2005-12-21 | 2007-07-05 | Yaskawa Electric Corp | ロボットのダイレクト操作装置 |
WO2009004772A1 (ja) * | 2007-07-05 | 2009-01-08 | Panasonic Corporation | ロボットアームの制御装置及び制御方法、ロボット、及び制御プログラム |
Family Cites Families (12)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JPS52129155A (en) * | 1976-04-19 | 1977-10-29 | Tokico Ltd | Information reader of playback robbot |
US4918981A (en) * | 1988-10-28 | 1990-04-24 | W. L. Gore & Associates, Inc. | Method for measuring moisture vapor transmission rate in wearing apparel |
EP0465743A1 (en) * | 1990-07-12 | 1992-01-15 | British Aerospace Public Limited Company | Teach and report probe for a robot arm |
JPH0784632A (ja) * | 1993-06-30 | 1995-03-31 | Hitachi Constr Mach Co Ltd | ロボットの位置と姿勢の教示方法 |
EP0850730B1 (en) * | 1995-09-14 | 2002-07-24 | Kabushiki Kaisha Yaskawa Denki | Teaching unit for robots |
IL120889A (en) * | 1997-05-22 | 1998-10-30 | Eshed Robotec 1982 Ltd | Method and facility for direct learning of vending machines |
US7443115B2 (en) * | 2002-10-29 | 2008-10-28 | Matsushita Electric Industrial Co., Ltd. | Apparatus and method for robot handling control |
JP3923053B2 (ja) * | 2004-03-31 | 2007-05-30 | ファナック株式会社 | ロボット教示装置 |
JP4470058B2 (ja) | 2004-04-21 | 2010-06-02 | 株式会社ジェイテクト | パワーアシスト装置 |
FR2871363B1 (fr) * | 2004-06-15 | 2006-09-01 | Medtech Sa | Dispositif robotise de guidage pour outil chirurgical |
JP2007136588A (ja) | 2005-11-16 | 2007-06-07 | Yaskawa Electric Corp | プログラミングペンダント |
CN101870108B (zh) * | 2006-01-13 | 2011-09-28 | 松下电器产业株式会社 | 机械手臂的控制装置 |
-
2009
- 2009-12-24 WO PCT/JP2009/007155 patent/WO2010079564A1/ja active Application Filing
- 2009-12-24 JP JP2010518656A patent/JP4568795B2/ja active Active
- 2009-12-24 CN CN2009801215012A patent/CN102056715B/zh not_active Expired - Fee Related
- 2009-12-24 EP EP09837454.9A patent/EP2431138A4/en not_active Withdrawn
- 2009-12-24 US US12/918,897 patent/US8423188B2/en active Active
Patent Citations (4)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JPS59157715A (ja) | 1983-02-25 | 1984-09-07 | Hitachi Ltd | ロボツトの直接教示方法 |
JPH10249769A (ja) * | 1997-01-13 | 1998-09-22 | Sankyo Seiki Mfg Co Ltd | 力補助装置の操作装置 |
JP2007168000A (ja) * | 2005-12-21 | 2007-07-05 | Yaskawa Electric Corp | ロボットのダイレクト操作装置 |
WO2009004772A1 (ja) * | 2007-07-05 | 2009-01-08 | Panasonic Corporation | ロボットアームの制御装置及び制御方法、ロボット、及び制御プログラム |
Non-Patent Citations (1)
Title |
---|
See also references of EP2431138A4 |
Cited By (4)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN102686366A (zh) * | 2010-08-31 | 2012-09-19 | 松下电器产业株式会社 | 主从机器人的控制装置及控制方法、主从机器人、控制程序、以及集成电子电路 |
US9089967B2 (en) | 2010-08-31 | 2015-07-28 | Panasonic Intellectual Property Management Co., Ltd. | Control apparatus and method for master-slave robot, master-slave robot, control program, and integrated electronic circuit |
WO2019102746A1 (ja) * | 2017-11-27 | 2019-05-31 | アズビル株式会社 | ロボットの直接教示装置及びその方法 |
JP2019093526A (ja) * | 2017-11-27 | 2019-06-20 | アズビル株式会社 | ロボットの直接教示装置及びその方法 |
Also Published As
Publication number | Publication date |
---|---|
JPWO2010079564A1 (ja) | 2012-06-21 |
CN102056715B (zh) | 2012-12-26 |
CN102056715A (zh) | 2011-05-11 |
EP2431138A4 (en) | 2014-06-04 |
EP2431138A1 (en) | 2012-03-21 |
JP4568795B2 (ja) | 2010-10-27 |
US20110015785A1 (en) | 2011-01-20 |
US8423188B2 (en) | 2013-04-16 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
JP4568795B2 (ja) | ロボットアームの制御装置及び制御方法、ロボット、ロボットアームの制御プログラム、並びに、集積電子回路 | |
JP5325843B2 (ja) | ロボットアームの制御装置及び制御方法、ロボット、ロボットアームの制御プログラム、並びに、集積電子回路 | |
JP5740554B2 (ja) | ロボットアームの制御装置及び制御方法、ロボット、ロボットアームの制御プログラム、及び、ロボットアーム制御用集積電子回路 | |
JP4361132B2 (ja) | ロボットアームの制御装置及び制御方法、ロボット、及び制御プログラム | |
JP4759660B2 (ja) | ロボットアーム制御用の装置、方法、プログラム及び集積電子回路、並びに、組立ロボット | |
CN101646534B (zh) | 机器手控制装置及控制方法、机器人 | |
JP4531126B2 (ja) | ロボットアームの制御装置及び制御方法、ロボット、ロボットアームの制御プログラム、及びロボットアーム制御用集積電子回路 | |
JP5512048B2 (ja) | ロボットアームの制御装置及び制御方法、ロボット、制御プログラム、並びに、集積電子回路 | |
WO2011021376A1 (ja) | ロボットアームの制御装置及び制御方法、家事ロボット、ロボットアームの制御プログラム、及び、ロボットアーム制御用集積電子回路 |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
WWE | Wipo information: entry into national phase |
Ref document number: 200980121501.2 Country of ref document: CN |
|
WWE | Wipo information: entry into national phase |
Ref document number: 2010518656 Country of ref document: JP |
|
WWE | Wipo information: entry into national phase |
Ref document number: 12918897 Country of ref document: US |
|
121 | Ep: the epo has been informed by wipo that ep was designated in this application |
Ref document number: 09837454 Country of ref document: EP Kind code of ref document: A1 |
|
REEP | Request for entry into the european phase |
Ref document number: 2009837454 Country of ref document: EP |
|
WWE | Wipo information: entry into national phase |
Ref document number: 2009837454 Country of ref document: EP |
|
NENP | Non-entry into the national phase |
Ref country code: DE |