[go: up one dir, main page]
More Web Proxy on the site http://driver.im/

WO2024024036A1 - Robot parameter setting device - Google Patents

Robot parameter setting device Download PDF

Info

Publication number
WO2024024036A1
WO2024024036A1 PCT/JP2022/029131 JP2022029131W WO2024024036A1 WO 2024024036 A1 WO2024024036 A1 WO 2024024036A1 JP 2022029131 W JP2022029131 W JP 2022029131W WO 2024024036 A1 WO2024024036 A1 WO 2024024036A1
Authority
WO
WIPO (PCT)
Prior art keywords
setting
control unit
robot
arm
parameter setting
Prior art date
Application number
PCT/JP2022/029131
Other languages
French (fr)
Japanese (ja)
Inventor
恭平 小窪
Original Assignee
ファナック株式会社
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by ファナック株式会社 filed Critical ファナック株式会社
Priority to PCT/JP2022/029131 priority Critical patent/WO2024024036A1/en
Priority to TW112127224A priority patent/TW202419238A/en
Publication of WO2024024036A1 publication Critical patent/WO2024024036A1/en

Links

Images

Classifications

    • BPERFORMING OPERATIONS; TRANSPORTING
    • B25HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
    • B25JMANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
    • B25J13/00Controls for manipulators
    • B25J13/08Controls for manipulators by means of sensing devices, e.g. viewing or touching devices

Definitions

  • the present invention relates to a robot parameter setting device.
  • a device that measures the position of the tip of a tool with respect to the tip of a robot arm. Furthermore, a device is known that associates position information in an image obtained by a visual sensor with position information of a robot. Also, a device is known that uses a master work to make a camera coordinate system correspond to a robot coordinate system. Furthermore, a robot system is known in which a robot mounted on a trolley performs work on a fixed target device. In this robot system, a plurality of marks are provided on the target device, and the position of the target device with respect to the robot on the cart is grasped using the plurality of marks as landmarks. For example, please refer to Patent Documents 1 to 4.
  • Patent No. 4191080 Japanese Patent Application Publication No. 2015-174191 Japanese Patent Application Publication No. 2018-034271 JP2020-163518A
  • the arm of the robot holds one of the object such as a mark imaged by the visual sensor and the visual sensor, and the other is supported by a predetermined support part. Then, the tip of the arm moves to various positions based on the coordinate system setting program. Further, based on the coordinate system setting program, a visual sensor images the object at each position, and coordinate systems are associated based on the obtained image data.
  • This setting is performed on the premise that the positions of the target and visual sensor do not change.
  • unintended tension, inertial force, etc. may act on the cable of the visual sensor, and in this case, there is a possibility that the visual sensor may be misaligned during the above settings. Therefore, there is a need for a robot parameter setting device that can ensure that unintended situations do not occur.
  • a parameter setting device for a robot includes a first setting element that is one of an object and a sensor and is attached to an arm of the robot, and a second setting element that is the other of the object and the sensor. element, and a control unit that causes the arm to perform a parameter setting operation for arranging the first setting element at a plurality of positions, and the control unit is configured to perform a first half of the setting operation or a parameter setting operation for arranging the first setting element at a plurality of positions.
  • the control unit causes the sensor to acquire first check data regarding the object with the first setting element placed at the setting check position, and the controller controls the second half of the setting operation or the After the setting operation, the control unit causes the arm to perform an operation of arranging the first setting element at the setting check position, and causes the sensor to acquire second check data regarding the object; Based on the first check data and the second check data, it is determined whether the position of the first setting element or the second setting element has changed.
  • FIG. 1 is a schematic perspective view of a robot parameter setting device according to a first embodiment and a robot system in which the setting device is used;
  • FIG. 1 is a schematic side view of a robot system according to a first embodiment.
  • FIG. 2 is a block diagram of a control device of the robot parameter setting device according to the first embodiment.
  • FIG. 2 is a block diagram of an image processing device of the robot parameter setting device according to the first embodiment.
  • 3 is a flowchart of an example of processing performed by the control unit of the robot parameter setting device according to the first embodiment.
  • 3 is a flowchart of an example of processing performed by the control unit of the robot parameter setting device according to the first embodiment.
  • 3 is a display example of the display device of the robot parameter setting device according to the first embodiment.
  • FIG. 2 is a schematic perspective view of a robot parameter setting device according to a second embodiment and a robot system in which the setting device is used.
  • 12 is a flowchart of an example of processing performed by the control unit of the robot parameter setting device according to the second embodiment.
  • 12 is a flowchart of an example of processing performed by the control unit of the robot parameter setting device according to the second embodiment.
  • the setting device includes, for example, a control device 20 for a robot 10, a visual sensor (sensor) 50, and an image processing device 40 connected to the control device 20.
  • the control device 20 may take on the functions of the image processing device 40, and in this case, the image processing device 40 can be omitted.
  • another computer may be responsible for part of the processing described below.
  • the setting device includes the other computer.
  • the robot 10 is not limited to a specific type, the robot 10 of the first embodiment is an articulated robot with six axes.
  • the robot 10 may be a horizontal articulated robot, a multi-link robot, or the like.
  • the arm 10a includes a plurality of servo motors 11 that respectively drive a plurality of movable parts 12 (see FIGS. 1 and 3).
  • Each servo motor 11 has an operating position detecting device 11a for detecting its operating position, and the operating position detecting device 11a is an encoder, for example.
  • the control device 20 receives the detection value of the operating position detection device 11a.
  • a tool 30 is attached to the tip of an arm 10a, and the arm 10a is part of a robot system that performs work on a workpiece W on a transfer device 2.
  • a camera 3 separate from the visual sensor 50 is provided on the upstream side of the transport device 2 in order to confirm the position and posture of the work object W.
  • a motor (drive device) 2 a that drives the conveyance device 2 is provided with an encoder (operating position detection device) 2 b that detects the amount of conveyance of the conveyance device 2 .
  • the control device 20 uses the position of the work object W detected based on the output of the camera 3 and the conveyance amount of the transport device 2, the control device 20 causes the tip of the arm 10a to follow the work object W during work.
  • the above operations are known operations such as taking out the work object W, processing the work object W, and attaching parts to the work object W.
  • the processing for the work object W is known processing such as machining, painting, and cleaning.
  • the transport device 2 may be any device that can move the work object W, such as a conveyor, an automatic guided vehicle (AGV), or a car under manufacture. In the case of a car being manufactured, the chassis, tires, motor, etc. function as a transport device 2, and a work object W such as a body on the chassis is transported.
  • AGV automatic guided vehicle
  • the tool 30 is a hand, and the tool 30 is equipped with a servo motor 31 that drives a claw (FIG. 3).
  • the servo motor 31 has an operating position detection device for detecting its operating position, and the operating position detection device is an encoder, for example. The detection value of the operating position detection device is transmitted to the control device 20.
  • various servo motors such as a rotary motor, a linear motor, etc. can be used.
  • the visual sensor 50 is supported on the top surface of the transport device 2.
  • the visual sensor 50 is supported by a predetermined support part 70 (FIG. 1), such as a plate placed on the top surface of the transport device 2.
  • the visual sensor 50 is, for example, a camera such as a two-dimensional camera. Other types of sensors capable of acquiring imaging data or detection data of the object 60 can also be used as the visual sensor 50.
  • the control device 20 includes a processor 21 having one or more processor elements such as a CPU or a microcomputer, and a display device 22.
  • the control device 20 has a storage unit 23 including nonvolatile storage, ROM, RAM, and the like.
  • the control device 20 includes a servo controller 24 corresponding to the servo motor 11 of the robot 10 and a servo controller 25 corresponding to the servo motor 31 of the tool 30.
  • the control device 20 also includes an input section 26 connected to the control device 20 by wire or wirelessly.
  • the input unit 26 is an input device such as an operation panel that the user can carry.
  • input unit 26 is a tablet computer. In the case of a tablet computer, the input is performed using a touch screen function.
  • the operating panel or tablet computer may also have a display device 22 .
  • the storage unit 23 stores a system program 23a, and the system program 23a is responsible for the basic functions of the control device 20. Furthermore, the storage unit 23 stores an operation program 23b. The storage unit 23 also stores a parameter setting program 23c and a setting check program 23d. The control device 20 controls the arm 10a to perform a predetermined work on the work object W based on the operation program 23b.
  • the image processing device 40 includes a processor 41 having one or more processor elements such as a CPU, a microcomputer, and an image processing processor.
  • the image processing device 40 is connected to the control device 20 and the visual sensor 50, and includes an input section 42 and a storage section 43 having nonvolatile storage, ROM, RAM, etc.
  • a known image processing program 43a is stored in the storage unit 43.
  • the image processing device 40 performs known image processing on the image data obtained by the visual sensor 50 based on the image processing program 43a. For example, the image processing device 40 performs known image processing such as binarization processing, blob analysis processing, and edge detection processing on the image data of the target 60.
  • the image processing device 40 may obtain position detection data such as the feature position and center of gravity position of the object 60. It is also possible for the image processing device 40 to obtain posture detection data of the object 60.
  • the image processing device 40 sequentially transmits image data, position detection data, posture detection data, etc. that have been subjected to image processing to the control device 20.
  • the control device 20 may be responsible for some or all of the functions of the image processing device 40, or the image processing device 40 may be responsible for some or all of the functions of the control device 20. Further, another computer may perform part or all of the functions of the image processing device 40 and the control device 20.
  • the image processing device 40, the control device 20, and other computers are referred to as a control section, and the following computer processing is performed by one or more processors of the control section.
  • the one or more processors are processor 21, processor 41, etc.
  • the control unit performs the following processing for parameter setting based on the parameter setting program 23c and the setting check program 23d.
  • An example of parameter setting will be described with reference to flowcharts in FIGS. 5 and 6.
  • the reference coordinate system 101 of the arm 10a of the robot 10 is set in advance, for example, based on the base 13 that supports the arm 10a.
  • the control unit can calculate the position and orientation of the tip end portion of the arm 10a such as the flange 10b in the tip coordinate system 102 based on the detection result of the operating position detection device 11a.
  • an object (first setting element) 60 such as a calibration jig is attached to the tip of the arm 10a for setting the parameters.
  • an object 60 is held by tool 30, as shown in FIG.
  • the object 60 may be attached to the wrist flange 10b by a magnet such as an electromagnet, or by attachment means such as a bolt.
  • the center position of the mark 61 of the target 60 is arranged at a position corresponding to the tool center point (TCP) of the tool 30.
  • the visual sensor (second setting element) 50 is supported by the support section 70 for the parameter setting.
  • the support section 70 may be the upper surface of the transport device 2, or may be a support device such as a tripod.
  • the position and orientation of the visual sensor 50 on the support portion 70 correspond to the work reference coordinate system for the work object W.
  • the work reference coordinate system corresponds to the position and orientation of the work object W, the position and orientation of the mounting part of the work object W, and the like.
  • the work reference coordinate system and the sensor coordinate system 103 of the visual sensor 50 match. This allows the arm 10a to accurately work on the work object W by setting parameters to be described later.
  • the optical axis of the visual sensor 50 corresponds to one axis such as the Z axis of the sensor coordinate system 103 and extends vertically upward, but the optical axis of the visual sensor 50 is directed in another direction. It's okay.
  • the target 60 in the first embodiment has a mark 61, and the control unit recognizes the size, shape, arrangement of each element, etc. of the mark 61 in advance. Thereby, the control unit can calculate the position and orientation of the mark 61 and the object 60 with respect to the visual sensor 50 based on the image data of the mark 61. For example, the center position of the mark 61 is calculated as the position of the mark 61, and the orientation of the mark 61 is calculated based on the relative positions of three or more feature shapes or feature points in the mark 61.
  • the mark 61 may be a dot pattern, a single or multiple three-dimensional shapes, or the like.
  • the object 60 is a work object W
  • a single or plural characteristic shapes of the work object W function as the mark 61.
  • the target 60 may be provided with a thin tip, and this tip may be used instead of the mark 61.
  • the control section upon receiving an input to start parameter setting (step S1-1), the control section performs the following processing based on the setting check program 23d.
  • the start input is based on a user input to input portion 26 .
  • the user moves the tip of the arm 10a so that the object 60 is within the field of view of the visual sensor 50.
  • a parameter setting screen 23g shown in FIG. 7 is displayed on the display device 22.
  • the control unit displays on the display device 22 a preparation image or text 23h that prompts the distal end of the arm 10a to be placed at the setting check position (step S1-2).
  • the control unit may display on the display device 22 a preparation image or text 23h that prompts the user to place the target 60 at the setting check position.
  • the term setting check position means the position and attitude of the tip of the arm 10a, and the setting check position is the position and attitude of the tip where the object 60 is placed within the field of view of the visual sensor 50.
  • the preparation image or text 23h in step S1-2 may be a screen that informs the user that the setting check position will be used for determination in step S1-17, which will be described later. In this case, the user can appropriately set the position and orientation of the tip in step S1-1 while considering various conditions.
  • the various conditions include the configuration including the wiring of the robot 10, the configuration including the wiring of the visual sensor 50, the size of the robot 10, the type of the tool 30, the size of the tool 30, and the like. Note that if the content regarding step S1-2 is described in an instruction manual or the like, the control unit may be configured not to perform step S1-2.
  • control unit When the control unit receives the parameter setting process start input (step S1-3), it causes the visual sensor 50 to acquire the first check data 23e (FIG. 3) at the setting check position (step S1-4). Further, the control unit stores the first check data 23e, which is the imaging data or detection data obtained by the visual sensor 50, in the storage unit 23 or the like (step S1-5). Before step S1-4, the control unit may cause the arm 10a to perform an operation to eliminate the influence of backlash on the arm 10a. In one example, for this operation, the controller rotates each servo motor 11 slightly and then returns it to its original position.
  • control unit performs the following processing based on the parameter setting program 23c.
  • the control unit controls the arm 10a to place the object 60 at a plurality of positions, for example, three or more calibration positions, and causes the visual sensor 50 to take an image at each calibration position.
  • the postures of the target 60 at each calibration position may be different from each other.
  • control unit places the object 60 at the first calibration position (step S1-6) and causes the visual sensor 50 to acquire data at the first calibration position (step S1-7). Further, the control unit places the object 60 at the second calibration position (step S1-8) and causes the visual sensor 50 to acquire data at the second calibration position (step S1-9). Further, the control unit places the object 60 at the third calibration position (step S1-10) and causes the visual sensor 50 to acquire data at the third calibration position (step S1-11).
  • the control unit calculates the position and orientation of the object 60 with respect to the visual sensor 50 at each calibration position from the imaging data of steps S1-7, S1-9, and S1-11 (step S1-12). These positions and orientations are the positions and orientations of the visual sensor 50 on the sensor coordinate system 103. Further, the control unit recognizes the position and orientation of at least one of the tip of the arm 10a, the tool 30, and the tip coordinate system 102 at each calibration position. The control unit sets parameters using the position and orientation on the sensor coordinate system 103 and the position and orientation of each calibration position, for example, on the tip coordinate system 102 (step S1-13).
  • the parameter is, for example, one that associates the sensor coordinate system 103 with the tip coordinate system 102 and the reference coordinate system 101.
  • the parameters are, for example, those that associate the work reference coordinate system with the positions and postures of the tip of the arm 10a, the tool 30, and the tip coordinate system 102. If the sensor coordinate system 103 corresponds to the coordinate system of the transport device 2, the work by the arm 10a can be performed accurately. Alternatively, the parameter associates the tip coordinate system 102 with the tip (TCP) of the tool 30.
  • TCP tip
  • the parameters may have other uses, and other calculations may be employed to set the parameters.
  • the position and orientation of the visual sensor 50 correspond to the work reference coordinate system for the work object W.
  • the control device 20 may recognize the position and direction of the TCP without any corresponding response.
  • control unit performs the following processing based on the setting check program 23d.
  • the control unit controls the arm 10a so that the distal end of the arm 10a is placed at the setting check position (step S1-14). Further, the control unit causes the visual sensor 50 to acquire the second check data 23f (FIG. 3) at the setting check position (step S1-15). Then, the control unit stores the second check data 23f, which is the imaging data or detection data obtained by the visual sensor 50, in the storage unit 23 or the like (step S1-16).
  • the control unit determines whether there is a change in the position of the visual sensor 50 or the object 60 by comparing both check data 23e and 23f (step S1-17). For example, when the amount of change in the position of the object 60 in the second check data 23f with respect to the first check data 23e is less than a threshold value, it is determined that there is no change in position. At this time, the control section displays a message indicating that parameter setting has been completed on the display device 22 (step S1-21), and normally ends the parameter setting process. Furthermore, it is determined that there is a position change when the amount of position change is greater than or equal to a threshold value.
  • the threshold value is, for example, 3 pixels of the imaging data of the visual sensor 50, but if it is determined that there is a positional change when the threshold value is 2 pixels or more, the parameter setting becomes more accurate.
  • the threshold value is the amount of positional change of a portion of the mark 61 on the object 60 where the largest positional shift has occurred. If it is determined that there is a positional change when the amount of positional change is 0.3mm or 0.2mm or more instead of 0.5mm or more, the parameter setting becomes more accurate. For example, the position of the visual sensor 50 may shift slightly during parameter setting due to tension on the cable of the visual sensor 50. Since the event occurs during a relatively short parameter setting, a slight change in the position of the visual sensor 50 or the object 60 can be accurately determined. Note that it is also possible to use a threshold value greater than the above value depending on the situation or conditions.
  • step S1-17 when it is determined in step S1-17 that there is a change in position, the control section displays the amount of change in position on the display device 22 (step S1-18).
  • the number of pixels, a numerical value in millimeters, etc. are used as the amount of positional change.
  • step S1-18 it is also possible for the control unit to display on the display device 22 a determination that there is a change in position. As a display of the judgment, an error display, a display indicating that the parameter cannot be set, etc. are also possible. This allows the user to know the position change in a timely manner.
  • control unit may display on the display device 22 a display indicating that there is no position change.
  • the display allows the user to know in a timely manner that the set parameters are accurate.
  • the amount of positional change may be displayed on the display device 22 as an indication that there is no positional change. Note that by making the characters and screen color different from those in step S1-18 when displaying the amount of positional change in this way, the user can easily recognize that there has been no positional change.
  • the tool 30 may be a long and thin welding gun, a hand with long claws, etc., and the object 60 may be attached to the tip thereof.
  • the work performed by the tool 30 may not require much positional accuracy.
  • the control unit may display the direction of the position change on the display device 22.
  • the direction of change in the position of the visual sensor 50, the direction of change in the position of the object 60, etc. are displayed on the display device 22.
  • the direction may be an approximate direction. This display makes it easier for the user to estimate the cause of the position change, which leads to improved efficiency in setting work.
  • step S1-17 when it is determined in step S1-17 that the position has changed, the control unit displays a selection screen on the display device 22 that allows the user to select whether or not to use the parameter. (Step S1-19). Thereby, the user can decide whether to use or not use the parameter based on the conditions, experience, etc. Note that the control unit is configured not to perform step S1-19 when there is no need for the selection.
  • control unit when the user makes a selection to use the control unit in step S1-19, stores data linking the content of the selection, the amount of position change, and the parameter in the storage unit 23. You may save it.
  • control unit may store data in which the position change amount and the parameter are linked in the storage unit 23. Further, the control unit may transmit the data to an external computer.
  • the external computer is a server for production control, quality control, etc. The data can be used for later quality control operations, the design and improvement of the robot system, the design and improvement of the work object W, and the like.
  • step S1-20 when the control unit receives an input to use the parameter through the input unit 26 or the like (step S1-20), the parameter is set and a display indicating that the parameter setting is completed is displayed (step S1-21). If the control unit does not accept the input to use the parameter (step S1-20), the process ends without setting the parameter (step S1-22).
  • control unit may automatically place the tip of the arm 10a at the setting check position.
  • control unit may display a plurality of setting check positions as options on the display device 22 by illustration or the like.
  • the control unit displays a screen on the display device 22 that allows the user to select one of the plurality of setting check positions.
  • the control unit receives a user's selection input via the input unit 26, and automatically places the tip of the arm 10a at a setting check position corresponding to the selection input.
  • control unit may display a screen on the display device 22 for setting the setting check position. For example, a screen is displayed that shows at least the position of the visual sensor 50. On this screen, at least the position of the tip of the arm 10a, the tool 30, or the object 60 can be specified.
  • the control unit receives a user's setting input via the input unit 26, and automatically places the tip of the arm 10a at a setting check position corresponding to the setting input.
  • a second embodiment of a robot parameter setting device will be described with reference to the drawings.
  • the object (first setting element) 60 is attached to the tip of the arm 10a, and the visual sensor (second setting element) 50 is supported by the support section 70.
  • a visual sensor (first setting element) 50 is attached to the tip of the arm 10a, and a target (second setting element) 60 is supported by a support section 70 (FIG. 8).
  • the same components as those in the first embodiment are denoted by the same reference numerals, and descriptions of the configurations and similar effects obtained by the configurations are omitted. Note that the modifications described in the first embodiment can also be applied to the second embodiment as appropriate.
  • the visual sensor 50 is removably attached to the tip of the arm 10a.
  • visual sensor 50 is attached to wrist flange 10b of arm 10a, similar to tool 30.
  • the object 60 is supported on the top surface of the transport device 2 .
  • the object 60 is supported by a support section 70 placed on the top surface of the transport device 2 .
  • the object 60 may be the work object W, or may be an article other than the work object W, a part of a structure, or the like. A part of a conveyance device or the like can be used as the structure part.
  • a visual sensor 50 is attached to the tip of the arm 10a for setting the parameters.
  • the visual sensor 50 is attached to the wrist flange 10b using a magnet such as an electromagnet, a bolt, or other attachment means.
  • the visual sensor 50 may be attached to the tip of the arm 10a by holding the visual sensor 50 with the tool 30.
  • the object 60 is supported by the support section 70 for the parameter setting.
  • the support section 70 may be the upper surface of the transport device 2, or may be a support device such as a tripod.
  • the position and orientation of the object 60 on the support part 70 correspond to the work reference coordinate system for the work object W.
  • the parameter settings can be made suitable for the work.
  • the camera coordinate system 103, the reference coordinate system 101, and the tip coordinate system 102 are associated with each other by setting parameters to be described later. Further, by setting parameters to be described later, it becomes possible for the arm 10a to accurately perform work on the work target W, for example.
  • the control unit performs the following processing based on the setting check program 23d.
  • the user moves the tip of the arm 10a so that the object 60 is within the field of view of the visual sensor 50 according to the instructions in the instruction manual.
  • the control unit does not display on the display device 22 the preparation image or text 23h that prompts the distal end of the arm 10a to be placed in the setting check position. It is also possible to configure the first embodiment in this way.
  • the control unit Upon receiving an input to start the parameter setting process (step S2-1), the control unit causes the visual sensor 50 to acquire the first check data 23e at the setting check position (step S2-2). Further, the control unit stores the first check data 23e, which is the imaging data or detection data obtained by the visual sensor 50, in the storage unit 23 or the like (step S2-3). Before step S2-2, the control unit may cause the arm 10a to perform an operation to eliminate the influence of backlash on the arm 10a.
  • control unit performs the following processing based on the parameter setting program 23c, similar to the first embodiment. For example, the control unit places the visual sensor 50 at the first calibration position (step S2-4), and causes the visual sensor 50 to acquire data at the first calibration position (step S2-5). Further, the control unit places the visual sensor 50 at the second calibration position (step S2-6), and causes the visual sensor 50 to acquire data at the second calibration position (step S2-7). Further, the control unit places the visual sensor 50 at the third calibration position (step S2-8), and causes the visual sensor 50 to acquire data at the third calibration position (step S2-9).
  • the control unit calculates the position and orientation of the object 60 with respect to the visual sensor 50 at each calibration position from the imaging data of steps S2-5, S2-7, and S2-9 (step S2-10). These positions and orientations are the positions and orientations of the visual sensor 50 on the sensor coordinate system 103. Further, the control unit recognizes the position and orientation of at least one of the tip of the arm 10a, the tool 30, and the tip coordinate system 102 at each calibration position. The control unit sets parameters using the position and orientation on the sensor coordinate system 103 and the position and orientation of each calibration position, for example, on the tip coordinate system 102 (step S2-11).
  • the parameters are, for example, those that associate the sensor coordinate system 103 with the tip coordinate system 102 and the reference coordinate system 101.
  • the parameters may be for other purposes, and other calculations may be employed to set the parameters.
  • control unit performs the following processing based on the setting check program 23d.
  • the control unit controls the arm 10a so that the distal end of the arm 10a is placed at the setting check position (step S2-12). Further, the control unit causes the visual sensor 50 to acquire the second check data 23f at the setting check position (step S2-13). Then, the control unit stores the second check data 23f, which is the imaging data or detection data obtained by the visual sensor 50, in the storage unit 23 or the like (step S2-14).
  • control unit determines whether there is a change in the position of the visual sensor 50 or the object 60, as in step S1-17 of the first embodiment (step S2-15). If there is no positional change, the control section displays a message indicating that the parameter setting has been completed on the display device 22 (step S2-19), and normally ends the parameter setting process. Further, the control unit displays the amount of position change on the display device 22 (step S2-16), similarly to step S1-18 of the first embodiment. The display of the amount of change in position allows the user to estimate the accuracy of parameter setting, as in the first embodiment.
  • control unit may display a selection screen on the display device 22 in the same manner as step S1-19 of the first embodiment (step S2-17). Thereby, the user can decide whether to use or not use the parameter based on the conditions, experience, etc., as in the first embodiment. Further, when the control unit receives an input to use the parameter through the input unit 26 or the like (step S2-18), the control unit sets the parameter and displays the completion of parameter setting (step S2-19). . If the control unit does not accept the input to use the parameter (step S2-18), the process ends without setting the parameter (step S2-20).
  • the tip of the robot 10 sequentially moves to a plurality of positions and postures, for example, as in steps S1-6 to S1-11 of the first embodiment. Furthermore, various objects such as other devices and work objects W exist around the robot. Further, a cable for the robot 10 exists around the robot 10, and the cable also moves according to the movement of the robot. Therefore, during the parameter setting process, the user needs to be careful to avoid interference between the robot 10 and the cables and surrounding objects. Therefore, the user often does not notice the state of the visual sensor 50 or its cable.
  • the visual sensor 50 is attached to the robot 10 or the support part by a user or the like for parameter setting, and its cable is also temporarily placed by the user. Therefore, depending on the temporary position of the cable, the visual sensor 50 may apply a slight tension to the cable during the parameter setting operation of the robot 10. The visual sensor 50 may or may not move due to the slight tension. Further, even if the visual sensor 50 is slightly displaced due to the slight tension, the user often does not notice the displacement. The position of the object 60 may also shift slightly due to vibrations, etc., but similarly, in many cases, the user does not notice this shift.
  • the parameter setting calculation itself is performed, so conventionally, the parameter processing ends normally and the user does not notice the deviation.
  • the accuracy required for the robot 10 to perform the work on the work object W there may be some work in which the user does not notice the slight deviation at all.
  • the cause may be related to the slight deviation.
  • the first and second check data 23e and 23f are measured and position changes are determined during a series of controls for setting parameters of the robot 10.
  • This configuration measures the first and second check data 23e and 23f during parameter setting for a relatively short period of time, so it is possible to accurately determine a slight positional deviation of the visual sensor 50 or object 60 that occurs during parameter setting. can.
  • step S1-17 if the positions of the first check data 23e and the second check data 23f differ by 3 pixels or more, it is determined that there is a position change.
  • the visual sensor 50 has several million pixels, and one side of the angle of view of the visual sensor 50 at the position of the object 60 is about 10 cm to 20 cm.
  • a position change of 3 pixels may correspond to a position change of 0.2 mm, 0.3 mm, etc. Therefore, the above configuration can accurately determine a slight positional shift of the visual sensor 50 or the object 60.
  • the visual sensor 50 is slightly deviated in the optical axis direction during image capture at the second setting check position, the deviation will almost never occur only in the optical axis direction of the visual sensor 50. In other words, when the visual sensor 50 is displaced in the optical axis direction due to some unintended cause, a situation in which the visual sensor 50 is not displaced in the direction intersecting the optical axis is unlikely to occur. The same applies to the target 60. Therefore, the above configuration can accurately determine a slight positional shift of the visual sensor 50 or the object 60.
  • the user moves the tip of the arm 10a before steps S1-4 and S2-2, and this position is used as the setting check position. Therefore, the user can easily and reliably determine the setting check position, and there is no need to separately set the setting check position. This contributes to improving user convenience. Further, the user can reliably place the visual sensor 50 in a focused position as the setting check position. This configuration is useful in view of the fact that many visual sensors 50 for robots do not have an autofocus function, and the in-focus range is known from the instruction manual or the like.
  • a screen for selecting a setting check position is displayed. Therefore, the user can easily use a setting check position suitable for conditions such as cable wiring and position of the visual sensor 50.
  • a screen for setting a setting check position is displayed. This is useful for the user to arbitrarily set the setting check position without much effort.
  • the setting check position is preferably the first half of the setting operation in steps S1-6 to S1-11 of the first embodiment, for example.
  • the first calibration position or the second calibration position are performed in the latter half of the setting operation of steps S1-6 to S1-11 or after the setting operation.
  • other setting check positions may be further set.
  • Other setting check positions may be set, for example, at positions where tension is likely to be applied to the cable of the visual sensor 50.
  • the other setting check positions may be any of the calibration positions.
  • the control unit may be configured to display the setting check position on the display device 22.
  • the control unit displays that the setting check position is a position where the arm 10a stops before the first calibration position, a position between the first calibration position and the second configuration position, or the like. In this case, the user can understand where the setting check position is, which contributes to facilitating identification of the cause of an error during setting work.
  • the visual sensor 50 is a three-dimensional camera, a three-dimensional distance sensor, LiDAR (Light Detection and Ranging), PSD (Position Sensing Detector) etc. may also be used. In these cases, the configuration of the image processing device 40 is changed as appropriate. Further, as the visual sensor 50, a visual sensor that is attached to the tip of the arm 10a while performing the work on the work object W may be used.

Landscapes

  • Engineering & Computer Science (AREA)
  • Human Computer Interaction (AREA)
  • Robotics (AREA)
  • Mechanical Engineering (AREA)
  • Manipulator (AREA)

Abstract

This robot parameter setting device comprises: a first setting element which is one of an object (60) and a sensor (50) and which is attached to an arm (10a); a second setting element which is the other of the object (60) and the sensor (50); and a control unit which causes the arm (10a) to execute a parameter setting operation for disposing the first setting element at multiple positions. In the first half of the parameter setting operation or prior to the parameter setting operation, the control unit causes the sensor (50) to acquire first check data (23e) concerning the object in a state where the first setting element is being disposed at a setting check position. In the second half of the parameter setting operation or after the parameter setting operation, the control unit disposes the first setting element at the setting check position and also causes the sensor (50) to acquire second check data (23f) concerning the object, and then determines whether or not there has been a positional change on the basis of the two sets of data.

Description

ロボットのパラメータの設定装置Robot parameter setting device
 本発明はロボットのパラメータの設定装置に関する。 The present invention relates to a robot parameter setting device.
 ロボットのアーム先端部に対するツール先端点の位置を計測する装置が知られている。また、視覚センサによって得られる画像中の位置情報とロボットの位置情報とを対応づける装置が知られている。また、マスターワークを用いてカメラ座標系をロボット座標系と対応させる装置が知られている。また、台車に載置されたロボットが固定された対象装置に対し作業を行うロボットシステムが知られている。当該ロボットシステムでは、対象装置に複数のマークが設けられ、複数のマークを目印に台車上のロボットに対する対象装置の位置が把握される。例えば特許文献1~4を参照されたい。 A device is known that measures the position of the tip of a tool with respect to the tip of a robot arm. Furthermore, a device is known that associates position information in an image obtained by a visual sensor with position information of a robot. Also, a device is known that uses a master work to make a camera coordinate system correspond to a robot coordinate system. Furthermore, a robot system is known in which a robot mounted on a trolley performs work on a fixed target device. In this robot system, a plurality of marks are provided on the target device, and the position of the target device with respect to the robot on the cart is grasped using the plurality of marks as landmarks. For example, please refer to Patent Documents 1 to 4.
特許第4191080号公報Patent No. 4191080 特開2015-174191号公報Japanese Patent Application Publication No. 2015-174191 特開2018-034271号公報Japanese Patent Application Publication No. 2018-034271 特開2020-163518号公報JP2020-163518A
 ロボットが作業対象に作業を行うためには、ロボットの座標系とロボットの先端部のツールの座標系との対応付け、ロボットに関する座標系と作業対象に関する座標系の対応付け等が必要である。例えば、ロボットの先端部のツールの座標系とコンベヤによって搬送される作業対象の座標系との対応付けが必要である。 In order for a robot to work on a work object, it is necessary to make a correspondence between the coordinate system of the robot and the coordinate system of the tool at the tip of the robot, and a correspondence between the coordinate system of the robot and the coordinate system of the work object. For example, it is necessary to associate the coordinate system of a tool at the tip of a robot with the coordinate system of a workpiece carried by a conveyor.
 このような座標系の対応付け設定の時、視覚センサによって撮像されるマーク等の対象および視覚センサの一方をロボットのアームが保持し他方が所定の支持部に支持される。そして、座標系設定プログラムに基づき、アームの先端部が様々な位置に移動する。また、座標系設定プログラムに基づき、各々の位置で視覚センサが対象を撮像し、得られた画像データに基づき座標系が対応付けられる。 When setting such a coordinate system correspondence, the arm of the robot holds one of the object such as a mark imaged by the visual sensor and the visual sensor, and the other is supported by a predetermined support part. Then, the tip of the arm moves to various positions based on the coordinate system setting program. Further, based on the coordinate system setting program, a visual sensor images the object at each position, and coordinate systems are associated based on the obtained image data.
 この設定は、対象および視覚センサの位置が変化しない前提で行われる。しかし、例えば視覚センサのケーブルに意図しない張力、慣性力等が働く場合もあり、この場合は前記設定の時に視覚センサの位置ずれが発生する可能性がある。このため、意図しない状況が発生していないことの保証を可能とするロボットのパラメータの設定装置が望まれる。 This setting is performed on the premise that the positions of the target and visual sensor do not change. However, for example, unintended tension, inertial force, etc. may act on the cable of the visual sensor, and in this case, there is a possibility that the visual sensor may be misaligned during the above settings. Therefore, there is a need for a robot parameter setting device that can ensure that unintended situations do not occur.
 本発明の一態様のロボットのパラメータの設定装置は、対象およびセンサの一方であり、ロボットのアームに取付けられる第1の設定用要素と、前記対象および前記センサの他方である第2の設定用要素と、前記第1の設定用要素を複数の位置に配置するパラメータの設定用動作を前記アームに行わせる制御部と、を備え、前記制御部は、前記設定用動作の前半又は前記設定用動作の前に、前記第1の設定用要素が設定チェック位置に配置された状態で、前記対象に関する第1チェックデータを前記センサに取得させ、前記制御部は、前記設定用動作の後半又は前記設定用動作の後に、前記第1の設定用要素を前記設定チェック位置に配置する動作を前記アームに行わせると共に、前記対象に関する第2チェックデータを前記センサに取得させ、前記制御部は、前記第1チェックデータと前記第2チェックデータとに基づいて、前記第1の設定用要素又は前記第2の設定用要素の位置変化の有無を判断する。 A parameter setting device for a robot according to one aspect of the present invention includes a first setting element that is one of an object and a sensor and is attached to an arm of the robot, and a second setting element that is the other of the object and the sensor. element, and a control unit that causes the arm to perform a parameter setting operation for arranging the first setting element at a plurality of positions, and the control unit is configured to perform a first half of the setting operation or a parameter setting operation for arranging the first setting element at a plurality of positions. Before the operation, the control unit causes the sensor to acquire first check data regarding the object with the first setting element placed at the setting check position, and the controller controls the second half of the setting operation or the After the setting operation, the control unit causes the arm to perform an operation of arranging the first setting element at the setting check position, and causes the sensor to acquire second check data regarding the object; Based on the first check data and the second check data, it is determined whether the position of the first setting element or the second setting element has changed.
第1実施形態のロボットのパラメータの設定装置および当該設定装置が用いられるロボットシステムの概略斜視図である。1 is a schematic perspective view of a robot parameter setting device according to a first embodiment and a robot system in which the setting device is used; FIG. 第1実施形態のロボットシステムの概略側面図である。FIG. 1 is a schematic side view of a robot system according to a first embodiment. 第1実施形態のロボットのパラメータの設定装置の制御装置のブロック図である。FIG. 2 is a block diagram of a control device of the robot parameter setting device according to the first embodiment. 第1実施形態のロボットのパラメータの設定装置の画像処理装置のブロック図である。FIG. 2 is a block diagram of an image processing device of the robot parameter setting device according to the first embodiment. 第1実施形態のロボットのパラメータの設定装置の制御部が行う処理例のフローチャートである。3 is a flowchart of an example of processing performed by the control unit of the robot parameter setting device according to the first embodiment. 第1実施形態のロボットのパラメータの設定装置の制御部が行う処理例のフローチャートである。3 is a flowchart of an example of processing performed by the control unit of the robot parameter setting device according to the first embodiment. 第1実施形態のロボットのパラメータの設定装置の表示装置の表示例である。3 is a display example of the display device of the robot parameter setting device according to the first embodiment. 第2実施形態のロボットのパラメータの設定装置および当該設定装置が用いられるロボットシステムの概略斜視図である。FIG. 2 is a schematic perspective view of a robot parameter setting device according to a second embodiment and a robot system in which the setting device is used. 第2実施形態のロボットのパラメータの設定装置の制御部が行う処理例のフローチャートである。12 is a flowchart of an example of processing performed by the control unit of the robot parameter setting device according to the second embodiment. 第2実施形態のロボットのパラメータの設定装置の制御部が行う処理例のフローチャートである。12 is a flowchart of an example of processing performed by the control unit of the robot parameter setting device according to the second embodiment.
 第1実施形態にロボットのパラメータの設定装置を、図面を参照しながら説明する。当該設定装置は、図1に示すように、例えばロボット10の制御装置20と、制御装置20に接続された視覚センサ(センサ)50および画像処理装置40とを備える。画像処理装置40の機能を制御装置20が担ってもよく、この場合は画像処理装置40を省くことができる。また、後述の処理の一部を他のコンピュータが担う場合もある。この場合は当該設定装置に前記他のコンピュータが含まれる。 In a first embodiment, a robot parameter setting device will be described with reference to the drawings. As shown in FIG. 1, the setting device includes, for example, a control device 20 for a robot 10, a visual sensor (sensor) 50, and an image processing device 40 connected to the control device 20. The control device 20 may take on the functions of the image processing device 40, and in this case, the image processing device 40 can be omitted. In addition, another computer may be responsible for part of the processing described below. In this case, the setting device includes the other computer.
 ロボット10は特定の種類に限定されないが、第1実施形態のロボット10は6軸を有する多関節ロボットである。ロボット10は水平多関節ロボット、マルチリンクロボット等であってもよい。アーム10aは、複数の可動部12をそれぞれ駆動する複数のサーボモータ11を備えている(図1および図3参照)。各サーボモータ11はその作動位置を検出するための作動位置検出装置11aを有し、作動位置検出装置11aは一例としてエンコーダである。制御装置20は作動位置検出装置11aの検出値を受信する。 Although the robot 10 is not limited to a specific type, the robot 10 of the first embodiment is an articulated robot with six axes. The robot 10 may be a horizontal articulated robot, a multi-link robot, or the like. The arm 10a includes a plurality of servo motors 11 that respectively drive a plurality of movable parts 12 (see FIGS. 1 and 3). Each servo motor 11 has an operating position detecting device 11a for detecting its operating position, and the operating position detecting device 11a is an encoder, for example. The control device 20 receives the detection value of the operating position detection device 11a.
 図2に示すように、例えばアーム10aの先端部にツール30が取付けられ、アーム10aは搬送装置2上の作業対象Wに作業を行うロボットシステムの一部である。図2の例では、搬送装置2の上流側に作業対象Wの位置および姿勢を確認するため、視覚センサ50とは別のカメラ3が設けられている。また、搬送装置2を駆動するモータ(駆動装置)2aには搬送装置2の搬送量を検出するエンコーダ(作動位置検出装置)2bが設けられている。また、カメラ3の出力に基づき検出される作業対象Wの位置と搬送装置2の搬送量とを用いて、作業時に制御装置20がアーム10aの先端部を作業対象Wに追従させる。 As shown in FIG. 2, for example, a tool 30 is attached to the tip of an arm 10a, and the arm 10a is part of a robot system that performs work on a workpiece W on a transfer device 2. In the example of FIG. 2, a camera 3 separate from the visual sensor 50 is provided on the upstream side of the transport device 2 in order to confirm the position and posture of the work object W. Further, a motor (drive device) 2 a that drives the conveyance device 2 is provided with an encoder (operating position detection device) 2 b that detects the amount of conveyance of the conveyance device 2 . Further, using the position of the work object W detected based on the output of the camera 3 and the conveyance amount of the transport device 2, the control device 20 causes the tip of the arm 10a to follow the work object W during work.
 前記作業は、作業対象Wの取出し、作業対象Wに対する処理、作業対象Wへの部品取付け等の公知の作業である。作業対象Wに対する処理は、加工、塗装、洗浄等の公知の処理である。搬送装置2は、コンベヤ、AGV(Automatic Guided Vehicle)、製造中の車等の作業対象Wを移動できるものであればよい。製造中の車の場合、シャーシ、タイヤ、モータ等が搬送装置2として機能し、シャーシ上のボディ等の作業対象Wが搬送される。 The above operations are known operations such as taking out the work object W, processing the work object W, and attaching parts to the work object W. The processing for the work object W is known processing such as machining, painting, and cleaning. The transport device 2 may be any device that can move the work object W, such as a conveyor, an automatic guided vehicle (AGV), or a car under manufacture. In the case of a car being manufactured, the chassis, tires, motor, etc. function as a transport device 2, and a work object W such as a body on the chassis is transported.
 一例では、ツール30はハンドであり、ツール30は爪を駆動するサーボモータ31を備えている(図3)。サーボモータ31はその作動位置を検出するための作動位置検出装置を有し、作動位置検出装置は一例としてエンコーダである。作動位置検出装置の検出値は制御装置20に送信される。各サーボモータ11,31として、回転モータ、直動モータ等の各種のサーボモータが用いられ得る。 In one example, the tool 30 is a hand, and the tool 30 is equipped with a servo motor 31 that drives a claw (FIG. 3). The servo motor 31 has an operating position detection device for detecting its operating position, and the operating position detection device is an encoder, for example. The detection value of the operating position detection device is transmitted to the control device 20. As each servo motor 11, 31, various servo motors such as a rotary motor, a linear motor, etc. can be used.
 一例では、視覚センサ50は搬送装置2の上面に支持される。第1実施形態では、視覚センサ50は搬送装置2の上面に載置されたプレート等の所定の支持部70(図1)に支持される。視覚センサ50は例えば二次元カメラ等のカメラである。視覚センサ50としては、対象60の撮像データ又は検出データを取得できる他の種類のセンサも使用可能である。 In one example, the visual sensor 50 is supported on the top surface of the transport device 2. In the first embodiment, the visual sensor 50 is supported by a predetermined support part 70 (FIG. 1), such as a plate placed on the top surface of the transport device 2. The visual sensor 50 is, for example, a camera such as a two-dimensional camera. Other types of sensors capable of acquiring imaging data or detection data of the object 60 can also be used as the visual sensor 50.
 制御装置20は、図3に示すように、CPU、マイクロコンピュータ等の1つ又は複数のプロセッサ素子を有するプロセッサ21と、表示装置22と、を有する。制御装置20は、不揮発性ストレージ、ROM、RAM等を有する記憶部23を有する。制御装置20は、ロボット10のサーボモータ11にそれぞれ対応しているサーボ制御器24と、ツール30のサーボモータ31に対応しているサーボ制御器25と、を有する。制御装置20は、制御装置20に有線又は無線によって接続された入力部26も備えている。一例では、入力部26はユーザが持ち運べる操作盤等の入力装置である。他の例では入力部26はタブレットコンピュータである。タブレットコンピュータの場合は前記入力がタッチスクリーン機能を用いて行われる。操作盤又はタブレットコンピュータが表示装置22を有する場合もある。 As shown in FIG. 3, the control device 20 includes a processor 21 having one or more processor elements such as a CPU or a microcomputer, and a display device 22. The control device 20 has a storage unit 23 including nonvolatile storage, ROM, RAM, and the like. The control device 20 includes a servo controller 24 corresponding to the servo motor 11 of the robot 10 and a servo controller 25 corresponding to the servo motor 31 of the tool 30. The control device 20 also includes an input section 26 connected to the control device 20 by wire or wirelessly. In one example, the input unit 26 is an input device such as an operation panel that the user can carry. In other examples, input unit 26 is a tablet computer. In the case of a tablet computer, the input is performed using a touch screen function. The operating panel or tablet computer may also have a display device 22 .
 記憶部23はシステムプログラム23aを格納しており、システムプログラム23aは制御装置20の基本機能を担っている。また、記憶部23は動作プログラム23bを格納している。記憶部23は、パラメータ設定プログラム23cおよび設定チェックプログラム23dも格納している。制御装置20は、動作プログラム23bに基づき、作業対象Wに対する所定の作業を行うためにアーム10aを制御する。 The storage unit 23 stores a system program 23a, and the system program 23a is responsible for the basic functions of the control device 20. Furthermore, the storage unit 23 stores an operation program 23b. The storage unit 23 also stores a parameter setting program 23c and a setting check program 23d. The control device 20 controls the arm 10a to perform a predetermined work on the work object W based on the operation program 23b.
 画像処理装置40は、図4に示すように、CPU、マイクロコンピュータ、画像処理プロセッサ等の1つ又は複数のプロセッサ素子を有するプロセッサ41を有する。画像処理装置40は、制御装置20および視覚センサ50に接続され、入力部42と、不揮発性ストレージ、ROM、RAM等を有する記憶部43とを備える。記憶部43には公知の画像処理プログラム43aが格納されている。 As shown in FIG. 4, the image processing device 40 includes a processor 41 having one or more processor elements such as a CPU, a microcomputer, and an image processing processor. The image processing device 40 is connected to the control device 20 and the visual sensor 50, and includes an input section 42 and a storage section 43 having nonvolatile storage, ROM, RAM, etc. A known image processing program 43a is stored in the storage unit 43.
 画像処理装置40は、画像処理プログラム43aに基づき、視覚センサ50によって得られる画像データに公知の画像処理を行う。例えば、画像処理装置40は対象60の画像データに公知の二値化処理、ブロブ解析処理、エッジ検出処理等の画像処理を行う。画像処理装置40が対象60の特徴位置、重心位置等の位置検出データを得てもよい。画像処理装置40が対象60の姿勢検出データを得ることも可能である。画像処理装置40は画像処理が行われた画像データ、位置検出データ、姿勢検出データ等を逐次制御装置20に送信する。 The image processing device 40 performs known image processing on the image data obtained by the visual sensor 50 based on the image processing program 43a. For example, the image processing device 40 performs known image processing such as binarization processing, blob analysis processing, and edge detection processing on the image data of the target 60. The image processing device 40 may obtain position detection data such as the feature position and center of gravity position of the object 60. It is also possible for the image processing device 40 to obtain posture detection data of the object 60. The image processing device 40 sequentially transmits image data, position detection data, posture detection data, etc. that have been subjected to image processing to the control device 20.
 画像処理装置40の機能の一部又は全部を制御装置20が担ってもよく、制御装置20の機能の一部又は全部を画像処理装置40が担ってもよい。また、画像処理装置40および制御装置20の機能の一部又は全部を他のコンピュータが担ってもよい。以下の説明では画像処理装置40、制御装置20、および他のコンピュータを制御部と称し、以下のコンピュータ処理は制御部の1つ又は複数のプロセッサが行う。1つ又は複数のプロセッサはプロセッサ21、プロセッサ41等である。 The control device 20 may be responsible for some or all of the functions of the image processing device 40, or the image processing device 40 may be responsible for some or all of the functions of the control device 20. Further, another computer may perform part or all of the functions of the image processing device 40 and the control device 20. In the following description, the image processing device 40, the control device 20, and other computers are referred to as a control section, and the following computer processing is performed by one or more processors of the control section. The one or more processors are processor 21, processor 41, etc.
 制御部は、パラメータ設定プログラム23cおよび設定チェックプログラム23dに基づき、パラメータ設定のための以下の処理を行う。パラメータ設定の一例を図5および図6のフローチャートを参照しながら説明する。
 ここで、図1に示すように、ロボット10のアーム10aの基準座標系101は、例えばアーム10aを支持する基部13を基準に予め設定されている。また、制御部は、フランジ10b等のアーム10aの先端部の先端部座標系102の位置および姿勢を、作動位置検出装置11aの検出結果に基づき算出できる。
The control unit performs the following processing for parameter setting based on the parameter setting program 23c and the setting check program 23d. An example of parameter setting will be described with reference to flowcharts in FIGS. 5 and 6.
Here, as shown in FIG. 1, the reference coordinate system 101 of the arm 10a of the robot 10 is set in advance, for example, based on the base 13 that supports the arm 10a. Further, the control unit can calculate the position and orientation of the tip end portion of the arm 10a such as the flange 10b in the tip coordinate system 102 based on the detection result of the operating position detection device 11a.
 第1実施形態では、当該パラメータ設定のためにアーム10aの先端部に校正治具等の対象(第1の設定用要素)60が取付けられる。一例では、図1に示すように、ツール30によって対象60が保持される。電磁石等の磁石、ボルト等の取付手段によって対象60が手首フランジ10bに取付けられてもよい。第1実施形態では、対象60のマーク61の中心位置は、ツール30のツールセンターポイント(TCP)に対応した位置に配置される。 In the first embodiment, an object (first setting element) 60 such as a calibration jig is attached to the tip of the arm 10a for setting the parameters. In one example, an object 60 is held by tool 30, as shown in FIG. The object 60 may be attached to the wrist flange 10b by a magnet such as an electromagnet, or by attachment means such as a bolt. In the first embodiment, the center position of the mark 61 of the target 60 is arranged at a position corresponding to the tool center point (TCP) of the tool 30.
 また、第1実施形態では、当該パラメータ設定のために支持部70によって視覚センサ(第2の設定用要素)50が支持される。支持部70は、搬送装置2の上面であってもよく、三脚等の支持装置であってもよい。例えば、支持部70上の視覚センサ50の位置および姿勢は作業対象W用の作業基準座標系に対応している。作業基準座標系は、作業対象Wの位置および姿勢、作業対象Wの載置部の位置および姿勢等に対応している。第1実施形態では作業基準座標系と視覚センサ50のセンサ座標系103とが一致している。これにより、後述のパラメータ設定によってアーム10aは作業対象Wに正確に作業を行うことが可能となる。第1実施形態では視覚センサ50の光軸はセンサ座標系103のZ軸等のある1つの軸に対応すると共に鉛直上方に延びているが、視覚センサ50の光軸が他の方向に向けられてもよい。 Furthermore, in the first embodiment, the visual sensor (second setting element) 50 is supported by the support section 70 for the parameter setting. The support section 70 may be the upper surface of the transport device 2, or may be a support device such as a tripod. For example, the position and orientation of the visual sensor 50 on the support portion 70 correspond to the work reference coordinate system for the work object W. The work reference coordinate system corresponds to the position and orientation of the work object W, the position and orientation of the mounting part of the work object W, and the like. In the first embodiment, the work reference coordinate system and the sensor coordinate system 103 of the visual sensor 50 match. This allows the arm 10a to accurately work on the work object W by setting parameters to be described later. In the first embodiment, the optical axis of the visual sensor 50 corresponds to one axis such as the Z axis of the sensor coordinate system 103 and extends vertically upward, but the optical axis of the visual sensor 50 is directed in another direction. It's okay.
 第1実施形態の対象60はマーク61を有し、制御部は予めマーク61の大きさ、形状、各要素の配置等を認識している。これにより、制御部はマーク61の撮像データに基づき、視覚センサ50に対するマーク61および対象60の位置および姿勢を計算することができる。例えば、マーク61の中心位置がマーク61の位置として計算され、マーク61中の3つ以上の特徴形状又は特徴点の相対位置に基づきマーク61の姿勢が計算される。マーク61はドットパターン、単一又は複数の立体形状等であってもよい。対象60が作業対象Wである場合は、作業対象Wの単一又は複数の特徴形状等がマーク61として機能する。対象60に細い先端部が設けられ、当該先端部がマーク61の代わりに用いられてもよい。 The target 60 in the first embodiment has a mark 61, and the control unit recognizes the size, shape, arrangement of each element, etc. of the mark 61 in advance. Thereby, the control unit can calculate the position and orientation of the mark 61 and the object 60 with respect to the visual sensor 50 based on the image data of the mark 61. For example, the center position of the mark 61 is calculated as the position of the mark 61, and the orientation of the mark 61 is calculated based on the relative positions of three or more feature shapes or feature points in the mark 61. The mark 61 may be a dot pattern, a single or multiple three-dimensional shapes, or the like. When the object 60 is a work object W, a single or plural characteristic shapes of the work object W function as the mark 61. The target 60 may be provided with a thin tip, and this tip may be used instead of the mark 61.
 先ず、制御部は、パラメータ設定の開始入力を受付けると(ステップS1-1)、設定チェックプログラム23dに基づき以下の処理を行う。一例では、開始入力は、ユーザの入力部26への入力に基づく。
 ここで、ユーザは、対象60が視覚センサ50の画角内に入るように、アーム10aの先端部を移動する。この時、表示装置22に図7に示すパラメータ設定用画面23gが表示される。好ましくは、制御部は、アーム10aの先端部を設定チェック位置に配置することを促す準備画像又は文字23hを表示装置22に表示する(ステップS1-2)。対象60を設定チェック位置に配置することを促す準備画像又は文字23hを制御部が表示装置22に表示してもよい。
First, upon receiving an input to start parameter setting (step S1-1), the control section performs the following processing based on the setting check program 23d. In one example, the start input is based on a user input to input portion 26 .
Here, the user moves the tip of the arm 10a so that the object 60 is within the field of view of the visual sensor 50. At this time, a parameter setting screen 23g shown in FIG. 7 is displayed on the display device 22. Preferably, the control unit displays on the display device 22 a preparation image or text 23h that prompts the distal end of the arm 10a to be placed at the setting check position (step S1-2). The control unit may display on the display device 22 a preparation image or text 23h that prompts the user to place the target 60 at the setting check position.
 設定チェック位置の用語は、アーム10aの先端部の位置および姿勢を意味し、設定チェック位置は視覚センサ50の画角内に対象60が配置される先端部の位置および姿勢である。ステップS1-2の準備画像又は文字23hは、設定チェック位置が後述のステップS1-17の判断に用いられることをユーザに伝える画面であってもよい。この場合、ユーザは、様々な条件を考慮しながらステップS1-1の先端部の位置および姿勢を適切に設定できる。様々な条件は、ロボット10の配線を含む構成、視覚センサ50の配線を含む構成、ロボット10のサイズ、ツール30の種類、ツール30のサイズ等である。なお、ステップS1-2に関する内容が取扱い説明書等に記載されている場合等は、制御部はステップS1-2を行わないように構成され得る。 The term setting check position means the position and attitude of the tip of the arm 10a, and the setting check position is the position and attitude of the tip where the object 60 is placed within the field of view of the visual sensor 50. The preparation image or text 23h in step S1-2 may be a screen that informs the user that the setting check position will be used for determination in step S1-17, which will be described later. In this case, the user can appropriately set the position and orientation of the tip in step S1-1 while considering various conditions. The various conditions include the configuration including the wiring of the robot 10, the configuration including the wiring of the visual sensor 50, the size of the robot 10, the type of the tool 30, the size of the tool 30, and the like. Note that if the content regarding step S1-2 is described in an instruction manual or the like, the control unit may be configured not to perform step S1-2.
 制御部は、パラメータ設定の処理開始入力を受付けると(ステップS1-3)、設定チェック位置において視覚センサ50に第1チェックデータ23e(図3)を取得させる(ステップS1-4)。また、制御部は、視覚センサ50によって得られた撮像データ又は検出データである第1チェックデータ23eを記憶部23等に保存する(ステップS1-5)。ステップS1-4の前に、制御部が、アーム10aのバックラッシの影響を解消するための動作をアーム10aに行わせてもよい。一例では、当該動作のために、制御部は各サーボモータ11を僅かに回転させた後に元の位置に戻す。 When the control unit receives the parameter setting process start input (step S1-3), it causes the visual sensor 50 to acquire the first check data 23e (FIG. 3) at the setting check position (step S1-4). Further, the control unit stores the first check data 23e, which is the imaging data or detection data obtained by the visual sensor 50, in the storage unit 23 or the like (step S1-5). Before step S1-4, the control unit may cause the arm 10a to perform an operation to eliminate the influence of backlash on the arm 10a. In one example, for this operation, the controller rotates each servo motor 11 slightly and then returns it to its original position.
 次に、制御部は、パラメータ設定プログラム23cに基づき、以下の処理を行う。制御部は、対象60を複数の位置、例えば3以上の校正用位置に配置するためにアーム10aを制御すると共に、各々の校正用位置で視覚センサ50により撮像を行わせる。各校正用位置における対象60の姿勢が互いに異なっていてもよい。 Next, the control unit performs the following processing based on the parameter setting program 23c. The control unit controls the arm 10a to place the object 60 at a plurality of positions, for example, three or more calibration positions, and causes the visual sensor 50 to take an image at each calibration position. The postures of the target 60 at each calibration position may be different from each other.
 例えば、制御部は、対象60を第1校正用位置に配置し(ステップS1-6)、第1校正用位置で視覚センサ50にデータを取得させる(ステップS1-7)。また、制御部は、対象60を第2校正用位置に配置し(ステップS1-8)、第2校正用位置で視覚センサ50にデータを取得させ(ステップS1-9)。また、制御部は、対象60を第3校正用位置に配置し(ステップS1-10)、第3校正用位置で視覚センサ50にデータを取得させる(ステップS1-11)。 For example, the control unit places the object 60 at the first calibration position (step S1-6) and causes the visual sensor 50 to acquire data at the first calibration position (step S1-7). Further, the control unit places the object 60 at the second calibration position (step S1-8) and causes the visual sensor 50 to acquire data at the second calibration position (step S1-9). Further, the control unit places the object 60 at the third calibration position (step S1-10) and causes the visual sensor 50 to acquire data at the third calibration position (step S1-11).
 制御部は、ステップS1-7,S1-9,S1-11の撮像データから、各校正用位置における視覚センサ50に対する対象60の位置および姿勢を演算する(ステップS1-12)。これらの位置および姿勢は視覚センサ50のセンサ座標系103上における位置および姿勢である。また、制御部は各校正用位置におけるアーム10aの先端部、ツール30、および先端部座標系102の少なくとも何れかの位置および姿勢を認識している。制御部は、センサ座標系103上の位置および姿勢と、各校正用位置の例えば先端部座標系102の位置および姿勢とを用いて、パラメータを設定する(ステップS1-13)。 The control unit calculates the position and orientation of the object 60 with respect to the visual sensor 50 at each calibration position from the imaging data of steps S1-7, S1-9, and S1-11 (step S1-12). These positions and orientations are the positions and orientations of the visual sensor 50 on the sensor coordinate system 103. Further, the control unit recognizes the position and orientation of at least one of the tip of the arm 10a, the tool 30, and the tip coordinate system 102 at each calibration position. The control unit sets parameters using the position and orientation on the sensor coordinate system 103 and the position and orientation of each calibration position, for example, on the tip coordinate system 102 (step S1-13).
 当該パラメータは、例えば、センサ座標系103と先端部座標系102および基準座標系101とを対応付けるものである。当該パラメータは、例えば、前記作業基準座標系とアーム10aの先端部、ツール30、および先端部座標系102の位置および姿勢とを対応付けるものである。センサ座標系103が搬送装置2の座標系と対応していれば、アーム10aによる作業が正確に行われる。または、当該パラメータは、先端部座標系102とツール30の先端部(TCP)とを対応付けるものである。
 当該パラメータが他の用途のものであってもよく、当該パラメータの設定のために他の演算が採用されてもよい。また、第1実施形態では視覚センサ50の位置および姿勢は作業対象W用の作業基準座標系に対応している。代わりに、当該対応が無い状態で制御装置20がTCPの位置および方向を認識している場合もある。
The parameter is, for example, one that associates the sensor coordinate system 103 with the tip coordinate system 102 and the reference coordinate system 101. The parameters are, for example, those that associate the work reference coordinate system with the positions and postures of the tip of the arm 10a, the tool 30, and the tip coordinate system 102. If the sensor coordinate system 103 corresponds to the coordinate system of the transport device 2, the work by the arm 10a can be performed accurately. Alternatively, the parameter associates the tip coordinate system 102 with the tip (TCP) of the tool 30.
The parameters may have other uses, and other calculations may be employed to set the parameters. Furthermore, in the first embodiment, the position and orientation of the visual sensor 50 correspond to the work reference coordinate system for the work object W. Alternatively, the control device 20 may recognize the position and direction of the TCP without any corresponding response.
 続いて、制御部は、設定チェックプログラム23dに基づき以下の処理を行う。制御部は、アーム10aの先端部が設定チェック位置に配置されるようにアーム10aを制御する(ステップS1-14)。また、制御部は、設定チェック位置において視覚センサ50に第2チェックデータ23f(図3)を取得させる(ステップS1-15)。そして、制御部は、視覚センサ50によって得られた撮像データ又は検出データである第2チェックデータ23fを記憶部23等に保存する(ステップS1-16)。 Subsequently, the control unit performs the following processing based on the setting check program 23d. The control unit controls the arm 10a so that the distal end of the arm 10a is placed at the setting check position (step S1-14). Further, the control unit causes the visual sensor 50 to acquire the second check data 23f (FIG. 3) at the setting check position (step S1-15). Then, the control unit stores the second check data 23f, which is the imaging data or detection data obtained by the visual sensor 50, in the storage unit 23 or the like (step S1-16).
 続いて、制御部は、両チェックデータ23e,23fを比較することによって、視覚センサ50又は対象60の位置変化の有無を判断する(ステップS1-17)。例えば第1チェックデータ23eに対する第2チェックデータ23fの対象60の位置変化量が閾値未満であるときは前記位置変化が無いと判断する。この際には制御部は表示装置22にパラメータ設定終了の表示を行い(ステップS1-21)、パラメータ設定処理を正常に終了する。また、前記位置変化量が閾値以上である時に位置変化が有ると判断する。閾値は例えば視覚センサ50の撮像データの3ピクセルであるが、閾値が2ピクセル以上の時に位置変化が有ると判断するとパラメータ設定がより正確になる。 Next, the control unit determines whether there is a change in the position of the visual sensor 50 or the object 60 by comparing both check data 23e and 23f (step S1-17). For example, when the amount of change in the position of the object 60 in the second check data 23f with respect to the first check data 23e is less than a threshold value, it is determined that there is no change in position. At this time, the control section displays a message indicating that parameter setting has been completed on the display device 22 (step S1-21), and normally ends the parameter setting process. Furthermore, it is determined that there is a position change when the amount of position change is greater than or equal to a threshold value. The threshold value is, for example, 3 pixels of the imaging data of the visual sensor 50, but if it is determined that there is a positional change when the threshold value is 2 pixels or more, the parameter setting becomes more accurate.
 前記閾値として0.5mmを設定して位置変化の有無を判断することも可能である。例えば、前記位置変化量は、対象60におけるマーク61中で最も大きな位置ずれが起きている部位の位置変化量である。位置変化量が0.5mm以上ではなく0.3mm又は0.2mm以上の時に位置変化が有ると判断すると、パラメータ設定がより正確になる。例えば視覚センサ50のケーブルに張力が作用することによって、パラメータの設定中に視覚センサ50の位置が僅かにずれる場合がある。当該事象は比較的短時間のパラメータ設定中に発生するので、視覚センサ50又は対象60の僅かな位置変化を正確に判断することができる。なお、状況又は条件によっては上記の値以上の閾値を用いることも可能である。 It is also possible to set 0.5 mm as the threshold value to determine the presence or absence of a positional change. For example, the amount of positional change is the amount of positional change of a portion of the mark 61 on the object 60 where the largest positional shift has occurred. If it is determined that there is a positional change when the amount of positional change is 0.3mm or 0.2mm or more instead of 0.5mm or more, the parameter setting becomes more accurate. For example, the position of the visual sensor 50 may shift slightly during parameter setting due to tension on the cable of the visual sensor 50. Since the event occurs during a relatively short parameter setting, a slight change in the position of the visual sensor 50 or the object 60 can be accurately determined. Note that it is also possible to use a threshold value greater than the above value depending on the situation or conditions.
 続いて、ステップS1-17で前記位置変化があると判断された時に、制御部は表示装置22に前記位置変化量を表示する(ステップS1-18)。位置変化量として前記ピクセル数、ミリ単位の数値等が用いられる。ステップS1-18において、制御部が表示装置22に前記位置変化が有るとの判断の表示をすることも可能である。当該判断の表示として、エラー表示、パラメータが設定不可であることを示す表示等も可能である。これにより、ユーザは前記位置変化をタイムリーに知ることができる。 Subsequently, when it is determined in step S1-17 that there is a change in position, the control section displays the amount of change in position on the display device 22 (step S1-18). The number of pixels, a numerical value in millimeters, etc. are used as the amount of positional change. In step S1-18, it is also possible for the control unit to display on the display device 22 a determination that there is a change in position. As a display of the judgment, an error display, a display indicating that the parameter cannot be set, etc. are also possible. This allows the user to know the position change in a timely manner.
 なお、ステップS1-17で前記位置変化が無いと判断された時に、制御部が表示装置22に前記位置変化が無かったことを示す表示を行ってもよい。当該表示によってユーザは設定されたパラメータが正確であることをタイムリーに知る。前記位置変化が無かったことを示す表示として前記位置変化量が表示装置22に表示されてもよい。なお、このように位置変化量を表示する時の字や画面の色をステップS1-18と異ならせると、ユーザは前記位置変化が無かったことを認識し易い。 Note that when it is determined in step S1-17 that there is no position change, the control unit may display on the display device 22 a display indicating that there is no position change. The display allows the user to know in a timely manner that the set parameters are accurate. The amount of positional change may be displayed on the display device 22 as an indication that there is no positional change. Note that by making the characters and screen color different from those in step S1-18 when displaying the amount of positional change in this way, the user can easily recognize that there has been no positional change.
 また、前記位置変化量の表示により、ユーザによるパラメータ設定の精度の推測が可能となる。例えば、ツール30が長細い溶接ガン、爪が長いハンド等であり、それらの先端部に対象60が取付けられる場合もある。例えば、ツール30による作業がそれほど位置精度を求めない場合がある。また、前述の様々な条件もある。これらの条件がある時、ユーザは4ピクセルの適否を前記条件、経験等に基づき判断できる。そして、当該適否に基づきユーザがパラメータ設定の精度を推測できる場合がある。 Furthermore, by displaying the amount of positional change, it becomes possible for the user to estimate the accuracy of parameter setting. For example, the tool 30 may be a long and thin welding gun, a hand with long claws, etc., and the object 60 may be attached to the tip thereof. For example, the work performed by the tool 30 may not require much positional accuracy. There are also various conditions mentioned above. When these conditions exist, the user can judge the suitability of 4 pixels based on the conditions, experience, etc. The user may be able to estimate the accuracy of the parameter settings based on the appropriateness.
 制御部が表示装置22に前記位置変化の方向を表示してもよい。この場合、視覚センサ50の位置変化の方向、対象60の位置変化の方向等が表示装置22に表示される。当該方向は凡その方向であってもよい。当該表示によってユーザは前記位置変化の原因を推定し易くなり、これは設定作業の効率の向上に繋がる。 The control unit may display the direction of the position change on the display device 22. In this case, the direction of change in the position of the visual sensor 50, the direction of change in the position of the object 60, etc. are displayed on the display device 22. The direction may be an approximate direction. This display makes it easier for the user to estimate the cause of the position change, which leads to improved efficiency in setting work.
 このような場合を想定し、ステップS1-17で前記位置変化があると判断された時に、制御部が当該パラメータを使用するか否かをユーザに選択させる選択画面を表示装置22に表示してもよい(ステップS1-19)。これにより、ユーザは前記条件、経験等に基づき当該パラメータの使用又は不使用を決定できる。なお、当該選択の必要が無い場合等には制御部はステップS1-19を行わないように構成される。 Assuming such a case, when it is determined in step S1-17 that the position has changed, the control unit displays a selection screen on the display device 22 that allows the user to select whether or not to use the parameter. (Step S1-19). Thereby, the user can decide whether to use or not use the parameter based on the conditions, experience, etc. Note that the control unit is configured not to perform step S1-19 when there is no need for the selection.
 なお、制御部が、ステップS1-19で使用するとの選択をユーザが行った時に、制御部が当該選択の内容と、当該位置変化量と、当該パラメータとを紐づけたデータを記憶部23に保存してもよい。
 または、ステップS1-17で前記位置変化があると判断された時に、制御部が当該位置変化量と当該パラメータとを紐づけたデータを記憶部23に保存してもよい。
 また、制御部は前記データを外部のコンピュータに送信してもよい。当該外部のコンピュータは生産管理、品質管理のためのサーバ等である。当該データは後の品質管理業務、前記ロボットシステムの設計や改良、作業対象Wの設計や改良等に用いられ得る。
Note that when the user makes a selection to use the control unit in step S1-19, the control unit stores data linking the content of the selection, the amount of position change, and the parameter in the storage unit 23. You may save it.
Alternatively, when it is determined in step S1-17 that there is a position change, the control unit may store data in which the position change amount and the parameter are linked in the storage unit 23.
Further, the control unit may transmit the data to an external computer. The external computer is a server for production control, quality control, etc. The data can be used for later quality control operations, the design and improvement of the robot system, the design and improvement of the work object W, and the like.
 そして、当該パラメータを使用するとの入力を入力部26等によって制御部が受付けると(ステップS1-20)、当該パラメータを設定し、パラメータ設定終了の表示を行う(ステップS1-21)。当該パラメータを使用するとの入力を制御部が受付けない場合は(ステップS1-20)、パラメータの設定無し終了を行う(ステップS1-22)。 Then, when the control unit receives an input to use the parameter through the input unit 26 or the like (step S1-20), the parameter is set and a display indicating that the parameter setting is completed is displayed (step S1-21). If the control unit does not accept the input to use the parameter (step S1-20), the process ends without setting the parameter (step S1-22).
 なお、前記ステップS1-2の代わりに、制御部がアーム10aの先端部を設定チェック位置に自動的に配置してもよい。その際、制御部が表示装置22に選択肢としての複数の設定チェック位置を図示等によって表示してもよい。この場合、制御部は複数の設定チェック位置の何れかを選択させる画面を表示装置22に表示する。制御部は入力部26を介したユーザの選択入力を受付け、当該選択入力に対応した設定チェック位置にアーム10aの先端部を自動的に配置する。 Note that instead of step S1-2, the control unit may automatically place the tip of the arm 10a at the setting check position. At this time, the control unit may display a plurality of setting check positions as options on the display device 22 by illustration or the like. In this case, the control unit displays a screen on the display device 22 that allows the user to select one of the plurality of setting check positions. The control unit receives a user's selection input via the input unit 26, and automatically places the tip of the arm 10a at a setting check position corresponding to the selection input.
 また、制御部が表示装置22に設定チェック位置を設定させる画面を表示してもよい。例えば、視覚センサ50の少なくとも位置がわかる画面を表示する。当該画面は、当該画面にアーム10aの先端部、ツール30、又は対象60の少なくとも位置を指定できるものである。制御部は入力部26を介したユーザの設定入力を受付け、当該設定入力に対応した設定チェック位置にアーム10aの先端部を自動的に配置する。 Additionally, the control unit may display a screen on the display device 22 for setting the setting check position. For example, a screen is displayed that shows at least the position of the visual sensor 50. On this screen, at least the position of the tip of the arm 10a, the tool 30, or the object 60 can be specified. The control unit receives a user's setting input via the input unit 26, and automatically places the tip of the arm 10a at a setting check position corresponding to the setting input.
 第2実施形態にロボットのパラメータの設定装置を、図面を参照しながら説明する。第1実施形態では、アーム10aの先端部に対象(第1の設定用要素)60が取付けられ、支持部70によって視覚センサ(第2の設定用要素)50が支持されていた。第2実施形態では、アーム10aの先端部に視覚センサ(第1の設定用要素)50が取付けられ、支持部70によって対象(第2の設定用要素)60が支持される(図8)。第2実施形態では、第1実施形態と同様の構成には同様の符号を付し、その構成や当該構成により得られる同様の作用効果の説明を省略する。なお、第2実施形態でも第1実施形態で説明した変形例を適宜適用可能である。 A second embodiment of a robot parameter setting device will be described with reference to the drawings. In the first embodiment, the object (first setting element) 60 is attached to the tip of the arm 10a, and the visual sensor (second setting element) 50 is supported by the support section 70. In the second embodiment, a visual sensor (first setting element) 50 is attached to the tip of the arm 10a, and a target (second setting element) 60 is supported by a support section 70 (FIG. 8). In the second embodiment, the same components as those in the first embodiment are denoted by the same reference numerals, and descriptions of the configurations and similar effects obtained by the configurations are omitted. Note that the modifications described in the first embodiment can also be applied to the second embodiment as appropriate.
 第2実施形態では、視覚センサ50はアーム10aの先端部に取外し可能に取付けられる。一例では、視覚センサ50は、ツール30と同様にアーム10aの手首フランジ10bに取付けられる。一例では、対象60は搬送装置2の上面に支持される。第2実施形態では、対象60は搬送装置2の上面に載置された支持部70に支持される。対象60は作業対象Wであってもよく、作業対象W以外の他の物品、構造物の部分等であってもよい。構造物の部分としては、搬送装置の一部等を用いることができる。 In the second embodiment, the visual sensor 50 is removably attached to the tip of the arm 10a. In one example, visual sensor 50 is attached to wrist flange 10b of arm 10a, similar to tool 30. In one example, the object 60 is supported on the top surface of the transport device 2 . In the second embodiment, the object 60 is supported by a support section 70 placed on the top surface of the transport device 2 . The object 60 may be the work object W, or may be an article other than the work object W, a part of a structure, or the like. A part of a conveyance device or the like can be used as the structure part.
 第2実施形態でも、パラメータ設定プログラム23cおよび設定チェックプログラム23dに基づき、パラメータ設定のための以下の処理を行う。パラメータ設定の一例を図9および図10のフローチャートを参照しながら説明する。 In the second embodiment as well, the following processing for parameter setting is performed based on the parameter setting program 23c and the setting check program 23d. An example of parameter setting will be described with reference to flowcharts in FIGS. 9 and 10.
 第2実施形態では、当該パラメータ設定のためにアーム10aの先端部に視覚センサ50が取付けられる。一例では、電磁石等の磁石、ボルト等の取付手段によって視覚センサ50が手首フランジ10bに取付けられる。ツール30が視覚センサ50を保持することによって視覚センサ50がアーム10aの先端部に取付けられてもよい。 In the second embodiment, a visual sensor 50 is attached to the tip of the arm 10a for setting the parameters. In one example, the visual sensor 50 is attached to the wrist flange 10b using a magnet such as an electromagnet, a bolt, or other attachment means. The visual sensor 50 may be attached to the tip of the arm 10a by holding the visual sensor 50 with the tool 30.
 また、第2実施形態では、当該パラメータ設定のために支持部70によって対象60が支持される。支持部70は、搬送装置2の上面であってもよく、三脚等の支持装置であってもよい。例えば、支持部70上の対象60の位置および姿勢は作業対象W用の作業基準座標系に対応している。この意味では、対象60として複数又は単一の作業対象Wを用いると、当該パラメータ設定を作業に適したものとすることができる。後述のパラメータ設定によって、例えば、カメラ座標系103と基準座標系101および先端部座標系102とが対応付けられる。また、後述のパラメータ設定によって、例えば、アーム10aは作業対象Wに正確に作業を行うことが可能となる。 Furthermore, in the second embodiment, the object 60 is supported by the support section 70 for the parameter setting. The support section 70 may be the upper surface of the transport device 2, or may be a support device such as a tripod. For example, the position and orientation of the object 60 on the support part 70 correspond to the work reference coordinate system for the work object W. In this sense, if a plurality of work objects W or a single work object W is used as the object 60, the parameter settings can be made suitable for the work. For example, the camera coordinate system 103, the reference coordinate system 101, and the tip coordinate system 102 are associated with each other by setting parameters to be described later. Further, by setting parameters to be described later, it becomes possible for the arm 10a to accurately perform work on the work target W, for example.
 先ず、制御部は、設定チェックプログラム23dに基づき以下の処理を行う。
 第2実施形態では、ユーザは、取扱い説明書の説明に沿って、対象60が視覚センサ50の画角内に入るように、アーム10aの先端部を移動する。第2実施形態では、制御部は、アーム10aの先端部を設定チェック位置に配置することを促す準備画像又は文字23hを表示装置22に表示しない。第1実施形態をこのように構成することも可能である。
First, the control unit performs the following processing based on the setting check program 23d.
In the second embodiment, the user moves the tip of the arm 10a so that the object 60 is within the field of view of the visual sensor 50 according to the instructions in the instruction manual. In the second embodiment, the control unit does not display on the display device 22 the preparation image or text 23h that prompts the distal end of the arm 10a to be placed in the setting check position. It is also possible to configure the first embodiment in this way.
 制御部は、パラメータ設定の処理開始入力を受付けると(ステップS2-1)、設定チェック位置において視覚センサ50に第1チェックデータ23eを取得させる(ステップS2-2)。また、制御部は、視覚センサ50によって得られた撮像データ又は検出データである第1チェックデータ23eを記憶部23等に保存する(ステップS2-3)。ステップS2-2の前に、制御部が、アーム10aのバックラッシの影響を解消するための動作をアーム10aに行わせてもよい。 Upon receiving an input to start the parameter setting process (step S2-1), the control unit causes the visual sensor 50 to acquire the first check data 23e at the setting check position (step S2-2). Further, the control unit stores the first check data 23e, which is the imaging data or detection data obtained by the visual sensor 50, in the storage unit 23 or the like (step S2-3). Before step S2-2, the control unit may cause the arm 10a to perform an operation to eliminate the influence of backlash on the arm 10a.
 次に、制御部は、第1実施形態と同様にパラメータ設定プログラム23cに基づき以下の処理を行う。
 例えば、制御部は、視覚センサ50を第1校正用位置に配置し(ステップS2-4)、第1校正用位置で視覚センサ50にデータを取得させる(ステップS2-5)。また、制御部は、視覚センサ50を第2校正用位置に配置し(ステップS2-6)、第2校正用位置で視覚センサ50にデータを取得させる(ステップS2-7)。また、制御部は、視覚センサ50を第3校正用位置に配置し(ステップS2-8)、第3校正用位置で視覚センサ50にデータを取得させる(ステップS2-9)。
Next, the control unit performs the following processing based on the parameter setting program 23c, similar to the first embodiment.
For example, the control unit places the visual sensor 50 at the first calibration position (step S2-4), and causes the visual sensor 50 to acquire data at the first calibration position (step S2-5). Further, the control unit places the visual sensor 50 at the second calibration position (step S2-6), and causes the visual sensor 50 to acquire data at the second calibration position (step S2-7). Further, the control unit places the visual sensor 50 at the third calibration position (step S2-8), and causes the visual sensor 50 to acquire data at the third calibration position (step S2-9).
 制御部は、ステップS2-5,S2-7,S2-9の撮像データから、各校正用位置における視覚センサ50に対する対象60の位置および姿勢を演算する(ステップS2-10)。これらの位置および姿勢は視覚センサ50のセンサ座標系103上における位置および姿勢である。また、制御部は各校正用位置におけるアーム10aの先端部、ツール30、および先端部座標系102の少なくとも何れかの位置および姿勢を認識している。制御部は、センサ座標系103上の位置および姿勢と、各校正用位置の例えば先端部座標系102の位置および姿勢とを用いて、パラメータを設定する(ステップS2-11)。 The control unit calculates the position and orientation of the object 60 with respect to the visual sensor 50 at each calibration position from the imaging data of steps S2-5, S2-7, and S2-9 (step S2-10). These positions and orientations are the positions and orientations of the visual sensor 50 on the sensor coordinate system 103. Further, the control unit recognizes the position and orientation of at least one of the tip of the arm 10a, the tool 30, and the tip coordinate system 102 at each calibration position. The control unit sets parameters using the position and orientation on the sensor coordinate system 103 and the position and orientation of each calibration position, for example, on the tip coordinate system 102 (step S2-11).
 当該パラメータは、例えば、センサ座標系103と先端部座標系102および基準座標系101とを対応付けるものである。
 当該パラメータが他の用途のものであってもよく、当該パラメータの設定のために他の演算を採用してもよい。
The parameters are, for example, those that associate the sensor coordinate system 103 with the tip coordinate system 102 and the reference coordinate system 101.
The parameters may be for other purposes, and other calculations may be employed to set the parameters.
 続いて、制御部は、設定チェックプログラム23dに基づき以下の処理を行う。制御部は、アーム10aの先端部が設定チェック位置に配置されるようにアーム10aを制御する(ステップS2-12)。また、制御部は、設定チェック位置において視覚センサ50に第2チェックデータ23fを取得させる(ステップS2-13)。そして、制御部は、視覚センサ50によって得られた撮像データ又は検出データである第2チェックデータ23fを記憶部23等に保存する(ステップS2-14)。 Subsequently, the control unit performs the following processing based on the setting check program 23d. The control unit controls the arm 10a so that the distal end of the arm 10a is placed at the setting check position (step S2-12). Further, the control unit causes the visual sensor 50 to acquire the second check data 23f at the setting check position (step S2-13). Then, the control unit stores the second check data 23f, which is the imaging data or detection data obtained by the visual sensor 50, in the storage unit 23 or the like (step S2-14).
 続いて、制御部は、第1実施形態のステップS1-17と同様に視覚センサ50又は対象60の位置変化の有無を判断する(ステップS2-15)。前記位置変化が無い時は、制御部は表示装置22にパラメータ設定終了の表示を行い(ステップS2―19)、パラメータ設定処理を正常に終了する。
 また、制御部は、第1実施形態のステップS1-18と同様に表示装置22に位置変化量を表示する(ステップS2-16)。当該位置変化量の表示は、第1実施形態と同様にユーザによるパラメータ設定の精度の推測を可能とする。
Subsequently, the control unit determines whether there is a change in the position of the visual sensor 50 or the object 60, as in step S1-17 of the first embodiment (step S2-15). If there is no positional change, the control section displays a message indicating that the parameter setting has been completed on the display device 22 (step S2-19), and normally ends the parameter setting process.
Further, the control unit displays the amount of position change on the display device 22 (step S2-16), similarly to step S1-18 of the first embodiment. The display of the amount of change in position allows the user to estimate the accuracy of parameter setting, as in the first embodiment.
 また、制御部は、第1実施形態のステップS1-19と同様に選択画面を表示装置22に表示してもよい(ステップS2-17)。これにより、第1実施形態と同様にユーザは前記条件、経験等に基づき当該パラメータの使用又は不使用を決定できる。
 また、制御部は、当該パラメータを使用するとの入力を入力部26等によって制御部が受付けると(ステップS2-18)、当該パラメータを設定し、パラメータ設定終了の表示を行う(ステップS2-19)。当該パラメータを使用するとの入力を制御部が受付けない場合は(ステップS2-18)、パラメータの設定無し終了を行う(ステップS2-20)。
Further, the control unit may display a selection screen on the display device 22 in the same manner as step S1-19 of the first embodiment (step S2-17). Thereby, the user can decide whether to use or not use the parameter based on the conditions, experience, etc., as in the first embodiment.
Further, when the control unit receives an input to use the parameter through the input unit 26 or the like (step S2-18), the control unit sets the parameter and displays the completion of parameter setting (step S2-19). . If the control unit does not accept the input to use the parameter (step S2-18), the process ends without setting the parameter (step S2-20).
 ここで、パラメータ設定処理では、例えば第1実施形態のステップS1-6~S1-11のようにロボット10の先端部が複数種類の位置および姿勢に順次移動する。また、ロボットの周囲には他の機器、作業対象W等の様々な物体が存在する。また、ロボット10の周囲にはロボット10用のケーブルが存在しており、ケーブルもロボットの動作に応じて移動する。このため、パラメータ設定処理の最中、ユーザはロボット10およびケーブルと周囲の物体との干渉が無いように気を付ける必要がある。このため、ユーザは視覚センサ50やそのケーブルの状態に気付かない場合が多い。 Here, in the parameter setting process, the tip of the robot 10 sequentially moves to a plurality of positions and postures, for example, as in steps S1-6 to S1-11 of the first embodiment. Furthermore, various objects such as other devices and work objects W exist around the robot. Further, a cable for the robot 10 exists around the robot 10, and the cable also moves according to the movement of the robot. Therefore, during the parameter setting process, the user needs to be careful to avoid interference between the robot 10 and the cables and surrounding objects. Therefore, the user often does not notice the state of the visual sensor 50 or its cable.
 また、パラメータ設定用に視覚センサ50がユーザ等によってロボット10又は支持部に取付けられ、そのケーブルもユーザによって一時的に配置される。このため、ケーブルの一時的な配置位置によっては、ロボット10のパラメータ設定動作中に視覚センサ50がケーブルに少し張力が加わる場合がある。当該少しの張力で視覚センサ50が動く場合もあり、動かない場合もある。また、当該少しの張力で視覚センサ50が僅かにずれる場合に、ユーザがそのずれに気付かない場合が多い。対象60も振動等で僅かに位置がずれる場合があるが、同様にユーザがそのずれに気付かない場合が多い。 Further, the visual sensor 50 is attached to the robot 10 or the support part by a user or the like for parameter setting, and its cable is also temporarily placed by the user. Therefore, depending on the temporary position of the cable, the visual sensor 50 may apply a slight tension to the cable during the parameter setting operation of the robot 10. The visual sensor 50 may or may not move due to the slight tension. Further, even if the visual sensor 50 is slightly displaced due to the slight tension, the user often does not notice the displacement. The position of the object 60 may also shift slightly due to vibrations, etc., but similarly, in many cases, the user does not notice this shift.
 一例では、当該僅かなずれがあっても、パラメータ設定の計算自体はされるため、従来はパラメータ処理が正常に終了し、ユーザは当該ずれに気付かずにいた。ロボット10による作業対象Wへの作業に要求される精度にもよるが、ユーザが前記僅かなずれに全く気付かない作業もある。しかし、このような作業においても、何等かのエラーが出る場合に、その原因が当該僅かなずれに関係している可能性がある。 In one example, even if there is a slight deviation, the parameter setting calculation itself is performed, so conventionally, the parameter processing ends normally and the user does not notice the deviation. Although it depends on the accuracy required for the robot 10 to perform the work on the work object W, there may be some work in which the user does not notice the slight deviation at all. However, even in such work, if any error occurs, the cause may be related to the slight deviation.
 第1および第2実施形態では、ロボット10のパラメータ設定のための一連の制御中に、第1および第2チェックデータ23e,23fの計測および位置変化の判断が行われる。当該構成は、比較的短時間のパラメータ設定中に第1および第2チェックデータ23e,23fの計測を行うので、パラメータ設定中に発生する視覚センサ50又は対象60の僅かな位置ずれを正確に判断できる。 In the first and second embodiments, the first and second check data 23e and 23f are measured and position changes are determined during a series of controls for setting parameters of the robot 10. This configuration measures the first and second check data 23e and 23f during parameter setting for a relatively short period of time, so it is possible to accurately determine a slight positional deviation of the visual sensor 50 or object 60 that occurs during parameter setting. can.
 例えば、ステップS1-17において第1チェックデータ23eと第2チェックデータ23fとで位置が3ピクセル以上ずれると位置変化があると判断される。一例では、視覚センサ50が数百万画素を有し、対象60の位置における視覚センサ50の画角の一辺が10cm~20cm程度である。この場合、3ピクセルの位置変化は0.2mm、0.3mm等の位置変化に相当し得する。このため、上記構成は視覚センサ50又は対象60の僅かな位置ずれを正確に判断できる。 For example, in step S1-17, if the positions of the first check data 23e and the second check data 23f differ by 3 pixels or more, it is determined that there is a position change. In one example, the visual sensor 50 has several million pixels, and one side of the angle of view of the visual sensor 50 at the position of the object 60 is about 10 cm to 20 cm. In this case, a position change of 3 pixels may correspond to a position change of 0.2 mm, 0.3 mm, etc. Therefore, the above configuration can accurately determine a slight positional shift of the visual sensor 50 or the object 60.
 仮に2回目の設定チェック位置における撮像時に視覚センサ50がその光軸方向に僅かにずれている場合、当該ずれが視覚センサ50の光軸方向にのみ発生することはほぼ無い。つまり、視覚センサ50が何等かの意図せぬ原因で光軸方向にずれる時、視覚センサ50が光軸と交差する方向にずれない状況は起こり難い。対象60についても同様である。このため、上記構成は視覚センサ50又は対象60の僅かな位置ずれを正確に判断できる。 If the visual sensor 50 is slightly deviated in the optical axis direction during image capture at the second setting check position, the deviation will almost never occur only in the optical axis direction of the visual sensor 50. In other words, when the visual sensor 50 is displaced in the optical axis direction due to some unintended cause, a situation in which the visual sensor 50 is not displaced in the direction intersecting the optical axis is unlikely to occur. The same applies to the target 60. Therefore, the above configuration can accurately determine a slight positional shift of the visual sensor 50 or the object 60.
 また、第1および第2実施形態では、ステップS1-4,S2-2の前にユーザがアーム10aの先端部を移動し、当該位置が設定チェック位置として用いられる。このため、ユーザは設定チェック位置を容易且つ確実に決められ、設定チェック位置を別途設定する必要がない。これはユーザの利便性の向上に寄与する。また、ユーザは設定チェック位置として、視覚センサ50を焦点が合う位置に確実に配置できる。多くのロボット用の視覚センサ50にオートフォーカス機能が付いておらず、焦点が合う範囲は取扱い説明書等から既知である状況を考慮すると、当該構成は有用である。 Furthermore, in the first and second embodiments, the user moves the tip of the arm 10a before steps S1-4 and S2-2, and this position is used as the setting check position. Therefore, the user can easily and reliably determine the setting check position, and there is no need to separately set the setting check position. This contributes to improving user convenience. Further, the user can reliably place the visual sensor 50 in a focused position as the setting check position. This configuration is useful in view of the fact that many visual sensors 50 for robots do not have an autofocus function, and the in-focus range is known from the instruction manual or the like.
 また、第1および第2実施形態では、設定チェック位置を選択するための画面を表示する。このため、ユーザは視覚センサ50のケーブルの配線、位置等の条件に適した設定チェック位置を容易に用いることができる。 Furthermore, in the first and second embodiments, a screen for selecting a setting check position is displayed. Therefore, the user can easily use a setting check position suitable for conditions such as cable wiring and position of the visual sensor 50.
 また、第1および第2実施形態では、設定チェック位置を設定するための画面を表示する。これは、ユーザが手間取ることなく任意に設定チェック位置を設定するために有用である。 Furthermore, in the first and second embodiments, a screen for setting a setting check position is displayed. This is useful for the user to arbitrarily set the setting check position without much effort.
 なお、第1および第2実施形態において、校正用位置の何れか少なくとも1つを設定チェック位置として用いることも可能である。例えば、第1校正用位置を設定チェック位置として用いる場合、ステップS1-14,S2-12ではアーム10aの先端部を第1校正用位置に配置する。この場合でも前述と同様の作用効果を奏し得る。なお、設定チェック位置は、例えば第1実施形態のステップS1-6~S1-11の設定用動作の前半であることが好ましい。つまり、第1校正用位置又は第2校正用位置であることが好ましい。また、好ましくは、例えば第1実施形態においてステップS1-12,S1-13はステップS1-6~S1-11の設定用動作の後半又は当該設定動作の後に行われる。 Note that in the first and second embodiments, it is also possible to use at least one of the calibration positions as the setting check position. For example, when the first calibration position is used as the setting check position, the tip of the arm 10a is placed at the first calibration position in steps S1-14 and S2-12. Even in this case, the same effects as described above can be achieved. Note that the setting check position is preferably the first half of the setting operation in steps S1-6 to S1-11 of the first embodiment, for example. In other words, it is preferable to use the first calibration position or the second calibration position. Preferably, for example, in the first embodiment, steps S1-12 and S1-13 are performed in the latter half of the setting operation of steps S1-6 to S1-11 or after the setting operation.
 また、第1および第2実施形態において、他の設定チェック位置が更に設定されてもよい。他の設定チェック位置は例えば視覚センサ50のケーブルに張力が加わる可能性が高い位置等に設定され得る。他の設定チェック位置が校正用位置の何れかであってもよい。設定チェック位置が制御部によって自動的に設定される場合、又は、設定チェック位置が制御部に予め設定されている場合もあり得る。これらの場合に、制御部が設定チェック位置を表示装置22に表示するように構成されていてもよい。例えば、制御部は、第1校正用位置の前にアーム10aが止まった位置、第1校正用位置と第2構成用位置との間の位置等が設定チェック位置であることを表示する。この場合、ユーザは設定チェック位置がどこであるか把握することができ、これは設定作業時のエラー原因の特定の容易化等に貢献する。 In addition, in the first and second embodiments, other setting check positions may be further set. Other setting check positions may be set, for example, at positions where tension is likely to be applied to the cable of the visual sensor 50. The other setting check positions may be any of the calibration positions. There may be cases where the setting check position is automatically set by the control unit, or there may be cases where the setting check position is set in advance in the control unit. In these cases, the control unit may be configured to display the setting check position on the display device 22. For example, the control unit displays that the setting check position is a position where the arm 10a stops before the first calibration position, a position between the first calibration position and the second configuration position, or the like. In this case, the user can understand where the setting check position is, which contributes to facilitating identification of the cause of an error during setting work.
 なお、第1および第2実施形態において、視覚センサ50が三次元カメラ、三次元距離センサ、LiDAR(Light Detection and Ranging)、PSD(Position
Sensing Detector)等であってもよい。これらの場合に画像処理装置40の構成は適宜変更される。また、視覚センサ50として、作業対象Wに前記作業を行う間にアーム10aの先端部に取付けられている視覚センサが用いられる場合もあり得る。
Note that in the first and second embodiments, the visual sensor 50 is a three-dimensional camera, a three-dimensional distance sensor, LiDAR (Light Detection and Ranging), PSD (Position
Sensing Detector) etc. may also be used. In these cases, the configuration of the image processing device 40 is changed as appropriate. Further, as the visual sensor 50, a visual sensor that is attached to the tip of the arm 10a while performing the work on the work object W may be used.
 本開示の実施形態について詳述したが、本開示は上述した個々の実施形態に限定されるものではない。これらの実施形態は、発明の要旨を逸脱しない範囲で、または、特許請求の範囲に記載された内容とその均等物から導き出される本発明の思想および趣旨を逸脱しない範囲で、種々の追加、置き換え、変更、部分的削除等が可能である。例えば、上述した実施形態において、各動作の順序の変更、各処理の順序の変更、条件に応じた一部の動作の省略又は追加、条件に応じた一部の処理の省略又は追加は、上記の例に拘泥されることなく可能である。また、上記実施形態の説明に数値又は数式が用いられている場合も同様である。 Although the embodiments of the present disclosure have been described in detail, the present disclosure is not limited to the individual embodiments described above. These embodiments may include various additions and substitutions without departing from the gist of the invention or the spirit and spirit of the present invention derived from the content described in the claims and equivalents thereof. , change, partial deletion, etc. are possible. For example, in the embodiments described above, changing the order of each operation, changing the order of each process, omitting or adding some operations depending on conditions, omitting or adding some processes depending on conditions, etc. It is possible without being restricted to the example. Further, the same applies when numerical values or formulas are used in the description of the above embodiments.
2 搬送装置
10 ロボット
10a アーム
11 サーボモータ
20 制御装置
21 プロセッサ
22 表示装置
23 記憶部
23a システムプログラム
23b 動作プログラム
23c パラメータ設定プログラム
23d 設定チェックプログラム
23e 第1チェックデータ
23f 第2チェックデータ
26 入力部
30 ツール
31 サーボモータ
40 画像処理装置
41 プロセッサ
43 記憶部
43a 画像処理プログラム
50 視覚センサ(センサ)
60 対象
61 マーク
70 支持部
101 基準座標系
102 先端部座標系
103 センサ座標系
W 作業対象
2 Transfer device 10 Robot 10a Arm 11 Servo motor 20 Control device 21 Processor 22 Display device 23 Storage section 23a System program 23b Operation program 23c Parameter setting program 23d Setting check program 23e First check data 23f Second check data 26 Input section 30 Tool 31 Servo motor 40 Image processing device 41 Processor 43 Storage unit 43a Image processing program 50 Visual sensor (sensor)
60 Target 61 Mark 70 Support part 101 Reference coordinate system 102 Tip coordinate system 103 Sensor coordinate system W Work target

Claims (9)

  1.  対象およびセンサの一方であり、ロボットのアームに取付けられる第1の設定用要素と、
     前記対象および前記センサの他方である第2の設定用要素と、
     前記第1の設定用要素を複数の位置に配置するパラメータの設定用動作を前記アームに行わせる制御部と、を備え、
     前記制御部は、前記設定用動作の前半又は前記設定用動作の前に、前記第1の設定用要素が設定チェック位置に配置された状態で、前記対象に関する第1チェックデータを前記センサに取得させ、
     前記制御部は、前記設定用動作の後半又は前記設定用動作の後に、前記第1の設定用要素を前記設定チェック位置に配置する動作を前記アームに行わせると共に、前記対象に関する第2チェックデータを前記センサに取得させ、
     前記制御部は、前記第1チェックデータと前記第2チェックデータとに基づいて、前記第1の設定用要素又は前記第2の設定用要素の位置変化の有無を判断する、ロボットのパラメータの設定装置。
    a first configuration element, which is one of an object and a sensor and is attached to the arm of the robot;
    a second setting element that is the other of the target and the sensor;
    a control unit that causes the arm to perform a parameter setting operation for arranging the first setting element at a plurality of positions;
    The control unit acquires first check data regarding the object to the sensor in a state where the first setting element is placed at a setting check position during the first half of the setting operation or before the setting operation. let me,
    The control unit causes the arm to perform an operation of arranging the first setting element at the setting check position in the latter half of the setting operation or after the setting operation, and also causes the arm to perform an operation of arranging the first setting element at the setting check position, and also controls second check data regarding the object. to be acquired by the sensor,
    The control unit determines whether or not there is a change in the position of the first setting element or the second setting element based on the first check data and the second check data, and the control unit sets parameters of the robot. Device.
  2.  制御部は、前記パラメータの設定のために前記第1の設定用要素を複数の位置に配置する前に、ユーザが前記アームを動かして前記第1の設定用要素を配置した位置を前記設定チェック位置として用いる、請求項1に記載のロボットのパラメータの設定装置。 Before arranging the first setting element at a plurality of positions for setting the parameters, the control unit performs the setting check on the position where the first setting element is placed by moving the arm by the user. The robot parameter setting device according to claim 1, which is used as a position.
  3.  前記位置変化が有るとの判断を表示する表示装置を備える、請求項1又は2に記載のロボットのパラメータの設定装置。 The robot parameter setting device according to claim 1 or 2, further comprising a display device that displays a determination that there is a position change.
  4.  前記位置変化が有る場合に前記位置変化の量を表示する表示装置を備える、請求項1又は2に記載のロボットのパラメータの設定装置。 The robot parameter setting device according to claim 1 or 2, further comprising a display device that displays the amount of the position change when there is a position change.
  5.  前記位置変化が有る場合に前記位置変化の方向を表示する表示装置を備える、請求項1又は2に記載のロボットのパラメータの設定装置。 The robot parameter setting device according to claim 1 or 2, further comprising a display device that displays the direction of the position change when the position change occurs.
  6.  前記設定チェック位置をユーザに設定させるための画面、又は、選択させるための画面を表示する表示装置を備える、請求項1又は2に記載のロボットのパラメータの設定装置。 3. The robot parameter setting device according to claim 1, further comprising a display device that displays a screen for allowing the user to set or select the setting check position.
  7.  前記制御部による前記位置変化が有るとの判断の後に、設定された前記パラメータを用いるか否かユーザに選択させる画面を前記表示装置に表示する、請求項1又は2に記載のロボットのパラメータの設定装置。 3. The robot parameter setting method according to claim 1, wherein after the controller determines that there is a change in the position, a screen is displayed on the display device that allows the user to select whether or not to use the set parameters. Setting device.
  8.  前記第2の設定用要素は所定の支持部によって支持される、請求項1又は2に記載のロボットのパラメータの設定装置。 The robot parameter setting device according to claim 1 or 2, wherein the second setting element is supported by a predetermined support part.
  9.  請求項1~8の何れかに記載のロボットのパラメータの設定装置を備えたロボット。 A robot comprising the robot parameter setting device according to any one of claims 1 to 8.
PCT/JP2022/029131 2022-07-28 2022-07-28 Robot parameter setting device WO2024024036A1 (en)

Priority Applications (2)

Application Number Priority Date Filing Date Title
PCT/JP2022/029131 WO2024024036A1 (en) 2022-07-28 2022-07-28 Robot parameter setting device
TW112127224A TW202419238A (en) 2022-07-28 2023-07-20 Robot parameter setting device

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
PCT/JP2022/029131 WO2024024036A1 (en) 2022-07-28 2022-07-28 Robot parameter setting device

Publications (1)

Publication Number Publication Date
WO2024024036A1 true WO2024024036A1 (en) 2024-02-01

Family

ID=89705774

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/JP2022/029131 WO2024024036A1 (en) 2022-07-28 2022-07-28 Robot parameter setting device

Country Status (2)

Country Link
TW (1) TW202419238A (en)
WO (1) WO2024024036A1 (en)

Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP4191080B2 (en) * 2004-04-07 2008-12-03 ファナック株式会社 Measuring device
JP2019084649A (en) * 2017-11-09 2019-06-06 オムロン株式会社 Interference determination method, interference determination system, and computer program

Patent Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP4191080B2 (en) * 2004-04-07 2008-12-03 ファナック株式会社 Measuring device
JP2019084649A (en) * 2017-11-09 2019-06-06 オムロン株式会社 Interference determination method, interference determination system, and computer program

Also Published As

Publication number Publication date
TW202419238A (en) 2024-05-16

Similar Documents

Publication Publication Date Title
JP6490037B2 (en) Robot system comprising a robot supported by a movable carriage
US20050273199A1 (en) Robot system
US9050728B2 (en) Apparatus and method for measuring tool center point position of robot
JP5670416B2 (en) Robot system display device
US7161321B2 (en) Measuring system
JP6517203B2 (en) Bending press
JP2000250626A (en) Method and device for correcting position of automated guided vehicle
CN111216099B (en) Robot system and coordinate conversion method
CN109719714B (en) Robot, robot system, and robot coordinate system setting method
CN114589487A (en) Accurate position control for fixture-less assembly
JP5418915B2 (en) Robot, status presentation device, status presentation method, and robot teaching method
JP6928015B2 (en) Robot system and coordinate conversion method
WO2024024036A1 (en) Robot parameter setting device
JP7343349B2 (en) How to determine the position of the robot, measurement jig, and tool tip
JPH04211807A (en) Method and device for estimating installing error of robot and robot drive controlling method, working bench with standard, and standard
JP7477633B2 (en) Robot System
WO2022195938A1 (en) Robot system positioning accuracy measurement method
JP5118896B2 (en) Transfer robot system
JP3515657B2 (en) Three-dimensional position detecting device and transfer robot using the same
WO2023209827A1 (en) Robot, robot control device, and work robot system
JP6708675B2 (en) robot
JP7472223B2 (en) Press brake, bending system and sensor movement control method
EP3889542B1 (en) Method for estimating normal vector to die and/or normal vector to attachment
JP2012016769A (en) Mount device and method of visual sensor
CN116803628A (en) Object detection method and detection device

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 22953125

Country of ref document: EP

Kind code of ref document: A1