WO2022163590A1 - 教示装置 - Google Patents
教示装置 Download PDFInfo
- Publication number
- WO2022163590A1 WO2022163590A1 PCT/JP2022/002464 JP2022002464W WO2022163590A1 WO 2022163590 A1 WO2022163590 A1 WO 2022163590A1 JP 2022002464 W JP2022002464 W JP 2022002464W WO 2022163590 A1 WO2022163590 A1 WO 2022163590A1
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- icon
- information
- instruction
- execution
- teaching device
- Prior art date
Links
- 230000006870 function Effects 0.000 claims abstract description 29
- 230000000007 visual effect Effects 0.000 claims description 42
- 230000004044 response Effects 0.000 claims description 28
- 238000000034 method Methods 0.000 claims description 2
- 238000003466 welding Methods 0.000 description 38
- 238000010586 diagram Methods 0.000 description 16
- 230000009471 action Effects 0.000 description 11
- 238000001514 detection method Methods 0.000 description 10
- 238000013500 data storage Methods 0.000 description 8
- 238000003825 pressing Methods 0.000 description 6
- 230000004913 activation Effects 0.000 description 3
- 238000012986 modification Methods 0.000 description 3
- 230000004048 modification Effects 0.000 description 3
- 210000000707 wrist Anatomy 0.000 description 3
- 238000010891 electric arc Methods 0.000 description 2
- 238000010079 rubber tapping Methods 0.000 description 2
- 238000007792 addition Methods 0.000 description 1
- 230000000694 effects Effects 0.000 description 1
- 238000003384 imaging method Methods 0.000 description 1
- 230000010365 information processing Effects 0.000 description 1
- 230000001131 transforming effect Effects 0.000 description 1
- 230000007704 transition Effects 0.000 description 1
Images
Classifications
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B25—HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
- B25J—MANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
- B25J9/00—Programme-controlled manipulators
- B25J9/0081—Programme-controlled manipulators with master teach-in means
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B25—HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
- B25J—MANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
- B25J9/00—Programme-controlled manipulators
- B25J9/16—Programme controls
- B25J9/1656—Programme controls characterised by programming, planning systems for manipulators
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B25—HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
- B25J—MANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
- B25J13/00—Controls for manipulators
- B25J13/08—Controls for manipulators by means of sensing devices, e.g. viewing or touching devices
-
- G—PHYSICS
- G05—CONTROLLING; REGULATING
- G05B—CONTROL OR REGULATING SYSTEMS IN GENERAL; FUNCTIONAL ELEMENTS OF SUCH SYSTEMS; MONITORING OR TESTING ARRANGEMENTS FOR SUCH SYSTEMS OR ELEMENTS
- G05B19/00—Programme-control systems
- G05B19/02—Programme-control systems electric
- G05B19/42—Recording and playback systems, i.e. in which the programme is recorded from a cycle of operations, e.g. the cycle of operations being manually controlled, after which this record is played back on the same machine
-
- G—PHYSICS
- G05—CONTROLLING; REGULATING
- G05B—CONTROL OR REGULATING SYSTEMS IN GENERAL; FUNCTIONAL ELEMENTS OF SUCH SYSTEMS; MONITORING OR TESTING ARRANGEMENTS FOR SUCH SYSTEMS OR ELEMENTS
- G05B2219/00—Program-control systems
- G05B2219/30—Nc systems
- G05B2219/39—Robotics, robotics to robotics hand
- G05B2219/39438—Direct programming at the console
-
- G—PHYSICS
- G05—CONTROLLING; REGULATING
- G05B—CONTROL OR REGULATING SYSTEMS IN GENERAL; FUNCTIONAL ELEMENTS OF SUCH SYSTEMS; MONITORING OR TESTING ARRANGEMENTS FOR SUCH SYSTEMS OR ELEMENTS
- G05B2219/00—Program-control systems
- G05B2219/30—Nc systems
- G05B2219/40—Robotics, robotics mapping to robotics vision
- G05B2219/40099—Graphical user interface for robotics, visual robot user interface
-
- G—PHYSICS
- G05—CONTROLLING; REGULATING
- G05B—CONTROL OR REGULATING SYSTEMS IN GENERAL; FUNCTIONAL ELEMENTS OF SUCH SYSTEMS; MONITORING OR TESTING ARRANGEMENTS FOR SUCH SYSTEMS OR ELEMENTS
- G05B2219/00—Program-control systems
- G05B2219/30—Nc systems
- G05B2219/40—Robotics, robotics mapping to robotics vision
- G05B2219/40392—Programming, visual robot programming language
Definitions
- the present invention relates to a teaching device.
- Patent Document 2 discloses a teaching device for inputting (setting) various teaching data such as selection of an operation method of a robot body, teaching of movement points, etc., in which a setting menu on a touch panel is displayed.
- an item selection screen is displayed in which a plurality of teaching items are represented by icons, and a configuration is described in which each icon displays a moving image representing the content of the teaching item (abstract).
- setting information of the icon must be opened without selecting an icon on the program creation screen and opening a parameter setting screen for setting the parameters of the icon. cannot be verified.
- teaching (programming) using icons it is often desired to check not only the setting information displayed on the parameter setting screen, but also sensor information related to the icon, results of executing the icon, and the like.
- One aspect of the present disclosure is a teaching device for creating a control program for a robot, which includes a screen generation unit that generates a program creation screen for creating the program by commands representing functions constituting the control program for the robot. and a related information display control unit that displays information related to the instruction to be selected or instructed to be executed in accordance with the selection or instruction to execute the instruction placed on the program creation screen. .
- FIG. 1 is an overall configuration diagram of a robot system including a teaching device according to a first embodiment
- FIG. FIG. 4 is a diagram showing a state in which a force sensor is arranged between the wrist flange and the hand of the robot
- 3 is a diagram showing a hardware configuration example of a visual sensor control device and a robot control device
- FIG. 3 is a functional block diagram of a visual sensor control device and a robot control device
- FIG. It is a figure showing the hardware structural example of a teaching device.
- FIG. 10 is an operation example of the first embodiment, showing a state in which a live image of the visual sensor is displayed by selecting a viewing icon arranged in the icon display area
- FIG. 10 is an operation example of the first embodiment, showing a state in which a live image of the visual sensor is displayed by selecting a viewing icon arranged in the icon display area;
- FIG. 10 is an operation example of the first embodiment, showing a state in which a live image of the visual sensor is displayed by selecting a viewing icon
- FIG. 10 is an operation example of the first embodiment, showing a state in which a live image of the visual sensor is displayed by selecting a viewing icon arranged in the programming area;
- FIG. 10 is an operation example of the second embodiment, and shows a state in which information of a detection result is displayed as an execution result of the viewing icon arranged in the program creation area;
- FIG. 14 is an operation example of Example 3, and is a diagram showing a state in which force sensor information is displayed in response to selection of an icon related to the function of the force sensor in the icon display area.
- FIG. 17 is an operation example of the fourth embodiment, showing an example of displaying information about a device connected to the robot in response to an execution instruction for an icon related to the device.
- FIG. 20 is an operation example of Example 5, and is a diagram showing an example of displaying setting information of a standby setting icon arranged in an icon display area in response to selection of the icon.
- FIG. 20 is an operation example of the fifth embodiment, showing an example of displaying setting information of a standby setting icon arranged in a program creation area in response to an execution instruction of the icon;
- FIG. 20 is an operation example of Example 5, and is a diagram showing an example of displaying setting information of a linear movement icon arranged in an icon display area in response to selection of the icon.
- FIG. 20 is an operation example of the sixth embodiment, showing an example of displaying internal variables related to a register icon arranged in an icon display area in response to selection of the icon;
- FIG. 20 is an operation example of the sixth embodiment, showing a display example when there is an error in the setting of the register icon;
- FIG. 20 is an operation example of the seventh embodiment, showing an example of displaying information related to an icon before execution of the icon in response to an execution instruction for the program of the icon arranged in the program creation area.
- It is a functional block diagram of a teaching device according to a second embodiment.
- FIG. 10 is a diagram for explaining switching of display content based on display attribute information;
- FIG. 10 is a diagram for explaining switching of display contents based on execution state attributes;
- FIG. 1 is a diagram showing the overall configuration of a robot system 100 including a teaching device 10 according to one embodiment.
- the teaching device 10 is a teaching device for performing programming based on commands representing functions constituting a robot control program.
- the teaching device 10 is a teaching device that enables programming using icons representing functions constituting a robot control program (that is, representing robot control commands).
- a robot system including the teaching device 10 may have various configurations, but in this embodiment, the robot system 100 shown in FIG. 1 will be described as an example.
- the robot system 100 includes a robot 30 having a hand 33 mounted on the tip of an arm, a robot controller 50 that controls the robot 30, a teaching device 10 connected to the robot controller 50, and an arm tip of the robot 30. and a visual sensor controller 40 for controlling the visual sensor 70 .
- the robot system 100 is configured to detect the object 1 on the workbench 2 with the visual sensor 70 and handle the object 1 with the hand 33 mounted on the robot 30 .
- the visual sensor control device 40 has a function of controlling the visual sensor 70 and a function of performing image processing on the image captured by the visual sensor 70 .
- the visual sensor control device 40 detects the position of the object 1 from the image captured by the visual sensor 70 and provides the detected position of the object 1 to the robot control device 50 .
- the robot control device 50 can correct the teaching position and take out the object 1 or the like.
- FIG. 1 shows a configuration example in which the robot system 100 includes the visual sensor 70 as a sensor related to the functions of the robot system
- sensors related to the functions of the robot system there are various examples of sensors related to the functions of the robot system.
- a force sensor when a force sensor is mounted on the robot system, a force sensor 71 may be arranged between the wrist flange 31 of the robot 30 and the hand 33A as shown in FIG.
- FIG. 2 shows only the vicinity of the tip of the arm of the robot 30 .
- the robot 30 can be caused to perform an operation of pressing the workpiece W1 against another workpiece with a constant force (for example, precision fitting operation).
- FIG. 3 is a diagram showing a hardware configuration example of the visual sensor control device 40 and the robot control device 50.
- the visual sensor control device 40 is a general computer in which a memory 42 (ROM, RAM, non-volatile memory, etc.), an input/output interface 43 and the like are connected to a processor 41 via a bus. You may have a configuration as Further, the robot control device 50 has a memory 52 (ROM, RAM, non-volatile memory, etc.), an input/output interface 53, an operation unit 54 including various operation switches, etc. connected to the processor 51 via a bus. It may have a configuration as a typical computer.
- the visual sensor 70 may be a camera that captures a grayscale image or a color image, or a stereo camera or a three-dimensional sensor that can acquire a range image or a three-dimensional point group.
- a plurality of visual sensors may be arranged in the robot system 100 .
- the visual sensor control device 40 holds a model pattern of an object, and executes image processing for detecting the object by pattern matching between the image of the object in the captured image and the model pattern.
- FIG. 1 shows an example in which the visual sensor 70 is attached to the tip of the arm of the robot 30, the visual sensor 70 is fixed at a known position within the working space where the object 1 on the workbench 2 can be photographed.
- the robot 30 grasps the object 1 and performs an operation of showing the object to a visual sensor fixedly installed in the work space, thereby detecting, inspecting, etc. the object 1 .
- FIG. 4 is a functional block diagram showing main functional blocks of the visual sensor control device 40 and the robot control device 50.
- the visual sensor control device 40 includes an image processing unit 402 that performs image processing on an input image 401 captured by the visual sensor 70, and the visual sensor 70 with respect to the reference coordinate system set in the robot 30. and a calibration data storage unit 403 that stores calibration data that determines the relative position.
- the image processing unit 402 executes image processing for detecting the object 1 from the input image by pattern matching using a model pattern.
- the robot control device 50 includes a motion control unit 501 that controls the motion of the robot 30 according to the motion command input from the teaching device 10 or the control program held in the internal memory.
- the image processing unit 402 provides the detected position of the object 1 to the operation control unit 501, and the operation control unit 501 handles the object 1 using the detected position provided from the image processing unit 402. .
- the teaching device 10 is used to teach the robot 30 (that is, create a control program).
- Various information processing devices such as a teaching operation panel, a tablet terminal, a smart phone, and a personal computer can be used as the teaching device 10 .
- FIG. 5 is a diagram showing a hardware configuration example of the teaching device 10. As shown in FIG. As shown in FIG. 5, the teaching device 10 provides an operation input device such as a memory 12 (ROM, RAM, non-volatile memory, etc.), a display unit 13, and a keyboard (or software keys) for a processor 11. It may have a configuration as a general computer in which the unit 14, the input/output interface 15 and the like are connected via a bus.
- FIG. 6 is a functional block diagram of the teaching device 10.
- the teaching device 10 includes a program creating unit 110 for creating/editing a control program using icons.
- the program creation unit 110 includes an icon data storage unit 111 for storing icon data, a screen generation unit 112 for generating various screens for creating control programs, and an operation for receiving various user operations for creating/editing control programs. It has an input reception unit 113 , a program generation unit 114 , an execution unit 115 , a parameter setting unit 116 , and a related information display control unit 117 .
- the icon data storage unit 111 is composed of, for example, a non-volatile memory, and stores various information related to icons.
- the information about the icon includes information about the design of the icon, parameters (setting information) set for the function of the icon, a character string simply representing the function of the icon, and the like. Default values may be set in the icon data storage unit 111 as parameter settings for each icon.
- the screen generation unit 112 generates a program creation screen 500 for creating a control program using icons.
- the program creation screen 500 includes a first area (icon display area 200) for displaying a list of one or more icons representing functions constituting the control program of the robot 30; and a second area (program creation area 300) for creating a control program by arranging icons selected from the area.
- the user selects desired icons from the icon display area 200 and arranges them in the program creation area 300 . Then, the user sets parameters relating to the function of each icon arranged in the programming area 300 as necessary. This allows the user to create the control program without requiring knowledge of the grammar of the programming language.
- the operation input reception unit 113 receives various operation inputs on the program creation screen 500 .
- the operation input receiving unit 113 selects an icon according to an operation in which the cursor of the input device is aligned with one of the icons arranged in the icon display area 200 or the program creation area 300, or selects the icon in the icon display area.
- the operation input of dragging and dropping the icons arranged in 200 and arranging them in the program creation area 300 is supported.
- the program generation unit 114 generates a control program in a programming language from one or more icons arranged in the program creation area 300 in order of operation.
- the execution unit 115 executes the icons arranged in the program creation area 300 in accordance with a user operation instructing execution of the icons.
- the execution unit 115 can collectively execute the icons arranged in the program creation area 300 or execute one or a plurality of icons in the program creation area 300 .
- execution unit 115 sequentially executes the icons arranged in program creation area 300 to create a program.
- the icon may be configured to be executed in response to an operation of selecting an icon in the area 300 and instructing execution (an operation of placing the cursor and clicking (or tapping)).
- execution instructions for icons (programs) are not limited to user operations.
- an icon (program) may be executed by an external signal input to the teaching device (for example, a signal input to the robot 30 or the robot control device 50 from various devices).
- the parameter setting unit 116 provides a function of setting parameters for an icon selected in the program creation area 300 or the icon display area 200 . For example, when the detail tab 652 (see FIG. 7) is selected while the icon is selected, the parameter setting unit 116 generates a screen for parameter input and accepts parameter setting input.
- parameter settings are, for example, as follows. "Look icon" corresponding to the detection function using the visual sensor: ⁇ Initial setting of camera ⁇ Teaching setting of target object ⁇ Setting of shooting position "Linear movement icon” corresponding to the function of linear movement: ⁇ Operating speed ⁇ Positioning setting ⁇ Position register designation Note that, as described above, default values may be set in advance for these parameters.
- the related information display control unit 117 responds to at least one of an operation of selecting an icon in the icon display area 200 or the program creation area 300, or an operation of instructing execution of an icon arranged in the program creation area 300.
- Information related to the icon that is the target of the selection operation or the operation instructing execution is displayed on the display screen of the display unit 13 .
- Icon-related information includes any of icon parameter setting information, sensor and device information associated with the icon, icon execution results or execution states, icon-related program internal variables, and the like.
- Example 1 will be described with reference to FIG.
- Example 1 is an example of displaying sensor information related to an icon in response to selection of the icon in the icon display area 200 or the program creation area 300 .
- the "view icon 601" is selected.
- a view icon 601 provides a function of detecting an object by the visual sensor 70 and acquiring position information of the object.
- the program creation screen 500 includes an icon display area 200 for displaying a list of various icons that can be used for program creation, and a program creation area for arranging the icons in order of operation to create a control program. 300.
- a pop-up screen 700 displaying information related to an icon is displayed in response to an icon selection operation or an icon execution instruction operation.
- Pop-up screen 700 is preferably displayed as a screen (window) separate from icon display area 200 and program creation area 300 (so as not to overlap with those areas). Note that the form of displaying the information related to the icon is not limited to the pop-up screen, and various display forms can be adopted.
- the program creation screen 500 may further include a status display area 450.
- a status display area 450 Note that in the embodiment 1-7, an example of displaying the robot model 30M representing the current position and orientation of the robot 30 in the state display area 450 is shown. It should be noted that such an image of the robot model 30M is obtained by acquiring the current position and orientation information of the robot 30 from the robot control device 50 in the teaching device 10, and transforming the 3D model data of the robot 30 into a virtual work space according to the position and orientation information. This can be achieved by operating within
- the icon 601 to be viewed is selected by moving the cursor 671 to the icon 601 to be viewed within the icon display area 200 . It should be noted that the icon may be selected even during the operation of selecting the target icon in the icon display area 200 and dragging and dropping it to place it in the program creation area 300 .
- the related information display control unit 117 displays the live image M1 of the imaging area of the visual sensor 70 as the information of the visual sensor 70 associated with the viewing icon 601.
- a pop-up screen 700 is displayed.
- the live image M1 is immediately displayed when the user only selects the icon 601 to be viewed for program creation, the user does not need to perform complicated operations to confirm the image in the field of view of the visual sensor 70.
- related information display control unit 117 displays popup screen 700 showing live image M1 of visual sensor 70 in response to selection of viewing icon 601 arranged in program creation area 300. may be displayed.
- the related information display control unit 117 may display information related to the icon (here, the live image M1) on the pop-up screen 700 while the target icon is selected.
- the icon-related information is popped up separately from the program creation screen and the parameter setting screen. You can create a program while checking the In programming using icons, various types of information that support programming are provided with extremely simple operations, enabling more intuitive and efficient programming. Such effects of the first embodiment are common to each of the embodiments described below.
- Example 2 A second embodiment will be described with reference to FIG.
- the second embodiment is an example of displaying information about the execution result of an icon placed in the program creation area 300 when an operation is performed to instruct execution of the icon.
- a viewing icon 601 On the program creation screen 500 shown in FIG. 9, a viewing icon 601, an action (vision) icon 602, and a linear movement icon 603 are arranged in the program creation area 300 by user operation.
- the action (vision) icon 602 provides a function of moving the robot 30 (control part) to the position detected by the viewing icon 601 and picking up the object.
- the linear movement icon 603 provides a function of linear movement of the robot 30 (control part).
- the execution result of the view icon 601 is displayed according to the operation to select and execute the view icon 601 has been described.
- the execution result may be displayed according to the completion of the execution of the icon 601 to be viewed.
- the user simply performs an operation to instruct the execution of the icon related to the visual sensor or the program including the icon related to the visual sensor on the program creation screen 500, and the execution result of the icon (that is, , detection results indicating the presence or absence of the taught object) can be quickly confirmed.
- Example 3 will be described with reference to FIG.
- the third embodiment is an example of displaying force sensor information related to an icon in response to selection or execution instruction of an icon related to a function using a force sensor in the icon display area 200 .
- the force sensor 71 is arranged in the robot system 100 as shown in FIG.
- the "pressing action icon 604" related to force control using the force sensor 71 is selected in the icon display area 200.
- FIG. The force sensor 71 shall be a 6-axis force sensor.
- the related information display control unit 117 displays a graph 721 representing the detection values of the force sensor 71 on the pop-up screen 700 according to the selection of the pressing action icon 604 .
- a graph 721 shows, as the output values of the force sensor 71, the load magnitude (X, Y, Z) in the XYZ axis directions and the moment (W, P, R) around the XYZ axes in the coordinate system set in the force sensor 71. Shows time transition.
- the vertical axis represents the magnitude of load or moment
- the horizontal axis represents time.
- the user can immediately check the current detected value of the force sensor simply by selecting the icon related to the force sensor on the program creation screen 500 .
- a graph of the detection value of the force sensor 71 as information about the state of execution during execution of the pressing action icon 604 or as information about the execution result after execution in response to the selection/execution instruction operation for the pressed action icon 604. 721 may be displayed on the pop-up screen 700 .
- Example 4 will be described with reference to FIG.
- a fourth embodiment is an example of displaying information about a device (IO device) connected to the robot 30 (robot control device 50) in response to a selection or execution instruction for an icon related to the device. It is assumed that the "talk icon 605" is selected from among the icons arranged in the program creation area 300 on the program creation screen 500 of FIG. 11 and an execution instruction is given. A talk icon 605 corresponds to an instruction to open the hand 33 .
- the related information display control unit 117 graphically displays the IO information 731 representing the IO control state of the hand 33 and the current open/closed state of the hand 33 in response to the selection/execution instruction operation on the talk icon 605 .
- a hand image 732 is displayed on the pop-up screen 700 .
- the IO information 731 and the hand image 732 representing the state of IO control may be displayed in real time during execution of the release icon 605, or may be displayed as information representing the execution result after execution of the release icon 605. Also good.
- the related information display control unit 117 can acquire IO control information from the robot control device 50 .
- the teaching device 10 and the hand 33 may be directly connected so that the teaching device 10 directly acquires the control state of the hand 33 from the hand 33 (control section of the hand 33).
- a fifth embodiment will be described with reference to FIGS. 12-14.
- a fifth embodiment is an example of displaying parameters (setting information) of an icon according to the selection of the icon or an execution instruction.
- 12 and 13 show an operation example of displaying information about the standby setting icon 606 in response to the selection or execution of the command icon (standby setting icon 606) that causes the program to wait for a specified period of time. show.
- the standby setting icon 606 is selected in the icon display area 200 on the program creation screen 500 shown in FIG.
- the related information display control unit 117 displays the setting time information 741 (“standby setting time 6 sec”) of the standby setting icon 606 on the pop-up screen 700 .
- the user can confirm the setting information of the target icon only by selecting the target icon. It is assumed that the "set standby time 6 sec" displayed on the pop-up screen 700 of FIG. 12 is a value set in advance by the user via the parameter setting screen.
- the related information display control unit 117 displays the default value of the standby set time (eg, 5 sec) as the set time information 741. You can In the pop-up screen 700 of FIG. 12, the standby setting icon 606 is not in operation, so the elapsed time information 742 representing the actual elapsed time of the standby time is displayed as 0.0 sec.
- the related information display control unit 117 causes the pop-up screen 700 to display information regarding the setting of the standby setting icon 606 and information regarding the state of the icon being executed in response to the start of execution of the standby setting icon 606 .
- the related information display control unit 117 displays real-time elapsed time information 742 ("3.3 sec") about the standby time together with information 741 ("standby set time 6 sec") of the set time of the standby setting icon 606. Let This allows the user to confirm at a glance how long the robot 30 has been stationary.
- FIG. 14 shows an example of displaying setting information about the linear movement icon 603 corresponding to the linear movement command by selecting the linear movement icon 603 in the icon display area 200 .
- the related information display control unit 117 displays information 751 of setting details of the linear movement icon 603 in the popup screen 700 .
- the information 751 of the linear movement icon 603 includes the operation speed of the linear movement (1000 mm/sec), the designation of the "positioning" operation to stop at the target position after movement, and the use of the position register [2] as the position designation. , and a comment ('fetch location').
- the fifth embodiment supports the user who creates (teaches) the program and makes the programming more efficient.
- Example 6 (Example 6) Example 6 will be described with reference to FIGS. 15 and 16.
- FIG. Example 6 relates to an operation example of displaying internal variables of a program related to a target icon. Here, information display when an icon related to a register (register icon 607) is selected will be described.
- the register icon 607 is specified and selected by the cursor 671 in the icon display area 200.
- the related information display control unit 117 may display that there is an error in the setting within the pop-up screen 700.
- the pop-up screen 700 indicates that the register number "0" is incorrect. A state highlighted in a manner is shown.
- the internal variables of the program are presented, and error information is also presented if there is an error in the setting of the variables. be able to.
- Example 7 A seventh embodiment will be described with reference to FIG.
- the seventh embodiment when an operation is performed to instruct execution of a program of an icon placed in the program creation area 300, information related to a certain icon is displayed in a pop-up before execution of the icon. For example.
- FIG. 17 shows a situation in which information 751 of the setting content of the linear movement icon 603 is displayed while the action (vision) icon 602 is being executed.
- FIG. 17 describes an example of presenting information about one icon before execution of the icon. Also good.
- the information related to the icon is displayed before the icon is executed, so the user can grasp in advance the information related to the icon whose execution is scheduled to start. For example, if an error or the like is included in the settings of an icon scheduled to be executed, it is possible to take measures such as stopping the program before the icon is executed.
- the teaching device 10A has a configuration obtained by adding an attribute information setting unit 118 to the teaching device 10 according to the first embodiment.
- the teaching device 10A includes the functions of the teaching device 10 described above. It is assumed that the robot system to be programmed by the teaching device 10A is a welding robot system. In the case of a welding robot system, in the robot system 100 shown in FIG. It can be realized by configuring the welding power source to operate under the control of the robot control device 50 by connecting. This embodiment assumes such a welding robot system.
- the attribute information setting unit 118 provides a function of setting display attribute information defining display content to be displayed as icon-related information for each icon.
- the set display attribute information is stored in the icon data storage unit 111 in association with the icon.
- the related information display control unit 117 determines the display content according to the display attribute information set for the icon when displaying the information about the icon in response to the execution instruction for the icon.
- linear movement icons 621 and 622, a welding start icon 623, a welding point movement icon 624, and linear movement icons 625 and 626 are arranged in the programming area 300 as welding programs.
- Weld activation icon 623 represents an instruction to activate the welding power supply.
- the welding start icon 623 has a long strip shape, and represents that the icon (welding point movement icon 624) arranged therein is performed during welding.
- a welding operation is performed in which welding is started by moving from the initial position to the welding standby position and returning to the initial position after welding is completed.
- display attribute information is set for the icon as follows.
- Linear movement icon 'movement' Welding Activation Icon: 'Arc Welding'
- the related information display control unit 117 acquires the position information of the robot from the robot control device 50 in real time, and displays the robot model 30M representing the motion of the robot. It is displayed in the state display area 450 as an example.
- the welding start icon 623 whose display attribute information is 'arc welding' is being executed
- the related information display control unit 117 switches the display content displayed in the state display area 450 to information indicating the welding state.
- indicator images 771 and 772 respectively indicating current and voltage during arc welding are displayed as information indicating the welding state.
- the related information display control section 117 acquires the welding current and voltage via the robot control device 50 .
- the timing of such information display may also be settable.
- the welding start icon 623 may be set so that the screen is not switched for three seconds after the welding command is executed.
- a priority may be set for each piece of display attribute information. For example, if a high priority is set for the display attribute 'arc welding', information with a high priority can be preferentially displayed when content to be displayed in the state display area 450 conflicts.
- These information display timings and priorities may also be set as attributes for each icon via the attribute information setting unit 118 .
- an execution state attribute may be settable via the attribute information setting unit 118.
- a display example of information about icons based on execution state attributes will be described with reference to FIG. As an example, it is assumed that 'welding enabled' or 'welding disabled' can be set as an execution state attribute for the welding activation icon 623 .
- the execution state attribute is 'welding enabled' (that is, when arc discharge is enabled)
- the related information display control unit 117 displays indicator images 771 and 772 indicating arc welding current and voltage, respectively. is displayed in the state display area 450 .
- the execution state attribute is 'welding disabled' (that is, when arc discharge is disabled)
- the related information display control unit 117 displays a live image of the machined portion of the tip of the welding torch. displayed in area 450 .
- the welding enable setting the information of the current and voltage during arc welding desired by the user is presented, and in the case of welding disable setting, the welding torch tip is presented as the information desired by the user. It is possible to provide a live image M2 with which the position of the part can be confirmed.
- the setting regarding what kind of information is to be provided according to the execution state attribute may also be settable (that is, customizable) via the attribute information setting unit 118 .
- the attribute information setting unit 118 may receive the setting of the various attribute information described above through user operation.
- various attribute information may be input from an external device to the teaching device 10A, or may be stored in the icon data storage section 111 in advance.
- icon-related information may be displayed in a pop-up as in the first embodiment.
- the teaching device arranges a first area (instruction list display area) for displaying a list of instruction sentences (instruction list display area) and instruction sentences selected from the first area as a program creation screen.
- a second area program creation area for creating the .
- the first area is 'Choksen' 'Kakujiku' 'Yobidashi HAND_OPEN' 'Yobidashi HAND_CLOSE' 'Vision detection' 'Vision Hosei Data Stock' It may be configured as a pop-up menu screen that displays a list of command sentences such as
- the command 'choksen' corresponds to the command to move the control part of the robot in a straight trajectory
- the command 'kakujiku' corresponds to the command to move the robot by each axis movement
- 'yobidashi HAND_OPEN' corresponds to the command
- the command 'Yobidashi HAND_CLOSE' corresponds to the command to close the hand and grasp the object
- 'Vision detection' corresponds to the command to image the object with the visual sensor
- 'Vision position data Toku' corresponds to an instruction to detect the position of an object from the captured image.
- the user selects instructions from the first area (pop-up menu screen) and arranges them in the second area (program creation area) to create a program. specify.
- An example of the program described in the second area (program creation area) is shown below. 1: Chokusenichi[1] 2000mm/sec Ichigime; 2: ; 3: Vision detection 'A'; 4: Vision Position Data Stock 'A' Vision Reg[1] Jump Label[100] ; Five: ; 6: !Handling ; 7: Shift position [2] 2000mm/sec Nameraka 100 Vision adjustment, Vision registration [1] Tool adjustment, Position registration [1] ; 8: Choksen Ichi [2] 500mm/sec Ichigime vision adjustment, vision register [1] ; 9: Yobidashi HAND_CLOSE ; 10: 10: 2000mm/sec Nameraka 100 Vision adjustment, Vision registration [1] Tool adjustment, Position registration [1] ; The above program detects the position of the object (lines 1 to 5) using the visual sensor (cam
- the teaching device uses the functional block configuration equivalent to the functional block shown in FIG. (program creation area) to create a program. Then, the teaching device (related information display control unit) selects an imperative sentence in the first area or the second area, or according to an instruction to execute an imperative sentence placed in the second area, Information related to the imperative is displayed, for example, as a pop-up screen.
- the program creation screen 500 is displayed in one rectangular frame, and the program creation area 300 and the icon display area 200 are included in the one rectangular frame.
- the program creation area 300 and the icon display area 200 may be displayed as separate windows.
- the program creation area 300, the icon display area 200, and the pop-up screen 700 are displayed as separate windows.
- the arrangement of the functional blocks of the visual sensor control device, the robot control device, and the teaching device shown in FIGS. 2 and 4 is an example, and the arrangement of these functional blocks can be changed as appropriate.
- the functional blocks of the teaching device shown in FIG. 4 may be realized by the processor of the teaching device executing various software stored in a storage device, or by hardware such as ASIC (Application Specific Integrated Circuit). It may be realized by a configuration mainly composed of
Landscapes
- Engineering & Computer Science (AREA)
- Robotics (AREA)
- Mechanical Engineering (AREA)
- Human Computer Interaction (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- Automation & Control Theory (AREA)
- Numerical Control (AREA)
- Manipulator (AREA)
Abstract
Description
図1は一実施形態に係る教示装置10を含むロボットシステム100の全体構成を表す図である。教示装置10は、後述するように、ロボットの制御プログラムを構成する機能を表す命令によるプログラミングを行うための教示装置である。以下では、例示として、教示装置10は、ロボットの制御プログラムを構成する機能を表す(すなわち、ロボット制御の命令を表す)アイコンを用いたプログラミングを可能とする教示装置であるものとする。教示装置10を含むロボットシステムとしては様々な構成が有り得るが、本実施形態では、例示として、図1に示すロボットシステム100について説明を行うこととする。ロボットシステム100は、アーム先端部にハンド33を搭載したロボット30と、ロボット30を制御するロボット制御装置50と、ロボット制御装置50に接続された教示装置10と、ロボット30のアーム先端部に取り付けられた視覚センサ70と、視覚センサ70を制御する視覚センサ制御装置40とを含む。ロボットシステム100は、視覚センサ70により作業台2上の対象物1の検出を行い、ロボット30に搭載したハンド33で対象物1のハンドリングを行うように構成される。
視覚センサを用いた検出機能に対応する「見るアイコン」:
・カメラの初期設定
・対象物の教示設定
・撮影位置の設定
直線移動の機能に対応する「直線移動アイコン」:
・動作速度
・位置決め設定
・位置レジスタの指定
なお、上述したように、これらパラメータには、予めデフォルト値が設定されていても良い。
図7を参照し実施例1について説明する。実施例1は、アイコン表示領域200又はプログラム作成領域300においてアイコンが選択されることに応じて、そのアイコンに関連するセンサの情報を表示する例である。ここでは、例示として「見るアイコン601」が選択されるものとする。見るアイコン601は、視覚センサ70により対象物を検出し対象物の位置情報を取得する機能を提供する。
実施例2について図9を参照して説明する。実施例2は、プログラム作成領域300に配置されたアイコンに対して実行を指示する操作が行われた場合に、そのアイコンの実行結果の情報を表示する例である。図9に示すプログラム作成画面500では、ユーザ操作により、プログラム作成領域300には見るアイコン601、動作(ビジョン)アイコン602、及び直線移動アイコン603が配置されている。なお、動作(ビジョン)アイコン602は、見るアイコン601で検出された位置にロボット30(制御部位)を移動させて対象物を取り出す機能を提供する。直線移動アイコン603は、ロボット30(制御部位)を直線移動させる機能を提供する。ここでは、ユーザが、プログラム作成領域300で見るアイコン601を選択して実行させる操作(例えば、見るアイコン601にカーソルを合わせてクリック(又はタップ)する操作)を行ったとする。
実施例3について図10を参照して説明する。実施例3は、アイコン表示領域200において力センサを用いた機能に関するアイコンが選択或いは実行指示されることに応じて、そのアイコンに関連する力センサの情報を表示する例である。なお、本実施例では、ロボットシステム100において図2に示すように力センサ71が配置されている場合を想定する。図10に示されるように、アイコン表示領域200において力センサ71を用いた力制御に関連する「押付け動作アイコン604」が選択されたものとする。力センサ71は、6軸力センサであるものとする。
実施例4について図11を参照して説明する。実施例4は、ロボット30(ロボット制御装置50)と接続している機器(IO機器)に関連するアイコンに対する選択或いは実行指示に応じて当該機器に関する情報を表示する例である。図11のプログラム作成画面500において、プログラム作成領域300に配置されているアイコンのうち、「はなすアイコン605」が選択され実行指示が行われた状態であるとする。はなすアイコン605は、ハンド33を開く命令に対応する。
実施例5について図12-14を参照して説明する。実施例5は、アイコンの選択或いは実行指示に応じて当該アイコンのパラメータ(設定情報)を表示する場合の例である。図12-図13は、指定した時間だけプログラムを待機させる命令のアイコン(待機設定アイコン606)を選択した或いは実行を指示したことに応じて待機設定アイコン606に関する情報を表示する場合の動作例を示す。
実施例6について図15及び図16を参照して説明する。実施例6は、対象のアイコンに係わるプログラムの内部変数を表示する動作例に関する。ここでは、レジスタに関するアイコン(レジスタアイコン607)が選択された場合の情報の表示について説明する。
実施例7について図17を参照して説明する。実施例7は、プログラム作成領域300に配置されたアイコンのプログラムの実行を指示する操作が行われた場合に、あるアイコンに関して当該アイコンの実行前の時点で当該アイコンに関連する情報をポップアップ表示する例である。
次に第2実施形態に係る教示装置10Aについて説明する。図18に示すように、教示装置10Aは、第1実施形態に係る教示装置10に対して属性情報設定部118を付加した構成を有する。教示装置10Aは、上述した教示装置10の機能を包含する。なお、教示装置10Aによるプログラミングの対象のロボットシステムは、溶接ロボットシステムである場合を想定する。溶接ロボットシステムの場合、図1に示したロボットシステム100において、ロボット30のアーム先端部にハンド33に代えて溶接トーチを取り付け、溶接トーチに溶接電流を供給する溶接電源装置をロボット制御装置50に接続して、ロボット制御装置50の制御の下で溶接電源装置を動作させる構成とすることで実現することができる。本実施形態では、このような溶接ロボットシステムを想定する。
直線移動アイコン:‘動作’
溶接起動アイコン:‘アーク溶接’
表示属性情報が‘動作’の直線移動アイコンが実行される間、関連情報表示制御部117は、ロボットの位置情報をロボット制御装置50からリアルタイムで取得してロボットの動作を表すロボットモデル30Mを、一例として状態表示領域450に表示する。表示属性情報が‘アーク溶接’である溶接起動アイコン623の実行中、関連情報表示制御部117は、状態表示領域450に表示する表示内容を、溶接状態を示す情報に切り換える。ここでは、溶接状態を示す情報として、アーク溶接中の電流、電圧をそれぞれ示すインジケータ画像771、772が表示される例を示す。関連情報表示制御部117は、溶接電流、電圧をロボット制御装置50を介して取得する。
‘チョクセン’
‘カクジク’
‘ヨビダシHAND_OPEN’
‘ヨビダシHAND_CLOSE’
‘ビジョンケンシュツ’
‘ビジョンホセイデータシュトク’
といった命令文の一覧を表示するポップアップメニュー画面として構成されても良い。ここで、命令文‘チョクセン’はロボットの制御部位を直線の軌道で移動させる命令に対応し、命令文‘カクジク’はロボットを各軸動作で移動させる命令に対応し、‘ヨビダシHAND_OPEN’はハンドを開く命令に対応し、命令文‘ヨビダシHAND_CLOSE’はハンドを閉じて対象物をつかむ命令に対応し、‘ビジョンケンシュツ’は視覚センサで対象物を撮像する命令に対応し、‘ビジョンホセイデータシュトク’は撮像された画像から対象物の位置を検出する命令に対応する。
1:チョクセンイチ[1] 2000mm/sec イチギメ ;
2: ;
3: ビジョンケンシュツ 'A' ;
4: ビジョンホセイデータシュトク 'A' ビジョンレジ[1] ジャンプラベル[100] ;
5: ;
6: !Handling ;
7:チョクセンイチ[2] 2000mm/sec ナメラカ100 ビジョンホセイ,ビジョンレジ[1] ツールホセイ,イチレジ[1] ;
8:チョクセンイチ[2] 500mm/sec イチギメビジョンホセイ,ビジョンレジ[1] ;
9: ヨビダシ HAND_CLOSE ;
10:チョクセンイチ[2] 2000mm/sec ナメラカ100 ビジョンホセイ,ビジョンレジ[1] ツールホセイ,イチレジ[1] ;
上記プログラムは、所定位置に移動させた視覚センサ(カメラA)により、対象物の位置を検出し(第1-第5行)、検出された位置に基づきロボットの位置補正を行いながら対象物を把持する動作を実現するものである(第6-第10行)。
2 作業台
10 教示装置
11 プロセッサ
12 メモリ
13 表示部
14 操作部
15 入出力インタフェース
30 ロボット
30M ロボットモデル
31 手首フランジ
33、33A ハンド
40 視覚センサ制御装置
41 プロセッサ
42 メモリ
43 入出力インタフェース
50 ロボット制御装置
51 プロセッサ
52 メモリ
53 入出力インタフェース
54 操作部
70 視覚センサ
71 力センサ
100 ロボットシステム
110 プログラム作成部
111 アイコンデータ記憶部
112 画面生成部
113 操作入力受付部
114 プログラム生成部
115 実行部
116 パラメータ設定部
117 関連情報表示制御部
118 属性情報設定部
200 アイコン表示領域
300 プログラム作成領域
401 入力画像
402 画像処理部
403 キャリブレーションデータ記憶部
450 ロボットモデル表示領域
501 動作制御部
601 見るアイコン
602 動作(ビジョン)アイコン
603 直線移動アイコン
604 押付け動作アイコン
605 はなすアイコン
606 待機設定アイコン
607 レジスタアイコン
651 実行タブ
652 詳細タブ
671 カーソル
700 ポップアップ画面
Claims (13)
- ロボットの制御プログラムを作成するための教示装置であって、
前記ロボットの制御プログラムを構成する機能を表す命令によりプログラム作成を行うためのプログラム作成画面を生成する画面生成部と、
前記プログラム作成画面に配置された命令に対する選択又は実行指示に応じて、前記選択又は前記実行指示の対象の命令に関連する情報を表示する関連情報表示制御部と、
を備える教示装置。 - 前記プログラム作成画面は、前記ロボットの制御プログラムを構成する機能を表す1以上の命令の一覧を表示する第1の領域と、前記第1の領域から選択した命令を配置することで前記制御プログラムを作成するための第2の領域とを有し、
前記関連情報表示制御部は、前記第1の領域又は前記第2の領域にある命令に対する選択、又は、前記第2の領域に配置された命令に対する実行指示に応じて、前記対象の命令に関連する情報を表示する、請求項1に記載の教示装置。 - 前記関連情報表示制御部は、前記プログラム作成画面に配置された命令の選択中に当該命令に関連する情報を表示する、請求項1又は2に記載の教示装置。
- 前記関連情報表示制御部は、前記プログラム作成画面に配置された命令に対する実行指示に応じて、前記実行指示の対象の命令に関連する情報を当該命令の実行前、実行中、又は実行後のいずれかの時点で表示する請求項1又は2に記載の教示装置。
- 前記関連情報表示制御部は、前記実行指示の対象の命令の実行中に当該命令の実行中の状態に関する情報を表示する、請求項4に記載の教示装置。
- 前記関連情報表示制御部は、前記実行指示の対象の命令の実行後に当該命令の実行結果に関する情報を表示する、請求項4に記載の教示装置。
- 前記対象の命令に関連する情報は、当該命令のパラメータ設定情報、当該命令に関連付けられたセンサ又は機器の情報、当該命令の実行結果又は実行中の状態、当該命令に係わるプログラムの内部変数のいずれかを含む、請求項1から6のいずれか一項に記載の教示装置。
- 前記センサは視覚センサである、請求項7に記載の教示装置。
- 前記センサは力センサである、請求項7に記載の教示装置。
- 前記関連情報表示制御部は、前記対象の命令に関連する情報をポップアップ表示する、請求項1から9のいずれか一項に記載の教示装置。
- 前記命令に関連する情報として表示すべき表示内容を定義する表示属性情報を前記命令に対して設定する属性情報設定部を更に備え、
前記関連情報表示制御部は、前記命令に対する実行指示に応じて前記命令に関する情報を表示する場合に前記命令に設定されている前記表示属性情報にしたがって表示内容を決定する、請求項1から10のいずれか一項に記載の教示装置。 - 前記属性情報設定部は、更に、前記命令の実行状態に関する情報である実行状態属性を設定するように構成され、
前記関連情報表示制御部は、前記命令に対する実行指示に応じて前記命令に関する情報を表示する場合に、前記実行状態属性に更に基づいて表示内容を決定する、請求項11に記載の教示装置。 - 前記命令はアイコン又は命令文により表されている、請求項1から12のいずれか一項に記載の教示装置。
Priority Applications (4)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN202280009306.6A CN116685916A (zh) | 2021-01-28 | 2022-01-24 | 示教装置 |
JP2022578374A JP7553612B2 (ja) | 2021-01-28 | 2022-01-24 | 教示装置 |
DE112022000322.7T DE112022000322T5 (de) | 2021-01-28 | 2022-01-24 | Einlernvorrichtung |
US18/254,880 US20240091927A1 (en) | 2021-01-28 | 2022-01-24 | Teaching device |
Applications Claiming Priority (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
JP2021-011999 | 2021-01-28 | ||
JP2021011999 | 2021-01-28 |
Publications (1)
Publication Number | Publication Date |
---|---|
WO2022163590A1 true WO2022163590A1 (ja) | 2022-08-04 |
Family
ID=82654521
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
PCT/JP2022/002464 WO2022163590A1 (ja) | 2021-01-28 | 2022-01-24 | 教示装置 |
Country Status (6)
Country | Link |
---|---|
US (1) | US20240091927A1 (ja) |
JP (1) | JP7553612B2 (ja) |
CN (1) | CN116685916A (ja) |
DE (1) | DE112022000322T5 (ja) |
TW (1) | TW202228949A (ja) |
WO (1) | WO2022163590A1 (ja) |
Cited By (1)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
WO2024116261A1 (ja) * | 2022-11-29 | 2024-06-06 | ファナック株式会社 | プログラム生成装置、方法およびプログラム |
Citations (4)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JPH08249026A (ja) * | 1995-03-10 | 1996-09-27 | Fanuc Ltd | ロボットを含むシステムのプログラミング方法 |
JP2011158981A (ja) * | 2010-01-29 | 2011-08-18 | Mori Seiki Co Ltd | 加工状況監視装置 |
JP2015182142A (ja) * | 2014-03-20 | 2015-10-22 | セイコーエプソン株式会社 | ロボット、ロボットシステム及び教示方法 |
JP6498366B1 (ja) * | 2018-07-10 | 2019-04-10 | 三菱電機株式会社 | 教示装置 |
-
2022
- 2022-01-17 TW TW111101833A patent/TW202228949A/zh unknown
- 2022-01-24 CN CN202280009306.6A patent/CN116685916A/zh active Pending
- 2022-01-24 WO PCT/JP2022/002464 patent/WO2022163590A1/ja active Application Filing
- 2022-01-24 JP JP2022578374A patent/JP7553612B2/ja active Active
- 2022-01-24 DE DE112022000322.7T patent/DE112022000322T5/de active Pending
- 2022-01-24 US US18/254,880 patent/US20240091927A1/en active Pending
Patent Citations (4)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JPH08249026A (ja) * | 1995-03-10 | 1996-09-27 | Fanuc Ltd | ロボットを含むシステムのプログラミング方法 |
JP2011158981A (ja) * | 2010-01-29 | 2011-08-18 | Mori Seiki Co Ltd | 加工状況監視装置 |
JP2015182142A (ja) * | 2014-03-20 | 2015-10-22 | セイコーエプソン株式会社 | ロボット、ロボットシステム及び教示方法 |
JP6498366B1 (ja) * | 2018-07-10 | 2019-04-10 | 三菱電機株式会社 | 教示装置 |
Cited By (1)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
WO2024116261A1 (ja) * | 2022-11-29 | 2024-06-06 | ファナック株式会社 | プログラム生成装置、方法およびプログラム |
Also Published As
Publication number | Publication date |
---|---|
US20240091927A1 (en) | 2024-03-21 |
DE112022000322T5 (de) | 2023-09-28 |
CN116685916A (zh) | 2023-09-01 |
JPWO2022163590A1 (ja) | 2022-08-04 |
JP7553612B2 (ja) | 2024-09-18 |
TW202228949A (zh) | 2022-08-01 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
US10882189B2 (en) | Control device and robot system | |
JP6343353B2 (ja) | ロボットの動作プログラム生成方法及びロボットの動作プログラム生成装置 | |
US20190202058A1 (en) | Method of programming an industrial robot | |
EP0792726A1 (en) | Teach pendant | |
US10095216B2 (en) | Selection of a device or object using a camera | |
JP2004351570A (ja) | ロボットシステム | |
JP5672326B2 (ja) | ロボットシステム | |
JP7553559B2 (ja) | プログラミング装置 | |
WO2022163590A1 (ja) | 教示装置 | |
JP7440620B2 (ja) | プログラム編集装置 | |
KR20180081773A (ko) | 산업적 시설을 제어하기 위한 응용프로그램들의 단순화된 변경을 위한 방법 | |
JP2011045937A (ja) | ロボットシステム | |
EP3411195B1 (en) | Controlling an industrial robot using interactive commands | |
JP7035555B2 (ja) | 教示装置、及びシステム | |
CN108369413A (zh) | 工业机器人和用于控制机器人自动选择接下来要执行的程序代码的方法 | |
KR102403021B1 (ko) | 로봇 교시 장치 및 이를 이용한 로봇 교시 방법 | |
TW202335810A (zh) | 教示操作盤及機器人控制系統 | |
WO2024028977A1 (ja) | 教示装置 | |
JP2011141584A (ja) | 入力制御機器 | |
JP2007207196A (ja) | プログラマブルロジックコントローラ、情報処理装置、制御プログラム、およびテーブル作成プログラム | |
JP2003039357A (ja) | ロボットの教示装置 | |
CN114800482B (zh) | 创建机器人的控制程序的方法、及其系统、以及记录介质 | |
WO2023026490A1 (ja) | 教示装置 | |
JP7436796B2 (ja) | ロボットのプログラム作成支援装置 | |
TWM410265U (en) | Human-machine interface system with automatic measurement |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
121 | Ep: the epo has been informed by wipo that ep was designated in this application |
Ref document number: 22745811 Country of ref document: EP Kind code of ref document: A1 |
|
ENP | Entry into the national phase |
Ref document number: 2022578374 Country of ref document: JP Kind code of ref document: A |
|
WWE | Wipo information: entry into national phase |
Ref document number: 18254880 Country of ref document: US |
|
WWE | Wipo information: entry into national phase |
Ref document number: 202280009306.6 Country of ref document: CN |
|
WWE | Wipo information: entry into national phase |
Ref document number: 112022000322 Country of ref document: DE |
|
122 | Ep: pct application non-entry in european phase |
Ref document number: 22745811 Country of ref document: EP Kind code of ref document: A1 |