WO2018221053A1 - Remote operation robot control system and remote operation robot control method - Google Patents
Remote operation robot control system and remote operation robot control method Download PDFInfo
- Publication number
- WO2018221053A1 WO2018221053A1 PCT/JP2018/016084 JP2018016084W WO2018221053A1 WO 2018221053 A1 WO2018221053 A1 WO 2018221053A1 JP 2018016084 W JP2018016084 W JP 2018016084W WO 2018221053 A1 WO2018221053 A1 WO 2018221053A1
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- robot
- work
- monitoring
- remote
- operator
- Prior art date
Links
Images
Classifications
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B25—HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
- B25J—MANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
- B25J19/00—Accessories fitted to manipulators, e.g. for monitoring, for viewing; Safety devices combined with or specially adapted for use in connection with manipulators
- B25J19/02—Sensing devices
- B25J19/04—Viewing devices
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B25—HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
- B25J—MANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
- B25J3/00—Manipulators of master-slave type, i.e. both controlling unit and controlled unit perform corresponding spatial movements
Definitions
- the present invention relates to a remote work robot control system and a remote work robot control method.
- Patent Document 1 As a technique related to the control of an autonomous monitoring robot, for example, in Patent Document 1, a number of operation variables that can be operated by an operator are selected from a combination of control modes, and other operations are performed according to changes in the operated operation variables. A technique for indirectly controlling the manipulated variable is disclosed.
- Patent Document 2 discloses a technique in which a computer intentionally understands a situation where a human wearing a wearable system is placed, and a 3D robot system performs based on the understanding of the intention.
- Patent Document 3 discloses a technique for providing an operation detection unit in an autonomous mobile robot to realize movement and posture change intended by the operator.
- remote operators In order to improve the conditions of harsh environments such as disaster sites where it is difficult for humans to enter, remote operators use remote work robots to grasp, cut, and drill unknown work objects. It is necessary to perform such work.
- the remote operator performs the remote operation while watching the monitoring video.
- a monitoring image robot for providing a monitoring image operated by another operator is used separately from the remote work robot.
- Patent Documents 1 to 3 as described above, the operation support based on the intention estimation in one robot and the work that makes it easy to define the intention such as movement are targeted.
- the intention of the remote operator when monitoring the contact operation of an unknown object because there are many uncertain factors.
- the present invention provides a remote work robot control system and a remote work robot control method capable of controlling a plurality of remote work robots with a small number of operators.
- the present invention includes a plurality of means for solving the above-described problems.
- a work robot disposed in a remote place different from the place where the operator is located, and an operation of the work robot by the operator are provided.
- a robot control device ; a work robot sensor attached to each drive unit of the work robot; a monitoring camera that images the work robot; an autonomous monitoring robot that monitors remote work by the work robot; An operator sensor for detecting behavior, and the operator sensor and the work robot sensor.
- a data processing device that generates a control operation signal for an autonomous monitoring operation by the autonomous monitoring robot based on sensor information from at least one of the sensors, and the autonomous monitoring of the control operation signal generated by the data processing device
- An autonomous monitoring robot control device for converting into a robot command signal and a display device for displaying a monitoring image captured by the monitoring camera of the autonomous monitoring robot are provided.
- a plurality of remote work robots can be controlled by a small number of operators. Problems, configurations, and effects other than those described above will be clarified by the following description of embodiments.
- FIG. 3 is a PAD (Problem Analysis Diagram) diagram schematically showing processing contents of operation control processing of the work robot and the autonomous monitoring robot in the remote work robot control system of the present embodiment. It is a PAD figure which shows the conversion processing flow from the work robot joint angle sensor signal to the autonomous monitoring operation
- Embodiments of a remote work robot control system and a remote work robot control method according to the present invention will be described with reference to FIGS. 1 to 6C.
- a remote for controlling a remote work robot (work robot 7 and autonomous monitoring robot 10) while illustrating a situation in which an object is unknown and a work device performs a carrying work in a high radiation environment.
- the contents of the work robot control system 1000 will be described.
- FIG. 1 is a diagram showing a device configuration in a work environment according to the present embodiment.
- FIG. 2 is a functional block diagram showing the overall configuration of the remote robot control system for the work robot.
- a work robot 7 exists in a work environment (remote place) 13 different from the place where the operator 1 is, and works on the object 12 by an arm mechanism (grip arm) 71 and a moving mechanism. 72.
- the work robot 7 has a crawler type as the moving mechanism 72 and a manipulator type as the arm mechanism 71 as the work mechanism, but is not limited thereto.
- the work robot 7 is connected to the work robot control device 6 by wire and / or wireless, and the work robot control device 6 is connected to the data processing device 5.
- a joint angle sensor (work robot sensor) 8 is attached to each drive unit of the arm mechanism 71 of the work robot 7 for work phase determination.
- the joint angle sensor 8 is, for example, a laser distance meter, encoder, potentiometer, inclinometer, geomagnetic sensor, gyro sensor as a detection device that is mounted in the work robot 7 and detects the amount of movement and rotation of each drive unit. Any one or more of them.
- a detection device for detecting the interaction with the object 12 a camera that acquires image data of the surrounding environment, an ultrasonic distance meter, the shape of the surrounding environment and the object 12 are measured.
- any one or more of: a laser distance meter, a force / torque sensor that measures the force and torque applied to the hand of the robot, a thermometer, a pressure sensor, and a current sensor as a detection device that detects the operation of each drive unit can be used, and any one or more of these sensors can be used instead of the joint angle sensor 8.
- the operator 1 who manages and monitors the work inputs a work instruction to the operation control device 4 using a joystick controller (operation device) 3 that instructs the operation of the work robot 7 by the operator 1 to operate the work robot 7.
- a joystick controller operation device 3 that instructs the operation of the work robot 7 by the operator 1 to operate the work robot 7.
- an operator sensor used to determine the work phase by detecting the behavior of the operator 1 is used to set the gaze point of the operator 1 in the monitoring video displayed on the monitoring video display unit 45.
- a line-of-sight measuring device 2 for detection is provided.
- the operator sensor is a force / torque sensor that detects an operator's operation, a pressure-sensitive sensor, a geomagnetic sensor, a gyro sensor, an acceleration sensor, or a non-contact type sensor such as a motion capture.
- a force / torque sensor that detects an operator's operation
- a pressure-sensitive sensor that detects an operator's operation
- a geomagnetic sensor that detects an operator's operation
- a gyro sensor a geomagnetic sensor
- a gyro sensor a gyro sensor
- an acceleration sensor or a non-contact type sensor such as a motion capture.
- a non-contact type sensor such as a motion capture.
- One or more sensors can be used.
- the autonomous monitoring robot 10 is a robot that monitors the remote work performed by the work robot 7, and includes a moving mechanism 102 that moves the autonomous monitoring robot 10, a monitoring camera that captures the work robot 7, and an arm mechanism 101 that holds the monitoring camera. And a joint angle sensor (autonomous monitoring robot sensor) 100 attached to each drive unit of the autonomous monitoring robot 10.
- the monitoring camera can be any one of the optical camera 11, an ultrasonic scanner, and a laser scanner. Here, the optical camera 11 is used.
- the autonomous monitoring robot 10 is connected to the autonomous monitoring robot control device 9 in a wired and / or wireless manner, and the autonomous monitoring robot control device 9 is connected to the data processing device 5.
- the work phase is determined in the data processing device 5 based on the signals from the line-of-sight measuring device 2 worn by the operator 1 and the joint angle sensor 8 of the work robot 7, and in the autonomous monitoring robot control device 9.
- the operation of the autonomous monitoring robot 10 is generated.
- the operator 1 can monitor the progress of the operation, the state of the apparatus, and the like through the monitoring video from the optical camera 11 attached to the autonomous monitoring robot 10.
- the monitoring video display unit (display device) 45 is a display such as a liquid crystal that displays the monitoring video captured by the optical camera 11 of the autonomous monitoring robot 10.
- the operation control device 4 is a device that generates an operation signal for operating the work robot 7 in response to the input of the operation signal of the operator 1 by the joystick controller 3, and includes an operation input unit 40, a target value calculation unit 41, and a gazing point position pattern detection. Section (pattern detection section) 42, monitoring video output section 43, and data transmission / reception section 44.
- the operation input unit 40 is a part in which the operator 1 inputs an operation input using the joystick controller 3 to the operation control device 4.
- the target value calculation unit 41 is a part that calculates the target motion of the work robot 7 based on the operation input captured by the operation input unit 40.
- the gazing point position pattern detection unit 42 is a part that detects a work position pattern that the operator 1 gazes in the monitoring image displayed on the monitoring image display unit 45 by the line-of-sight measurement device 2 worn by the operator 1.
- the monitoring video output unit 43 is a part that outputs a monitoring video for the operator 1 to the monitoring video display unit 45.
- the data transmission / reception unit 44 is a part that manages data transmission / reception with the data processing device 5.
- the data processing device 5 is a device that generates a control operation signal for an autonomous monitoring operation by the autonomous monitoring robot 10 based on the sensor information from the line-of-sight measurement device 2 and the joint angle sensor 8.
- a determination unit 51, a storage unit 53, a camera position / posture generation unit 52, and an autonomous operation calculation unit 55 are included.
- the data transmission / reception unit 50 is a part that manages data exchange with the operation control device 4, the work robot control device 6, and the autonomous monitoring robot control device 9.
- the work phase determination unit 51 is a part that determines a work phase.
- the operation speed of the arm mechanism 71 of the work robot 7 is obtained from the signal of the joint angle sensor 8 mounted on the work robot 7, the operation speed of the arm mechanism 71, the arm mechanism 71, the object 12,
- the work phase is determined from the relationship with the distance, and a control operation signal suitable for the determined work phase is generated.
- the work phase determination unit 51 determines and generates a work phase based on the gaze point of the operator 1 acquired by the line-of-sight measurement device 2, and outputs the determined work phase to the camera position / posture generation unit 52.
- the storage unit 53 is a device that includes a hard disk and a RAM that store work training data 54 used to determine the work phase in the work phase determination unit 51.
- the camera position / posture generation unit 52 is a part that generates the position of the optical camera 11 from the estimation result of the work phase and the geometric positional relationship between the work robot 7 and the object 12.
- the autonomous movement calculation unit 55 is a part that generates an operation signal necessary for the movement of the autonomous monitoring robot 10 for placing the optical camera 11 at the camera target position.
- the work robot control device 6 is a device that converts an operation signal input by the joystick controller 3 and generated in the operation control device 4 into a command signal of the work robot 7, and includes a data transmission / reception unit 60, a command voltage calculation unit 61, And a robot controller 62.
- the data transmission / reception unit 60 is a part that exchanges data with the data processing device 5.
- the command voltage calculation unit 61 receives the target control amount (the crawler rotation speed of the moving mechanism 72, the crawler rotation speed of the moving mechanism 72, the arm mechanism 71) from the command signal of the working robot 7 transmitted from the data processing device 5. This is a part for calculating the joint angle of each joint.
- the robot control unit 62 uses the calculated target value and the current angle (current posture) calculated from the signal of the joint angle sensor 8 built in each drive unit of the work robot 7 to set the drive unit of each work robot 7. This is the part that generates the target command voltage.
- the autonomous monitoring robot control device 9 is a device that converts the control operation signal generated by the data processing device 5 into a command signal of the autonomous monitoring robot 10, and includes a data transmission / reception unit 90, a command voltage calculation unit 91, a robot control unit 92, And a monitoring video acquisition unit 93.
- the data transmission / reception unit 90 is a part that exchanges data with the data processing device 5.
- the command voltage calculation unit 91 receives the target control amount (the crawler rotation speed of the movement mechanism 102, the arm mechanism) from the command signal of the autonomous monitoring robot 10 transmitted from the data processing device 5 based on the movement mechanism 102 and the arm mechanism 101 of the autonomous monitoring robot 10. (Joint angle of each joint 101) is calculated.
- the robot control unit 92 drives each of the autonomous monitoring robots 10 using the calculated target value and the current angle (current posture) calculated from the signal of the joint angle sensor 100 built in each driving unit of the autonomous monitoring robot 10. This is a part for generating the target command voltage of the part.
- the monitoring video acquisition unit 93 is a part that acquires a video from the optical camera 11 attached to the arm mechanism 101 of the autonomous monitoring robot 10.
- the operation control device 4, the data processing device 5, the work robot control device 6, and the autonomous monitoring robot control device 9 can be appropriately integrated and divided, and are not limited to the configurations shown in FIGS.
- the operation control device 4 and the data processing device 5 may be integrated into a single device, or the work robot control device 6 may be divided into a movement mechanism and a work mechanism.
- each device is connected by a wired cable, but a wireless system configuration may be used.
- FIG. 3 is a PAD diagram schematically showing the contents of the operation control process and the autonomous monitoring robot control process of the work robot by the remote work robot control system of the present embodiment.
- FIG. 4 is a PAD showing a conversion process flow from the work robot joint angle sensor signal to the autonomous monitoring operation in the remote work robot control system of the present embodiment.
- the remote work robot control system 1000 first initializes the joint angle sensor 8 of the work robot 7 (step S1).
- step S10 when it is recognized that an operation instruction has been input by the operator 1 (step S10), the following processing is started.
- the moving direction / moving speed of the moving mechanism 72 is calculated according to the input signal (step S11), and a control amount (voltage command value of a crawler or the like) of the moving mechanism 72 is generated (calculated) (step S12).
- the position and angle of the target joint is calculated (step S13) using inverse kinematics that obtains the joint coordinates of the arm mechanism 71 from the arm hand position indicated by the operator 1, and the control amount of the arm mechanism 71 ( A voltage command value for each joint motor is generated (calculated) (step S14).
- step S20 is a work robot control step for converting the operation signal of the work robot 7 according to the operation instruction of the operator 1 into the command signal of the work robot 7, and the robot control calculated from the work instruction in the above-described steps S12 and S14. Based on the quantity, a target action of the work robot 7 is generated.
- step S20 first, when the voltage command values of the moving mechanism 72 and the arm mechanism 71 calculated in steps S12 and S14 are read (step S21), the voltage command values to each motor are output (step S22). Thereafter, based on the information of the joint angle sensor 8 mounted on the work robot 7, the state of the work robot 7 is detected (step S 23), and the joint angle sensor 8 is directed toward the operation control device 4 via the data processing device 5. The acquired data is transmitted (step S24).
- step S30 is for an autonomous monitoring operation by the autonomous monitoring robot 10 based on sensor information from the line-of-sight measuring device 2 that detects the behavior of the operator 1 and the joint angle sensor 8 attached to each drive unit of the work robot 7.
- This is a data processing step for generating the control operation signal.
- step S30 first, the acquired data of the joint angle sensor 8 of the work robot 7 output from the data processing device 5 is received (step S31).
- the posture of the work robot 7 is derived from the acquired data by forward kinematics (step S32).
- an operation signal operation speed of the arm mechanism 71
- the work phase is determined based on the gazing point position pattern (gaze point of the operator 1) (step S33).
- step S34 the target position / orientation of the optical camera 11 is generated from the positional relationship between the work robot 7 and the object 12 according to the work phase determined in step S33.
- step S35 the autonomous operation (the control amount of the moving mechanism 102, etc.) of the autonomous monitoring robot 10 that calculates the target position / posture of the optical camera 11 generated in step S34 is calculated.
- step S36 a control amount (voltage command value of each joint motor) of the arm mechanism 101 of the autonomous monitoring robot 10 is generated (step S36).
- step S40 is an autonomous monitoring robot control step for converting the control operation signal generated in step S35 in the data processing step into a command signal for the autonomous monitoring robot 10.
- step S40 first, when the voltage command values of the moving mechanism 102 and the arm mechanism 101 calculated in step S35 are read (step S41), the voltage command values to each motor are output (step S42). Thereafter, the monitoring video is acquired by the optical camera 11 mounted on the autonomous monitoring robot 10 (step S43), and the acquired monitoring video is transmitted to the data processing device 5 (step S44).
- step S50 presentation of the monitoring video to the operator 1 is started (step S50).
- This step S50 is a display step for displaying a monitoring video imaged by the optical camera 11 of the autonomous monitoring robot 10.
- step S50 first, the monitoring video output unit 43 of the operation control apparatus 4 receives the monitoring video output in step S44 (step S51), and then outputs the monitoring video to the monitoring video display unit 45 (step S51). Step S52).
- the data of the joint angle sensor 8 mounted on the work robot 7 is read, and the gaze position pattern of the operator 1 in the monitoring video displayed on the monitoring video display unit 45 is read from the line-of-sight measuring device 2 ( Step S100).
- step S101 the position / posture / movement speed of the work robot 7 is derived (step S101), and the geometric positional relationship between the object 12 and the arm mechanism 71 is detected (step S102).
- the work robot in the work training data 54 acquired in advance by training or the like and stored in the storage unit 53 with the current position / posture / movement speed and geometrical positional relationship of the current work robot 7 derived and detected.
- the work phase is determined from the movement signal such as the position of 7, posture, moving speed, positional relationship with the object 12 and the operator 1 gazing point position (step S 103).
- step S104 a combination of the current operation signal and the work phase is recorded (step S104).
- the above is the basic process of the work phase determination process.
- FIG. 5A and FIG. 5B are diagrams showing an example of work phase determination based on sensor signals in the remote work robot control system of this embodiment.
- FIG. 6A is a diagram illustrating a relationship example of a target geometric arrangement in the remote operation robot control system of the present embodiment
- FIGS. 6B and 6C are diagrams illustrating examples of determining a target geometric arrangement.
- the work phase until the completion of gripping is performed in advance as an “approach phase” in which the arm mechanism 71 approaches or separates from the object 12, and the object is detected by the arm mechanism 71.
- This is a “position adjustment phase” in which a subtle position for gripping 12 is adjusted, and a case where the speed of the arm mechanism 71 is higher than the proximity of the distance between the object 12 and the work robot 7 and is not a normal operation. It is set in three stages of “abnormal operation phase”. Further, the boundaries between the respective work phases are obtained from data at the time of remote training by the operator 1 in advance, and are stored in the storage unit 53 as work training data 54 as shown in FIG. 5A.
- FIG. 5A It is determined online where the position is, and the work phase to which the determined position belongs is output to the camera position / posture generation unit 52.
- the work phase determination unit 51 in this step records the combination of the work phase determined to be the current operation signal as shown in step S104 of FIG. 4, and the operator 1, work environment as shown in FIG. 5B. It is desirable that the boundary value of the work phase is updated according to 13 and stored in the storage unit 53.
- the operation of the arm mechanism 71 and the moving mechanism 72 is stopped or an operation input signal from the joystick controller 3 of the operator 1 is received in order to suppress the occurrence of an unexpected situation. It is desirable to output a signal for invalidation via the camera position / posture generation unit 52.
- the work phase determination unit 51 divides the position of the gazing point in the monitoring video into three regions: a work robot body J1, a work robot hand J2, and a target object J3.
- the work phase determination unit 51 obtains for each gazing point how long the operator 1 has gazed at each area, as shown in FIG. 6B and FIG. The ratio (gaze ratio) at which each position is watched per unit time for operating 7 is obtained. Then, the work phase is determined from the obtained stop ratio.
- the work phase determination unit 51 in this step records the combination of the current gaze ratio and the work phase, updates the boundary value of the work phase, and stores it in the same manner as shown in step S104 of FIG. 4 or FIG. 5B. It is desirable to store it in the unit 53.
- the work phase determination unit 51 may prioritize which one and give priority. I can leave.
- the work phase determination unit 51 determines that the work phase is “approach phase”
- the entire work robot 7 and the object 12 are captured in the monitoring image from the optical camera 11, and the positional relationship Generates a location that is easy to understand.
- the determined work phase is “position adjustment phase” or “work phase”
- a position where the optical camera 11 zooms the object 12 and the arm mechanism 71 of the work robot 7 is generated.
- a position signal is generated so as to maintain the current position.
- the position and orientation of the optical camera 11 of the autonomous monitoring robot 10 are not calculated so as to maintain the current position and orientation. .
- the operator is based on sensor information from at least one of the line-of-sight measurement device 2 and the joint angle sensor 8.
- One operation is classified, and an operation signal necessary for the autonomous monitoring operation of the autonomous monitoring robot 10 is generated.
- it is possible to autonomously acquire the monitoring video necessary for the operation from the operation related to the operation of the work robot 7 while reflecting the intention of the operator 1, and it is difficult to use the conventional technique without arranging the monitoring operator.
- it becomes possible to flexibly cope with changes in the work environment and work situation that occur during work.
- no monitoring operator since no monitoring operator is required, it is not necessary to spend time for communication between operators, and a plurality of robots can be operated by a small number of operators.
- the data processing device 5 determines the work phase from the relationship between the operation speed of the arm mechanism 71 of the work robot 7 and the distance between the arm mechanism 71 and the object 12, and outputs a control operation signal suitable for the determined work phase. Therefore, it is possible to acquire a monitoring video reflecting the intention of the operator 1 with high accuracy, and to respond more flexibly to changes in the work environment and work situation that occur during work.
- the operator sensor is a line-of-sight measuring device 2 that detects the gaze point of the operator 1 in the monitoring video displayed on the monitoring video display unit 45
- the data processing device 5 is the operator 1 acquired by the line-of-sight measuring device 2. It is also possible to obtain a monitoring video reflecting the intention of the operator 1 with high accuracy by determining the work phase based on the gazing point and generating a control operation signal suitable for the determined work phase. It is possible to respond more flexibly to changes in the work environment and work situation that occur inside.
- the present invention is not limited to the above-described embodiment, and includes various work examples (cutting, drilling, etc.). Further, the above-described embodiment has been described in detail for easy understanding of the present invention, and is not necessarily limited to all functions. Moreover, you may implement
- Each of the above-described configurations, functions, and the like may be realized by software by interpreting and executing a program that realizes each function by the processor.
- Data transmission / reception unit 61 ... Command voltage calculation unit 62 ; Robot control unit 71 ... Arm Mechanism 72 ... Movement mechanism 90 ; Data transmission / reception unit 91 ... Command voltage calculation unit 92 ; Robot control unit 93 ... Monitoring video acquisition unit 1 0 ... joint angle sensor 101 ... arm mechanism 102 ... moving mechanism 1000 ... remote working robot control system
Landscapes
- Engineering & Computer Science (AREA)
- Robotics (AREA)
- Mechanical Engineering (AREA)
- Manipulator (AREA)
Abstract
The present invention makes it possible to control multiple remote operation robots with a small number of operators. A system is provided with: work robots 7; an operation control device 4 for generating an operation signal for a work robot 7 from an operation signal of an operator 1; joint angle sensors 8 installed on each driving part of the work robots 7; an autonomous monitoring robot 10 that has an optical camera 11 and monitors remote work by the work robots 7; a line-of-sight measuring device 2 for detecting the behavior of the operator 1; a data-processing device 5 for generating a movement signal necessary for autonomous monitoring movement of the autonomous monitoring robot 10 on the basis of sensor information from the line-of-sight measuring device 2 and the joint angle sensors 8; and a monitoring video display unit 45 for displaying the monitoring video taken by the optical camera 11 of the autonomous monitoring robot 10.
Description
本発明は、遠隔作業ロボット制御システムおよび遠隔作業ロボットの制御方法に係わる。
The present invention relates to a remote work robot control system and a remote work robot control method.
自律監視ロボットの制御に関する技術として、例えば、特許文献1には、オペレータにとって操作が可能な数の操作変数を選択して組み合わせた制御モードから選択し、操作された操作変数の変化に応じてその他の操作変数を間接的に制御する技術が開示されている。
As a technique related to the control of an autonomous monitoring robot, for example, in Patent Document 1, a number of operation variables that can be operated by an operator are selected from a combination of control modes, and other operations are performed according to changes in the operated operation variables. A technique for indirectly controlling the manipulated variable is disclosed.
また、特許文献2には、ウェアラブルシステムを装着した人間の置かれている状況をコンピュータが意図理解し、その意図理解に基づいて3次元ロボットシステムが遂行する技術が開示されている。
Patent Document 2 discloses a technique in which a computer intentionally understands a situation where a human wearing a wearable system is placed, and a 3D robot system performs based on the understanding of the intention.
また、特許文献3には、自律移動ロボットに操作検出部を設け、オペレータの意図する移動および姿勢の変化を実現する技術が開示されている。
Further, Patent Document 3 discloses a technique for providing an operation detection unit in an autonomous mobile robot to realize movement and posture change intended by the operator.
人間の立ち入りが困難な災害現場等に代表される苛酷環境の状況を改善していくためには、遠隔地にいるオペレータが遠隔作業ロボットを用いながら、未知な作業対象物の把持、切断、穿孔等の作業を行う必要がある。
In order to improve the conditions of harsh environments such as disaster sites where it is difficult for humans to enter, remote operators use remote work robots to grasp, cut, and drill unknown work objects. It is necessary to perform such work.
その際、遠隔作業オペレータは監視映像を見ながら遠隔作業を行うことになる。この監視画像を得るために、従来は、監視映像を提供するために、遠隔作業ロボットとは別に、別のオペレータによって操作される監視映像を提供するための監視映像ロボットを用いている。
At that time, the remote operator performs the remote operation while watching the monitoring video. In order to obtain this monitoring image, conventionally, in order to provide a monitoring image, a monitoring image robot for providing a monitoring image operated by another operator is used separately from the remote work robot.
このような従来の方法では、監視オペレータを別途必要とし、オペレータ間の意思疎通に時間を要していた。そのため、監視等の支援を自律的に提供するロボットが求められている。
In such a conventional method, a monitoring operator is required separately, and it takes time for communication between operators. Therefore, there is a demand for a robot that autonomously provides support such as monitoring.
ここで、上記したような特許文献1-3においては、ロボットが1台の中での意図推定に基づく動作支援や、移動等の意図の規定が容易な作業を対象としていた。しかしながら、未知な対象物の接触作業を監視するときの遠隔作業オペレータの意図を推定することは不確定要素が多く、困難であるとの課題がある。
Here, in Patent Documents 1 to 3 as described above, the operation support based on the intention estimation in one robot and the work that makes it easy to define the intention such as movement are targeted. However, there is a problem that it is difficult to estimate the intention of the remote operator when monitoring the contact operation of an unknown object because there are many uncertain factors.
本発明は、少数のオペレータで複数の遠隔作業ロボットを制御することを可能とする遠隔作業ロボット制御システムおよび遠隔作業ロボットの制御方法を提供する。
The present invention provides a remote work robot control system and a remote work robot control method capable of controlling a plurality of remote work robots with a small number of operators.
本発明は、上記課題を解決する手段を複数含んでいるが、その一例を挙げるならば、オペレータの居る場所とは異なる遠隔地に配置された作業ロボットと、前記オペレータによる前記作業ロボットの操作を指示する操作装置と、前記操作装置による前記オペレータの操作信号の入力を受けて前記作業ロボットを操作する操作信号を生成する操作制御装置と、前記操作信号を前記作業ロボットの指令信号に変換する作業ロボット制御装置と、前記作業ロボットの各駆動部に取り付けられた作業ロボットセンサと、前記作業ロボットを撮像する監視カメラを有し、前記作業ロボットによる遠隔作業を監視する自律監視ロボットと、前記オペレータの挙動を検出するオペレータセンサと、前記オペレータセンサおよび前記作業ロボットセンサとのうち少なくともいずれか一方からのセンサ情報に基づいて前記自律監視ロボットによる自律監視動作のための制御動作信号を生成するデータ処理装置と、前記データ処理装置で生成された前記制御動作信号を前記自律監視ロボットの指令信号に変換する自律監視ロボット制御装置と、前記自律監視ロボットの前記監視カメラが撮像した監視映像を表示する表示装置と、を備えたことを特徴とする。
The present invention includes a plurality of means for solving the above-described problems. For example, a work robot disposed in a remote place different from the place where the operator is located, and an operation of the work robot by the operator are provided. An operation device for instructing, an operation control device for generating an operation signal for operating the work robot in response to an input of the operation signal of the operator by the operation device, and work for converting the operation signal into a command signal for the work robot A robot control device; a work robot sensor attached to each drive unit of the work robot; a monitoring camera that images the work robot; an autonomous monitoring robot that monitors remote work by the work robot; An operator sensor for detecting behavior, and the operator sensor and the work robot sensor. A data processing device that generates a control operation signal for an autonomous monitoring operation by the autonomous monitoring robot based on sensor information from at least one of the sensors, and the autonomous monitoring of the control operation signal generated by the data processing device An autonomous monitoring robot control device for converting into a robot command signal and a display device for displaying a monitoring image captured by the monitoring camera of the autonomous monitoring robot are provided.
本発明によれば、少数のオペレータで複数の遠隔作業ロボットを制御することができる。上記した以外の課題、構成および効果は、以下の実施の形態の説明により明らかにされる。
According to the present invention, a plurality of remote work robots can be controlled by a small number of operators. Problems, configurations, and effects other than those described above will be clarified by the following description of embodiments.
本発明の遠隔作業ロボット制御システムおよび遠隔作業ロボットの制御方法の一実施の形態を、図1乃至図6Cを用いて説明する。
Embodiments of a remote work robot control system and a remote work robot control method according to the present invention will be described with reference to FIGS. 1 to 6C.
本実施の形態では、対象物が未知な状況、かつ高放射線環境において作業装置が運搬作業を実施する場合を例示しながら遠隔作業ロボット(作業ロボット7および自律監視ロボット10)を制御するための遠隔作業ロボット制御システム1000の内容を説明する。
In the present embodiment, a remote for controlling a remote work robot (work robot 7 and autonomous monitoring robot 10) while illustrating a situation in which an object is unknown and a work device performs a carrying work in a high radiation environment. The contents of the work robot control system 1000 will be described.
最初に、遠隔作業ロボット制御システム1000の概要について図1および図2を用いて説明する。図1は、本実施の形態に係る作業環境における機器構成を示す図である。図2は、作業ロボットの遠隔作業ロボットの制御システムの全体構成を示す機能ブロック図である。
First, an outline of the remote work robot control system 1000 will be described with reference to FIGS. 1 and 2. FIG. 1 is a diagram showing a device configuration in a work environment according to the present embodiment. FIG. 2 is a functional block diagram showing the overall configuration of the remote robot control system for the work robot.
図1および図2において、オペレータ1の居る場所とは異なる作業環境(遠隔地)13には、作業ロボット7が存在しており、対象物12に対する作業をアーム機構(把持アーム)71および移動機構72により実施する。本実施の形態では、作業ロボット7は移動機構72としてクローラ型、作業機構としてのアーム機構71をマニピュレータ型とする構成としているが、それらに限定はされない。
In FIG. 1 and FIG. 2, a work robot 7 exists in a work environment (remote place) 13 different from the place where the operator 1 is, and works on the object 12 by an arm mechanism (grip arm) 71 and a moving mechanism. 72. In this embodiment, the work robot 7 has a crawler type as the moving mechanism 72 and a manipulator type as the arm mechanism 71 as the work mechanism, but is not limited thereto.
作業ロボット7は作業ロボット制御装置6と有線および/または無線で接続されており、作業ロボット制御装置6はデータ処理装置5と接続されている。
The work robot 7 is connected to the work robot control device 6 by wire and / or wireless, and the work robot control device 6 is connected to the data processing device 5.
作業ロボット7のアーム機構71の各駆動部には、作業フェーズ判定用として関節角度センサ(作業ロボットセンサ)8が取り付けられている。この関節角度センサ8は、例えば、作業ロボット7内に搭載された、各駆動部の移動量や回転量を検出する検出装置としてのレーザ距離計、エンコーダ、ポテンショメータ、傾斜計、地磁気センサ、ジャイロセンサのうち、いずれか一つ以上とすることができる。なお、関節角度センサ8に加えて、対象物12との相互作用を検出する検出装置としてのとして周辺環境の映像データを取得するカメラ、超音波距離計、周辺環境や対象物12の形状を測定するレーザ距離計、ロボットの手先に加わる力・トルクを測定する力・トルクセンサ、温度計、感圧センサ、各駆動部の動作を検出する検出装置としての電流センサ、のうちいずれか一つ以上を用いることができるとともに、関節角度センサ8の代わりにこれらのセンサのうちいずれか一つ以上を用いることができる。
A joint angle sensor (work robot sensor) 8 is attached to each drive unit of the arm mechanism 71 of the work robot 7 for work phase determination. The joint angle sensor 8 is, for example, a laser distance meter, encoder, potentiometer, inclinometer, geomagnetic sensor, gyro sensor as a detection device that is mounted in the work robot 7 and detects the amount of movement and rotation of each drive unit. Any one or more of them. In addition to the joint angle sensor 8, as a detection device for detecting the interaction with the object 12, a camera that acquires image data of the surrounding environment, an ultrasonic distance meter, the shape of the surrounding environment and the object 12 are measured. Any one or more of: a laser distance meter, a force / torque sensor that measures the force and torque applied to the hand of the robot, a thermometer, a pressure sensor, and a current sensor as a detection device that detects the operation of each drive unit Can be used, and any one or more of these sensors can be used instead of the joint angle sensor 8.
作業を管理・監視するオペレータ1は、オペレータ1による作業ロボット7の操作を指示するジョイスティックコントローラ(操作装置)3を用いて操作制御装置4へ作業指示を入力し、作業ロボット7を操作する。
The operator 1 who manages and monitors the work inputs a work instruction to the operation control device 4 using a joystick controller (operation device) 3 that instructs the operation of the work robot 7 by the operator 1 to operate the work robot 7.
このオペレータ1の居る環境にはオペレータ1の挙動を検出することで作業フェーズを判定するために用いられるオペレータセンサとして、監視映像表示部45に表示された監視映像中でのオペレータ1の注視点を検出する視線計測装置2が設けられている。
In the environment where the operator 1 is present, an operator sensor used to determine the work phase by detecting the behavior of the operator 1 is used to set the gaze point of the operator 1 in the monitoring video displayed on the monitoring video display unit 45. A line-of-sight measuring device 2 for detection is provided.
なお、オペレータセンサは、視線計測装置2の他に、オペレータの操作を検知する力・トルクセンサ、感圧センサ、地磁気センサ、ジャイロセンサ、加速度センサ等の装着型センサ、モーションキャプチャ等の非接触型センサを一つ以上用いることができる。
In addition to the line-of-sight measuring device 2, the operator sensor is a force / torque sensor that detects an operator's operation, a pressure-sensitive sensor, a geomagnetic sensor, a gyro sensor, an acceleration sensor, or a non-contact type sensor such as a motion capture. One or more sensors can be used.
自律監視ロボット10は、作業ロボット7による遠隔作業を監視するロボットであり、自律監視ロボット10を移動させる移動機構102と、作業ロボット7を撮像する監視カメラと、この監視カメラを把持するアーム機構101と、自律監視ロボット10の各駆動部に取り付けられた関節角度センサ(自律監視ロボットセンサ)100を有している。監視カメラは、光学カメラ11、超音波スキャナ、レーザスキャナのいずれか1つとすることができる。ここでは光学カメラ11とする。
The autonomous monitoring robot 10 is a robot that monitors the remote work performed by the work robot 7, and includes a moving mechanism 102 that moves the autonomous monitoring robot 10, a monitoring camera that captures the work robot 7, and an arm mechanism 101 that holds the monitoring camera. And a joint angle sensor (autonomous monitoring robot sensor) 100 attached to each drive unit of the autonomous monitoring robot 10. The monitoring camera can be any one of the optical camera 11, an ultrasonic scanner, and a laser scanner. Here, the optical camera 11 is used.
自律監視ロボット10は自律監視ロボット制御装置9と有線および/または無線で接続されており、自律監視ロボット制御装置9はデータ処理装置5と接続されている。
The autonomous monitoring robot 10 is connected to the autonomous monitoring robot control device 9 in a wired and / or wireless manner, and the autonomous monitoring robot control device 9 is connected to the data processing device 5.
自律監視ロボット10では、オペレータ1が装着した視線計測装置2と作業ロボット7の関節角度センサ8からの信号をもとに、データ処理装置5において作業フェーズを判定し、自律監視ロボット制御装置9において自律監視ロボット10の動作を生成する。作業中は、オペレータ1は、自律監視ロボット10に取り付けた光学カメラ11からの監視映像を通して、作業の進行状況や装置の状態等を監視することが可能となっている。
In the autonomous monitoring robot 10, the work phase is determined in the data processing device 5 based on the signals from the line-of-sight measuring device 2 worn by the operator 1 and the joint angle sensor 8 of the work robot 7, and in the autonomous monitoring robot control device 9. The operation of the autonomous monitoring robot 10 is generated. During the operation, the operator 1 can monitor the progress of the operation, the state of the apparatus, and the like through the monitoring video from the optical camera 11 attached to the autonomous monitoring robot 10.
監視映像表示部(表示装置)45は、自律監視ロボット10の光学カメラ11が撮像した監視映像を表示する液晶等のディスプレイである。
The monitoring video display unit (display device) 45 is a display such as a liquid crystal that displays the monitoring video captured by the optical camera 11 of the autonomous monitoring robot 10.
操作制御装置4はジョイスティックコントローラ3によるオペレータ1の操作信号の入力を受けて作業ロボット7を操作する操作信号を生成する装置であり、操作入力部40、目標値算出部41、注視点位置パターン検出部(パターン検出部)42、監視映像出力部43、およびデータ送受信部44を含んでいる。
The operation control device 4 is a device that generates an operation signal for operating the work robot 7 in response to the input of the operation signal of the operator 1 by the joystick controller 3, and includes an operation input unit 40, a target value calculation unit 41, and a gazing point position pattern detection. Section (pattern detection section) 42, monitoring video output section 43, and data transmission / reception section 44.
操作入力部40は、オペレータ1がジョイスティックコントローラ3を用いた操作入力を操作制御装置4へ取り込む部分である。
The operation input unit 40 is a part in which the operator 1 inputs an operation input using the joystick controller 3 to the operation control device 4.
目標値算出部41は、操作入力部40において取り込んだ操作入力に基づいて作業ロボット7の目標動作を算出する部分である。
The target value calculation unit 41 is a part that calculates the target motion of the work robot 7 based on the operation input captured by the operation input unit 40.
注視点位置パターン検出部42は、オペレータ1が装着した視線計測装置2により監視映像表示部45に表示された監視映像中でオペレータ1が注視する作業位置パターンを検出する部分である。
The gazing point position pattern detection unit 42 is a part that detects a work position pattern that the operator 1 gazes in the monitoring image displayed on the monitoring image display unit 45 by the line-of-sight measurement device 2 worn by the operator 1.
監視映像出力部43は、オペレータ1への監視映像を監視映像表示部45に対して出力する部分である。
The monitoring video output unit 43 is a part that outputs a monitoring video for the operator 1 to the monitoring video display unit 45.
データ送受信部44は、データ処理装置5とのデータの送受信を管理する部分である。
The data transmission / reception unit 44 is a part that manages data transmission / reception with the data processing device 5.
データ処理装置5は、視線計測装置2や関節角度センサ8からのセンサ情報に基づいて自律監視ロボット10による自律監視動作のための制御動作信号を生成する装置であり、データ送受信部50、作業フェーズ判定部51、記憶部53、カメラ位置・姿勢生成部52、および自律動作算出部55を含んでいる。
The data processing device 5 is a device that generates a control operation signal for an autonomous monitoring operation by the autonomous monitoring robot 10 based on the sensor information from the line-of-sight measurement device 2 and the joint angle sensor 8. A determination unit 51, a storage unit 53, a camera position / posture generation unit 52, and an autonomous operation calculation unit 55 are included.
データ送受信部50は、操作制御装置4、作業ロボット制御装置6、自律監視ロボット制御装置9とのデータ授受を管理する部分である。
The data transmission / reception unit 50 is a part that manages data exchange with the operation control device 4, the work robot control device 6, and the autonomous monitoring robot control device 9.
作業フェーズ判定部51は、作業のフェーズを判定する部分である。この作業フェーズ判定部51では、作業ロボット7に搭載された関節角度センサ8の信号から作業ロボット7のアーム機構71の操作速度を求め、アーム機構71の操作速度およびアーム機構71と対象物12との距離との関係から作業フェーズを判定し、判定した作業フェーズに適した制御動作信号を生成する。また、作業フェーズ判定部51は、視線計測装置2によって取得したオペレータ1の注視点に基づいて作業フェーズを判定,生成し、判定した作業フェーズをカメラ位置・姿勢生成部52に出力する。
The work phase determination unit 51 is a part that determines a work phase. In this work phase determination unit 51, the operation speed of the arm mechanism 71 of the work robot 7 is obtained from the signal of the joint angle sensor 8 mounted on the work robot 7, the operation speed of the arm mechanism 71, the arm mechanism 71, the object 12, The work phase is determined from the relationship with the distance, and a control operation signal suitable for the determined work phase is generated. Further, the work phase determination unit 51 determines and generates a work phase based on the gaze point of the operator 1 acquired by the line-of-sight measurement device 2, and outputs the determined work phase to the camera position / posture generation unit 52.
記憶部53は、作業フェーズ判定部51での作業フェーズを判断するために用いる作業訓練データ54を格納するハードディスクやRAMから構成される装置である。
The storage unit 53 is a device that includes a hard disk and a RAM that store work training data 54 used to determine the work phase in the work phase determination unit 51.
カメラ位置・姿勢生成部52は、作業フェーズの推定結果と、作業ロボット7と対象物12との幾何学的位置関係と、から光学カメラ11の位置を生成する部分である。
The camera position / posture generation unit 52 is a part that generates the position of the optical camera 11 from the estimation result of the work phase and the geometric positional relationship between the work robot 7 and the object 12.
自律動作算出部55は、光学カメラ11をカメラ目標位置に配置するための自律監視ロボット10の動作に必要な動作信号を生成する部分である。
The autonomous movement calculation unit 55 is a part that generates an operation signal necessary for the movement of the autonomous monitoring robot 10 for placing the optical camera 11 at the camera target position.
作業ロボット制御装置6は、ジョイスティックコントローラ3によって入力され、操作制御装置4内で生成された操作信号を作業ロボット7の指令信号に変換する装置であり、データ送受信部60、指令電圧算出部61、およびロボット制御部62を含んでいる。
The work robot control device 6 is a device that converts an operation signal input by the joystick controller 3 and generated in the operation control device 4 into a command signal of the work robot 7, and includes a data transmission / reception unit 60, a command voltage calculation unit 61, And a robot controller 62.
データ送受信部60は、データ処理装置5とデータの授受を行う部分である。
The data transmission / reception unit 60 is a part that exchanges data with the data processing device 5.
指令電圧算出部61は、データ処理装置5から送信される作業ロボット7の指令信号から作業ロボット7の移動機構72とアーム機構71の目標制御量(移動機構72のクローラ回転速度、アーム機構71の各関節の関節角度)を算出する部分である。
The command voltage calculation unit 61 receives the target control amount (the crawler rotation speed of the moving mechanism 72, the crawler rotation speed of the moving mechanism 72, the arm mechanism 71) from the command signal of the working robot 7 transmitted from the data processing device 5. This is a part for calculating the joint angle of each joint.
ロボット制御部62は、算出された目標値と作業ロボット7の各駆動部に内蔵される関節角度センサ8の信号から算出される現在角度(現在姿勢)を用いて作業ロボット7の各駆動部の目標指令電圧を生成する部分である。
The robot control unit 62 uses the calculated target value and the current angle (current posture) calculated from the signal of the joint angle sensor 8 built in each drive unit of the work robot 7 to set the drive unit of each work robot 7. This is the part that generates the target command voltage.
自律監視ロボット制御装置9は、データ処理装置5で生成された制御動作信号を自律監視ロボット10の指令信号に変換する装置であり、データ送受信部90、指令電圧算出部91、ロボット制御部92、および監視映像取得部93を含んでいる。
The autonomous monitoring robot control device 9 is a device that converts the control operation signal generated by the data processing device 5 into a command signal of the autonomous monitoring robot 10, and includes a data transmission / reception unit 90, a command voltage calculation unit 91, a robot control unit 92, And a monitoring video acquisition unit 93.
データ送受信部90は、データ処理装置5とデータの授受を行う部分である。
The data transmission / reception unit 90 is a part that exchanges data with the data processing device 5.
指令電圧算出部91は、データ処理装置5から送信される自律監視ロボット10の指令信号から自律監視ロボット10の移動機構102とアーム機構101の目標制御量(移動機構102のクローラ回転速度、アーム機構101の各関節の関節角度)を算出する部分である。
The command voltage calculation unit 91 receives the target control amount (the crawler rotation speed of the movement mechanism 102, the arm mechanism) from the command signal of the autonomous monitoring robot 10 transmitted from the data processing device 5 based on the movement mechanism 102 and the arm mechanism 101 of the autonomous monitoring robot 10. (Joint angle of each joint 101) is calculated.
ロボット制御部92は、算出された目標値と自律監視ロボット10の各駆動部に内蔵される関節角度センサ100の信号から算出される現在角度(現在姿勢)を用いて自律監視ロボット10の各駆動部の目標指令電圧を生成する部分である。
The robot control unit 92 drives each of the autonomous monitoring robots 10 using the calculated target value and the current angle (current posture) calculated from the signal of the joint angle sensor 100 built in each driving unit of the autonomous monitoring robot 10. This is a part for generating the target command voltage of the part.
監視映像取得部93は、自律監視ロボット10のアーム機構101に取り付けられた光学カメラ11からの映像を取得する部分である。
The monitoring video acquisition unit 93 is a part that acquires a video from the optical camera 11 attached to the arm mechanism 101 of the autonomous monitoring robot 10.
なお、操作制御装置4、データ処理装置5、作業ロボット制御装置6、および自律監視ロボット制御装置9の各装置は、統合・分割が適宜可能であり、図1,2の構成に限定されない。例えば、操作制御装置4とデータ処理装置5を統合して一つの装置にしても良いし、作業ロボット制御装置6を移動機構用と作業機構用に分割しても良い。また、各装置間は有線ケーブルにより接続されているが無線化されたシステム構成でも良い。
The operation control device 4, the data processing device 5, the work robot control device 6, and the autonomous monitoring robot control device 9 can be appropriately integrated and divided, and are not limited to the configurations shown in FIGS. For example, the operation control device 4 and the data processing device 5 may be integrated into a single device, or the work robot control device 6 may be divided into a movement mechanism and a work mechanism. In addition, each device is connected by a wired cable, but a wireless system configuration may be used.
次に、本実施の形態に係る作業ロボット7および自律監視ロボットからなる遠隔作業ロボットの制御処理の方法について図3乃至図6Cを参照して説明する。
Next, a method for controlling the remote work robot including the work robot 7 and the autonomous monitoring robot according to the present embodiment will be described with reference to FIGS. 3 to 6C.
図3は、本実施形態の遠隔作業ロボット制御システムによる作業ロボットの動作制御処理および自律監視ロボット制御処理内容を概略的に示すPAD図である。図4は、本実施形態の遠隔作業ロボット制御システムにおける、作業ロボット関節角度センサ信号から自律監視動作への変換処理フローを示すPAD図である。
FIG. 3 is a PAD diagram schematically showing the contents of the operation control process and the autonomous monitoring robot control process of the work robot by the remote work robot control system of the present embodiment. FIG. 4 is a PAD showing a conversion process flow from the work robot joint angle sensor signal to the autonomous monitoring operation in the remote work robot control system of the present embodiment.
最初に、遠隔作業ロボットの制御処理の全体の流れについて図3を用いて説明する。なお、遠隔作業ロボット制御システム1000では、図3に示す処理を電源がONの間は実行し続ける。
First, the overall flow of the remote robot control process will be described with reference to FIG. In the remote work robot control system 1000, the processing shown in FIG. 3 is continuously executed while the power is on.
図3において、遠隔作業ロボット制御システム1000は、まず、作業ロボット7の関節角度センサ8を初期化する(ステップS1)。
In FIG. 3, the remote work robot control system 1000 first initializes the joint angle sensor 8 of the work robot 7 (step S1).
次に、オペレータ1によって操作指示が入力されたことを認識する(ステップS10)と、以下の処理を開始する。
Next, when it is recognized that an operation instruction has been input by the operator 1 (step S10), the following processing is started.
最初に、入力信号に応じて移動機構72の移動方向・移動速度を算出(ステップS11)して、移動機構72の制御量(クローラ等の電圧指令値)を生成(算出)する(ステップS12)。また、オペレータ1が操作指示するアーム手先位置からアーム機構71の各関節座標を求める逆運動学を用いて、目標関節の位置・角度を算出(ステップS13)して、アーム機構71の制御量(各関節モータの電圧指令値)を生成(算出)する(ステップS14)。
First, the moving direction / moving speed of the moving mechanism 72 is calculated according to the input signal (step S11), and a control amount (voltage command value of a crawler or the like) of the moving mechanism 72 is generated (calculated) (step S12). . In addition, the position and angle of the target joint is calculated (step S13) using inverse kinematics that obtains the joint coordinates of the arm mechanism 71 from the arm hand position indicated by the operator 1, and the control amount of the arm mechanism 71 ( A voltage command value for each joint motor is generated (calculated) (step S14).
次に、作業ロボット7の動作生成処理を開始する(ステップS20)。このステップS20は、オペレータ1の操作指示による作業ロボット7の操作信号を作業ロボット7の指令信号に変換する作業ロボット制御ステップであり、上述したステップS12,S14において作業指示から算出されたロボットの制御量を基に、作業ロボット7の目標動作を生成する。
Next, the operation generation process of the work robot 7 is started (step S20). This step S20 is a work robot control step for converting the operation signal of the work robot 7 according to the operation instruction of the operator 1 into the command signal of the work robot 7, and the robot control calculated from the work instruction in the above-described steps S12 and S14. Based on the quantity, a target action of the work robot 7 is generated.
ステップS20では、まず、ステップS12,S14で算出された移動機構72・アーム機構71の電圧指令値を読み込む(ステップS21)と、各モータへの電圧指令値を出力する(ステップS22)。その後、作業ロボット7に搭載した関節角度センサ8の情報を基に、作業ロボット7の状態を検出し(ステップS23)、データ処理装置5を介して操作制御装置4に向けて関節角度センサ8の取得データを送信する(ステップS24)。
In step S20, first, when the voltage command values of the moving mechanism 72 and the arm mechanism 71 calculated in steps S12 and S14 are read (step S21), the voltage command values to each motor are output (step S22). Thereafter, based on the information of the joint angle sensor 8 mounted on the work robot 7, the state of the work robot 7 is detected (step S 23), and the joint angle sensor 8 is directed toward the operation control device 4 via the data processing device 5. The acquired data is transmitted (step S24).
次に、作業ロボット制御装置6から得られる情報を基に、作業状況を分析する(ステップS30)。このステップS30は、オペレータ1の挙動を検出する視線計測装置2および作業ロボット7の各駆動部に取り付けられた関節角度センサ8とからのセンサ情報に基づいて自律監視ロボット10による自律監視動作のための制御動作信号を生成するデータ処理ステップである。
Next, based on the information obtained from the work robot controller 6, the work situation is analyzed (step S30). This step S30 is for an autonomous monitoring operation by the autonomous monitoring robot 10 based on sensor information from the line-of-sight measuring device 2 that detects the behavior of the operator 1 and the joint angle sensor 8 attached to each drive unit of the work robot 7. This is a data processing step for generating the control operation signal.
ステップS30では、まず、データ処理装置5から出力された作業ロボット7の関節角度センサ8の取得データを受信する(ステップS31)。次いで、取得データから作業ロボット7の姿勢を順運動学により導出する(ステップS32)。その後、作業アームの姿勢と対象物12との位置関係(アーム機構71と対象物12との距離との関係)や作業アームの姿勢の時間変化などの動作信号(アーム機構71の操作速度)や、注視点位置パターン(オペレータ1の注視点)により作業フェーズを判定する(ステップS33)。その後、ステップS33で判定した作業フェーズに応じて、作業ロボット7と対象物12との位置関係から光学カメラ11の目標位置・姿勢を生成する(ステップS34)。次いで、ステップS34で生成した光学カメラ11の目標位置・姿勢となるような自律監視ロボット10の自律動作(移動機構102の制御量等)を算出する(ステップS35)。その後、自律監視ロボット10のアーム機構101の制御量(各関節モータの電圧指令値)を生成する(ステップS36)。
In step S30, first, the acquired data of the joint angle sensor 8 of the work robot 7 output from the data processing device 5 is received (step S31). Next, the posture of the work robot 7 is derived from the acquired data by forward kinematics (step S32). Thereafter, an operation signal (operation speed of the arm mechanism 71) such as a positional relationship between the posture of the work arm and the object 12 (a relationship between the distance between the arm mechanism 71 and the object 12), a time change of the posture of the work arm, and the like. The work phase is determined based on the gazing point position pattern (gaze point of the operator 1) (step S33). Thereafter, the target position / orientation of the optical camera 11 is generated from the positional relationship between the work robot 7 and the object 12 according to the work phase determined in step S33 (step S34). Next, the autonomous operation (the control amount of the moving mechanism 102, etc.) of the autonomous monitoring robot 10 that calculates the target position / posture of the optical camera 11 generated in step S34 is calculated (step S35). Thereafter, a control amount (voltage command value of each joint motor) of the arm mechanism 101 of the autonomous monitoring robot 10 is generated (step S36).
次に、ステップS30において算出された自律監視ロボット10の制御量を基に、自律監視ロボット10の目標動作を生成する動作生成処理を開始する(ステップS40)。このステップS40はデータ処理ステップ中のステップS35で生成された制御動作信号を自律監視ロボット10の指令信号に変換する自律監視ロボット制御ステップである。
Next, based on the control amount of the autonomous monitoring robot 10 calculated in step S30, an action generation process for generating a target action of the autonomous monitoring robot 10 is started (step S40). This step S40 is an autonomous monitoring robot control step for converting the control operation signal generated in step S35 in the data processing step into a command signal for the autonomous monitoring robot 10.
ステップS40では、まず、ステップS35で算出された移動機構102・アーム機構101の電圧指令値を読み込む(ステップS41)と、各モータへの電圧指令値を出力する(ステップS42)。その後、自律監視ロボット10に搭載した光学カメラ11で監視映像を取得し(ステップS43)、取得した監視映像をデータ処理装置5に向けて送信する(ステップS44)。
In step S40, first, when the voltage command values of the moving mechanism 102 and the arm mechanism 101 calculated in step S35 are read (step S41), the voltage command values to each motor are output (step S42). Thereafter, the monitoring video is acquired by the optical camera 11 mounted on the autonomous monitoring robot 10 (step S43), and the acquired monitoring video is transmitted to the data processing device 5 (step S44).
次に、オペレータ1への監視映像の提示を開始する(ステップS50)。このステップS50は自律監視ロボット10の光学カメラ11が撮像した監視映像を表示する表示ステップである。
Next, presentation of the monitoring video to the operator 1 is started (step S50). This step S50 is a display step for displaying a monitoring video imaged by the optical camera 11 of the autonomous monitoring robot 10.
ステップS50では、最初に、操作制御装置4の監視映像出力部43はステップS44で出力された監視映像を受信(ステップS51)し、次いで、監視映像を監視映像表示部45に対して出力する(ステップS52)。
In step S50, first, the monitoring video output unit 43 of the operation control apparatus 4 receives the monitoring video output in step S44 (step S51), and then outputs the monitoring video to the monitoring video display unit 45 (step S51). Step S52).
ここで、図4を用いて、ステップS33のアーム動作信号、注視点位置パターンによる作業フェーズ判定処理の詳細について説明する。
Here, the details of the work phase determination process based on the arm operation signal and the gaze position pattern in step S33 will be described with reference to FIG.
図4において、最初に、作業ロボット7に搭載した関節角度センサ8のデータを読み込むとともに、視線計測装置2から監視映像表示部45に表示された監視映像におけるオペレータ1の注視点位置パターンを読み込む(ステップS100)。
In FIG. 4, first, the data of the joint angle sensor 8 mounted on the work robot 7 is read, and the gaze position pattern of the operator 1 in the monitoring video displayed on the monitoring video display unit 45 is read from the line-of-sight measuring device 2 ( Step S100).
その後、作業ロボット7の位置・姿勢・移動速度を導出(ステップS101)して、対象物12とアーム機構71との幾何学的位置関係を検出する(ステップS102)。
Thereafter, the position / posture / movement speed of the work robot 7 is derived (step S101), and the geometric positional relationship between the object 12 and the arm mechanism 71 is detected (step S102).
その後、導出,検出した現在の作業ロボット7の位置・姿勢・移動速度や幾何学的位置関係と、予めトレーニング等で取得し、記憶部53において記憶しておいた作業訓練データ54における、作業ロボット7の位置,姿勢,移動速度,対象物12との位置関係やオペレータ1の注視点位置などの動作信号とから、作業フェーズを判定する(ステップS103)。
Thereafter, the work robot in the work training data 54 acquired in advance by training or the like and stored in the storage unit 53 with the current position / posture / movement speed and geometrical positional relationship of the current work robot 7 derived and detected. The work phase is determined from the movement signal such as the position of 7, posture, moving speed, positional relationship with the object 12 and the operator 1 gazing point position (step S 103).
また、作業訓練データ54を現状作業用に更新していくために、現状の動作信号と作業フェーズの組み合わせを記録する(ステップS104)。以上が作業フェーズ判定処理の基本処理となる。
Also, in order to update the work training data 54 for the current work, a combination of the current operation signal and the work phase is recorded (step S104). The above is the basic process of the work phase determination process.
次いで、図3のステップS30中のステップS33の作業フェーズの判定の一例について図5A乃至図6Cを用いて詳しく説明する。
Next, an example of the determination of the work phase in step S33 in step S30 in FIG. 3 will be described in detail with reference to FIGS. 5A to 6C.
図5Aおよび図5Bは、本実施形態の遠隔作業ロボット制御システムにおける、センサ信号による作業フェーズ判定例を示す図である。図6Aは、本実施形態の遠隔作業ロボット制御システムにおける、目標となる幾何学配置の関係例を示す図、図6Bおよび図6Cは、目標となる幾何学配置の決定例を示す図である。
FIG. 5A and FIG. 5B are diagrams showing an example of work phase determination based on sensor signals in the remote work robot control system of this embodiment. FIG. 6A is a diagram illustrating a relationship example of a target geometric arrangement in the remote operation robot control system of the present embodiment, and FIGS. 6B and 6C are diagrams illustrating examples of determining a target geometric arrangement.
まず、図5Aおよび図5Bを用いて、遠隔作業を把持としたときの作業ロボット7のアーム機構71の操作速度とアーム機構71と対象物12との距離との関係に基づいた作業フェーズの判定処理の一例を説明する。
First, using FIG. 5A and FIG. 5B, determination of the work phase based on the relationship between the operation speed of the arm mechanism 71 of the work robot 7 and the distance between the arm mechanism 71 and the object 12 when the remote work is gripped. An example of processing will be described.
本実施形態のような遠隔作業ロボットの制御方法では、予め、把持完了までの作業フェーズを、アーム機構71が対象物12に対して近づいたり離れたりする「接近フェーズ」、アーム機構71によって対象物12を把持するための微妙な位置を調整する「位置調節フェーズ」、および対象物12と作業ロボット7の距離の近さに比べてアーム機構71の速度が速く、通常の操作ではない場合である「異常操作フェーズ」、の3段階に設定しておく。また、各作業フェーズの境界を、オペレータ1の遠隔操作の事前訓練時のデータから求めておき、図5Aに示すような作業訓練データ54として記憶部53に記憶させておく。
In the remote robot control method according to the present embodiment, the work phase until the completion of gripping is performed in advance as an “approach phase” in which the arm mechanism 71 approaches or separates from the object 12, and the object is detected by the arm mechanism 71. This is a “position adjustment phase” in which a subtle position for gripping 12 is adjusted, and a case where the speed of the arm mechanism 71 is higher than the proximity of the distance between the object 12 and the work robot 7 and is not a normal operation. It is set in three stages of “abnormal operation phase”. Further, the boundaries between the respective work phases are obtained from data at the time of remote training by the operator 1 in advance, and are stored in the storage unit 53 as work training data 54 as shown in FIG. 5A.
実際の遠隔作業時は、作業フェーズ判定部51において、作業ロボット7のアーム機構71と対象物12との距離の値と作業ロボット7のアーム機構71の速度の値の組み合わせが、図5A中のどこに位置するのかをオンラインで判定し、判定した位置が属する作業フェーズをカメラ位置・姿勢生成部52に対して出力する。
At the time of actual remote work, the combination of the distance value between the arm mechanism 71 of the work robot 7 and the object 12 and the speed value of the arm mechanism 71 of the work robot 7 in the work phase determination unit 51 is shown in FIG. 5A. It is determined online where the position is, and the work phase to which the determined position belongs is output to the camera position / posture generation unit 52.
この時、作業ロボット1のアーム機構71が0のときは、操作停止中であり、「現状維持フェーズ」であると判定する。
At this time, when the arm mechanism 71 of the work robot 1 is 0, it is determined that the operation is stopped and the current state maintenance phase is in progress.
対象物と作業ロボットの距離が0のときは、把持動作中であるとして「作業フェーズ」であると判定する。この場合、操作制御装置4の監視映像表示部45に「把持可能」との表示を行うよう監視映像表示部45に向けた信号を出力することが望ましい。
When the distance between the target object and the work robot is 0, it is determined that the gripping operation is being performed and that it is in the “work phase”. In this case, it is desirable to output a signal directed to the monitoring video display unit 45 so that “monitoring is possible” is displayed on the monitoring video display unit 45 of the operation control device 4.
なお、本ステップにおける作業フェーズ判定部51では、図4のステップS104に示すように、現状の動作信号と判定した作業フェーズの組合せを記録しておき、図5Bに示すようにオペレータ1、作業環境13に応じて作業フェーズの境界値を更新し、記憶部53で記憶しておくことが望ましい。
The work phase determination unit 51 in this step records the combination of the work phase determined to be the current operation signal as shown in step S104 of FIG. 4, and the operator 1, work environment as shown in FIG. 5B. It is desirable that the boundary value of the work phase is updated according to 13 and stored in the storage unit 53.
また、異常操作フェーズであるときは、不測の事態の発生が生じることを抑制するために、アーム機構71や移動機構72の動作を停止したり、オペレータ1のジョイスティックコントローラ3からの操作入力信号を無効化したりするための信号をカメラ位置・姿勢生成部52を介して出力することが望ましい。
In the abnormal operation phase, the operation of the arm mechanism 71 and the moving mechanism 72 is stopped or an operation input signal from the joystick controller 3 of the operator 1 is received in order to suppress the occurrence of an unexpected situation. It is desirable to output a signal for invalidation via the camera position / posture generation unit 52.
次に、図6A乃至図6Cを用いて、オペレータが装着した視線計測装置2によって取得したオペレータ1の注視点位置パターンを用いた作業フェーズの判定処理の一例について説明する。
Next, an example of work phase determination processing using the gaze position pattern of the operator 1 acquired by the gaze measuring device 2 worn by the operator will be described with reference to FIGS. 6A to 6C.
作業フェーズ判定部51では、図6Aに示すように、監視映像中の注視点の位置を作業ロボット本体J1、作業ロボット手先J2、対象物J3の3領域に分ける。
As shown in FIG. 6A, the work phase determination unit 51 divides the position of the gazing point in the monitoring video into three regions: a work robot body J1, a work robot hand J2, and a target object J3.
このように注視点を分けたときに、作業フェーズ判定部51では、図6Bや図6Cに示すように、オペレータ1が各領域をどの程度の時間注視したかを注視点毎に求め、作業ロボット7を操作する単位時間当たりに各位置を注視する割合(注視割合)を求める。そして求めた中止割合から作業フェーズを判定する。
When the gazing points are thus divided, the work phase determination unit 51 obtains for each gazing point how long the operator 1 has gazed at each area, as shown in FIG. 6B and FIG. The ratio (gaze ratio) at which each position is watched per unit time for operating 7 is obtained. Then, the work phase is determined from the obtained stop ratio.
例えば、図6Bに示すように、作業ロボット本体J1、作業ロボット手先J2、対象物J3の各領域を比較的均等に注視しているときは、「接近フェーズ」であると判定する。また、図6Cに示すように、作業ロボット本体J1に比べて作業ロボット手先J2、対象物J3を注視している割合が高いときは、「位置調節フェーズ」であると判定する。
For example, as shown in FIG. 6B, when each area of the work robot body J1, the work robot hand J2, and the target object J3 is relatively evenly watched, it is determined to be in the “approach phase”. Further, as shown in FIG. 6C, when the ratio of gazing at the work robot hand J2 and the object J3 is higher than that of the work robot body J1, it is determined that it is the “position adjustment phase”.
なお、注視点がJ1,J2,J3とは異なる領域である割合が高いと判定されたときや、オペレータ1が当該監視画像を注視しておらずに単位時間当たりの注視点がほとんど存在しないと判定されたときは、「現状維持フェーズ」であると判定することが望ましい。
When it is determined that the ratio of the gazing point is different from J1, J2, and J3 is high, or when the operator 1 is not gazing at the monitoring image and there is almost no gazing point per unit time. When it is determined, it is desirable to determine that it is in the “current status maintenance phase”.
また、本ステップにおける作業フェーズ判定部51では、図4のステップS104や図5Bに示すときと同様に、現状の注視割合と作業フェーズの組合せを記録し、作業フェーズの境界値を更新し、記憶部53で記憶しておくことが望ましい。
In addition, the work phase determination unit 51 in this step records the combination of the current gaze ratio and the work phase, updates the boundary value of the work phase, and stores it in the same manner as shown in step S104 of FIG. 4 or FIG. 5B. It is desirable to store it in the unit 53.
また、図5Aおよび図5Bに示すような作業ロボット7のアーム機構71の操作速度とアーム機構71と対象物12との距離との関係に基づいた作業フェーズの判定結果と、図6A乃至図6Cに示すような視線計測装置2によって取得したオペレータ1の注視点に基づいた作業フェーズの判定結果とが異なる場合は、作業フェーズ判定部51は、どちらを優先してもよく、優先順位を設けておくことができる。
5A and 5B, the determination result of the work phase based on the relationship between the operation speed of the arm mechanism 71 of the work robot 7 and the distance between the arm mechanism 71 and the object 12, and FIGS. 6A to 6C. When the work phase determination result based on the gaze point of the operator 1 acquired by the line-of-sight measurement device 2 as shown in FIG. 2 is different, the work phase determination unit 51 may prioritize which one and give priority. I can leave.
次いで、図3のステップS30中のステップS34の自律監視ロボット10に設けられた光学カメラ11の位置の算出方法の一例について説明する。
Next, an example of a method for calculating the position of the optical camera 11 provided in the autonomous monitoring robot 10 in step S34 in step S30 in FIG. 3 will be described.
カメラ位置・姿勢生成部52では、作業フェーズ判定部51において作業フェーズが「接近フェーズ」と判定されたときは、光学カメラ11による監視画像中に作業ロボット7全体や対象物12が写り、位置関係が分かりやすい位置を生成する。また、判定された作業フェーズが「位置調整フェーズ」や「作業フェーズ」のときは、光学カメラ11が対象物12や作業ロボット7のアーム機構71をズームするような位置を生成する。更には、「現状維持フェーズ」と判定されたときは、現在の位置を維持するように位置信号を生成する。
In the camera position / posture generation unit 52, when the work phase determination unit 51 determines that the work phase is “approach phase”, the entire work robot 7 and the object 12 are captured in the monitoring image from the optical camera 11, and the positional relationship Generates a location that is easy to understand. Further, when the determined work phase is “position adjustment phase” or “work phase”, a position where the optical camera 11 zooms the object 12 and the arm mechanism 71 of the work robot 7 is generated. Furthermore, when it is determined as the “current state maintenance phase”, a position signal is generated so as to maintain the current position.
これらに対し、判定された作業フェーズが「異常操作」であった時は、自律監視ロボット10の光学カメラ11の位置、姿勢は現状維持を図るために、新たな位置、姿勢の算出は行わない。またこの場合、操作制御装置4の監視映像表示部45に「異常操作」がなされている旨の警告表示を行うよう監視映像表示部45に対する信号を出力することが望ましい。
On the other hand, when the determined work phase is “abnormal operation”, the position and orientation of the optical camera 11 of the autonomous monitoring robot 10 are not calculated so as to maintain the current position and orientation. . In this case, it is desirable to output a signal to the monitoring video display unit 45 so as to display a warning to the effect that “abnormal operation” has been performed on the monitoring video display unit 45 of the operation control device 4.
次に、本実施の形態の効果について説明する。
Next, the effect of this embodiment will be described.
上述した本発明の実施形態の遠隔作業ロボット制御システム1000や、遠隔作業ロボットの制御方法によれば、視線計測装置2および関節角度センサ8とのうち少なくともいずれか一方からのセンサ情報に基づいてオペレータ1の作業を分類し、自律監視ロボット10の自律監視動作に必要な動作信号を生成する。このため、作業ロボット7の操作に関連した動作から作業に必要となる監視映像をオペレータ1の意図を反映させて自律的に取得することができ、監視オペレータを配置することなく、従来技術では困難であった作業中に生じる作業環境や作業状況の変化に柔軟に対応することができるようになる。また、監視オペレータが不要になるため、オペレータ間の意思疎通に時間を要することがなくなり、少数のオペレータで複数ロボットを操作することが可能となる。
According to the remote work robot control system 1000 and the remote work robot control method of the embodiment of the present invention described above, the operator is based on sensor information from at least one of the line-of-sight measurement device 2 and the joint angle sensor 8. One operation is classified, and an operation signal necessary for the autonomous monitoring operation of the autonomous monitoring robot 10 is generated. For this reason, it is possible to autonomously acquire the monitoring video necessary for the operation from the operation related to the operation of the work robot 7 while reflecting the intention of the operator 1, and it is difficult to use the conventional technique without arranging the monitoring operator. Thus, it becomes possible to flexibly cope with changes in the work environment and work situation that occur during work. In addition, since no monitoring operator is required, it is not necessary to spend time for communication between operators, and a plurality of robots can be operated by a small number of operators.
また、データ処理装置5は、作業ロボット7のアーム機構71の操作速度とアーム機構71と対象物12との距離との関係から作業フェーズを判定し、判定した作業フェーズに適した制御動作信号を生成するため、高い精度でオペレータ1の意図を反映させた監視映像を取得することができ、作業中に生じる作業環境や作業状況の変化に対してより柔軟に対応することができる。
Further, the data processing device 5 determines the work phase from the relationship between the operation speed of the arm mechanism 71 of the work robot 7 and the distance between the arm mechanism 71 and the object 12, and outputs a control operation signal suitable for the determined work phase. Therefore, it is possible to acquire a monitoring video reflecting the intention of the operator 1 with high accuracy, and to respond more flexibly to changes in the work environment and work situation that occur during work.
更に、オペレータセンサは、監視映像表示部45に表示された監視映像中でのオペレータ1の注視点を検出する視線計測装置2であり、データ処理装置5は、視線計測装置2によって取得したオペレータ1の注視点に基づいて作業フェーズを判定し、判定した作業フェーズに適した制御動作信号を生成することによっても、高い精度でオペレータ1の意図を反映させた監視映像を取得することができ、作業中に生じる作業環境や作業状況の変化に対してより柔軟に対応することができる。
Further, the operator sensor is a line-of-sight measuring device 2 that detects the gaze point of the operator 1 in the monitoring video displayed on the monitoring video display unit 45, and the data processing device 5 is the operator 1 acquired by the line-of-sight measuring device 2. It is also possible to obtain a monitoring video reflecting the intention of the operator 1 with high accuracy by determining the work phase based on the gazing point and generating a control operation signal suitable for the determined work phase. It is possible to respond more flexibly to changes in the work environment and work situation that occur inside.
<その他>
なお、本発明は上記した実施の形態に限定されるものではなく、様々な作業例(切断、穿孔等)が含まれる。また、上記した実施の形態は本発明を分かりやすく説明するために詳細に説明したものであり、必ずしも全ての機能に限定されるものではない。また、上記の各構成、機能等は、それらの一部又は全部を、例えば集積回路で設計する等により実現してもよい。また、上記の各構成、機能等は、プロセッサがそれぞれの機能を実現するプログラムを解釈し、実行することによりソフトウェアで実現してもよい。 <Others>
The present invention is not limited to the above-described embodiment, and includes various work examples (cutting, drilling, etc.). Further, the above-described embodiment has been described in detail for easy understanding of the present invention, and is not necessarily limited to all functions. Moreover, you may implement | achieve part or all of said each structure, function, etc., for example by designing with an integrated circuit. Each of the above-described configurations, functions, and the like may be realized by software by interpreting and executing a program that realizes each function by the processor.
なお、本発明は上記した実施の形態に限定されるものではなく、様々な作業例(切断、穿孔等)が含まれる。また、上記した実施の形態は本発明を分かりやすく説明するために詳細に説明したものであり、必ずしも全ての機能に限定されるものではない。また、上記の各構成、機能等は、それらの一部又は全部を、例えば集積回路で設計する等により実現してもよい。また、上記の各構成、機能等は、プロセッサがそれぞれの機能を実現するプログラムを解釈し、実行することによりソフトウェアで実現してもよい。 <Others>
The present invention is not limited to the above-described embodiment, and includes various work examples (cutting, drilling, etc.). Further, the above-described embodiment has been described in detail for easy understanding of the present invention, and is not necessarily limited to all functions. Moreover, you may implement | achieve part or all of said each structure, function, etc., for example by designing with an integrated circuit. Each of the above-described configurations, functions, and the like may be realized by software by interpreting and executing a program that realizes each function by the processor.
1…オペレータ
2…視線計測装置
3…ジョイスティックコントローラ
4…操作制御装置
5…データ処理装置
6…作業ロボット制御装置
7…作業ロボット
8…関節角度センサ
9…自律監視ロボット制御装置
10…自律監視ロボット
11…光学カメラ
12…対象物
13…作業環境
40…操作入力部
41…目標値算出部
42…注視点位置パターン検出部
43…監視映像出力部
44…データ送受信部
45…監視映像表示部
50…データ送受信部
51…作業フェーズ判定部
52…カメラ位置・姿勢生成部
53…記憶部
54…作業訓練データ
55…自律動作算出部
60…データ送受信部
61…指令電圧算出部
62…ロボット制御部
71…アーム機構
72…移動機構
90…データ送受信部
91…指令電圧算出部
92…ロボット制御部
93…監視映像取得部
100…関節角度センサ
101…アーム機構
102…移動機構
1000…遠隔作業ロボット制御システム DESCRIPTION OF SYMBOLS 1 ...Operator 2 ... Eye-gaze measuring device 3 ... Joystick controller 4 ... Operation control device 5 ... Data processing device 6 ... Work robot control device 7 ... Work robot 8 ... Joint angle sensor 9 ... Autonomous monitoring robot control device 10 ... Autonomous monitoring robot 11 ... Optical camera 12 ... Object 13 ... Work environment 40 ... Operation input unit 41 ... Target value calculation unit 42 ... Gaze point position pattern detection unit 43 ... Monitor video output unit 44 ... Data transmission / reception unit 45 ... Monitor video display unit 50 ... Data Transmission / reception unit 51 ... Work phase determination unit 52 ... Camera position / posture generation unit 53 ... Storage unit 54 ... Work training data 55 ... Autonomous motion calculation unit 60 ... Data transmission / reception unit 61 ... Command voltage calculation unit 62 ... Robot control unit 71 ... Arm Mechanism 72 ... Movement mechanism 90 ... Data transmission / reception unit 91 ... Command voltage calculation unit 92 ... Robot control unit 93 ... Monitoring video acquisition unit 1 0 ... joint angle sensor 101 ... arm mechanism 102 ... moving mechanism 1000 ... remote working robot control system
2…視線計測装置
3…ジョイスティックコントローラ
4…操作制御装置
5…データ処理装置
6…作業ロボット制御装置
7…作業ロボット
8…関節角度センサ
9…自律監視ロボット制御装置
10…自律監視ロボット
11…光学カメラ
12…対象物
13…作業環境
40…操作入力部
41…目標値算出部
42…注視点位置パターン検出部
43…監視映像出力部
44…データ送受信部
45…監視映像表示部
50…データ送受信部
51…作業フェーズ判定部
52…カメラ位置・姿勢生成部
53…記憶部
54…作業訓練データ
55…自律動作算出部
60…データ送受信部
61…指令電圧算出部
62…ロボット制御部
71…アーム機構
72…移動機構
90…データ送受信部
91…指令電圧算出部
92…ロボット制御部
93…監視映像取得部
100…関節角度センサ
101…アーム機構
102…移動機構
1000…遠隔作業ロボット制御システム DESCRIPTION OF SYMBOLS 1 ...
Claims (13)
- オペレータの居る場所とは異なる遠隔地に配置された作業ロボットと、
前記オペレータによる前記作業ロボットの操作を指示する操作装置と、
前記操作装置による前記オペレータの操作信号の入力を受けて前記作業ロボットを操作する操作信号を生成する操作制御装置と、
前記操作信号を前記作業ロボットの指令信号に変換する作業ロボット制御装置と、
前記作業ロボットの各駆動部に取り付けられた作業ロボットセンサと、
前記作業ロボットを撮像する監視カメラを有し、前記作業ロボットによる遠隔作業を監視する自律監視ロボットと、
前記オペレータの挙動を検出するオペレータセンサと、
前記オペレータセンサおよび前記作業ロボットセンサとのうち少なくともいずれか一方からのセンサ情報に基づいて前記自律監視ロボットによる自律監視動作のための制御動作信号を生成するデータ処理装置と、
前記データ処理装置で生成された前記制御動作信号を前記自律監視ロボットの指令信号に変換する自律監視ロボット制御装置と、
前記自律監視ロボットの前記監視カメラが撮像した監視映像を表示する表示装置と、を備えた
ことを特徴とする遠隔作業ロボット制御システム。 A work robot located at a remote location different from the location of the operator,
An operation device for instructing the operation of the work robot by the operator;
An operation control device that receives an input of the operation signal of the operator by the operation device and generates an operation signal for operating the work robot;
A work robot controller that converts the operation signal into a command signal of the work robot;
A work robot sensor attached to each drive unit of the work robot;
An autonomous monitoring robot having a monitoring camera for imaging the work robot, and monitoring a remote work by the work robot;
An operator sensor for detecting the behavior of the operator;
A data processing device that generates a control operation signal for an autonomous monitoring operation by the autonomous monitoring robot based on sensor information from at least one of the operator sensor and the work robot sensor;
An autonomous monitoring robot control device that converts the control operation signal generated by the data processing device into a command signal of the autonomous monitoring robot;
A remote working robot control system comprising: a display device that displays a monitoring image captured by the monitoring camera of the autonomous monitoring robot. - 請求項1に記載の遠隔作業ロボット制御システムにおいて、
前記データ処理装置は、前記作業ロボットの把持アームの操作速度と前記把持アームと対象物との距離との関係から作業フェーズを判定し、判定した作業フェーズに適した前記制御動作信号を生成する
ことを特徴とする遠隔作業ロボット制御システム。 The remote work robot control system according to claim 1,
The data processing device determines a work phase from a relationship between an operation speed of a gripping arm of the work robot and a distance between the gripping arm and an object, and generates the control operation signal suitable for the determined work phase. Remote control robot control system characterized by - 請求項1に記載の遠隔作業ロボット制御システムにおいて、
前記オペレータセンサは、前記表示装置に表示された監視映像中でのオペレータの注視点を検出する視線計測装置であり、
前記データ処理装置は、前記視線計測装置によって取得したオペレータの注視点に基づいて作業フェーズを判定し、判定した作業フェーズに適した前記制御動作信号を生成する
ことを特徴とする遠隔作業ロボット制御システム。 The remote work robot control system according to claim 1,
The operator sensor is a line-of-sight measuring device that detects an operator's gazing point in the monitoring video displayed on the display device,
The data processing device determines a work phase based on an operator's gaze point acquired by the line-of-sight measurement device, and generates the control operation signal suitable for the determined work phase. . - 請求項1に記載の遠隔作業ロボット制御システムにおいて、
前記データ処理装置は、
前記操作制御装置、前記作業ロボット制御装置、前記自律監視ロボット制御装置とのデータ授受を管理するデータ送受信部、
前記オペレータセンサと前記作業ロボットセンサの信号から遠隔作業のフェーズを推定する作業フェーズ判定部、
前記作業フェーズ判定部での作業フェーズを判断するために用いる作業訓練データを格納する記憶部、
前記作業フェーズの推定結果と、前記作業ロボットと対象物との幾何学的位置関係と、から前記監視カメラの位置を生成するカメラ位置・姿勢生成部、
前記監視カメラを目標位置に配置するための前記自律監視ロボットの動作に必要な動作信号を生成する自律動作算出部、を含む
ことを特徴とする遠隔作業ロボット制御システム。 The remote work robot control system according to claim 1,
The data processing device includes:
A data transmission / reception unit for managing data exchange with the operation control device, the work robot control device, and the autonomous monitoring robot control device;
A work phase determination unit for estimating a remote work phase from signals of the operator sensor and the work robot sensor;
A storage unit for storing work training data used for determining a work phase in the work phase determination unit;
A camera position / posture generation unit that generates the position of the monitoring camera from the estimation result of the work phase and the geometric positional relationship between the work robot and an object;
A remote operation robot control system comprising: an autonomous motion calculation unit that generates an motion signal necessary for the motion of the autonomous surveillance robot for placing the surveillance camera at a target position. - 請求項1に記載の遠隔作業ロボット制御システムにおいて、
前記操作制御装置は、
前記オペレータの操作入力を取り込む操作入力部、
取り込んだ前記操作入力に基づいて前記作業ロボットの目標動作を算出する目標値算出部、
前記オペレータへの監視映像を出力する監視映像出力部、
データの送受信を管理するデータ送受信部、を含む
ことを特徴とする遠隔作業ロボット制御システム。 The remote work robot control system according to claim 1,
The operation control device includes:
An operation input unit for capturing the operation input of the operator;
A target value calculation unit that calculates a target action of the work robot based on the captured operation input;
A monitoring video output unit for outputting a monitoring video to the operator;
A remote work robot control system comprising a data transmission / reception unit for managing data transmission / reception. - 請求項1に記載の遠隔作業ロボット制御システムにおいて、
前記オペレータセンサは、視線計測装置、力・トルクセンサ、感圧センサ、地磁気センサ、ジャイロセンサ、加速度センサのいずれか一つ以上であり、
前記操作制御装置は、前記表示装置に表示された監視映像中でのオペレータが対象とする作業位置を検出するパターン検出部を有する
ことを特徴とする遠隔作業ロボット制御システム。 The remote work robot control system according to claim 1,
The operator sensor is at least one of a line-of-sight measurement device, a force / torque sensor, a pressure-sensitive sensor, a geomagnetic sensor, a gyro sensor, and an acceleration sensor,
The said operation control apparatus has a pattern detection part which detects the work position which the operator makes into the object in the monitoring image | video displayed on the said display apparatus. The remote work robot control system characterized by the above-mentioned. - 請求項1に記載の遠隔作業ロボット制御システムにおいて、
前記作業ロボットセンサは、前記作業ロボット内に搭載され、前記各駆動部の移動量や回転量を検出する検出装置としてのレーザ距離計、エンコーダ、ポテンショメータ、傾斜計、地磁気センサ、ジャイロセンサ、対象物との相互作用を検出する検出装置としてのとしてカメラ、超音波距離計、レーザ距離計、力・トルクセンサ、温度計、感圧センサ、前記各駆動部の動作を検出する検出装置としての電流センサ、のうちいずれか一つ以上である
ことを特徴とする遠隔作業ロボット制御システム。 The remote work robot control system according to claim 1,
The work robot sensor is mounted in the work robot, and a laser distance meter, encoder, potentiometer, inclinometer, geomagnetic sensor, gyro sensor, object as a detection device for detecting the movement amount and rotation amount of each drive unit Cameras, ultrasonic rangefinders, laser rangefinders, force / torque sensors, thermometers, pressure sensors, current sensors as detection devices that detect the operation of each drive unit A remote working robot control system characterized by being at least one of the above. - 請求項1に記載の遠隔作業ロボット制御システムにおいて、
作業ロボット制御装置は、
前記データ処理装置とデータの授受を行うデータ送受信部、
前記作業ロボットの指令信号から前記作業ロボットの移動機構およびアーム機構の目標制御値を算出する指令電圧算出部、
算出された目標制御値と前記作業ロボットセンサの信号から算出される現在姿勢とを用いて前記作業ロボットの各駆動部の目標指令電圧を生成するロボット制御部、を含む
ことを特徴とする遠隔作業ロボット制御システム。 The remote work robot control system according to claim 1,
Work robot controller
A data transmission / reception unit for exchanging data with the data processing device;
A command voltage calculation unit for calculating a target control value of the moving mechanism and arm mechanism of the work robot from the command signal of the work robot;
A robot control unit that generates a target command voltage for each drive unit of the work robot using a calculated target control value and a current posture calculated from a signal of the work robot sensor. Robot control system. - 請求項1に記載の遠隔作業ロボット制御システムにおいて、
前記自律監視ロボットの各駆動部に取り付けられた自律監視ロボットセンサを更に備え
、
前記自律監視ロボット制御装置は、
前記データ処理装置とデータの授受を行うデータ送受信部、
前記制御動作信号から前記自律監視ロボットの移動機構およびアーム機構の目標制御値を算出する指令電圧算出部、
算出された目標制御値と前記自律監視ロボットセンサの信号から算出される現在姿勢とを用いて前記自律監視ロボットの各駆動部の目標指令電圧を生成するロボット制御部、
前記監視カメラからの映像を取得する監視映像取得部、を含む
ことを特徴とする遠隔作業ロボット制御システム。 The remote work robot control system according to claim 1,
Further comprising an autonomous monitoring robot sensor attached to each drive unit of the autonomous monitoring robot;
The autonomous monitoring robot control device includes:
A data transmission / reception unit for exchanging data with the data processing device;
A command voltage calculation unit that calculates a target control value of the movement mechanism and arm mechanism of the autonomous monitoring robot from the control operation signal;
A robot control unit that generates a target command voltage for each drive unit of the autonomous monitoring robot using the calculated target control value and the current posture calculated from the signal of the autonomous monitoring robot sensor;
A remote operation robot control system comprising: a monitoring video acquisition unit that acquires video from the monitoring camera. - 請求項1に記載の遠隔作業ロボット制御システムにおいて、
前記監視カメラは、光学カメラ、超音波スキャナ、レーザスキャナのいずれか1つを含む
ことを特徴とする遠隔作業ロボット制御システム。 The remote work robot control system according to claim 1,
The remote operation robot control system, wherein the monitoring camera includes any one of an optical camera, an ultrasonic scanner, and a laser scanner. - オペレータの居る場所とは異なる遠隔地に配置された作業ロボット、および前記作業ロボットを撮像する監視カメラを有し、前記作業ロボットによる遠隔作業を監視する自律監視ロボットからなる遠隔作業ロボットの制御方法であって、
前記オペレータの操作指示による前記作業ロボットの操作信号を前記作業ロボットの指令信号に変換する作業ロボット制御ステップと、
前記オペレータの挙動を検出するオペレータセンサおよび前記前記作業ロボットの各駆動部に取り付けられた作業ロボットセンサとのうち少なくともいずれか一方からのセンサ情報に基づいて前記自律監視ロボットによる自律監視動作のための制御動作信号を生成するデータ処理ステップと、
前記データ処理ステップで生成された前記制御動作信号を前記自律監視ロボットの指令信号に変換する自律監視ロボット制御ステップと、
前記自律監視ロボットの前記監視カメラが撮像した監視映像を表示する表示ステップと、を有する
ことを特徴とする遠隔作業ロボットの制御方法。 A control method for a remote work robot comprising a work robot arranged in a remote place different from the place where the operator is located, and an autonomous monitoring robot having a monitoring camera for imaging the work robot and monitoring the remote work by the work robot There,
A work robot control step of converting an operation signal of the work robot according to an operation instruction of the operator into a command signal of the work robot;
For autonomous monitoring operation by the autonomous monitoring robot based on sensor information from at least one of an operator sensor for detecting the behavior of the operator and a working robot sensor attached to each drive unit of the working robot. A data processing step for generating a control action signal;
An autonomous monitoring robot control step of converting the control operation signal generated in the data processing step into a command signal of the autonomous monitoring robot;
A display step of displaying a monitoring image captured by the monitoring camera of the autonomous monitoring robot. - 請求項11に記載の遠隔作業ロボットの制御方法において、
前記データ処理ステップでは、前記作業ロボットの把持アームの操作速度と前記把持アームと対象物との距離との関係から作業フェーズを判定し、判定した作業フェーズに適した前記制御動作信号を生成する
ことを特徴とする遠隔作業ロボットの制御方法。 The method of controlling a remote work robot according to claim 11,
In the data processing step, a work phase is determined from a relationship between an operation speed of a gripping arm of the work robot and a distance between the gripping arm and an object, and the control operation signal suitable for the determined work phase is generated. A method for controlling a remote working robot. - 請求項11に記載の遠隔作業ロボットの制御方法において、
前記オペレータセンサは、前記表示ステップにより表示された監視映像中でのオペレータの注視点を検出する視線計測装置であり、
前記データ処理ステップでは、前記視線計測装置によって取得したオペレータの注視点に基づいて作業フェーズを判定し、判定した作業フェーズに適した前記制御動作信号を生成する
ことを特徴とする遠隔作業ロボットの制御方法。 The method of controlling a remote work robot according to claim 11,
The operator sensor is a line-of-sight measuring device that detects an operator's gazing point in the monitoring video displayed by the display step,
In the data processing step, a work phase is determined based on an operator's gaze point acquired by the line-of-sight measurement device, and the control operation signal suitable for the determined work phase is generated. Method.
Applications Claiming Priority (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
JP2017110121A JP6807280B2 (en) | 2017-06-02 | 2017-06-02 | Remote work robot control system and remote work robot control method |
JP2017-110121 | 2017-06-02 |
Publications (1)
Publication Number | Publication Date |
---|---|
WO2018221053A1 true WO2018221053A1 (en) | 2018-12-06 |
Family
ID=64455380
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
PCT/JP2018/016084 WO2018221053A1 (en) | 2017-06-02 | 2018-04-19 | Remote operation robot control system and remote operation robot control method |
Country Status (2)
Country | Link |
---|---|
JP (1) | JP6807280B2 (en) |
WO (1) | WO2018221053A1 (en) |
Cited By (1)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN109693238A (en) * | 2018-12-18 | 2019-04-30 | 航天时代电子技术股份有限公司 | Multi-sensor information display method and equipment and human body follow-up teleoperation robot |
Families Citing this family (8)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP7043141B2 (en) * | 2018-10-29 | 2022-03-29 | 株式会社大一商会 | Pachinko machine |
JP7114183B2 (en) * | 2018-10-29 | 2022-08-08 | 株式会社大一商会 | game machine |
JP7114184B2 (en) * | 2018-10-29 | 2022-08-08 | 株式会社大一商会 | game machine |
JP7114187B2 (en) * | 2018-10-29 | 2022-08-08 | 株式会社大一商会 | game machine |
JP7114185B2 (en) * | 2018-10-29 | 2022-08-08 | 株式会社大一商会 | game machine |
JP7396819B2 (en) | 2019-06-21 | 2023-12-12 | ファナック株式会社 | A monitoring device equipped with a camera that captures video images of robot equipment operations. |
WO2021025960A1 (en) * | 2019-08-02 | 2021-02-11 | Dextrous Robotics, Inc. | A robotic system for picking and placing objects from and into a constrained space |
WO2023205176A1 (en) | 2022-04-18 | 2023-10-26 | Dextrous Robotics, Inc. | System and/or method for grasping objects |
Citations (2)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JPH05233059A (en) * | 1992-02-25 | 1993-09-10 | Toshiba Corp | Decentralized work robot system |
JPH0976063A (en) * | 1995-09-16 | 1997-03-25 | Sanshiyuuzen Kogyo Kk | Welding equipment |
Family Cites Families (2)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JPH0811071A (en) * | 1994-06-29 | 1996-01-16 | Yaskawa Electric Corp | Controller for manipulator |
JP6559525B2 (en) * | 2015-09-29 | 2019-08-14 | 日立Geニュークリア・エナジー株式会社 | Remote work support system and remote work support method |
-
2017
- 2017-06-02 JP JP2017110121A patent/JP6807280B2/en active Active
-
2018
- 2018-04-19 WO PCT/JP2018/016084 patent/WO2018221053A1/en active Application Filing
Patent Citations (2)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JPH05233059A (en) * | 1992-02-25 | 1993-09-10 | Toshiba Corp | Decentralized work robot system |
JPH0976063A (en) * | 1995-09-16 | 1997-03-25 | Sanshiyuuzen Kogyo Kk | Welding equipment |
Cited By (1)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN109693238A (en) * | 2018-12-18 | 2019-04-30 | 航天时代电子技术股份有限公司 | Multi-sensor information display method and equipment and human body follow-up teleoperation robot |
Also Published As
Publication number | Publication date |
---|---|
JP2018202541A (en) | 2018-12-27 |
JP6807280B2 (en) | 2021-01-06 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
WO2018221053A1 (en) | Remote operation robot control system and remote operation robot control method | |
JP6420229B2 (en) | A robot system including a video display device that superimposes and displays an image of a virtual object on a video of a robot | |
US9002517B2 (en) | Telematic interface with directional translation | |
CA2945189C (en) | Robotic systems and methods of operating robotic systems | |
JP4167940B2 (en) | Robot system | |
JP5022868B2 (en) | Information processing apparatus and information processing method | |
JP6445092B2 (en) | Robot system displaying information for teaching robots | |
JP2012011498A (en) | System and method for operating robot arm | |
KR20140066544A (en) | Robot and friction compensation method for the robot | |
JP2015043488A (en) | Remote controller and remote construction method using the same | |
JP2006000977A (en) | Device for presenting action state of force between robot and environment | |
JP2011200997A (en) | Teaching device and method for robot | |
US11618166B2 (en) | Robot operating device, robot, and robot operating method | |
JP3912584B2 (en) | Remote control device | |
CN113752236B (en) | Device, calibration rod and method for teaching mechanical arm | |
JP2019000918A (en) | System and method for controlling arm attitude of working robot | |
JP7577017B2 (en) | CONTROL DEVICE, ROBOT SYSTEM, CONTROL METHOD, AND PROGRAM | |
JP6679446B2 (en) | Remote control system for work robots | |
JP5361058B2 (en) | Robot remote control system and work robot used therefor | |
CN112743537A (en) | Annotating device | |
US11926065B2 (en) | Vision-based operation for robot | |
CN114905478B (en) | Bilateral teleoperation system and control method | |
JP3376029B2 (en) | Robot remote control device | |
Minamoto et al. | Tele-Operation of Robot by Image Processing of Markers Attached to Operator's Head | |
KR20160094105A (en) | Robot control visualization apparatus |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
121 | Ep: the epo has been informed by wipo that ep was designated in this application |
Ref document number: 18808924 Country of ref document: EP Kind code of ref document: A1 |
|
NENP | Non-entry into the national phase |
Ref country code: DE |
|
122 | Ep: pct application non-entry in european phase |
Ref document number: 18808924 Country of ref document: EP Kind code of ref document: A1 |