WO2022230854A1 - データ処理装置、データ処理方法及びデータ処理システム - Google Patents
データ処理装置、データ処理方法及びデータ処理システム Download PDFInfo
- Publication number
- WO2022230854A1 WO2022230854A1 PCT/JP2022/018831 JP2022018831W WO2022230854A1 WO 2022230854 A1 WO2022230854 A1 WO 2022230854A1 JP 2022018831 W JP2022018831 W JP 2022018831W WO 2022230854 A1 WO2022230854 A1 WO 2022230854A1
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- controller
- movable part
- data processing
- image data
- data
- Prior art date
Links
- 238000012545 processing Methods 0.000 title claims abstract description 96
- 238000003672 processing method Methods 0.000 title claims description 12
- 238000003384 imaging method Methods 0.000 claims abstract description 14
- 238000000034 method Methods 0.000 claims abstract description 12
- 238000004891 communication Methods 0.000 claims description 29
- 239000002131 composite material Substances 0.000 claims description 18
- 230000033001 locomotion Effects 0.000 claims description 18
- 210000000707 wrist Anatomy 0.000 claims description 12
- 230000002194 synthesizing effect Effects 0.000 description 22
- 238000010586 diagram Methods 0.000 description 15
- 238000012937 correction Methods 0.000 description 14
- 210000004247 hand Anatomy 0.000 description 9
- 230000005540 biological transmission Effects 0.000 description 5
- 230000015572 biosynthetic process Effects 0.000 description 5
- 238000003786 synthesis reaction Methods 0.000 description 5
- 238000012986 modification Methods 0.000 description 4
- 230000004048 modification Effects 0.000 description 4
- 230000000694 effects Effects 0.000 description 3
- 239000000203 mixture Substances 0.000 description 3
- 230000015541 sensory perception of touch Effects 0.000 description 2
- 239000000284 extract Substances 0.000 description 1
- 230000006870 function Effects 0.000 description 1
- 239000007787 solid Substances 0.000 description 1
- 230000001360 synchronised effect Effects 0.000 description 1
Images
Classifications
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T5/00—Image enhancement or restoration
- G06T5/50—Image enhancement or restoration using two or more images, e.g. averaging or subtraction
-
- G—PHYSICS
- G09—EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
- G09G—ARRANGEMENTS OR CIRCUITS FOR CONTROL OF INDICATING DEVICES USING STATIC MEANS TO PRESENT VARIABLE INFORMATION
- G09G5/00—Control arrangements or circuits for visual indicators common to cathode-ray tube indicators and other visual indicators
- G09G5/36—Control arrangements or circuits for visual indicators common to cathode-ray tube indicators and other visual indicators characterised by the display of a graphic pattern, e.g. using an all-points-addressable [APA] memory
- G09G5/37—Details of the operation on graphic patterns
- G09G5/377—Details of the operation on graphic patterns for mixing or overlaying two or more graphic patterns
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B25—HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
- B25J—MANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
- B25J13/00—Controls for manipulators
- B25J13/06—Control stands, e.g. consoles, switchboards
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B25—HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
- B25J—MANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
- B25J3/00—Manipulators of master-slave type, i.e. both controlling unit and controlled unit perform corresponding spatial movements
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B25—HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
- B25J—MANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
- B25J9/00—Programme-controlled manipulators
- B25J9/16—Programme controls
- B25J9/1694—Programme controls characterised by use of sensors other than normal servo-feedback from position, speed or acceleration sensors, perception control, multi-sensor controlled systems, sensor fusion
- B25J9/1697—Vision controlled systems
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/03—Arrangements for converting the position or the displacement of a member into a coded form
- G06F3/0304—Detection arrangements using opto-electronic means
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/03—Arrangements for converting the position or the displacement of a member into a coded form
- G06F3/033—Pointing devices displaced or positioned by the user, e.g. mice, trackballs, pens or joysticks; Accessories therefor
- G06F3/0346—Pointing devices displaced or positioned by the user, e.g. mice, trackballs, pens or joysticks; Accessories therefor with detection of the device orientation or free movement in a 3D space, e.g. 3D mice, 6-DOF [six degrees of freedom] pointers using gyroscopes, accelerometers or tilt-sensors
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/048—Interaction techniques based on graphical user interfaces [GUI]
- G06F3/0481—Interaction techniques based on graphical user interfaces [GUI] based on specific properties of the displayed interaction object or a metaphor-based environment, e.g. interaction with desktop elements like windows or icons, or assisted by a cursor's changing behaviour or appearance
- G06F3/04815—Interaction with a metaphor-based environment or interaction object displayed as three-dimensional, e.g. changing the user viewpoint with respect to the environment or object
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/048—Interaction techniques based on graphical user interfaces [GUI]
- G06F3/0484—Interaction techniques based on graphical user interfaces [GUI] for the control of specific functions or operations, e.g. selecting or manipulating an object, an image or a displayed text element, setting a parameter value or selecting a range
- G06F3/04847—Interaction techniques to control parameter settings, e.g. interaction with sliders or dials
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T11/00—2D [Two Dimensional] image generation
- G06T11/60—Editing figures and text; Combining figures or text
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T7/00—Image analysis
- G06T7/70—Determining position or orientation of objects or cameras
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N21/00—Selective content distribution, e.g. interactive television or video on demand [VOD]
- H04N21/40—Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
- H04N21/41—Structure of client; Structure of client peripherals
- H04N21/4104—Peripherals receiving signals from specially adapted client devices
- H04N21/4131—Peripherals receiving signals from specially adapted client devices home appliance, e.g. lighting, air conditioning system, metering devices
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N21/00—Selective content distribution, e.g. interactive television or video on demand [VOD]
- H04N21/40—Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
- H04N21/41—Structure of client; Structure of client peripherals
- H04N21/422—Input-only peripherals, i.e. input devices connected to specially adapted client devices, e.g. global positioning system [GPS]
- H04N21/4223—Cameras
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N21/00—Selective content distribution, e.g. interactive television or video on demand [VOD]
- H04N21/40—Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
- H04N21/43—Processing of content or additional data, e.g. demultiplexing additional data from a digital video stream; Elementary client operations, e.g. monitoring of home network or synchronising decoder's clock; Client middleware
- H04N21/431—Generation of visual interfaces for content selection or interaction; Content or additional data rendering
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N21/00—Selective content distribution, e.g. interactive television or video on demand [VOD]
- H04N21/40—Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
- H04N21/43—Processing of content or additional data, e.g. demultiplexing additional data from a digital video stream; Elementary client operations, e.g. monitoring of home network or synchronising decoder's clock; Client middleware
- H04N21/431—Generation of visual interfaces for content selection or interaction; Content or additional data rendering
- H04N21/4312—Generation of visual interfaces for content selection or interaction; Content or additional data rendering involving specific graphical features, e.g. screen layout, special fonts or colors, blinking icons, highlights or animations
- H04N21/4316—Generation of visual interfaces for content selection or interaction; Content or additional data rendering involving specific graphical features, e.g. screen layout, special fonts or colors, blinking icons, highlights or animations for displaying supplemental content in a region of the screen, e.g. an advertisement in a separate window
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04Q—SELECTING
- H04Q9/00—Arrangements in telecontrol or telemetry systems for selectively calling a substation from a main station, in which substation desired apparatus is selected for applying a control signal thereto or for obtaining measured values therefrom
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2207/00—Indexing scheme for image analysis or image enhancement
- G06T2207/20—Special algorithmic details
- G06T2207/20212—Image combination
- G06T2207/20221—Image fusion; Image merging
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04Q—SELECTING
- H04Q2209/00—Arrangements in telecontrol or telemetry systems
- H04Q2209/40—Arrangements in telecontrol or telemetry systems using a wireless architecture
Definitions
- the present disclosure relates to a data processing device, a data processing method, and a data processing system.
- Patent Literature 1 discloses a technique of remotely controlling a robot using a controller worn by an operator and displaying an image captured by an imaging device provided on the robot on a terminal used by the operator. .
- the present disclosure has been made in view of this point, and aims to provide a data processing device or the like that allows an operator of a device such as a robot to easily operate the device using a controller.
- a data processing device is a data processing device that processes data for operating a movable part of a device according to a position of a controller, and an operator operates the movable part using the controller.
- acquiring controller information including the position and acquiring captured image data generated by an imaging device capturing an image of a movable part while the movable part is being operated according to the position.
- generating composite image data by superimposing a controller image indicating the position included in the controller information on an area corresponding to the movable part on the captured image data; and displaying the composite image data on a display device visible to the operator. and a controller configured to perform:
- a data processing device or the like that allows an operator of a device such as a robot to easily operate the device using a controller.
- FIG. 4 is a schematic diagram of a controller image generated by a synthesizing unit;
- FIG. 3 is a schematic diagram of composite image data displayed by a display device;
- 4 is a sequence diagram of a data processing method executed by the data processing system;
- FIG. It is a figure which shows the flowchart of the data processing method which a data processing apparatus performs.
- FIG. 3 is a schematic diagram of composite image data displayed by a display device;
- FIG. 3 is a schematic diagram of composite image data displayed by a display device;
- FIG. 1 is a diagram showing an outline of a data processing system S.
- the data processing system S provides, for example, a telexistence environment in which the operator U can operate an object in real time while making the operator U feel as if the object at a remote location were nearby. System.
- the data processing system S has a data processing device 1, an operation device 2, and a robot 3.
- the data processing device 1 is a device that controls the robot 3 according to the operation performed by the operator U on the operation device 2, and is, for example, a computer.
- the data processing device 1 may be installed in a room where either the operator U or the robot 3 is present, or may be installed at a location different from the location where the operator U and the robot 3 are located.
- the operating device 2 is a device worn by the operator U, and has a display device 21, an operating device 22, and a communication section 23.
- the display device 21 has a display (display device) on which the operator U can visually recognize an image based on the composite image data generated by the data processing device 1, and is, for example, a goggle-type display device.
- the image displayed by the display device 21 is, for example, a celestial image corresponding to 360 degrees (all directions).
- the robot 3 is a device that is remotely operated by an operator U and operates based on control data received from the data processing device 1 via the network N.
- the robot 3 has an imaging device (hereinafter referred to as “camera 31 ”) that generates captured image data, and an image transmission unit 32 that transmits the captured image data generated by the camera 31 to the data processing device 1 .
- the robot 3 has movable parts such as a head, hands and arms that operate according to the movement of the operation device 2 .
- the movable parts of the robot 3 operate in synchronization with the position and orientation of the operating device 2 .
- the robot 3 sends at least one robot state data of tactile sense data indicating the tactile sense detected by the robot 3, sound data indicating sounds collected by the robot 3, and joint state data indicating the state of the joints of the robot 3. It is transmitted to the operation device 2 via the data processing device 1 .
- the robot 3 according to the present embodiment performs a task of arranging products on shelves according to the operation of the operator U in a store where a large number of products are sold on shelves. The place where 3 is provided and the contents of the work are arbitrary.
- the data processing device 1 acquires captured image data generated by the camera 31 capturing an image of the front, and causes the display device 21 of the operation device 2 to display the acquired captured image data.
- the data processing device 1 generates composite image data by superimposing a controller image indicating the position and orientation of the operation device 22 on the captured image data generated by the camera 31 .
- the operability of the operator U can be improved by causing the data processing device 1 to display the synthetic image data on the operation device 2 .
- the display device 21 has a sensor that detects the angle of the display device 21 with respect to the reference orientation (hereinafter sometimes referred to as "apparatus angle").
- the reference orientation is, for example, the orientation of the display device 21 when the display device 21 is worn by the operator U who faces a predetermined orientation. Orientation.
- the display device 21 detects the angular difference between the reference orientation and the orientation of the display device 21 as the device angle.
- the device angle of the display device 21 is represented, for example, by a combination of two angles in a spherical coordinate system within a three-dimensional space.
- the display device 21 generates head operation data indicating the device angle of the display device 21 with respect to the reference orientation.
- the display device 21 notifies the communication unit 23 of the head operation data at predetermined time intervals.
- the predetermined time interval is determined, for example, based on the speed at which the robot 3 can change the head angle. Predetermined time intervals are shortened. Since the interval at which the display device 21 notifies the device angle corresponds to the speed at which the robot 3 can change the angle of the head, it is not necessary to detect the device angle more frequently than necessary. Power consumption of the operation device 2 can be suppressed.
- the operation device 22 (controller) is a device used by the operator U to operate the robot 3 with the hands and arms, and has sensors for detecting the movements of the operator U's hands and arms.
- the operation device 22 generates hand-arm operation data indicating the movements of the operator's U hand and arm.
- the operation device 22 notifies the communication unit 23 of the generated hand/arm operation data.
- the operation device 22 may receive an operation for moving the robot 3 forward and backward or left and right.
- the operation device 22 also has elements for generating heat, pressure, vibration, or the like corresponding to the state of the robot 3 so that the operator U can sense the state of the robot 3 .
- the communication unit 23 has a communication controller that transmits operation data based on the content of the operation by the operator U to the data processing device 1 and receives composite image data from the data processing device 1 .
- the communication unit 23 synchronizes the head operation data notified from the display device 21 and the hand-arm operation data notified from the operation device 22, and outputs predetermined operation data including the synchronized head operation data and hand-arm operation data. is transmitted to the data processing device 1 at time intervals of .
- the communication unit 23 also inputs the composite image data received from the data processing device 1 to the display device 21 .
- the communication unit 23 may transmit operation data for moving the robot 3 forward, backward, leftward, or rightward to the data processing device 1 .
- the communication unit 23 may be included in either the display device 21 or the operation device 22 , or may be housed in a housing different from that of the display device 21 and the operation device 22 .
- the robot 3 orients the head of the robot 3 at the same angle as the angle of the display device 21 based on the head control data generated by the data processing device 1 based on the head operation data transmitted from the operation device 2. change.
- the robot 3 also controls the hands and arms of the robot 3 in the same manner as the hands and arms of the operator U, based on the control data of the hands and arms generated by the data processing device 1 based on the hand and arm operation data transmitted from the operation device 2 . move the
- the camera 31 is provided, for example, on the head of the robot 3, and by capturing an image in front of the robot 3, it generates captured image data of a range including the movable parts of the robot 3.
- the camera 31 may be an image capturing device provided separately from the robot 3 at a position capable of capturing an image of the range including the movable parts of the robot 3 .
- the image transmission unit 32 has a communication controller for transmitting captured image data via the network N.
- FIG. 1 illustrates the case where the camera 31 has the image transmission section 32, the portion where the image transmission section 32 is provided is arbitrary.
- the operation device 2 generates operation data according to the movements of the operator U's head and hands and arms, and transmits the operation data to the data processing device 1 .
- the data processing device 1 generates control data for operating the robot 3 based on the operation data, and transmits the control data to the robot 3.
- the robot 3 operates based on control data received from the data processing device 1 . Further, after operating based on the control data received from the data processing device 1, the robot 3 may transmit feedback information (for example, ACK information) indicating that it has operated according to the control data to the data processing device 1. . While the robot 3 is operating, the captured image data generated by the camera 31 is transmitted to the data processing device 1 at predetermined time intervals (for example, 5 millisecond intervals).
- the data processing device 1 superimposes a controller image indicating the position and orientation of the operation device 22 on the area corresponding to the movable part (for example, the arm of the robot 3) on the captured image data generated by the camera 31.
- the synthetic image data is transmitted to the operation device 2 .
- the display device 21 displays the composite image data received from the data processing device 1 .
- FIG. 2 is a diagram showing the configuration of the data processing device 1.
- the data processing device 1 has a communication section 11 , a storage section 12 and a control section 13 .
- the control unit 13 has a controller information acquisition unit 131 , an operation control unit 132 , an image data acquisition unit 133 , a specification unit 134 , a synthesis unit 135 and a display control unit 136 .
- the communication unit 11 has a communication interface for transmitting and receiving various data between the operation device 2 and the robot 3 via the network N.
- the communication unit 11 inputs, for example, the controller information received from the controller device 2 to the controller information acquisition unit 131 . Also, the communication unit 11 inputs the captured image data received from the robot 3 to the image data acquisition unit 133 .
- the communication unit 11 also transmits control data input from the operation control unit 132 to the robot 3 . Also, the communication unit 11 transmits the synthetic image data input from the display control unit 136 to the operation device 2 .
- the storage unit 12 has storage media such as ROM (Read Only Memory), RAM (Random Access Memory) and SSD (Solid State Drive).
- the storage unit 12 stores programs executed by the control unit 13 .
- the storage unit 12 also temporarily stores captured image data received from the robot 3 .
- the control unit 13 has, for example, a CPU (Central Processing Unit) as a processor.
- the control unit 13 functions as a controller information acquisition unit 131, an operation control unit 132, an image data acquisition unit 133, an identification unit 134, a synthesis unit 135, and a display control unit 136 by executing programs stored in the storage unit 12. do.
- a CPU Central Processing Unit
- the processing executed by the data processing device 1 will be described in detail below. Processing for the operator U to operate the hands and arms of the robot 3 using the operation device 22 will be described below. You can operate.
- the controller information acquisition unit 131 acquires controller information including the position and orientation of the operation device 22 when the operator U operates the movable part of the robot 3 using the operation device 22 (controller).
- FIG. 3 is a schematic diagram for explaining the operation device 22.
- the controller information acquisition unit 131 receives hand-arm operation data from the operation device 22 via the communication unit 11 and acquires the received hand-arm operation data as controller information including the position and orientation of the operation device 22 .
- the controller information may also include information indicating whether or not each of one or more buttons of the operation device 22 is pressed.
- the controller information acquisition unit 131 inputs the acquired controller information to the operation control unit 132 , the image data acquisition unit 133 and the identification unit 134 .
- the operation control section 132 generates control data for operating the robot 3 based on the controller information input from the controller information acquisition section 131 .
- the control data is, for example, information indicating the amount and direction (that is, vector) of movement or rotation of the movable part of the robot 3 according to the position and direction of the operation device 22 indicated by the controller information.
- the operation control unit 132 may generate control data by, for example, adding or subtracting a predetermined correction value to or from the value indicated by the controller information.
- the operation control section 132 transmits the generated control data to the robot 3 via the communication section 11 .
- the image data acquisition unit 133 acquires captured image data generated by capturing images of the movable parts of the robot 3 while the camera 31 is operating the movable parts of the robot 3 according to the position and orientation of the operation device 22 . do.
- the image data acquiring unit 133 inputs the acquired captured image data to the synthesizing unit 135 .
- the identifying unit 134 identifies the calculated position and orientation of the movable parts of the robot 3 based on the position and orientation of the operation device 22 included in the controller information.
- the position of the movable part is, for example, the position of the wrist of the robot 3
- the orientation of the movable part is, for example, the orientation of the hand with respect to the position of the wrist of the robot 3 .
- the identifying unit 134 sequentially adds or subtracts the amount and direction (i.e., vector) of movement or rotation of the movable part indicated by the control data generated by the operation control unit 132 to or from the predetermined initial position of the movable part of the robot 3. By doing so, the calculated positions and orientations of the movable parts of the robot 3 are calculated.
- the specifying unit 134 is not limited to the specific method shown here, and may specify the calculated position and orientation of the movable part of the robot 3 corresponding to the position and orientation of the operation device 22 by other methods.
- the synthesis unit 135 generates a controller image indicating the position and orientation of the operation device 22 included in the controller information acquired by the controller information acquisition unit 131 .
- FIG. 4 is a schematic diagram of the controller image IM1 generated by the synthesizing unit 135. As shown in FIG.
- the controller image IM1 is an image showing the position and orientation of at least one of the hand or wrist of the operator U holding the operation device 22, corresponding to the position and orientation included in the controller information.
- the synthesizing unit 135 calculates the center position A of the wrist by, for example, adding or subtracting a predetermined correction value to or from the position of the operation device 22 .
- the synthesizing unit 135 calculates the hand orientation B and the wrist angle C by, for example, adding or subtracting a predetermined correction value to or from the orientation of the operation device 22 .
- the synthesizing unit 135 generates a controller image IM1 representing the calculated center position A of the wrist, direction B of the hand, and angle C of the wrist.
- the controller image IM1 illustrated in FIG. 4 represents the center position A of the wrist by the position of the spherical portion, the orientation B of the hand by the orientation of the tip portion connected to the spherical portion, and the rotation angle of the wrist by the rotation angle of the spherical portion. It represents the angle C.
- the controller image IM1 is not limited to the specific method shown here, and may represent the center position A of the wrist, the direction B of the hand tip, and the angle C of the wrist by other methods.
- the combining unit 135 generates combined image data by superimposing the generated controller image on the region corresponding to the movable part on the captured image data.
- the region corresponding to the movable part on the captured image data is coordinates on the captured image data corresponding to the calculated position and orientation of the movable part specified by the specifying unit 134 .
- the synthesizing unit 135 calculates the coordinates on the captured image data, for example, by converting the calculated position and orientation of the movable part according to a predetermined rule.
- the synthesizing unit 135 superimposes the controller image on the calculated coordinates on the captured image data.
- the data processing apparatus 1 allows the operator U to recognize the position and orientation of the operation device 22 even when the operator U cannot see the operation device 22 .
- the synthesizing unit 135 moves the controller image to the area corresponding to the movable part after movement on the captured image data.
- the data processing apparatus 1 causes the controller image to follow the movable part, and allows the operator U to recognize that the controller image is displayed according to the position and orientation of the operation device 22 .
- the synthesizing unit 135 superimposes an operation image indicating a state in which the operation device 22 is being operated in association with the controller image on the captured image data. For example, the synthesizing unit 135 superimposes an operation image representing one or more buttons of the operation device 22 near the controller image, and determines whether each of the one or more buttons is pressed based on the controller information. to change the display mode of the operation image.
- the data processing apparatus 1 allows the operator U to recognize the operation state of the operation device 22 even when the operator U cannot see the operation device 22 .
- the data processing device 1 allows the operator U to see the controller image and the operation image even when the movable parts of the robot 3 are hidden behind the body of the robot 3 itself or other objects and cannot be seen. The operator U can be made to recognize the positions, orientations, and operating states of the movable parts 3 .
- the display control unit 136 transmits display data including the synthesized image data generated by the synthesizing unit 135 to the operation device 2 via the communication unit 11, thereby displaying the synthesized image data on the display device 21 visible to the operator U. display.
- FIG. 5 is a schematic diagram of the composite image data IM2 displayed by the display device 21.
- FIG. A controller image IM1 is displayed in a region corresponding to the movable part P in the combined image data IM2.
- an operation image IM3 is displayed near the controller image IM1. Even if the operator U cannot see the operation device 22, the operator U can easily recognize the position and orientation of the operation device 22 by viewing the combined image data IM2, and can easily determine the operation state of the operation device 22. can be recognized.
- FIG. 6 is a sequence diagram of the data processing method executed by the data processing system S.
- the operation device 22 When the operator U uses the operation device 22 to operate the movable parts of the robot 3, the operation device 22 generates operation data (hand and arm operation data) indicating the movements of the operator U's hands and arms (S1). .
- the operation device 22 transmits the generated operation data to the data processing apparatus 1 via the communication section 23 .
- the controller information acquisition section 131 receives operation data from the operation device 22 via the communication section 11 and acquires the received operation data as controller information including the position and orientation of the operation device 22.
- the controller information acquisition unit 131 inputs the acquired controller information to the operation control unit 132 , the image data acquisition unit 133 and the identification unit 134 .
- the operation control unit 132 generates control data for operating the robot 3 based on the controller information input from the controller information acquisition unit 131 (S2).
- the operation control section 132 transmits the generated control data to the robot 3 via the communication section 11 .
- the robot 3 receives control data from the data processing device 1 and operates the movable parts according to the received control data. Further, after the robot 3 operates the movable parts according to the control data received from the data processing device 1, the robot 3 transmits feedback information (for example, ACK information) indicating that it has operated according to the control data to the data processing device 1. good too.
- the data processing device 1 causes the storage unit 12 to store the feedback information received from the robot 3 .
- the camera 31 generates captured image data by imaging the movable part while the movable part is being operated (S3).
- the image transmission unit 32 transmits captured image data generated by the camera 31 to the data processing device 1 .
- the image data acquisition unit 133 acquires captured image data generated by the camera 31 from the robot 3 .
- the specifying unit 134 specifies the calculated position and orientation of the movable part of the robot 3 based on the position and orientation of the operation device 22 included in the controller information (S4).
- the synthesis unit 135 generates a controller image indicating the position and orientation of the operation device 22 included in the controller information acquired by the controller information acquisition unit 131 .
- the combining unit 135 generates combined image data by superimposing the generated controller image on the region corresponding to the movable part on the captured image data (S5).
- the display control unit 136 transmits display data including the composite image data generated by the composition unit 135 to the operation device 2 via the communication unit 11 .
- the display device 21 displays the composite image data received from the data processing device 1 via the communication section 23 (S6).
- FIG. 7 is a diagram showing a flowchart of a data processing method executed by the data processing device 1. As shown in FIG. The flowchart shown in FIG. 7 starts when the operating device 2 is powered on.
- the controller information acquisition unit 131 receives operation data from the operation device 22 via the communication unit 11, and acquires the received operation data as controller information including the position and orientation of the operation device 22 (S11).
- the image data acquisition unit 133 acquires captured image data generated by the camera 31 from the robot 3 (S12). S11 and S12 may be performed in reverse order or in parallel.
- the identifying unit 134 identifies the calculated position and orientation of the movable parts of the robot 3 based on the position and orientation of the operation device 22 included in the controller information (S13).
- the synthesis unit 135 generates a controller image indicating the position and orientation of the operation device 22 included in the controller information acquired by the controller information acquisition unit 131 (S14).
- the combining unit 135 generates combined image data by superimposing the generated controller image on the area corresponding to the movable part on the captured image data (S15). Further, the synthesizing unit 135 superimposes an operation image representing each of one or more buttons of the operation device 22 near the controller image.
- the synthesizing unit 135 displays an operation image corresponding to the button based on the controller information. The aspect is changed (S17). If the controller information does not indicate that one or a plurality of buttons are pressed on the operation device 22 (NO in S16), the synthesizing unit 135 does not change the display mode of the operation image.
- the display control unit 136 transmits display data including the composite image data generated by the composition unit 135 to the operation device 2 via the communication unit 11 (S18).
- the control unit 13 repeats the processes from S11 to S18 until the operator U performs an operation to end the operation on the operation device 2 (NO in S19). If the operator U has performed an operation to end the operation (YES in S19), the control unit 13 ends the process.
- the data processing device 1 superimposes a controller image indicating the position and orientation of the operation device 22 on the region corresponding to the movable part on the captured image data generated by the camera 31.
- the display device 21 displays the synthesized image data generated by .
- the identifying unit 134 identifies the actual position and orientation of the movable parts of the robot 3 in addition to identifying the calculated positions and orientations of the movable parts of the robot 3 by the method described above.
- the identifying unit 134 executes known image recognition processing, for example, and extracts regions in the captured image data that are similar to predefined image patterns of movable parts in various orientations.
- the specifying unit 134 specifies the position of a region similar to a specific image pattern in the captured image data as the actual position of the movable part, and specifies the orientation corresponding to the image pattern as the actual orientation of the movable part. .
- the identifying unit 134 may, for example, calculate the position and orientation of the movable part using the operating device 2 or a sensor provided on the movable part, and identify the calculated position and orientation as the actual position and orientation.
- a sensor for specifying the position of a movable part is, for example, an angle sensor provided at each of a plurality of joints of the movable part.
- the specifying unit 134 acquires angles of each of the plurality of joints of the movable part from the angle sensor, and based on the acquired angles and a predefined structure of the movable part (arm length, etc.), determines the movable part. Identify the actual position and orientation.
- the identifying unit 134 identifies the difference between the actual position and orientation of the movable part and the calculated position and orientation of the movable part.
- the identifying unit 134 identifies, for example, a vector pointing from the actual position of the movable part to the calculated position of the movable part as the positional difference.
- the identifying unit 134 also identifies, for example, the angle between the actual orientation of the movable part and the calculated orientation of the movable part as the orientation difference.
- the display control unit 136 displays the difference between the actual position and orientation of the movable part and the calculated position and orientation of the movable part on the synthetic image data displayed on the display device 21 in association with the controller image. indicate.
- the display control unit 136 superimposes, for example, a difference image including a message representing the amount and orientation of the difference between the actual position and orientation of the movable part and the calculated position and orientation of the movable part on the combined image data. displayed.
- the synthesizing unit 135 may superimpose the controller image on the coordinates on the captured image data corresponding to the actual position and orientation of the movable part instead of the calculated position and orientation of the movable part.
- FIG. 8 is a schematic diagram of the composite image data IM2 displayed by the display device 21.
- a difference image IM4 including a message representing the difference is displayed near the controller image IM1.
- the display control unit 136 displays the difference by displaying an arrow indicating the amount and direction of the difference, or by changing the color of the controller image IM1 according to the magnitude of the difference. You may By viewing the difference image IM4, the operator U can easily recognize the deviation between the actual position and orientation of the movable part and the calculated position and orientation of the movable part.
- the operation control unit 132 may operate the movable part according to the position and orientation included in the controller information on condition that the difference is equal to or less than a predetermined value. That is, the operation control unit 132 permits the operator U to control the movement of the robot 3 when the difference is equal to or less than a predetermined value, and permits the operator U to control the movement of the robot 3 when the difference is greater than the predetermined value. not allowed. As a result, the data processing device 1 can prevent the robot 3 from moving when there is a large difference between the actual position and orientation of the movable part and the calculated position and orientation of the movable part, thereby improving safety.
- the display control unit 136 changes the display mode (color, color, etc.) of the composite image data between a state in which the operator U is permitted to control the movement of the robot 3 and a state in which the operator U is not permitted to control the movement of the robot 3. , patterns, etc.) may be changed. Thereby, the data processing device 1 can make the operator U recognize that the robot 3 will not be moved when the deviation is large.
- the data processing device 1 may accept correction of the difference from the operator U after displaying the difference on the display device 21 .
- the operation control unit 132 temporarily stops the control of the movable part, and the controller information acquisition unit 131 receives the controller information from the operation device 2. Accepts an operation to move the position of the image. In a state where the position of the controller image matches the position of the movable part, the operator U performs an operation to end the correction on the operation device 22 .
- the controller information acquisition unit 131 causes the storage unit 12 to store the amount and direction of movement of the controller image during correction as a correction amount.
- the composition unit 135 temporarily fixes the position of the controller image, and the controller information acquisition unit 131 obtains An operation to move the movable part may be accepted. With the position of the movable part matching the position of the controller image, the operator U performs an operation to end the correction on the operation device 22 .
- the controller information acquisition unit 131 causes the storage unit 12 to store the amount and direction of movement of the movable part at the time of correction as a correction amount.
- the operation control unit 132 After that, the operation control unit 132 generates control data by adding or subtracting the correction amount stored in the storage unit 12 to or from the position and orientation indicated by the controller information. Thereby, the data processing apparatus 1 can correct the deviation between the actual position and orientation of the movable part and the calculated position and orientation of the movable part according to the operation by the operator U.
- FIG. 1
- the data processing apparatus 1 may automatically correct the position and orientation of the operation device 22 indicated by the controller information corresponding to the difference identified by the identifying unit 134 .
- Correction of the difference includes a case of correcting the movement of the robot 3 and a case of correcting the position of the controller image.
- the operation control unit 132 When correcting the movement of the robot 3, the operation control unit 132 generates control data by, for example, adding or subtracting the value of the difference specified by the specifying unit 134 to the position and orientation indicated by the controller information. Thereby, the motion of the robot 3 is corrected according to the difference.
- the synthesizing unit 135 When correcting the position of the controller image, the synthesizing unit 135 adds or subtracts the difference value specified by the specifying unit 134 to or from the coordinate on the captured image data corresponding to the calculated position and orientation of the movable part.
- the controller image is superimposed on the Thereby, the position of the controller image is corrected according to the difference.
- the data processing device 1 can automatically correct the deviation between the actual position and orientation of the movable part and the calculated position and orientation of the movable part.
- the data processing apparatus 1 causes the display device 21 to display an operation image showing a state in which the operation device 22 is being operated. At 22, the display device 21 is caused to display a suggested image indicating the operation to be performed next.
- the display device 21 is caused to display a suggested image indicating the operation to be performed next.
- the storage unit 12 stores in advance work schedule information that associates one or more work contents scheduled to be performed by the operator U with operations to be performed on the robot 3 in each of the one or more work contents.
- the identifying unit 134 acquires work schedule information from the storage unit 12 before the synthesizing unit 135 generates synthetic image data.
- the identification unit 134 identifies the work content that the operator U is to perform next based on the work schedule information.
- the identifying unit 134 identifies the next scheduled work content from among one or more work content indicated by the work schedule information, for example, by receiving an input of work progress status from the operation device 22 .
- the identifying unit 134 identifies an operation associated with the content of the next scheduled work in the work schedule information.
- the synthesizing unit 135 superimposes an operation image indicating a state in which the operation device 22 is being operated on the captured image data in association with the controller image. Further, the synthesizing unit 135 superimposes a suggested image indicating an operation to be performed next on the operation device 22 on the captured image data according to the work content specified by the specifying unit 134 .
- the operation to be performed next on the operation device 22 is, for example, an operation to change the hand portion of the robot 3 to be used in the work content to be performed next.
- the suggested image is, for example, an image representing a message explaining the next operation to be performed on the operation device 22 . Further, the synthesizing unit 135 may superimpose work information indicating details of work to be performed next on the captured image data.
- the display control unit 136 transmits display data including the synthesized image data generated by the synthesizing unit 135 to the operation device 2 via the communication unit 11, thereby displaying the synthesized image data on the display device 21 visible to the operator U. display.
- FIG. 9 is a schematic diagram of the composite image data IM2 displayed by the display device 21.
- an operation image IM3 is displayed near the controller image IM1.
- work information IM5 indicating the content of the work to be performed next is displayed
- a suggested image IM6 indicating the operation to be performed next on the operation device 22 is displayed near the operation image IM3. is displayed.
- the data processing apparatus 1 can propose an operation according to the content of the work to be performed next to the operator U, and facilitate the progress of the work.
Landscapes
- Engineering & Computer Science (AREA)
- Theoretical Computer Science (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- General Engineering & Computer Science (AREA)
- Mechanical Engineering (AREA)
- Robotics (AREA)
- Multimedia (AREA)
- Signal Processing (AREA)
- Human Computer Interaction (AREA)
- Computer Hardware Design (AREA)
- Computer Vision & Pattern Recognition (AREA)
- Computer Networks & Wireless Communication (AREA)
- Business, Economics & Management (AREA)
- Marketing (AREA)
- Automation & Control Theory (AREA)
- Manipulator (AREA)
Abstract
Description
図1は、データ処理システムSの概要を示す図である。データ処理システムSは、例えば、遠隔地にある物が近くにあるかのように操作者Uに感じさせながら、操作者Uが物に対する操作をリアルタイムに行うことができるテレイグジスタンス環境を提供するシステムである。
図2は、データ処理装置1の構成を示す図である。データ処理装置1は、通信部11と、記憶部12と、制御部13と、を有する。制御部13は、コントローラ情報取得部131と、操作制御部132と、画像データ取得部133と、特定部134と、合成部135と、表示制御部136と、を有する。
図6は、データ処理システムSが実行するデータ処理方法のシーケンス図である。操作者Uが操作デバイス22を用いてロボット3の可動部位を操作する際に、操作デバイス22は、操作者Uの手及び腕の動きを示す操作データ(手腕操作データ)を生成する(S1)。操作デバイス22は、通信部23を介して、生成した操作データをデータ処理装置1に送信する。
図7は、データ処理装置1が実行するデータ処理方法のフローチャートを示す図である。図7に示すフローチャートは、操作装置2の電源が投入された時点から開始している。
本実施形態に係るデータ処理システムSにおいて、データ処理装置1は、カメラ31が生成した撮像画像データ上の可動部位に対応する領域に、操作デバイス22の位置及び向きを示すコントローラ画像を重畳することにより生成した合成画像データを、表示デバイス21に表示させる。これにより、操作者Uは、自分の手元にある操作デバイス22を見ることができない状態であっても、撮像画像データにおいてロボット3の可動部位と操作デバイス22に対応するコントローラ画像とを関連付けて認識できるため、ロボット3を操作しやすくなる。
上述の実施形態において、ロボット3が動作するために用いるモータやギヤ等には誤差があるため、操作デバイス22の位置及び向きに基づいて算出された可動部位の計算上の位置は、可動部位の実際の位置からずれる可能性がある。本変形例に係るデータ処理システムSは、可動部位の実際の位置と、可動部位の計算上の位置とのずれ(差異)を表示することによって、操作者Uがロボット3をより操作しやすくすることができる。以下、上述の実施形態とは異なる部分を主に説明する。
上述の実施形態では、データ処理装置1は、操作デバイス22が操作されている状態を示す操作画像を表示デバイス21に表示させるのに対して、本変形例では、データ処理装置1は、操作デバイス22において次に行われるべき操作を示す提案画像を表示デバイス21に表示させる。以下、上述の実施形態とは異なる部分を主に説明する。
Claims (14)
- コントローラの位置に応じて装置の可動部位を操作するためのデータを処理するデータ処理装置であって、
操作者が前記コントローラを用いて前記可動部位を操作する際に、前記位置を含むコントローラ情報を取得することと、
撮像装置が、前記位置に応じて前記可動部位が操作されている間に前記可動部位を撮像することによって生成された撮像画像データを取得することと、
前記撮像画像データ上の前記可動部位に対応する領域に、前記コントローラ情報が含む前記位置を示すコントローラ画像を重畳することによって合成画像データを生成することと、
前記操作者が視認可能な表示装置に前記合成画像データを表示させることと、
を実行するように構成された制御部を有するデータ処理装置。 - 前記制御部は、
前記位置に加えて前記コントローラの向きを含む前記コントローラ情報を取得し、
前記コントローラ情報が含む前記位置及び前記向きを示す前記コントローラ画像を重畳することによって前記合成画像データを生成する、
請求項1に記載のデータ処理装置。 - 前記コントローラ画像は、前記コントローラ情報が含む前記向きに対応する、前記コントローラを保持する前記操作者の手先又は手首の少なくとも一方の向きを示す画像である、
請求項2に記載のデータ処理装置。 - 前記制御部は、前記可動部位が移動した場合に、前記撮像画像データ上において、移動後の前記可動部位に対応する領域に、前記コントローラ画像を重畳する、
請求項1から3のいずれか一項に記載のデータ処理装置。 - 前記制御部は、前記撮像画像データ上に、前記コントローラ画像に関連付けて、前記コントローラが操作されている状態を示す画像を重畳する、
請求項1から4のいずれか一項に記載のデータ処理装置。 - 前記制御部は、前記撮像画像データ上に、前記操作者が次に行うことが予定されている作業内容に応じて、前記コントローラに対して次に行うべき操作を示す画像を重畳する、
請求項1から5のいずれか一項に記載のデータ処理装置。 - 前記制御部は、前記可動部位の実際の位置と、前記コントローラ情報が含む前記位置に基づいて算出された前記可動部位の計算上の位置と、の間の差異を特定することをさらに実行するように構成されている、
請求項1から5のいずれか一項に記載のデータ処理装置。 - 前記制御部は、
前記位置に加えて前記コントローラの向きを含む前記コントローラ情報を取得し、
前記可動部位の実際の位置及び向きと、前記コントローラ情報が含む前記位置及び前記向きに基づいて算出された前記可動部位の計算上の位置及び向きと、の間の前記差異を特定する、
請求項7に記載のデータ処理装置。 - 前記制御部は、前記コントローラ画像に関連付けて、前記差異を前記表示装置に表示させる、
請求項7又は8に記載のデータ処理装置。 - 前記制御部は、前記コントローラ情報が含む前記位置に対して前記差異に対応する補正をした上で、前記コントローラ情報が含む前記位置に応じて前記可動部位を操作することをさらに実行するように構成されている、
請求項7から9のいずれか一項に記載のデータ処理装置。 - 前記制御部は、前記差異が所定値以下であることを条件として、前記コントローラ情報が含む前記位置に応じて前記可動部位を操作する、
請求項10に記載のデータ処理装置。 - コントローラの位置に応じて装置の可動部位を操作するためのデータをコンピュータが処理するデータ処理方法であって、
操作者が前記コントローラを用いて前記可動部位を操作する際に、前記位置を含むコントローラ情報を取得することと、
撮像装置が、前記位置に応じて前記可動部位が操作されている間に前記可動部位を撮像することによって生成された撮像画像データを取得することと、
前記撮像画像データ上の前記可動部位に対応する領域に、前記コントローラ情報が含む前記位置を示すコントローラ画像を重畳することによって合成画像データを生成することと、
前記操作者が視認可能な表示装置に前記合成画像データを表示させることと、
を含むデータ処理方法。 - コントローラの位置に応じて装置の可動部位を操作するためのデータを処理するデータ処理方法をコンピュータに実行させるためのプログラムであって、
前記データ処理方法は、
操作者が前記コントローラを用いて前記可動部位を操作する際に、前記位置を含むコントローラ情報を取得することと、
撮像装置が、前記位置に応じて前記可動部位が操作されている間に前記可動部位を撮像することによって生成された撮像画像データを取得することと、
前記撮像画像データ上の前記可動部位に対応する領域に、前記コントローラ情報が含む前記位置を示すコントローラ画像を重畳することによって合成画像データを生成することと、
前記操作者が視認可能な表示装置に前記合成画像データを表示させることと、
を含む、プログラム。 - 装置を操作するためのコントローラに関連付けられた操作装置と、前記コントローラの位置に応じて前記装置の可動部位を操作するためのデータを処理するデータ処理装置と、を備えるデータ処理システムであって、
前記データ処理装置は、
操作者が前記コントローラを用いて前記可動部位を操作する際に、前記位置を含むコントローラ情報を取得することと、
撮像装置が、前記位置に応じて前記可動部位が操作されている間に前記可動部位を撮像することによって生成された撮像画像データを取得することと、
前記撮像画像データ上の前記可動部位に対応する領域に、前記コントローラ情報が含む前記位置を示すコントローラ画像を重畳することによって合成画像データを生成することと、
前記操作装置に前記合成画像データを送信することと、
を実行するように構成された制御部を有し、
前記操作装置は、
前記操作者による操作の内容に基づく前記コントローラ情報を前記データ処理装置に送信する通信部と、
前記操作者が視認可能な、前記データ処理装置が送信した前記合成画像データを表示する表示部と、
を有する、
データ処理システム。
Priority Applications (4)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
KR1020237031576A KR20240004248A (ko) | 2021-04-28 | 2022-04-26 | 데이터 처리 장치, 데이터 처리 방법 및 데이터 처리 시스템 |
CN202280027818.5A CN117121503A (zh) | 2021-04-28 | 2022-04-26 | 数据处理装置、数据处理方法以及数据处理系统 |
JP2023517542A JPWO2022230854A1 (ja) | 2021-04-28 | 2022-04-26 | |
US18/286,912 US20240202877A1 (en) | 2021-04-28 | 2022-04-26 | Data processing apparatus, data processing method, and data processing system |
Applications Claiming Priority (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
JP2021075776 | 2021-04-28 | ||
JP2021-075776 | 2021-04-28 |
Publications (1)
Publication Number | Publication Date |
---|---|
WO2022230854A1 true WO2022230854A1 (ja) | 2022-11-03 |
Family
ID=83848406
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
PCT/JP2022/018831 WO2022230854A1 (ja) | 2021-04-28 | 2022-04-26 | データ処理装置、データ処理方法及びデータ処理システム |
Country Status (5)
Country | Link |
---|---|
US (1) | US20240202877A1 (ja) |
JP (1) | JPWO2022230854A1 (ja) |
KR (1) | KR20240004248A (ja) |
CN (1) | CN117121503A (ja) |
WO (1) | WO2022230854A1 (ja) |
Citations (3)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2011189431A (ja) * | 2010-03-12 | 2011-09-29 | Denso Wave Inc | ロボットシステム |
JP2014065100A (ja) * | 2012-09-25 | 2014-04-17 | Denso Wave Inc | ロボットシステム、及びロボットのティーチング方法 |
US20200163731A1 (en) * | 2017-07-13 | 2020-05-28 | Intuitive Surgical Operations, Inc. | Systems and methods for switching control between multiple instrument arms |
Family Cites Families (1)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP6666400B1 (ja) | 2018-09-10 | 2020-03-13 | Telexistence株式会社 | ロボット制御装置、ロボット制御方法及びロボット制御システム |
-
2022
- 2022-04-26 KR KR1020237031576A patent/KR20240004248A/ko unknown
- 2022-04-26 CN CN202280027818.5A patent/CN117121503A/zh active Pending
- 2022-04-26 JP JP2023517542A patent/JPWO2022230854A1/ja active Pending
- 2022-04-26 WO PCT/JP2022/018831 patent/WO2022230854A1/ja active Application Filing
- 2022-04-26 US US18/286,912 patent/US20240202877A1/en active Pending
Patent Citations (3)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2011189431A (ja) * | 2010-03-12 | 2011-09-29 | Denso Wave Inc | ロボットシステム |
JP2014065100A (ja) * | 2012-09-25 | 2014-04-17 | Denso Wave Inc | ロボットシステム、及びロボットのティーチング方法 |
US20200163731A1 (en) * | 2017-07-13 | 2020-05-28 | Intuitive Surgical Operations, Inc. | Systems and methods for switching control between multiple instrument arms |
Also Published As
Publication number | Publication date |
---|---|
JPWO2022230854A1 (ja) | 2022-11-03 |
KR20240004248A (ko) | 2024-01-11 |
CN117121503A (zh) | 2023-11-24 |
US20240202877A1 (en) | 2024-06-20 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
CN110394780B (zh) | 机器人的仿真装置 | |
EP3342561B1 (en) | Remote control robot system | |
US7714895B2 (en) | Interactive and shared augmented reality system and method having local and remote access | |
JP5526881B2 (ja) | ロボットシステム | |
US20160158937A1 (en) | Robot system having augmented reality-compatible display | |
CN110977931A (zh) | 使用了增强现实和混合现实的机器人控制装置及显示装置 | |
US20190160662A1 (en) | Teaching device for performing robot teaching operations and teaching method | |
JP6445092B2 (ja) | ロボットの教示のための情報を表示するロボットシステム | |
JP7000253B2 (ja) | 力覚視覚化装置、ロボットおよび力覚視覚化プログラム | |
US12017351B2 (en) | Remote control system, information processing method, and non-transitory computer-readable recording medium | |
JP7564327B2 (ja) | 教示装置 | |
JP4949799B2 (ja) | 作業支援装置および方法 | |
WO2019026790A1 (ja) | ロボットシステム及びその運転方法 | |
US20200361092A1 (en) | Robot operating device, robot, and robot operating method | |
WO2022230854A1 (ja) | データ処理装置、データ処理方法及びデータ処理システム | |
JP2003181785A (ja) | 遠隔操作装置 | |
JPH1011122A (ja) | 情報提示装置 | |
JPH1177568A (ja) | 教示支援方法及び装置 | |
WO2022220208A1 (ja) | データ処理装置、データ処理方法、プログラム及びデータ処理システム | |
US20230010302A1 (en) | Generation of image for robot operation | |
JP4949798B2 (ja) | 作業支援装置および方法 | |
JP2019215769A (ja) | 操作装置及び操作方法 | |
JPS60126985A (ja) | 遠隔操作における作業状況表示装置 | |
JP2005267554A (ja) | 画像処理方法、画像処理装置 | |
JPH09297611A (ja) | ロボット教示方法及び装置 |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
121 | Ep: the epo has been informed by wipo that ep was designated in this application |
Ref document number: 22795767 Country of ref document: EP Kind code of ref document: A1 |
|
WWE | Wipo information: entry into national phase |
Ref document number: 2023517542 Country of ref document: JP |
|
WWE | Wipo information: entry into national phase |
Ref document number: 18286912 Country of ref document: US |
|
NENP | Non-entry into the national phase |
Ref country code: DE |
|
122 | Ep: pct application non-entry in european phase |
Ref document number: 22795767 Country of ref document: EP Kind code of ref document: A1 |