[go: up one dir, main page]
More Web Proxy on the site http://driver.im/

CN111383348A - Method for remotely and synchronously controlling robot through virtual reality - Google Patents

Method for remotely and synchronously controlling robot through virtual reality Download PDF

Info

Publication number
CN111383348A
CN111383348A CN202010185883.3A CN202010185883A CN111383348A CN 111383348 A CN111383348 A CN 111383348A CN 202010185883 A CN202010185883 A CN 202010185883A CN 111383348 A CN111383348 A CN 111383348A
Authority
CN
China
Prior art keywords
robot
virtual
depth
glasses
real
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202010185883.3A
Other languages
Chinese (zh)
Inventor
陈学超
王晨征
黄强
余张国
董岳
黄高
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Beijing Institute of Technology BIT
Original Assignee
Beijing Institute of Technology BIT
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Beijing Institute of Technology BIT filed Critical Beijing Institute of Technology BIT
Priority to CN202010185883.3A priority Critical patent/CN111383348A/en
Publication of CN111383348A publication Critical patent/CN111383348A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T19/00Manipulating 3D models or images for computer graphics
    • G06T19/006Mixed reality
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/011Arrangements for interaction with the human body, e.g. for user immersion in virtual reality
    • G06F3/012Head tracking input arrangements
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2210/00Indexing scheme for image generation or computer graphics
    • G06T2210/21Collision detection, intersection

Landscapes

  • Engineering & Computer Science (AREA)
  • General Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Human Computer Interaction (AREA)
  • Computer Graphics (AREA)
  • Computer Hardware Design (AREA)
  • Software Systems (AREA)
  • Processing Or Creating Images (AREA)

Abstract

The invention provides a method for remotely and synchronously controlling a robot through virtual reality, which comprises the steps of reconstructing a depth scene from video data and depth data of a real environment where the robot is located in a Unity3D platform, restoring three-dimensional information of the real environment where the robot is located after the depth scene and a virtual object are mixed and realized, and displaying the three-dimensional information in VR glasses, wherein an operator remotely and synchronously controls the robot through controlling the virtual robot to move; the steering engine can be controlled to rotate by a corresponding angle through the posture of the VR glasses, so that the visual angle of the robot rotates along with the head of an operator, and the robot can be remotely and synchronously controlled in another visual range; and real-time collision detection is carried out on the surface of the reconstructed virtual object and each surface of the virtual robot, and early warning is carried out before the robot is in true collision, so that the damage to the robot is reduced. The invention can really restore the three-dimensional information of the scene, thereby realizing the remote synchronous control of the virtual reality of the robot.

Description

Method for remotely and synchronously controlling robot through virtual reality
Technical Field
The invention relates to the field of robots, in particular to a method for remotely and synchronously controlling a robot through virtual reality.
Background
In order to enable robots to perform a variety of complex tasks in complex environments, the prior art utilizes a human robot to mimic or follow the actions of a remote operator. In order to facilitate the operator to know the task environment of the robot, the remote operator needs to synchronize the visual angle of the humanoid robot in real time with high telepresence. Virtual reality technology is one of the best solutions to achieve high presence.
In the prior art, an operator can remotely obtain the vision of the robot by utilizing virtual reality and has stronger telepresence. However, these so-called "virtual reality" simply transfer video information obtained by two cameras at a machine to the left and right eyes of VR glasses, and essentially the same as watching a 3D movie with 3D glasses, and do not acquire three-dimensional information (the third dimension is "depth"), and thus it is difficult to reuse the information. The real virtual reality is three-dimensional information of a known object/scene, and then the left and right eye visual angles of a person are rendered by software to respectively see the object/scene. However, these "virtual reality" used in the robot only obtain objects/scenes viewed from the left and right eye views, respectively, but cannot restore three-dimensional information of the objects/scenes for further use.
Disclosure of Invention
Aiming at the defects in the prior art, the invention provides a method for remotely and synchronously controlling a robot through virtual reality, which is used for restoring three-dimensional information of a real scene where the robot is located and realizing the remote and synchronous control of the robot through the virtual reality.
The present invention achieves the above-described object by the following technical means.
A method for remotely and synchronously controlling a robot in virtual reality comprises the steps that a computer provided with a Unity3D platform imports video data and depth data of a real environment where the robot is located, a depth scene is reconstructed from the video data and the depth data in a Unity3D platform, and the depth scene and a virtual object are presented in VR (virtual reality) glasses after mixed reality is carried out on the depth scene and the virtual object; an operator controls the virtual robot to move according to the real environment of the robot presented in the mixed reality, so that the real robot moves correspondingly; the process of mixed reality is as follows: and for the video image area with the same depth value, if no virtual object with the depth smaller than that of the video image area exists, directly projecting the area to the corresponding area of the VR glasses, or else, rendering the virtual object and projecting the rendered virtual object to the corresponding area of the VR glasses.
Further, amplifying all surfaces of the virtual robot in the Unity3D platform, performing real-time collision detection on the reconstructed virtual object surface and all surfaces of the virtual robot, and performing early warning before the real robot actually collides with the real environment.
Still further, the reconstructed virtual object is: and performing three-dimensional reconstruction on a two-dimensional image generated by projecting the surface of the real object in the stereo camera by using the depth data.
Further, the Unity3D platform collects the gesture of VR glasses, and controls the steering engine to rotate a corresponding angle, so that the visual angle of the robot rotates along with the head of an operator, and the remote synchronous control of the virtual reality of the robot in another visual range is realized.
Further, the pose of the VR glasses can be expressed in terms of XYZ Euler angles
Figure BDA0002414160920000021
The Unity3D platform transmits the gesture of VR glasses to the industrial computer, and the industrial computer controls the rotation axis of the steering engine to rotate a corresponding angle.
Further, the rendering specifically includes: and projecting each virtual object into a two-dimensional image according to different positions and angles of the virtual camera to be displayed in VR glasses.
Further, the depth scene is restored by combining the video data, each frame of the video data is a two-dimensional image, and each area of the two-dimensional image has a corresponding depth value.
Further, the depth values are: the distance between the projection of each area of the two-dimensional image on the main optical axis of the stereo camera and the center of the stereo camera.
The invention has the beneficial effects that: according to the invention, a depth scene is reconstructed from video data and depth data of a real environment where a robot is located, which are acquired by a stereo camera, in a Unity3D platform, after the depth scene and a virtual object are mixed and realized, three-dimensional information of the real scene where the robot is located is restored and presented in VR glasses, and an operator controls the virtual robot to move according to the real environment where the robot is located presented in the mixed reality, so that the real robot performs corresponding movement, and the virtual reality remote synchronous control of the robot is realized; the Unity3D platform collects the postures of VR glasses, controls a steering engine to rotate by a corresponding angle, enables the visual angle of the robot to rotate along with the head of an operator, and enables the robot to be remotely and synchronously controlled in a virtual reality mode in another visual range; in addition, each surface of the virtual robot is amplified in the Unity3D platform, real-time collision detection is carried out on the reconstructed virtual object surface and each surface of the virtual robot, and early warning can be carried out before the robot is in true collision, so that damage to the robot is reduced.
Drawings
FIG. 1 is a flow chart of a method for remotely and synchronously controlling a robot according to virtual reality of the present invention;
FIG. 2 is a schematic view of the installation of a stereo camera according to the present invention;
FIG. 3 is a schematic diagram showing depth values corresponding to each region of a two-dimensional image according to the present invention;
FIG. 4 is a schematic diagram of a scenario in an embodiment of the present invention;
FIG. 5 is a schematic view of a reconstructed depth scene according to the present invention;
FIG. 6 is a schematic view of a virtual scene according to the present invention;
FIG. 7 is a schematic diagram of mixed reality of the present invention, FIG. 7(a) is a schematic diagram before mixed reality, and FIG. 7(b) is a schematic diagram before mixed reality;
fig. 8 is a schematic diagram illustrating the effect of the remote synchronous control robot according to the present invention.
Detailed Description
The invention will be further described with reference to the following figures and specific examples, but the scope of the invention is not limited thereto.
As shown in fig. 1, the method for remotely and synchronously controlling a robot through virtual reality of the invention specifically comprises the following steps:
firstly, a stereo camera is installed on a robot neck pan-tilt formed by two steering engines (as shown in figure 2), and is connected to a development board through a data line, and collected video data, depth data and position information (IMU built in the stereo camera) of the real environment where the robot is located are transmitted to a computer at an operator end through an Ethernet line or a wireless network.
Step two, a computer at the operator end is provided with a Unity3D platform and is connected with VR glasses, and the computer imports various information transmitted by the stereo camera to perform the following operations:
(1) firstly, reconstructing a depth scene from video data and depth data of a real environment where the robot is located in a Unity3D platform, then mixing the depth scene and a virtual object, rendering through a Unity3D platform, and finally presenting in VR glasses.
Each frame of video data of the real environment where the robot is located is a two-dimensional image, each area of the two-dimensional image has a corresponding depth value, and the depth values are as follows: the distance between the projection of each area of the two-dimensional image on the main optical axis of the stereo camera and the center of the stereo camera (figure 3) can be combined with the depth value to restore a stereo scene; see fig. 4, 5:
in the scene of fig. 4, there is an arrow-shaped object, and whether it is a normal camera or a stereo camera, a line segment appears in the video image, but the stereo camera can additionally obtain a depth value corresponding to each region of the image, and assuming that the resolution of the depth information is relatively low, the Unity3D platform can only equally divide the line segment image presented by the camera into seven regions, and then the Unity3D platform can reconstruct the depth scene as shown in fig. 5 by combining the video image and the depth information.
Assuming that the position and orientation of the virtual camera in the virtual scene is the same as the position and orientation of the stereo camera mounted on the robot neck pan-tilt in the real environment, the Unity3D platform mixes the reconstructed depth scene with the virtual object: for a region with the same depth value in the video image of the stereo camera, if there is no virtual object with a depth smaller than that of the region (i.e. a virtual object closer to the center of the stereo camera), the region of video data is directly projected to a corresponding region in the VR glasses, otherwise the virtual object is rendered and projected to the corresponding region in the VR glasses.
A general virtual scene comprises a plurality of virtual objects and a virtual camera (corresponding to VR glasses worn by a user in reality). The rendering specifically comprises: and each virtual object is projected into a two-dimensional image according to different positions and angles of the virtual camera to be displayed in VR glasses, which is a self-contained function of the Unity3D platform. As shown in fig. 6, it is assumed that there is only one virtual cylindrical object in the virtual scene, and there are three virtual cameras (camera 1, camera 2, and camera 3) located at different positions, where the orientation of camera 1 is different from that of camera 2 and camera 3, and the orientation of camera 2 is the same as that of camera 3; then the VR glasses for camera 1 will show a larger circle to the user, the VR glasses for camera 2 will show a smaller rectangle, and the VR glasses for camera 3 will show nothing.
As shown in fig. 7(a), the real object photographed by the stereo camera is a triangular tip, the virtual object photographed by the virtual camera is a circle, and after the real object and the virtual object are mixed, the result displayed in the VR glasses is that a circle is superimposed on a line segment, as shown in fig. 7(b), the mixed reality effect is realized.
(2) The depth information can be used for realizing mixed reality, and can also be used for three-dimensional reconstruction of the object surface in the perspective of the stereoscopic camera, specifically: each region reconstructed in fig. 5 is simplified into one point, and points with a short distance are connected by line segments, and all line segments form a gridded surface. On this basis, the three-dimensionally reconstructed mesh surfaces are introduced into the Unity3D platform and become virtual objects in the Unity3D platform virtual scene. The 3D model (virtual robot model) for controlling the real robot is led into a virtual scene of a Unity3D platform, an operator wearing VR glasses observes the real environment where the robot after fusion rendering is located, the virtual robot is controlled to move, the real robot is enabled to move correspondingly, and therefore the following overall effects are achieved: the actions of the robot model, the shape of the surface of the virtual object and the position of the virtual object relative to the robot model in the virtual scene, and the actions of the real robot, the shape of the surface of the real object and the position of the real object relative to the real robot in the real environment are kept consistent from time to time, as shown in fig. 8.
The Unity3D platform has a function of detecting whether collision exists between the surfaces of the virtual objects in real time, and further by utilizing the function, the Unity3D platform carries out real-time collision detection on the surfaces of the reconstructed virtual objects and the surfaces of the virtual robot, thereby indirectly realizing the real-time collision detection on the real robot and the actual environment. If the surfaces of the virtual robot are properly enlarged at the Unity3D platform, the collision can be detected in the virtual environment before the real collision occurs, and the collision early warning effect is achieved.
(3) In order to further expand the visual range of the real robot, the position and the posture of VR glasses are collected by a Unity3D platform, the corresponding rotation angle of a steering engine is calculated according to the posture information, so that the visual angle of the robot rotates along with the rotation of the head of an operator, and the steps (1) and (2) are repeated, and the virtual reality remote synchronous control of the real robot in another visual range is realized.
The Unity3D platform is compatible with various conventional VR glasses, the posture of the VR glasses in the world coordinate system can be called at any time through the self-contained program interface of the Unity3D platform, and the posture is expressed by XYZ Euler angles which are recorded as
Figure BDA0002414160920000041
And in the initial state
Figure BDA0002414160920000042
Because the neck pan-tilt for installing the stereo camera only has two rotating shafts (figure 2), if the rotating shaft 3 which is vertical to the rotating shaft 1 and the rotating shaft 2 is added at the rotating shaft 2, finally the stereo camera is installed at the shaft 3, and the rotating shaft 1, the rotating shaft 2 and the rotating shaft 3 are respectively arranged at the initial stateThe axis is in the same direction with the x, y and z axes of the world coordinate system; if the posture of VR glasses can be expressed by XYZ Euler angles as defined by Euler angles
Figure BDA0002414160920000043
The Unity3D platform transmits the posture of the VR glasses to the industrial personal computer, and the industrial personal computer sends an instruction to enable the rotating shaft 1, the rotating shaft 2 and the rotating shaft 3 to rotate respectively
Figure BDA0002414160920000044
Then, the posture of the stereo camera is the same as that of the VR glasses. The rotation axis 3 of the head is removed and omitted for the sake of simplifying the mechanical design
Figure BDA0002414160920000045
Only the steering engine corresponding to the rotating shaft 1 and the steering engine corresponding to the rotating shaft 2 are respectively rotated by psi and theta, so that the stereo camera can rotate up, down, left and right along with the VR glasses or the head of a wearer.
The present invention is not limited to the above-described embodiments, and any obvious improvements, substitutions or modifications can be made by those skilled in the art without departing from the spirit of the present invention.

Claims (8)

1. A method for remotely and synchronously controlling a robot in a virtual reality manner is characterized in that a computer provided with a Unity3D platform imports video data and depth data of a real environment where the robot is located, reconstructs a depth scene from the video data and the depth data in a Unity3D platform, and displays the depth scene and a virtual object in VR glasses after mixed reality is carried out on the depth scene and the virtual object; an operator controls the virtual robot to move according to the real environment of the robot presented in the mixed reality, so that the real robot moves correspondingly; the process of mixed reality is as follows: and for the video image area with the same depth value, if no virtual object with the depth smaller than that of the video image area exists, directly projecting the area to the corresponding area of the VR glasses, or else, rendering the virtual object and projecting the rendered virtual object to the corresponding area of the VR glasses.
2. The method of claim 1, wherein the surfaces of the virtual robot are enlarged in the Unity3D platform, the reconstructed surfaces of the virtual object and the virtual robot are detected for real-time collision, and the real robot is warned before the real robot actually collides with the real environment.
3. The method of virtual reality remote synchronous control of a robot according to claim 2, wherein the reconstructed virtual object is: and performing three-dimensional reconstruction on a two-dimensional image generated by projecting the surface of the real object in the stereo camera by using the depth data.
4. The method for remotely and synchronously controlling the robot according to the virtual reality of claim 1, wherein the Unity3D platform collects the postures of VR glasses and controls a steering engine to rotate by a corresponding angle, so that the visual angle of the robot rotates along with the head of an operator, and the remote and synchronous control of the virtual reality of the robot in another visual range is realized.
5. The method of virtual reality remote synchronous control robot of claim 4, wherein the pose of the VR glasses can be expressed in XYZ Euler angles as (ψ, θ,
Figure FDA0002414160910000011
) And the Unity3D platform transmits the gesture of the VR glasses to the industrial personal computer, and the industrial personal computer controls the rotating shaft of the steering engine to rotate by a corresponding angle.
6. The method for remotely and synchronously controlling the robot according to the virtual reality of claim 1, wherein the rendering is specifically: and projecting each virtual object into a two-dimensional image according to different positions and angles of the virtual camera to be displayed in VR glasses.
7. The method of claim 1, wherein the depth scene is restored by combining the video data, each frame of the video data is a two-dimensional image, and each area of the two-dimensional image has a corresponding depth value.
8. The method for remotely and synchronously controlling a robot according to the virtual reality of claim 7, wherein the depth values are: the distance between the projection of each area of the two-dimensional image on the main optical axis of the stereo camera and the center of the stereo camera.
CN202010185883.3A 2020-03-17 2020-03-17 Method for remotely and synchronously controlling robot through virtual reality Pending CN111383348A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202010185883.3A CN111383348A (en) 2020-03-17 2020-03-17 Method for remotely and synchronously controlling robot through virtual reality

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202010185883.3A CN111383348A (en) 2020-03-17 2020-03-17 Method for remotely and synchronously controlling robot through virtual reality

Publications (1)

Publication Number Publication Date
CN111383348A true CN111383348A (en) 2020-07-07

Family

ID=71219196

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202010185883.3A Pending CN111383348A (en) 2020-03-17 2020-03-17 Method for remotely and synchronously controlling robot through virtual reality

Country Status (1)

Country Link
CN (1) CN111383348A (en)

Cited By (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN112667179A (en) * 2020-12-18 2021-04-16 北京理工大学 Remote synchronous collaboration system based on mixed reality
CN112819966A (en) * 2021-01-05 2021-05-18 上海大学 Environment fusion system and method suitable for man-machine interaction operation of underwater remote control robot
CN112906118A (en) * 2021-03-12 2021-06-04 河北工业大学 Construction robot remote operation method under virtual-real coupling environment
US20210347053A1 (en) * 2020-05-08 2021-11-11 Vangogh Imaging, Inc. Virtual presence for telerobotics in a dynamic scene
CN114281190A (en) * 2021-12-14 2022-04-05 Oppo广东移动通信有限公司 Information control method, device, system, equipment and storage medium

Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN101794349A (en) * 2010-02-09 2010-08-04 北京邮电大学 Experimental system and method for augmented reality of teleoperation of robot
CN108830940A (en) * 2018-06-19 2018-11-16 广东虚拟现实科技有限公司 Hiding relation processing method, device, terminal device and storage medium
CN108898676A (en) * 2018-06-19 2018-11-27 青岛理工大学 Method and system for detecting collision and shielding between virtual and real objects
CN109196447A (en) * 2016-03-31 2019-01-11 奇跃公司 Use the interaction of posture and more DOF controllers and 3D virtual objects
CN110136082A (en) * 2019-05-10 2019-08-16 腾讯科技(深圳)有限公司 Occlusion culling method, apparatus and computer equipment
CN110531846A (en) * 2018-05-24 2019-12-03 明日基金知识产权控股有限公司 The two-way real-time 3D interactive operation of real-time 3D virtual objects in the range of real-time 3D virtual world representing real world

Patent Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN101794349A (en) * 2010-02-09 2010-08-04 北京邮电大学 Experimental system and method for augmented reality of teleoperation of robot
CN109196447A (en) * 2016-03-31 2019-01-11 奇跃公司 Use the interaction of posture and more DOF controllers and 3D virtual objects
CN110531846A (en) * 2018-05-24 2019-12-03 明日基金知识产权控股有限公司 The two-way real-time 3D interactive operation of real-time 3D virtual objects in the range of real-time 3D virtual world representing real world
CN108830940A (en) * 2018-06-19 2018-11-16 广东虚拟现实科技有限公司 Hiding relation processing method, device, terminal device and storage medium
CN108898676A (en) * 2018-06-19 2018-11-27 青岛理工大学 Method and system for detecting collision and shielding between virtual and real objects
CN110136082A (en) * 2019-05-10 2019-08-16 腾讯科技(深圳)有限公司 Occlusion culling method, apparatus and computer equipment

Non-Patent Citations (3)

* Cited by examiner, † Cited by third party
Title
张慧: "基于多传感器融合的半自主式遥操作机器人控制技术研究", 《中国优秀博硕士学位论文全文数据库(硕士)信息科技辑(月刊)》 *
张金玲: "面向空间舱内机器人遥操作的增强现实仿真场景构建技术研究", 《中国优秀博硕士学位论文全文数据库(博士)信息科技辑(月刊)》 *
李永田等主编: "《实用软件小百科 图形图像软件速查字典》", 31 December 2001, 中国商业出版社 *

Cited By (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20210347053A1 (en) * 2020-05-08 2021-11-11 Vangogh Imaging, Inc. Virtual presence for telerobotics in a dynamic scene
CN112667179A (en) * 2020-12-18 2021-04-16 北京理工大学 Remote synchronous collaboration system based on mixed reality
CN112667179B (en) * 2020-12-18 2023-03-28 北京理工大学 Remote synchronous collaboration system based on mixed reality
CN112819966A (en) * 2021-01-05 2021-05-18 上海大学 Environment fusion system and method suitable for man-machine interaction operation of underwater remote control robot
CN112906118A (en) * 2021-03-12 2021-06-04 河北工业大学 Construction robot remote operation method under virtual-real coupling environment
CN114281190A (en) * 2021-12-14 2022-04-05 Oppo广东移动通信有限公司 Information control method, device, system, equipment and storage medium

Similar Documents

Publication Publication Date Title
CN111383348A (en) Method for remotely and synchronously controlling robot through virtual reality
US10818099B2 (en) Image processing method, display device, and inspection system
US11577159B2 (en) Realistic virtual/augmented/mixed reality viewing and interactions
JP5032343B2 (en) Method and apparatus for displaying a virtual object, and method and apparatus for overlaying a virtual object on an environmental image
US7714895B2 (en) Interactive and shared augmented reality system and method having local and remote access
Krückel et al. Intuitive visual teleoperation for UGVs using free-look augmented reality displays
JP5709440B2 (en) Information processing apparatus and information processing method
US20160163063A1 (en) Mixed-reality visualization and method
JP6589604B2 (en) Teaching result display system
JP2006302034A (en) Image processing method and image processor
CN102221884A (en) Visual tele-existence device based on real-time calibration of camera and working method thereof
CN117542253A (en) Pilot cockpit training system
JP7517803B2 (en) ROBOT TEACHING SYSTEM, IMAGE GENERATION METHOD, AND PROGRAM
JP6682624B2 (en) Image processing device
CN111947650A (en) Fusion positioning system and method based on optical tracking and inertial tracking
JPH0421105A (en) Stereoscopic teaching device for manipulator
CN115514885B (en) Remote augmented reality follow-up sensing system and method based on monocular and binocular fusion
CN105828021A (en) Specialized robot image acquisition control method and system based on augmented reality technology
Saraiji et al. Real-time egocentric superimposition of operator's own body on telexistence avatar in virtual environment
JP6890524B2 (en) Attitude control system
JP2020031413A (en) Display device, mobile body, mobile body control system, manufacturing method for them, and image display method
CN116205980A (en) Method and device for positioning and tracking virtual reality in mobile space
WO2017191703A1 (en) Image processing device
JP7451084B2 (en) Information processing device and information processing method
WO2024070398A1 (en) Information processing device, information processing method, and program

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
RJ01 Rejection of invention patent application after publication

Application publication date: 20200707

RJ01 Rejection of invention patent application after publication