GB2594810A - An enhanced reality underwater maintenance system by using a virtual reality manipulator (VRM) - Google Patents
An enhanced reality underwater maintenance system by using a virtual reality manipulator (VRM) Download PDFInfo
- Publication number
- GB2594810A GB2594810A GB2108167.4A GB202108167A GB2594810A GB 2594810 A GB2594810 A GB 2594810A GB 202108167 A GB202108167 A GB 202108167A GB 2594810 A GB2594810 A GB 2594810A
- Authority
- GB
- United Kingdom
- Prior art keywords
- manipulator
- underwater
- force
- human
- prc
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Granted
Links
- 238000012423 maintenance Methods 0.000 title claims 15
- 238000013473 artificial intelligence Methods 0.000 claims 9
- 210000004556 brain Anatomy 0.000 claims 9
- 238000005259 measurement Methods 0.000 claims 7
- 210000004247 hand Anatomy 0.000 claims 4
- 238000009434 installation Methods 0.000 claims 4
- XLYOFNOQVPJJNP-UHFFFAOYSA-N water Substances O XLYOFNOQVPJJNP-UHFFFAOYSA-N 0.000 claims 3
- 210000000707 wrist Anatomy 0.000 claims 3
- XAGFODPZIPBFFR-UHFFFAOYSA-N aluminium Chemical compound [Al] XAGFODPZIPBFFR-UHFFFAOYSA-N 0.000 claims 2
- 229910052782 aluminium Inorganic materials 0.000 claims 2
- 238000005516 engineering process Methods 0.000 claims 2
- 239000000835 fiber Substances 0.000 claims 2
- 238000012546 transfer Methods 0.000 claims 2
- 108700009949 PTP protocol Proteins 0.000 claims 1
- 238000013459 approach Methods 0.000 claims 1
- 238000013528 artificial neural network Methods 0.000 claims 1
- 230000003190 augmentative effect Effects 0.000 claims 1
- 238000010276 construction Methods 0.000 claims 1
- 238000013523 data management Methods 0.000 claims 1
- 230000005611 electricity Effects 0.000 claims 1
- 230000002068 genetic effect Effects 0.000 claims 1
- 210000003128 head Anatomy 0.000 claims 1
- 238000004519 manufacturing process Methods 0.000 claims 1
- 238000000034 method Methods 0.000 claims 1
- 238000000053 physical method Methods 0.000 claims 1
- 238000012545 processing Methods 0.000 claims 1
- 230000035945 sensitivity Effects 0.000 claims 1
- 230000001960 triggered effect Effects 0.000 claims 1
- 238000012795 verification Methods 0.000 claims 1
Classifications
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B25—HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
- B25J—MANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
- B25J13/00—Controls for manipulators
- B25J13/02—Hand grip control means
- B25J13/025—Hand grip control means comprising haptic means
-
- G—PHYSICS
- G05—CONTROLLING; REGULATING
- G05B—CONTROL OR REGULATING SYSTEMS IN GENERAL; FUNCTIONAL ELEMENTS OF SUCH SYSTEMS; MONITORING OR TESTING ARRANGEMENTS FOR SUCH SYSTEMS OR ELEMENTS
- G05B19/00—Programme-control systems
- G05B19/02—Programme-control systems electric
- G05B19/04—Programme control other than numerical control, i.e. in sequence controllers or logic controllers
- G05B19/042—Programme control other than numerical control, i.e. in sequence controllers or logic controllers using digital processors
-
- G—PHYSICS
- G05—CONTROLLING; REGULATING
- G05D—SYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
- G05D1/00—Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots
- G05D1/0011—Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots associated with a remote control arrangement
- G05D1/0038—Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots associated with a remote control arrangement by providing the operator with simple or augmented images from one or more cameras located onboard the vehicle, e.g. tele-operation
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B63—SHIPS OR OTHER WATERBORNE VESSELS; RELATED EQUIPMENT
- B63G—OFFENSIVE OR DEFENSIVE ARRANGEMENTS ON VESSELS; MINE-LAYING; MINE-SWEEPING; SUBMARINES; AIRCRAFT CARRIERS
- B63G8/00—Underwater vessels, e.g. submarines; Equipment specially adapted therefor
- B63G8/001—Underwater vessels adapted for special purposes, e.g. unmanned underwater vessels; Equipment specially adapted therefor, e.g. docking stations
- B63G2008/002—Underwater vessels adapted for special purposes, e.g. unmanned underwater vessels; Equipment specially adapted therefor, e.g. docking stations unmanned
- B63G2008/005—Underwater vessels adapted for special purposes, e.g. unmanned underwater vessels; Equipment specially adapted therefor, e.g. docking stations unmanned remotely controlled
- B63G2008/007—Underwater vessels adapted for special purposes, e.g. unmanned underwater vessels; Equipment specially adapted therefor, e.g. docking stations unmanned remotely controlled by means of a physical link to a base, e.g. wire, cable or umbilical
-
- G—PHYSICS
- G05—CONTROLLING; REGULATING
- G05B—CONTROL OR REGULATING SYSTEMS IN GENERAL; FUNCTIONAL ELEMENTS OF SUCH SYSTEMS; MONITORING OR TESTING ARRANGEMENTS FOR SUCH SYSTEMS OR ELEMENTS
- G05B2219/00—Program-control systems
- G05B2219/30—Nc systems
- G05B2219/32—Operator till task planning
- G05B2219/32014—Augmented reality assists operator in maintenance, repair, programming, assembly, use of head mounted display with 2-D 3-D display and voice feedback, voice and gesture command
Landscapes
- Engineering & Computer Science (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- Automation & Control Theory (AREA)
- Radar, Positioning & Navigation (AREA)
- Remote Sensing (AREA)
- Aviation & Aerospace Engineering (AREA)
- Human Computer Interaction (AREA)
- Robotics (AREA)
- Mechanical Engineering (AREA)
- Manipulator (AREA)
Abstract
The present invention relates to a virtual reality manipulator which is an advanced solution that consists of ROV manipulator arms, virtual reality helmet and controllers, 3d cameras, force and torque sensors all integrated together to replace the human divers underwater providing same quality. The components of the said manipulator are 3D Vision, Moving Manipulators as if his own arms, Enable torque/force feeling, and Photo Realistic Cloud feature.
Claims (13)
1. A remote underwater maintenance system by using a virtual reality manipulator (VRM) comprising: A. Manipulator system consisting of virtual reality helmet (1), controllers (2) and Artificial Intelligence (AI), wherein the said manipulators are arms equipped on the Remotely Operated Vehicle (ROV) (5) which used to replace the divers in a deep water, extreme conditions, hazard areas to perform subsea maneuvering either by electric power or hydraulic pump driven by an electric motor. ROV manipulators can be a 1 joint up to 7 joints which it depends on the size, and the required tasks. ROVs are equipped with 1 or 2 manipulators in order to perform basic or complicated tasks, wherein the said manipulators can also be electrically driven or hydraulic, and it can carry up to 380Kgs. B. PRC & Stereo Cameras, wherein PRC Cameras are installed on the WCROV Manipulator to produce the three-dimensional environment to populate a similar vision with these interpreted by the human brain through human eyes, and wherein Z- mini cameras are installed on pan and tilt unit (8), and when the operator moves the helmet (1) in any direction or angle this move is translated through software to move the pan and tilt unit (8) and accordingly to move the camera to the desired direction or angle. Such flexibility enables operator to explore the underwater environment exactly like he is in it. C. Virtual Reality solution, it consists of both virtual reality helmet (1) and controllers (2) by using them: - WCROV Operator can wear the helmet (1) on the vessel top to view the three- dimensional environment underwater, this view is generated by using the said 3D cameras, - Operator can also hold the controllers (2) in his hands to move WCROV manipulator arms (4) accordingly, the result of the movement is sent real-time to Manipulator Movement Artificial Intelligence to move the manipulator arms, and the controllers can perform free moves on manipulatorâ s 7 joints to perform different tasks. D. Force/Torque sensor (9) on each manipulator: Every manipulator arm is equipped with one force and torque sensor to work underwater in full ocean depth without affecting its sensitivity to feel Force / Torque in all 6 directions. Wherein ROV manipulator arm has been modified to allow the installation of a force / torque sensor inside and an underwater pod created for it and cabling / wiring has been implemented so the force and torque which is generated from manipulator arms due to its different actions can be measured, these measurements are passed to artificial intelligence module and consequently understood, wherein the force and torque sensor (9) is integrated inside the manipulator arm so the arm can sense used forces and torques to move objects or do specific operations. Characterized in that: â ¢ These data appears to the person wearing the virtual reality helmet to allow the operator to take decisions upon, and understands if the force he is using is excessive to break the object, use more power to tighten bolts...etc. â ¢ The Measurements from force / torque sensor are stored into the AI module, these measurements are understood by this software module and affects the decision making of the solution in automatic mode. E. Manipulator Movement Artificial Intelligence (MMAI): MMAI is a replacement of the diverâ s back brain to move his arms for his hand to be in a certain position. When the human moves his arm, the MMAI will compute the best setting for each joint of the manipulator to result that the manipulator handâ s position will be in the same position as the human hand in the virtual reality environment the human brain sees, then the MMAI will execute such settings to the manipulator joints simultaneously to maintain the same path of the human hands while reaching the final position (inserting a bolt in a flange hole, the manipulator hand must move in a certain direction for the bolt to be installed). Characterized in that ROV manipulator arms, Virtual reality helmet and controllers, 3D cameras, force and torque sensors all integrated together to replace the human divers underwater providing same quality. F. The photo realistic 3D cloud (PRC) system is used to recognize underwater features such as pipelines and subsea assets. Photo Realistic 3D Cloud (PRC) is a cutting-edge innovative technology to scan complete structures and pipeline sections underwater to create a 3D Cloud of millions of points presenting the as-built of any scanned object with a high accuracy more than 1/1000 (ex: lmm accuracy in a lm measurement). Characterized in that: The PRC help solution to pass exact dimensions to computer brain to perform accurate installation tasks like recognize a flange position and dimensions, therefore enable the MMAI to finish the task of installing all the other bolts and nuts on a flange automatically.
2. The remote underwater maintenance system according to claim 1, wherein the manipulator is equipped with torque and force sensor to read the force generated from different moves or fasten bolts with specific torque and precisely calculated movements from manipulator joints, such results are displayed as numbers on the VR image and transmitted as vibrations in the movement sensors held by the operator.
3. The remote underwater maintenance system according to claim 1, wherein PRC cameras housing (3) is enclosed inside sealed aluminum enclosure to tolerate underwater pressure up to 3000m, wherein the said housing is equipped with dome as original camera dome canâ t operate in water and all camera ports are equipped with underwater connectivity bulkheads to connect electricity and network to the camera housing to be connected to rest of the system.
4. The remote underwater maintenance system according to claim 1, wherein the system can take the torque/force readings to enable the AI module to achieve a task just like a human brain use the fingers feeling to sense accurately where is a hole in a flange to insert a bolt, therefore the MMAI is capable to automatically achieve a preprogramed task of inserting a bolt in a flange hole and take over the human brain control as soon as it is triggered after the manipulator hand reach near the flange hole. It also has a vibration feedback when it cannot reach a solution because of any obstacle or high unexpected torque while screwing a nut on a bolt.
5. The remote underwater maintenance system according to claims 1 to 4, wherein the system provides two modes, the manual mode where the person / user wearing the helmet is totally in control of the operation, the user in manual mode wears the helmet and hold controllers to move arms, view the environment in 3d from the 3d stereo cameras and understand measurements coming from PRC software generated model.
6. The remote underwater maintenance system according to claim 5, further it is contained an automatic mode which is the main function of the artificial intelligence software module, this module gathers all data coming from PRC, force and torque sensor to accomplish complete tasks autonomously upon already setuped missions.
7. The remote underwater maintenance system according to claims 1 to 6, wherein Z- mini camera is enclosed into ZM camera housing (6) to adapt underwater up to 3000m depth under sea surface and brings the best of virtual and augmented reality together, this camera needed for virtual reality helmet to view the underwater environment in 3D by using advanced depth sensing technology.
8. The remote underwater maintenance system according to claims 1 to 7, wherein Artificial Intelligence (AI) is used to move the arm with all of its 7-joints in a correct setup to make the manipulator get to the target desired by the user with the correct orientation. It runs in real-time to figure out the 7-joints angles to match the desired target and orientation, and it uses different AI algorithms such as neural networks and Genetic algorithms.
9. The remote underwater maintenance system according to claims 1 to 8, wherein the camera feed is continuously scanning the scene in 3D creasing an accurate dimension of all object which enable the operator to switch to autonomous operations for the manipulator ex: tie two flanges together by inserting bolts in all holes of the flange and insert nuts on the other side then tighten all nuts to a given torque.
10. The remote underwater maintenance system according to claims 1 to 9, wherein the PRC solution require the existence of underwater network switch, this network switch (7) perform two main functions as below:
11. (1) It is used to synchronize the images taken between the two cameras at the exact precise time for processing images through PTP protocol Precision Time protocol.
12. (2) The other function is to convert the ethemet connectivity to fiber optic signal to be connected with ROV fiber optic cable to transfer taken images instantly to vessel top. Switch originally is not entitled to go under water, accordingly its enclosed into an aluminum housing / pod to be able to tolerate pressure up to 3000m under sea surface.
11. The remote underwater maintenance system according to claims 1 to 10, wherein the said system enable the WCROV to replace the human diver presence underwater and perform all the required tasks with a better quality for the below features: â ¢ 3D Vision: WCROV manipulator is equipped by multiple cameras underwater with MCS Video Codec s/w to develop video feeds through the human eyes to the human brain which managed easily to interpret the video feed and simulate a 3D vision environment where the human can feel depth, distances and sizes through a Virtual Reality Helmet. â ¢ Moving Manipulators as if his own arms: When the human operator moves his arms holding motion sensors in his hands, the result of the movement is sent real-time to the MCS Manipulator Movement Artificial Intelligence (MMAI) s/w which acts as the human back brain and solve the movement of each joint of the 7 joints of the manipulator for its fingers position and direction to match the human hand. The manipulator hand will be exactly where the brain wants it to be when the human hands were moved. â ¢ Sensing through Force and Torque: System is equipped by force and torque sensor, the readings coming from force and torque sensor enables system to sense the required actions needed to perform each maintenance task. â ¢ 3D Measurements: Due to presence of PRC module, VRM solution is able to obtain accurate measurements of subsea assets from generated 3D models.
12. The remote underwater maintenance system according to claims 1 to 1 1, wherein the photo realistic 3D cloud (PRC) system provides the below features:
1. Produce full set of as-found drawing base line for subsea structures within mm accuracy and precise 3D integrated model with all dimension.
2. Eliminate the need for verification as built visits, whenever new installation needed to eliminate the necessity for frequent Vessel visits to obtain physical measurement.
3. Faster, diver-less and more accurate approach than conventional way for collecting as-built/as-found information avoiding human error.
4. Precise 3D reference for all features position/orientation; convenient for future construction/maintenance (i.e. riser installation, flange/clamps fabrication).
5. Database can integrate within company Assets Integrity Data Management System.
6. In-depth Integrity assessment & life time extension.
7. PRC equipment can be mounted on ROV or divers.
8. (PRC) system of the present invention is the only way for user in manual mode to view the full object in operation, imagine the user is making an operation on the pipelineâ s flange Characterized in that: Through helmet the operator is only viewing the flange in full view but through PRC the operator can see the full pipeline if surveyed before , flangeâ s dimensions , bolts and nuts dimensions and how many bolts installed for which pipe diameter .. etc.
13. A method of operating the Virtual reality manipulator VRM of the current invention comprising the below steps:
1- The user is required to wear a VR helmet on his head to see a live 3D video of the object(s) that he want to interact with as well as the robotic arm. 2-The user also has to hold both VR controllers, one in each hand, the VR controller will transfer the user wrist movement or rotation. To the mechanical arm so any movement that the user does with his wrist will be emitted and replicated by the mechanical arm. 3-After the user wear both VR helmet and VR controllers, the user can consider the robotic arm(s) as his own and begin to interact, move and rotate his wrist to virtually interact with the objects and the robotic arm will replicate his movements which will interact with the objects in real world.
Applications Claiming Priority (3)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
EG2018110018 | 2018-11-08 | ||
EG2019050687 | 2019-05-02 | ||
PCT/EG2019/000024 WO2020094205A1 (en) | 2018-11-08 | 2019-11-03 | An enhanced reality underwater maintenance syestem by using a virtual reality manipulator (vrm) |
Publications (3)
Publication Number | Publication Date |
---|---|
GB202108167D0 GB202108167D0 (en) | 2021-07-21 |
GB2594810A true GB2594810A (en) | 2021-11-10 |
GB2594810B GB2594810B (en) | 2023-01-11 |
Family
ID=70611727
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
GB2108167.4A Active GB2594810B (en) | 2018-11-08 | 2019-11-03 | An enhanced reality underwater maintenance system by using a virtual reality manipulator (VRM) |
Country Status (2)
Country | Link |
---|---|
GB (1) | GB2594810B (en) |
WO (1) | WO2020094205A1 (en) |
Families Citing this family (5)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
KR102422355B1 (en) * | 2021-01-07 | 2022-07-19 | 한국생산기술연구원 | An remote control system based on augmented reality for underater robot sterilization robot based on 3d environmental recognition |
CN112894820A (en) * | 2021-01-29 | 2021-06-04 | 清华大学深圳国际研究生院 | Flexible mechanical arm remote operation man-machine interaction device and system |
CN114131635B (en) * | 2021-12-08 | 2024-07-12 | 山东大学 | Multi-degree-of-freedom auxiliary grasping outer limb robot system integrating visual touch active sensing |
CN114927016A (en) * | 2022-03-31 | 2022-08-19 | 江苏集萃清联智控科技有限公司 | Seabed multitask simulation system, device and method |
CN116160435A (en) * | 2023-04-24 | 2023-05-26 | 海南坤联科技有限公司 | Somatosensory-control bionic mechanical arm mounted on submarine |
Citations (3)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US6016385A (en) * | 1997-08-11 | 2000-01-18 | Fanu America Corp | Real time remotely controlled robot |
US20170106537A1 (en) * | 2014-03-03 | 2017-04-20 | University Of Washington | Haptic Virtual Fixture Tools |
US20180250086A1 (en) * | 2017-03-02 | 2018-09-06 | KindHeart, Inc. | Telerobotic surgery system using minimally invasive surgical tool with variable force scaling and feedback and relayed communications between remote surgeon and surgery station |
-
2019
- 2019-11-03 GB GB2108167.4A patent/GB2594810B/en active Active
- 2019-11-03 WO PCT/EG2019/000024 patent/WO2020094205A1/en active Application Filing
Patent Citations (3)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US6016385A (en) * | 1997-08-11 | 2000-01-18 | Fanu America Corp | Real time remotely controlled robot |
US20170106537A1 (en) * | 2014-03-03 | 2017-04-20 | University Of Washington | Haptic Virtual Fixture Tools |
US20180250086A1 (en) * | 2017-03-02 | 2018-09-06 | KindHeart, Inc. | Telerobotic surgery system using minimally invasive surgical tool with variable force scaling and feedback and relayed communications between remote surgeon and surgery station |
Non-Patent Citations (1)
Title |
---|
IASTREBOV, et al. "Vision enhancement using stereoscopic telepresence for remotely operated underwater robotic vehicles." Journal of Intelligent and Robotic Systems 52.1 (2008): 139-154, entire document [online] URL <https://s3.amazonaws.com/academia.edu.documents/46193062/s10846-008-3DVision_Enhan * |
Also Published As
Publication number | Publication date |
---|---|
WO2020094205A1 (en) | 2020-05-14 |
WO2020094205A4 (en) | 2020-09-24 |
GB2594810B (en) | 2023-01-11 |
GB202108167D0 (en) | 2021-07-21 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
GB2594810A (en) | An enhanced reality underwater maintenance system by using a virtual reality manipulator (VRM) | |
CN107968915B (en) | Real-time control system and method for underwater robot camera pan-tilt | |
Birk et al. | Dexterous underwater manipulation from onshore locations: Streamlining efficiencies for remotely operated underwater vehicles | |
Sanz et al. | TRIDENT An European project targeted to increase the autonomy levels for underwater intervention missions | |
CN112634318B (en) | Teleoperation system and method for underwater maintenance robot | |
Bruno et al. | Augmented reality visualization of scene depth for aiding ROV pilots in underwater manipulation | |
KR20140021354A (en) | Apparartus and method for generating an around view of a remotely operated vehicle | |
Di Lillo et al. | Underwater intervention with remote supervision via satellite communication: Developed control architecture and experimental results within the dexrov project | |
CN110682291B (en) | Robot teleoperation system based on VR and teleoperation method thereof | |
Di Lillo et al. | Advanced ROV autonomy for efficient remote control in the DexROV project | |
KR20160055609A (en) | Underwater IMR (Installation, Maintenance, and Repair) Task Management System and Its Method | |
CN110794710A (en) | Underwater robot simulation system | |
Bruno et al. | A ROV for supporting the planned maintenance in underwater archaeological sites | |
CN113093914B (en) | High-presence visual perception method and device based on VR | |
Wang et al. | Research and experiment of an underwater stereo vision system | |
Smith et al. | Computer vision control of an underwater manipulator | |
Sagara et al. | A Stereo Vision System for Underwater Vehicle-Manipulator Systems-Proposal of a Novel Concept Using Pan-Tilt-Slide Cameras-. | |
Transeth et al. | A robotic concept for remote maintenance operations: A robust 3D object detection and pose estimation method and a novel robot tool | |
Bian et al. | An autonomous underwater vehicle manipulator system for underwater target capturing | |
Kim et al. | Line Laser mounted Small Agent ROV based 3D Reconstruction Method for Precision Underwater Manipulation | |
Choi et al. | New concepts for smart ROV to increase efficiency and productivity | |
Lee et al. | System design of an ROV with manipulators and adaptive control of it | |
CN210616555U (en) | Six-degree-of-freedom ROV (remote operated vehicle) driving simulation system | |
Wang et al. | Manipulator Oriented Grasp Control Based on Image Recognition | |
Santos et al. | Autonomous tracking system of a moving target for underwater operations of work-class rovs |