Map Representation and Navigation Planning for Legged Climbing UGVs in 3D Environments
<p>The overall framework of the proposed method. The input information consists of RGB images and depth information captured by a depth camera. The RGB images are used to estimate the relative pose, while the depth images are converted into point clouds. The point clouds are first used to extract plane and boundary information, which are then used to construct the proposed MLEM. The numbers 1, 2 indicate the code of the different planes. Based on the MLEM, a universal hierarchical planning framework is used for path planning and motion generation in challenging 3D environments. The outputs of the planning framework are all joint angles trajectories. We use a position controller to control the joints.</p> "> Figure 2
<p>The multi-layer elevation map construction pipeline. Based on the plane and boundary information obtained, the point clouds of each plane are used to construct the MLEM. Each plane and its corresponding elevation map are labeled, and the boundary information stores the connection information for different planes.</p> "> Figure 3
<p>The transformation process of two adjacent planes.</p> "> Figure 4
<p>Cross-plane path planning and map unfolding process. After the MLEM is constructed and each plane is numbered, a search is conducted in the constructed 3D space map for possible paths from the start plane to the target plane. Based on priority, the planes involved in different paths are unfolded and a path search is conducted until a viable path is found.</p> "> Figure 5
<p>The different unfolding results with the same planes in a different order.</p> "> Figure 6
<p>The two forms of collision checking. For soft-collision checks, the surrounding grids validation of the foot are checked. A foot is considered supportive when the amount of valid grids around the foothold exceeds a specified threshold. A state is deemed valid only when all six feet are supportive.</p> "> Figure 7
<p>The sampling space <math display="inline"><semantics> <mrow> <mo>[</mo> <mi>x</mi> <mo>,</mo> <mi>y</mi> <mo>,</mo> <mi>θ</mi> <mo>]</mo> </mrow> </semantics></math> represents the coordinates on the map and rotation about the z-axis.</p> "> Figure 8
<p>Boundary motion state. (<b>a</b>). The simplified model and a hexapod prototype. (<b>b</b>) The three independent basic motion patterns. The transition between two adjacent states is realized through a single basic motion.</p> "> Figure 9
<p>The convex hull of the foot−valid workspace.</p> "> Figure 10
<p>The region transition process between the plane region and boundary region.</p> "> Figure 11
<p>The transition between two states is considered as either a translation or a rotation around a certain point.</p> "> Figure 12
<p>(<b>a</b>). The world coordinate system and the body coordinate system. (<b>b</b>). The change of the foot end within the body coordinate system when body posture adjusts.</p> "> Figure 13
<p>(<b>a</b>). The obstacle−free planned path and its simulation scene. (<b>b</b>). The soft−collision check planned path and its simulation scene. Numbers 1–4 represent the sequence of movements.</p> "> Figure 14
<p>In the simulation environment, the UGV first walks along the wall boundary and passes through a narrow gap. Numbers 1−16 represent the sequence of movements. It then executes a plane-to-plane transition to move from the ground to the wall. After performing obstacle avoidance maneuvers on the wall, it returns to the ground. This process fully demonstrates the locomotion capabilities of a legged climbing UGV in a complex 3D environment.</p> "> Figure 15
<p>(<b>a</b>). The red line represents the planned trajectory, while the blue line connecting the scatter points represents the measured trajectory. (<b>b</b>). The absolute motion error over time.</p> "> Figure 16
<p>(<b>1</b>–<b>4</b>) shows that the physical UGV moves along plane boundaries to traverse narrow gaps. (<b>a</b>–<b>d</b>) shows that the UGV conducts the plane-to-plane transition. More experiment and simulation details can be found in the video provided.</p> "> Figure 17
<p>The plane transition process under different angles.</p> ">
Abstract
:1. Introduction
- A novel map representation suitable for LC-UGVs, known as the multi-layer elevation map, is proposed.
- We introduce a universal hierarchical planning framework for LC-UGVs in 3D environments. This architecture integrates wall transition and single-plane movements into the unified path planning method and is applicable to various configurations of LC-UGVs. Based on the framework, LC-UGVs can not only traverse obstacles on the ground but also perform wall transitions and even move along gaps between planes.
- To validate the effectiveness of the proposed framework, extensive simulations alongside physical tests are performed.
2. Related Work
2.1. Map Representation for Navigation
2.2. Navigation Planning for Legged Climbing UGVs
3. Method
3.1. Overall Framework
3.2. Multi-Layer Elevation Map
Algorithm 1: Plane segmentation and boundary detection. |
|
3.3. Global Planner and Map Unfold
3.4. Local Path Planner
3.4.1. Map Pre-Processing
3.4.2. Path Planning in Plane Motion Region
3.4.3. Path Planning in Boundary Motion Region
3.4.4. Inter-Regional Key-State
3.5. Motion Generator
3.5.1. Plane Motion Single-Step Generation
3.5.2. Boundary Motion Single-Step Generation
4. Experiment Results
4.1. Hexapod Climbing UGV Platform
4.2. Map Representation Validation
4.3. Simulation
4.4. Physical Experiment
5. Conclusions
Supplementary Materials
Author Contributions
Funding
Data Availability Statement
Conflicts of Interest
Abbreviations
LC-UGVs | Legged Climbing Unmanned Ground Vehicles |
MELM | Multi-Level Elevation Map |
SLAM | Simultaneous Localization and Mapping |
ESDF | Euclidean Signed Distance Filed |
References
- Gong, C.; Fan, L. Design and Experiments of a Hexapod Robot for Inspection of Wind Turbine Blades. In Proceedings of the 2023 IEEE International Conference on Robotics and Biomimetics (ROBIO), Samui, Thailand, 4–9 December 2023; pp. 1–6. [Google Scholar] [CrossRef]
- Zhang, H.; Zhang, J.; Liu, R.; Zong, G. A novel approach to pneumatic position servo control of a glass wall cleaning robot. In Proceedings of the 2004 IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS) (IEEE Cat. No.04CH37566), Sendai, Japan, 28 September–2 October 2004; Volume 1, pp. 467–472. [Google Scholar] [CrossRef]
- Xiang, A.; Zhang, L.; Fan, L. Design and analysis of an electro-adhesive hexapod robot with convertible limbs in microgravity. Adv. Space Res. 2023, 73, 1908–1924. [Google Scholar] [CrossRef]
- Parness, A.; Abcouwer, N.; Fuller, C.; Wiltsie, N.; Nash, J.; Kennedy, B. Lemur 3: A limbed climbing robot for extreme terrain mobility in space. In Proceedings of the 2017 IEEE International Conference on Robotics and Automation (ICRA), Singapore, 29 May–3 June 2017; IEEE: Singapore, 2017; pp. 5467–5473. [Google Scholar]
- Zhu, H.; Lu, J.; Gu, S.; Wei, S.; Guan, Y. Planning three-dimensional collision-free optimized climbing path for biped wall-climbing robots. IEEE/ASME Trans. Mechatron. 2020, 26, 2712–2723. [Google Scholar] [CrossRef]
- Kim, H.; Kang, T.; Loc, V.G.; Choi, H.R. Gait planning of quadruped walking and climbing robot for locomotion in 3D environment. In Proceedings of the 2005 IEEE International Conference on Robotics and Automation, Barcelona, Spain, 18–22 April 2005; IEEE: Alcobendas, Ethiopia, 2005; pp. 2733–2738. [Google Scholar]
- Gao, Y.; Wei, W.; Wang, X.; Li, Y.; Wang, D.; Yu, Q. Feasibility, planning and control of ground-wall transition for a suctorial hexapod robot. Appl. Intell. 2021, 51, 5506–5524. [Google Scholar] [CrossRef]
- Milstein, A. Occupancy grid maps for localization and mapping. In Motion Planning; Springer: Berlin/Heidelberg, Germany, 2008; pp. 381–408. [Google Scholar] [CrossRef]
- Hornung, A.; Wurm, K.M.; Bennewitz, M.; Stachniss, C.; Burgard, W. OctoMap: An efficient probabilistic 3D mapping framework based on octrees. Auton. Robot. 2013, 34, 189–206. [Google Scholar] [CrossRef]
- Chilian, A.; Hirschmüller, H. Stereo camera based navigation of mobile robots on rough terrain. In Proceedings of the 2009 IEEE/RSJ International Conference on Intelligent Robots and Systems, St. Louis, MO, USA, 11–15 October 2009; IEEE: New York, NY, USA, 2009; pp. 4571–4576. [Google Scholar]
- Wellhausen, L.; Hutter, M. Rough Terrain Navigation for Legged Robots using Reachability Planning and Template Learning. In Proceedings of the 2021 IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS), Prague, Czech Republic, 27 September–1 October 2021; pp. 6914–6921. [Google Scholar] [CrossRef]
- Čížek, P.; Masri, D.; Faigl, J. Foothold placement planning with a hexapod crawling robot. In Proceedings of the 2017 IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS), Vancouver, BC, Canada, 24–28 September 2017; pp. 4096–4101. [Google Scholar] [CrossRef]
- Liendo, F.; Bozzi, A.; Hernández, C.; Galez, C.; Sacile, R.; Jiménez, J. An Improved Dual Quaternion-based Dynamic Movement Primitives Formulation for Obstacle Avoidance Kinematics in Human- Robot Collaboration System of Systems. In Proceedings of the 2023 18th Annual System of Systems Engineering Conference (SoSe), Lille, France, 14–16 June 2023; pp. 1–5. [Google Scholar] [CrossRef]
- Bozzi, A.; Graffione, S.; Kockelkoren, M.M.; Sacile, R.; Zero, E. Path Tracking for Wheeled Mobile Robot Using Non Linear Model Predictive Control in Indoor Environment. In Proceedings of the 2023 9th International Conference on Control, Decision and Information Technologies (CoDIT), Istanbul, Turkey, 3–6 July 2023; pp. 1379–1384. [Google Scholar] [CrossRef]
- Chen, S.; Zhu, H.; Guan, Y.; Wu, P.; Hu, J.; Chen, X.; Zhang, H. Collision-free single-step motion planning of biped pole-climbing robots in spatial trusses. In Proceedings of the 2013 IEEE International Conference on Robotics and Biomimetics (ROBIO), Shenzhen, China, 12–14 December 2013; IEEE: Beijing, China, 2013; pp. 280–285. [Google Scholar]
- Campos, C.; Elvira, R.; Gomez, J.J.; Montiel, J.M.M.; Tardós, J.D. ORB-SLAM3: An Accurate Open-Source Library for Visual, Visual-Inertial and Multi-Map SLAM. IEEE Trans. Robot. 2021, 37, 1874–1890. [Google Scholar] [CrossRef]
- Labbé, M.; Michaud, F. Appearance-Based Loop Closure Detection for Online Large-Scale and Long-Term Operation. IEEE Trans. Robot. 2013, 29, 734–745. [Google Scholar] [CrossRef]
- Fischler, M.A.; Bolles, R.C. Random sample consensus: A paradigm for model fitting with applications to image analysis and automated cartography. Commun. ACM 1981, 24, 381–395. [Google Scholar] [CrossRef]
- Fankhauser, P.; Hutter, M. A Universal Grid Map Library: Implementation and Use Case for Rough Terrain Navigation. In Robot Operating System (ROS)—The Complete Reference (Volume 1); Koubaa, A., Ed.; Springer: Berlin/Heidelberg, Germany, 2016; Chapter 5. [Google Scholar] [CrossRef]
- Cheah, W.; Khalili, H.H.; Watson, S.; Green, P.; Lennox, B. Grid-based motion planning using advanced motions for hexapod robots. In Proceedings of the 2018 IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS), Madrid, Spain, 1–5 October 2018; IEEE: Alcobendas, Ethiopia, 2018; pp. 3573–3578. [Google Scholar]
- Zhang, T.; Zhang, W.; Gupta, M.M. Resilient Robots: Concept, Review, and Future Directions. Robotics 2017, 6, 22. [Google Scholar] [CrossRef]
- Bongard, J.; Zykov, V.; Lipson, H. Resilient Machines Through Continuous Self-Modeling. Science 2006, 314, 1118–1121. [Google Scholar] [CrossRef] [PubMed]
- Wang, Z.; Qian, Z.; Song, Z.; Liu, H.; Zhang, W.; Bi, Z. Instrumentation and self-repairing control for resilient multi-rotor aircrafts. Ind. Robot. Int. J. 2018, 45, 647–656. [Google Scholar] [CrossRef]
- Shaw, A. Autonomous Multi-Robot Exploration Strategies for 3D Environments with Fire Detection Capabilitie. arXiv 2024, arXiv:2411.15953. [Google Scholar]
- Zheng, W.; Chen, W.; Huang, Y.; Zhang, B.; Duan, Y.; Lu, J. Occworld: Learning a 3d occupancy world model for autonomous driving. In Proceedings of the European Conference on Computer Vision, Shanghai, China, 1–2 January 2025; Springer: Berlin/Heidelberg, Germany, 2025; pp. 55–72. [Google Scholar]
Octomap | 3D Occupancy Map | MLEM | ||||
---|---|---|---|---|---|---|
Input Size/Resolution | Time Cost (s) | Memory Footprint (MB) | Time Cost (s) | Memory Footprint (MB) | Time Cost (s) | Memory Footprint (MB) |
3 MB/0.025 m | 0.025 | 1.29 | 0.015 | 13 | 0.041 | 0.232 |
3 MB/0.05 m | 0.012 | 0.368 | 0.004 | 1.64 | 0.019 | 0.058 |
60 MB/0.025 m | 0.763 | 13.48 | 0.493 | 394 | 1.72 | 2.89 |
60 MB/0.05 m | 0.32 | 3.15 | 0.146 | 49 | 1.35 | 0.732 |
Disclaimer/Publisher’s Note: The statements, opinions and data contained in all publications are solely those of the individual author(s) and contributor(s) and not of MDPI and/or the editor(s). MDPI and/or the editor(s) disclaim responsibility for any injury to people or property resulting from any ideas, methods, instructions or products referred to in the content. |
© 2024 by the authors. Licensee MDPI, Basel, Switzerland. This article is an open access article distributed under the terms and conditions of the Creative Commons Attribution (CC BY) license (https://creativecommons.org/licenses/by/4.0/).
Share and Cite
Xiang, A.; Gong, C.; Fan, L. Map Representation and Navigation Planning for Legged Climbing UGVs in 3D Environments. Drones 2024, 8, 768. https://doi.org/10.3390/drones8120768
Xiang A, Gong C, Fan L. Map Representation and Navigation Planning for Legged Climbing UGVs in 3D Environments. Drones. 2024; 8(12):768. https://doi.org/10.3390/drones8120768
Chicago/Turabian StyleXiang, Ao, Chenzhang Gong, and Li Fan. 2024. "Map Representation and Navigation Planning for Legged Climbing UGVs in 3D Environments" Drones 8, no. 12: 768. https://doi.org/10.3390/drones8120768
APA StyleXiang, A., Gong, C., & Fan, L. (2024). Map Representation and Navigation Planning for Legged Climbing UGVs in 3D Environments. Drones, 8(12), 768. https://doi.org/10.3390/drones8120768