DCP-SLAM: Distributed Collaborative Partial Swarm SLAM for Efficient Navigation of Autonomous Robots
<p>Illustration of DCP-SLAM technique. Unknown obstacles are illustrated in grey. Once an obstacle (or a part of it) is detected by the onboard sensors of the robot(s), it is illustrated in red. (<b>a</b>) Initial setup, launchpad where robots are launched from and unknown obstacles are shown. (<b>b</b>) Robot 1 detects an obstacle while navigating towards goal. Robot 1 broadcasts the information to following robots. (<b>c</b>) Partial detection of second obstacle by Robot 1, Robot 2 navigating towards the waypoint dropped by Robot 1. (<b>d</b>) Robot 1 finds unobstructed path to goal, Robot 2 navigates from the other side of the obstacle, while broadcasting the information. (<b>e</b>) Robot 2 finds unobstructed path to goal. Followers opt for the optimal path.</p> "> Figure 2
<p>Shortest path planning with obstacle avoidance.</p> "> Figure 3
<p>Immediate waypoint selection.</p> "> Figure 4
<p>Concept of fuzzification.</p> "> Figure 5
<p>Simulation snapshots: showing the operation of the IWp-OA algorithm. (<b>a</b>) Robot 1 avoiding the first detected obstacle from the bottom edge. (<b>b</b>) Robot 2 circumvents the obstacle from above to discover an alternate route. (<b>c</b>) Robots 1 and 2 have partially discovered obstacle 5. (<b>d</b>) Robot 3 discovers obstacle 2, and utilizing fuzzification, combines it with the already discovered obstacle 3. (<b>e</b>) Robot 4 chooses the trajectory, by utilizing the broadcast information, to bypass obstacle 3 from below. (<b>f</b>) Robot 5 utilizes information provided by all other robots and chooses the optimal trajectory.</p> "> Figure 6
<p>Simulation results: different experimental scenarios. (<b>a</b>) Experimental scenario 1. (<b>b</b>) Experimental scenario 2. (<b>c</b>) Experimental scenario 3. (<b>d</b>) Experimental scenario 4. (<b>e</b>) Experimental scenario 5. (<b>f</b>) Experimental scenario 6. (<b>g</b>) Experimental scenario 7. (<b>h</b>) Experimental scenario 8. (<b>i</b>) Experimental scenario 9. (<b>j</b>) Experimental scenario 10.</p> "> Figure 7
<p>Simulation results: results for respective experiments, showing the total distance traveled by each robot along with the time it took to reach the goal. (<b>a</b>) Experimental scenario 1. (<b>b</b>) Experimental scenario 2. (<b>c</b>) Experimental scenario 3. (<b>d</b>) Experimental scenario 4. (<b>e</b>) Experimental scenario 5. (<b>f</b>) Experimental scenario 6. (<b>g</b>) Experimental scenario 7. (<b>h</b>) Experimental scenario 8. (<b>i</b>) Experimental scenario 9. (<b>j</b>) Experimental scenario 10.</p> "> Figure 8
<p>Distance (minimum, maximum, and on average) traveled by the swarm in all experimental scenarios.</p> "> Figure 9
<p>Comparative results of distance traveled by the swarm as a whole.</p> ">
Abstract
:1. Introduction
2. Related Work
Collaborative Sensing for Map Building
3. Proposed Approach
Algorithm 1 Navigation |
3.1. Collision Avoidance and Map Building
3.2. Merge Objects and Update Map
Algorithm 2 Merge Objects and Update Map |
3.3. Shortest Path with Obstacle Avoidance
Algorithm 3 Shortest path with Obstacle Avoidance |
- 1.
- The robot draws a straight line from its current position R to its goal G to determine whether it intercepts any edge of the known obstacles. In the instant case, the line is intercepted by the edge of the obstacle C at point I1.
- 2.
- Next, it chooses two waypoints that are a minimum safe distance away from vertices C1 and C2; let us call these W1 and W2. Possible paths are followed by ; and followed by .
- 3.
- The robot iterates steps 1 and 2 above for all possible path segments, always progressing from left to right, creating a new waypoint for each edge that intercepts any path, and repeating steps 1 and 2.
- 4.
- We now have a directed acyclic graph (DAG) with multiple obstacle-free paths from the robots current position to the goal.
- 5.
- The robot now selects the first waypoint on shortest obstacle-free path as its temporary goal and starts moving towards it.
- 1.
- The robot draws a straight line from its current position R to its goal G and determines whether it intercepts any edge of the known obstacles. In the instant case, the line is intercepted by the edge of the obstacle C at point I1.
- 2.
- Since the top left corner C1 is nearer to I1 than the bottom left corner C2, it decides to circumvent the obstacle from the top left corner C1. It chooses a waypoint that is a minimum safe distance away from C1, let us call it W1, and sets this waypoint as its temporary goal.
- 3.
- It repeats steps 1 and 2 above; this time, the line is intercepted by the edge of obstacle A at point I2. The robot avoids obstacle A from bottom left corner A2 and chooses a waypoint WP2 that is a minimum safe distance away from vertex A2. The robot now sets its temporary goal to WP2.
- 4.
- The robot repeats step 1 by drawing a straight line . Since this line is not intercepted by any known obstacle, the algorithm finishes, and the robot starts moving towards its temporary goal WP2.
4. Simulation Results
- The robots pass through a vast passage area between the launch zone and delivery zone.
- There are randomly placed multiple obstacles in the passage area.
- The communication channel between the robots is considered to be ideal and lossless.
- Utilizing onboard localization methods, the robots obtain their position vector.
- The range of LiDAR sensors is 100 m.
5. Conclusions and Future Work
Author Contributions
Funding
Institutional Review Board Statement
Informed Consent Statement
Data Availability Statement
Conflicts of Interest
References
- Hinchey, M.G.; Sterritt, R.; Rouff, C. Swarms and Swarm Intelligence. Computer 2007, 40, 111–113. [Google Scholar] [CrossRef]
- Brambilla, M.; Ferrante, E.; Birattari, M.; Dorigo, M. Swarm robotics: A review from the swarm engineering perspective. Swarm Intell. 2013, 7, 1–41. [Google Scholar] [CrossRef] [Green Version]
- Yasin, J.N.; Haghbayan, M.H.; Yasin, M.M.; Plosila, J. Swarm formation morphing for congestion-aware collision avoidance. Heliyon 2021, 7, e07840. [Google Scholar] [CrossRef] [PubMed]
- Grocholsky, B.; Keller, J.; Kumar, V.; Pappas, G. Cooperative air and ground surveillance. IEEE Robot. Autom. Mag. 2006, 13, 16–25. [Google Scholar] [CrossRef] [Green Version]
- Besada, J.A.; Bergesio, L.; Campaña, I.; Vaquero-Melchor, D.; López-Araquistain, J.; Bernardos, A.M.; Casar, J.R. Drone Mission Definition and Implementation for Automated Infrastructure Inspection Using Airborne Sensors. Sensors 2018, 18, 1170. [Google Scholar] [CrossRef] [PubMed] [Green Version]
- Yasin, J.N.; Mohamed, S.A.S.; Haghbayan, M.H.; Heikkonen, J.; Tenhunen, H.; Plosila, J. Unmanned Aerial Vehicles (UAVs): Collision Avoidance Systems and Approaches. IEEE Access 2020, 8, 105139–105155. [Google Scholar] [CrossRef]
- Madridano, Á.; Al-Kaff, A.; Martín, D.; de la Escalera, A. 3D Trajectory Planning Method for UAVs Swarm in Building Emergencies. Sensors 2020, 20, 642. [Google Scholar] [CrossRef] [Green Version]
- Tagliabue, A.; Kamel, M.; Verling, S.; Siegwart, R.; Nieto, J. Collaborative transportation using MAVs via passive force control. In Proceedings of the 2017 IEEE International Conference on Robotics and Automation (ICRA), Singapore, 29 May–3 June 2017; pp. 5766–5773. [Google Scholar] [CrossRef]
- Udroiu, R.; Deaconu, A.M.; Nanau, C. Data Delivery in a Disaster or Quarantined Area Divided into Triangles Using DTN-Based Algorithms for Unmanned Aerial Vehicles. Sensors 2021, 21, 3572. [Google Scholar] [CrossRef]
- Shakhatreh, H.; Sawalmeh, A.H.; Al-Fuqaha, A.; Dou, Z.; Almaita, E.; Khalil, I.; Othman, N.S.; Khreishah, A.; Guizani, M. Unmanned Aerial Vehicles (UAVs): A Survey on Civil Applications and Key Research Challenges. IEEE Access 2019, 7, 48572–48634. [Google Scholar] [CrossRef]
- Kegeleirs, M.; Grisetti, G.; Birattari, M. Swarm SLAM: Challenges and Perspectives. Front. Robot. AI 2021, 8, 618268. [Google Scholar] [CrossRef]
- Cieslewski, T.; Choudhary, S.; Scaramuzza, D. Data-Efficient Decentralized Visual SLAM. In Proceedings of the 2018 IEEE International Conference on Robotics and Automation (ICRA), Brisbane, Australia, 21–25 May 2018; pp. 2466–2473. [Google Scholar] [CrossRef]
- Lajoie, P.Y.; Ramtoula, B.; Wu, F.; Beltrame, G. Towards Collaborative Simultaneous Localization and Mapping: A Survey of the Current Research Landscape. Field Robot. 2022, 2, 971–1000. [Google Scholar] [CrossRef]
- Mertens, J.C.; Knies, C.; Diermeyer, F.; Escherle, S.; Kraus, S. The Need for Cooperative Automated Driving. Electronics 2020, 9, 754. [Google Scholar] [CrossRef]
- Malik, S.; Khan, M.A.; El-Sayed, H. Collaborative Autonomous Driving—A Survey of Solution Approaches and Future Challenges. Sensors 2021, 21, 3783. [Google Scholar] [CrossRef] [PubMed]
- Kerner, B.S. Failure of classical traffic flow theories: Stochastic highway capacity and automatic driving. Phys. A Stat. Mech. Its Appl. 2016, 450, 700–747. [Google Scholar] [CrossRef] [Green Version]
- Knies, C.; Hermansdorfer, L.; Diermeyer, F. Cooperative Maneuver Planning for Highway Traffic Scenarios based on Monte-Carlo Tree Search. In Proceedings of the AAET 2019—Automatisiertes und vernetztes Fahren, Montreal, QC, Canada, 13–17 May 2019. [Google Scholar]
- Ebadi, K.; Chang, Y.; Palieri, M.; Stephens, A.; Hatteland, A.; Heiden, E.; Thakur, A.; Funabiki, N.; Morrell, B.; Wood, S.; et al. LAMP: Large-Scale Autonomous Mapping and Positioning for Exploration of Perceptually-Degraded Subterranean Environments. In Proceedings of the 2020 IEEE International Conference on Robotics and Automation (ICRA), Paris, France, 31 May–31 August 2020; pp. 80–86. [Google Scholar] [CrossRef]
- Wang, Y.; Sun, Z.; Xu, C.Z.; Sarma, S.E.; Yang, J.; Kong, H. LiDAR Iris for Loop-Closure Detection. In Proceedings of the 2020 IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS), Las Vegas, NV, USA, 25–29 October 2020; pp. 5769–5775. [Google Scholar] [CrossRef]
- Chang, Y.; Ebadi, K.; Denniston, C.E.; Ginting, M.F.; Rosinol, A.; Reinke, A.; Palieri, M.; Shi, J.; Chatterjee, A.; Morrell, B.; et al. LAMP 2.0: A Robust Multi-Robot SLAM System for Operation in Challenging Large-Scale Underground Environments. arXiv 2022. [Google Scholar] [CrossRef]
- Dubé, R.; Cramariuc, A.; Dugas, D.; Sommer, H.; Dymczyk, M.; Nieto, J.; Siegwart, R.; Cadena, C. SegMap: Segment-based mapping and localization using data-driven descriptors. Int. J. Robot. Res. 2020, 39, 339–355. [Google Scholar] [CrossRef]
- Huang, Y.; Shan, T.; Chen, F.; Englot, B. DiSCo-SLAM: Distributed Scan Context-Enabled Multi-Robot LiDAR SLAM With Two-Stage Global-Local Graph Optimization. IEEE Robot. Autom. Lett. 2022, 7, 1150–1157. [Google Scholar] [CrossRef]
- Bartashevich, P.; Koerte, D.; Mostaghim, S. Energy-saving decision making for aerial swarms: PSO-based navigation in vector fields. In Proceedings of the 2017 IEEE Symposium Series on Computational Intelligence (SSCI), Honolulu, HI, USA, 27 November–1 December 2017; pp. 1–8. [Google Scholar] [CrossRef]
- Al-Sabban, W.H.; Gonzalez, L.F.; Smith, R.N. Wind-energy based path planning for Unmanned Aerial Vehicles using Markov Decision Processes. In Proceedings of the 2013 IEEE International Conference on Robotics and Automation, Karlsruhe, Germany, 6–10 May 2013; pp. 784–789. [Google Scholar] [CrossRef] [Green Version]
- Yasin, J.N.; Mahboob, H.; Haghbayan, M.H.; Yasin, M.M.; Plosila, J. Energy-Efficient Navigation of an Autonomous Swarm with Adaptive Consciousness. Remote Sens. 2021, 13, 1059. [Google Scholar] [CrossRef]
- Narayanan, K.; Honkote, V.; Ghosh, D.; Baldev, S. Energy Efficient Communication with Lossless Data Encoding for Swarm Robot Coordination. In Proceedings of the 2019 32nd International Conference on VLSI Design and 2019 18th International Conference on Embedded Systems (VLSID), Delhi, India, 5–9 January 2019; pp. 525–526. [Google Scholar] [CrossRef]
- Majd, A.; Loni, M.; Sahebi, G.; Daneshtalab, M. Improving Motion Safety and Efficiency of Intelligent Autonomous Swarm of Drones. Drones 2020, 4, 48. [Google Scholar] [CrossRef]
- Yasin, J.N.; Mohamed, S.A.S.; Haghbayan, M.H.; Heikkonen, J.; Tenhunen, H.; Yasin, M.M.; Plosila, J. Energy-Efficient Formation Morphing for Collision Avoidance in a Swarm of Drones. IEEE Access 2020, 8, 170681–170695. [Google Scholar] [CrossRef]
- Tseng, C.M.; Chau, C.K.; Elbassioni, K.M.; Khonji, M. Flight tour planning with recharging optimization for battery-operated autonomous drones. arXiv 2017, arXiv:abs/1703.10049. [Google Scholar]
- Alyassi, R.; Khonji, M.; Karapetyan, A.; Chau, S.C.K.; Elbassioni, K.; Tseng, C.M. Autonomous Recharging and Flight Mission Planning for Battery-Operated Autonomous Drones. IEEE Trans. Autom. Sci. Eng. 2022, 1–13. [Google Scholar] [CrossRef]
- Dubé, R.; Gollub, M.G.; Sommer, H.; Gilitschenski, I.; Siegwart, R.; Cadena, C.; Nieto, J. Incremental-Segment-Based Localization in 3-D Point Clouds. IEEE Robot. Autom. Lett. 2018, 3, 1832–1839. [Google Scholar] [CrossRef]
- Han, Q.; Li, T.; Sun, S.; Villarrubia, G.; de la Prieta, F. “1-N” Leader-Follower Formation Control of Multiple Agents Based on Bearing-Only Observation. In Proceedings of the Advances in Practical Applications of Agents, Multi-Agent Systems, and Sustainability: The PAAMS Collection, Salamanca, Spain, 3–4 June 2015; Demazeau, Y., Decker, K.S., Bajo Pérez, J., de la Prieta, F., Eds.; Springer International Publishing: Cham, Switzerland, 2015; pp. 120–130. [Google Scholar]
- Dimidov, C.; Oriolo, G.; Trianni, V. Random Walks in Swarm Robotics: An Experiment with Kilobots. In Proceedings of the Swarm Intelligence, Brussels, Belgium, 7–9 September 2016; Dorigo, M., Birattari, M., Li, X., López-Ibáñez, M., Ohkura, K., Pinciroli, C., Stützle, T., Eds.; Springer International Publishing: Cham, Switzerland, 2016; pp. 185–196. [Google Scholar]
- Kegeleirs, M.; Garzón Ramos, D.; Birattari, M. Random Walk Exploration for Swarm Mapping. In Proceedings of the Towards Autonomous Robotic Systems, London, UK, 3–5 July 2019; Althoefer, K., Konstantinova, J., Zhang, K., Eds.; Springer International Publishing: Cham, Switzerland, 2019; pp. 211–222. [Google Scholar]
- Rone, W.; Ben-Tzvi, P. Mapping, localization and motion planning in mobile multi-robotic systems. Robotica 2013, 31, 1–23. [Google Scholar] [CrossRef]
- Allen, J.M.; Joyce, R.; Millard, A.G.; Gray, I. The Pi-puck Ecosystem: Hardware and Software Support for the e-puck and e-puck2. In Proceedings of the Swarm Intelligence, Barcelona, Spain, 26–28 October 2020; Dorigo, M., Stützle, T., Blesa, M.J., Blum, C., Hamann, H., Heinrich, M.K., Strobel, V., Eds.; Springer International Publishing: Cham, Switzerland, 2020; pp. 243–255. [Google Scholar]
- Rosinol, A.; Abate, M.; Chang, Y.; Carlone, L. Kimera: An Open-Source Library for Real-Time Metric-Semantic Localization and Mapping. In Proceedings of the 2020 IEEE International Conference on Robotics and Automation (ICRA), Paris, France, 31 May–31 August 2020; pp. 1689–1696. [Google Scholar] [CrossRef]
- Dubé, R.; Gawel, A.; Sommer, H.; Nieto, J.; Siegwart, R.; Cadena, C. An online multi-robot SLAM system for 3D LiDARs. In Proceedings of the 2017 IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS), Vancouver, BC, Canada, 24–28 September 2017; pp. 1004–1011. [Google Scholar] [CrossRef]
- Schmuck, P.; Chli, M. CCM-SLAM: Robust and efficient centralized collaborative monocular simultaneous localization and mapping for robotic teams. J. Field Robot. 2019, 36, 763–781. [Google Scholar] [CrossRef] [Green Version]
- Karrer, M.; Schmuck, P.; Chli, M. CVI-SLAM—Collaborative Visual-Inertial SLAM. IEEE Robot. Autom. Lett. 2018, 3, 2762–2769. [Google Scholar] [CrossRef] [Green Version]
- Chang, Y.; Tian, Y.; How, J.P.; Carlone, L. Kimera-Multi: A System for Distributed Multi-Robot Metric-Semantic Simultaneous Localization and Mapping. In Proceedings of the 2021 IEEE International Conference on Robotics and Automation (ICRA), Xi’an, China, 30 May–5 June 2021; pp. 11210–11218. [Google Scholar] [CrossRef]
- Choudhary, S.; Carlone, L.; Nieto, C.; Rogers, J.; Christensen, H.I.; Dellaert, F. Distributed mapping with privacy and communication constraints: Lightweight algorithms and object-based models. Int. J. Robot. Res. 2017, 36, 1286–1311. [Google Scholar] [CrossRef]
- Lajoie, P.Y.; Ramtoula, B.; Chang, Y.; Carlone, L.; Beltrame, G. DOOR-SLAM: Distributed, Online, and Outlier Resilient SLAM for Robotic Teams. IEEE Robot. Autom. Lett. 2020, 5, 1656–1663. [Google Scholar] [CrossRef]
- Yasin, J.N.; Mahboob, H.; Jokinen, S.; Haghbayan, H.; Yasin, M.M.; Plosila, J. Partial Swarm SLAM for Intelligent Navigation. In Proceedings of the Advances in Practical Applications of Agents, Multi-Agent Systems, and Complex Systems Simulation, The PAAMS Collection, L’Aquila, Italy, 13–15 July 2022; Dignum, F., Mathieu, P., Corchado, J.M., De La Prieta, F., Eds.; Springer International Publishing: Cham, Switzerland, 2022; pp. 435–446. [Google Scholar]
Exp | 1 | 2 | 3 | 4 | 5 | 6 | 7 | 8 | 9 | 10 | ||||||||||
---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|
Robot | Raw | Obj | Raw | Obj | Raw | Obj | Raw | Obj | Raw | Obj | Raw | Obj | Raw | Obj | Raw | Obj | Raw | Obj | Raw | Obj |
1 | 29,340 | 105 | 32,353 | 154 | 29,907 | 245 | 29,187 | 185 | 22,097 | 101 | 30,606 | 167 | 20,900 | 140 | 26,582 | 160 | 16,420 | 131 | 27,225 | 140 |
2 | 27,393 | 89 | 30,766 | 105 | 35,216 | 80 | 35,197 | 107 | 20,523 | 35 | 21,868 | 42 | 15,338 | 0 | 19,078 | 11 | 17,867 | 32 | 18,770 | 10 |
3 | 12,182 | 0 | 23,914 | 38 | 18,781 | 0 | 15,495 | 0 | 11,987 | 0 | 22,112 | 1 | 26,824 | 82 | 18,704 | 1 | 14,270 | 56 | 18,717 | 1 |
4 | 12,182 | 0 | 22,402 | 0 | 18,781 | 0 | 15,854 | 5 | 11,987 | 0 | 22,112 | 0 | 22,022 | 10 | 18,700 | 0 | 14,271 | 0 | 18,717 | 0 |
5 | 12,182 | 0 | 19,030 | 0 | 18,781 | 0 | 15,854 | 0 | 11,987 | 0 | 22,112 | 0 | 23,951 | 1 | 18,700 | 0 | 10,052 | 0 | 18,717 | 0 |
6 | 12,182 | 0 | 19,030 | 0 | 18,781 | 0 | 11,092 | 0 | 11,987 | 0 | 22,112 | 0 | 16,933 | 56 | 18,700 | 0 | 10,052 | 0 | 18,717 | 0 |
7 | 12,182 | 0 | 19,030 | 0 | 18,781 | 0 | 11,092 | 0 | 11,987 | 0 | 22,112 | 0 | 16,933 | 0 | 18,700 | 0 | 10,052 | 0 | 18,717 | 0 |
8 | 12,182 | 0 | 19,030 | 0 | 18,781 | 0 | 11,092 | 0 | 11,987 | 0 | 22,112 | 0 | 15,726 | 2 | 18,700 | 0 | 10,052 | 0 | 18,717 | 0 |
9 | 12,182 | 0 | 19,030 | 0 | 18,781 | 0 | 11,092 | 0 | 11,987 | 0 | 22,112 | 0 | 15,726 | 0 | 18,700 | 0 | 10,052 | 0 | 18,717 | 0 |
10 | 12,182 | 0 | 19,030 | 0 | 18,781 | 0 | 11,092 | 0 | 11,987 | 0 | 22,112 | 0 | 15,726 | 0 | 18,700 | 0 | 10,052 | 0 | 18,717 | 0 |
Exp | 1 | 2 | 3 | 4 | 5 | 6 | 7 | 8 | 9 | 10 | ||||||||||
---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|
Robot | Raw | Obj | Raw | Obj | Raw | Obj | Raw | Obj | Raw | Obj | Raw | Obj | Raw | Obj | Raw | Obj | Raw | Obj | Raw | Obj |
1 | 82.15 | 5.46 | 90.59 | 8.01 | 83.74 | 12.74 | 81.72 | 9.62 | 61.87 | 5.25 | 85.69 | 8.68 | 58.52 | 7.28 | 74.43 | 8.32 | 45.98 | 6.81 | 76.23 | 7.28 |
2 | 76.70 | 4.63 | 86.15 | 5.46 | 98.61 | 4.16 | 98.55 | 5.56 | 57.46 | 1.82 | 61.23 | 2.18 | 42.95 | 0 | 53.42 | 0.57 | 50.03 | 1.66 | 52.56 | 0.52 |
3 | 34.11 | 0 | 66.96 | 1.98 | 52.59 | 0 | 43.39 | 0 | 33.56 | 0 | 61.91 | 0.05 | 75.11 | 4.26 | 52.37 | 0.05 | 39.96 | 2.91 | 52.41 | 0.05 |
4 | 34.11 | 0 | 62.73 | 0 | 52.59 | 0 | 44.39 | 0.26 | 33.56 | 0 | 61.91 | 0 | 61.66 | 0.52 | 52.36 | 0 | 39.96 | 0 | 52.41 | 0 |
5 | 34.11 | 0 | 53.28 | 0 | 52.59 | 0 | 44.39 | 0 | 33.56 | 0 | 61.91 | 0 | 67.06 | 0.05 | 52.36 | 0 | 28.15 | 0 | 52.41 | 0 |
6 | 34.11 | 0 | 53.28 | 0 | 52.59 | 0 | 31.06 | 0 | 33.56 | 0 | 61.91 | 0 | 47.41 | 2.91 | 52.36 | 0 | 28.15 | 0 | 52.41 | 0 |
7 | 34.11 | 0 | 53.28 | 0 | 52.59 | 0 | 31.06 | 0 | 33.56 | 0 | 61.91 | 0 | 47.41 | 0 | 52.36 | 0 | 28.15 | 0 | 52.41 | 0 |
8 | 34.11 | 0 | 53.28 | 0 | 52.59 | 0 | 31.06 | 0 | 33.56 | 0 | 61.91 | 0 | 44.03 | 0.10 | 52.36 | 0 | 28.15 | 0 | 52.41 | 0 |
9 | 34.11 | 0 | 53.28 | 0 | 52.59 | 0 | 31.06 | 0 | 33.56 | 0 | 61.91 | 0 | 44.03 | 0 | 52.36 | 0 | 28.15 | 0 | 52.41 | 0 |
10 | 34.11 | 0 | 53.28 | 0 | 52.59 | 0 | 31.06 | 0 | 33.56 | 0 | 61.91 | 0 | 44.03 | 0 | 52.36 | 0 | 28.15 | 0 | 52.41 | 0 |
Exp | 1 | 2 | 3 | 4 | 5 | 6 | 7 | 8 | 9 | 10 | ||||||||||
---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|
Robot | Raw | Obj | Raw | Obj | Raw | Obj | Raw | Obj | Raw | Obj | Raw | Obj | Raw | Obj | Raw | Obj | Raw | Obj | Raw | Obj |
1 | 82.15 | 0.55 | 90.59 | 0.80 | 83.74 | 1.27 | 81.72 | 0.96 | 61.87 | 0.53 | 85.69 | 0.87 | 58.52 | 0.73 | 74.43 | 0.83 | 45.98 | 0.68 | 76.23 | 0.73 |
2 | 76.70 | 0.46 | 86.14 | 0.55 | 98.61 | 0.42 | 98.55 | 0.56 | 57.46 | 0.18 | 61.23 | 0.22 | 42.95 | 0 | 53.42 | 0.06 | 50.03 | 0.17 | 52.56 | 0.05 |
3 | 34.11 | 0 | 66.96 | 0.19 | 52.59 | 0 | 43.39 | 0 | 33.56 | 0 | 61.91 | 0.01 | 75.11 | 0.43 | 52.37 | 0.01 | 39.96 | 0.29 | 52.41 | 0.01 |
4 | 34.11 | 0 | 62.73 | 0 | 52.59 | 0 | 44.39 | 0.03 | 33.56 | 0 | 61.91 | 0 | 61.66 | 0.05 | 52.36 | 0 | 39.96 | 0 | 52.41 | 0 |
5 | 34.11 | 0 | 53.28 | 0 | 52.59 | 0 | 44.39 | 0 | 33.56 | 0 | 61.91 | 0 | 67.06 | 0.01 | 52.36 | 0 | 28.15 | 0 | 52.41 | 0 |
6 | 34.11 | 0 | 53.28 | 0 | 52.59 | 0 | 31.06 | 0 | 33.56 | 0 | 61.91 | 0 | 47.41 | 0.29 | 52.36 | 0 | 28.15 | 0 | 52.41 | 0 |
7 | 34.11 | 0 | 53.28 | 0 | 52.59 | 0 | 31.06 | 0 | 33.56 | 0 | 61.91 | 0 | 47.41 | 0 | 52.36 | 0 | 28.15 | 0 | 52.41 | 0 |
8 | 34.11 | 0 | 53.28 | 0 | 52.59 | 0 | 31.06 | 0 | 33.56 | 0 | 61.91 | 0 | 44.03 | 0.01 | 52.36 | 0 | 28.15 | 0 | 52.41 | 0 |
9 | 34.11 | 0 | 53.28 | 0 | 52.59 | 0 | 31.06 | 0 | 33.56 | 0 | 61.91 | 0 | 44.03 | 0 | 52.36 | 0 | 28.15 | 0 | 52.41 | 0 |
10 | 34.11 | 0 | 53.28 | 0 | 52.59 | 0 | 31.06 | 0 | 33.56 | 0 | 61.91 | 0 | 44.03 | 0 | 52.36 | 0 | 28.15 | 0 | 52.41 | 0 |
Disclaimer/Publisher’s Note: The statements, opinions and data contained in all publications are solely those of the individual author(s) and contributor(s) and not of MDPI and/or the editor(s). MDPI and/or the editor(s) disclaim responsibility for any injury to people or property resulting from any ideas, methods, instructions or products referred to in the content. |
© 2023 by the authors. Licensee MDPI, Basel, Switzerland. This article is an open access article distributed under the terms and conditions of the Creative Commons Attribution (CC BY) license (https://creativecommons.org/licenses/by/4.0/).
Share and Cite
Mahboob, H.; Yasin, J.N.; Jokinen, S.; Haghbayan, M.-H.; Plosila, J.; Yasin, M.M. DCP-SLAM: Distributed Collaborative Partial Swarm SLAM for Efficient Navigation of Autonomous Robots. Sensors 2023, 23, 1025. https://doi.org/10.3390/s23021025
Mahboob H, Yasin JN, Jokinen S, Haghbayan M-H, Plosila J, Yasin MM. DCP-SLAM: Distributed Collaborative Partial Swarm SLAM for Efficient Navigation of Autonomous Robots. Sensors. 2023; 23(2):1025. https://doi.org/10.3390/s23021025
Chicago/Turabian StyleMahboob, Huma, Jawad N. Yasin, Suvi Jokinen, Mohammad-Hashem Haghbayan, Juha Plosila, and Muhammad Mehboob Yasin. 2023. "DCP-SLAM: Distributed Collaborative Partial Swarm SLAM for Efficient Navigation of Autonomous Robots" Sensors 23, no. 2: 1025. https://doi.org/10.3390/s23021025
APA StyleMahboob, H., Yasin, J. N., Jokinen, S., Haghbayan, M. -H., Plosila, J., & Yasin, M. M. (2023). DCP-SLAM: Distributed Collaborative Partial Swarm SLAM for Efficient Navigation of Autonomous Robots. Sensors, 23(2), 1025. https://doi.org/10.3390/s23021025