Leader–Follower Approach for Non-Holonomic Mobile Robots Based on Extended Kalman Filter Sensor Data Fusion and Extended On-Board Camera Perception Controlled with Behavior Tree
<p>(<b>a</b>) Leader robot. (<b>b</b>) Follower robot.</p> "> Figure 2
<p>Behavior tree of leader robot.</p> "> Figure 3
<p>Behavior tree of follower robot.</p> "> Figure 4
<p>Top view of the leader and follower robots.</p> "> Figure 5
<p>Representation of the axis between the ArUco marker and the camera.</p> "> Figure 6
<p>Flow chart of pitch angle <math display="inline"><semantics> <msub> <mi>θ</mi> <mi>R</mi> </msub> </semantics></math> modification of ArUco marker.</p> "> Figure 7
<p>Representation of ArUco marker corner points.</p> "> Figure 8
<p>Pose representation of the leader and follower robots.</p> "> Figure 9
<p>EKF of leader robot.</p> "> Figure 10
<p>(<b>a</b>) EKF 1 of follower robot. (<b>b</b>) EKF 2 of follower robot.</p> "> Figure 11
<p>Experiment result 1 (<b>a</b>). (<b>a</b>) (x,y) plot of all robots. (<b>b</b>) Theta of all the robots. (<b>c</b>) (x,y) plot of all robots with EKF. (<b>d</b>) Error in x-axis for leader robot. (<b>e</b>) Error in y-axis for leader robot. (<b>f</b>) Error of leader’s orientation. (<b>g</b>) Linear velocity of leader robot. (<b>h</b>) Angular velocity of leader robot. (<b>i</b>) Error in x-axis for follower robot. (<b>j</b>) Error in y-axis for follower robot.</p> "> Figure 12
<p>Experiment result 1 (<b>b</b>). (<b>a</b>) Error of follower’s orientation. (<b>b</b>) Linear velocity of follower robot. (<b>c</b>) Angular velocity of follower robot. (<b>d</b>) Marker detection information.</p> "> Figure 13
<p>Experiment result 2 (<b>a</b>). (<b>a</b>) (x,y) plot of all robots. (<b>b</b>) (x,y) plot of leader robot. (<b>c</b>) Error in x-axis for follower robot. (<b>d</b>) Error in y-axis for follower robot. (<b>e</b>) Linear velocity of follower robot. (<b>f</b>) Angular velocity of follower robot. (<b>g</b>) Landmark x-distance. (<b>h</b>) Landmark y-distance. (<b>i</b>) Landmark id information. (<b>j</b>) Marker x distance.</p> "> Figure 14
<p>Experiment result 2 (<b>b</b>). (<b>a</b>) Marker y distance. (<b>b</b>) Marker detection information.</p> ">
Abstract
:1. Introduction
- (a)
- A decentralized vision-based system for markers is represented in the paper [13]. In the same paper, each robot has a truncated regular octagon and a fixed camera, whereas in this paper, the camera is movable and independent of the movement of the mobile platform.
- (b)
- (c)
- In the paper [15], the ArUco markers are placed on the top of the robots and the camera is placed on the stand observing the robot vertically. The disadvantage of the method presented in this paper is that the vertical camera should cover all the environments with the robots. Our algorithm presented in this paper covers this gap and such a solution limits the working area only to that covered by the measurement system.
- (d)
- The paper [16] represents a hitchhiking robot approach for leader–follower robots. The author used a QR marker, and the mapping and computation are conducted by the driver robot. But a completely different control algorithm from this paper is used with a fixed camera.
2. Behavior Tree
3. ArUco Marker
3.1. ArUco Marker Placement on Leader Robot
3.2. Artificial Landmarks
3.3. Pose Estimation from ArUco Markers
3.3.1. Case 1—When One Marker Detected
3.3.2. Case 2—When Two Markers Detected
3.4. Pose Calculation of Leader Robot
4. Extended Kalman Filter
4.1. Wheel Encoder Data of the EKF
4.2. IMU Data to the EKF
4.3. Landmark Data to the EKF
4.4. Pose Prediction of Leader Robot
5. Control Algorithm
5.1. Control for Rotating Platform
5.2. Control Leader and Follower Robots
6. Methodology
7. Experiment Results
7.1. Experiment 1
7.2. Experiment 2
8. Conclusions
Author Contributions
Funding
Institutional Review Board Statement
Informed Consent Statement
Data Availability Statement
Conflicts of Interest
References
- Prassler, E.; Ritter, A.; Schaeffer, C. A Short History of Cleaning Robots. Auton. Robot. 2000, 9, 211–226. [Google Scholar] [CrossRef]
- Joon, A.; Kowalczyk, W. Design of Autonomous Mobile Robot for Cleaning in the Environment with Obstacles. Appl. Sci. 2021, 11, 8076. [Google Scholar] [CrossRef]
- Kowalczyk, W.; Kozłowski, K. Trajectory tracking and collision avoidance for the formation of two-wheeled mobile robots. Bull. Pol. Acad. Sci. 2019, 67, 915–924. [Google Scholar] [CrossRef]
- Kiełczewski, M.; Kowalczyk, W.; Krysiak, B. Differentially-Driven Robots Moving in Formation—Leader–Follower Approach. Appl. Sci. 2022, 12, 7273. [Google Scholar] [CrossRef]
- Romanov, A.M.; Tararin, A.A. An Automatic Docking System for Wheeled Mobile Robots. In Proceedings of the IEEE Conference of Russian Young Researchers in Electrical and Electronic Engineering (ElConRus), Moscow, Russia, 26–29 January 2021; pp. 1040–1045. [Google Scholar] [CrossRef]
- Mráz, E.; Rodina, J.; Babinec, A. Using fiducial markers to improve localization of a drone. In Proceedings of the 23rd International Symposium on Measurement and Control in Robotics (ISMCR), Budapest, Hungary, 15–17 October 2020; pp. 1–5. [Google Scholar] [CrossRef]
- Kam, H.C.; Yu, Y.K.; Wong, K.H. An Improvement on ArUco Marker for Pose Tracking Using Kalman Filter. In Proceedings of the 2018 19th IEEE/ACIS International Conference on Software Engineering, Artificial Intelligence, Networking and Parallel/Distributed Computing (SNPD), Busan, Republic of Korea, 27–29 June 2018; pp. 65–69. [Google Scholar] [CrossRef]
- Marcotte, R.; Hamilton, H.J. Behavior Trees for Modelling Artificial Intelligence in Games: A Tutorial. Comput. Game J. 2017, 6, 171–184. [Google Scholar] [CrossRef]
- Colledanchise, M.; Ögren, P. Behavior Trees in Robotics and AI: An Introduction; CRC Press: Boca Raton, FL, USA, 2018. [Google Scholar]
- Desai, J.P.; Ostrowski, J.; Kumar, V. Controlling formations of multiple mobile robots. In Proceedings of the 1998 IEEE International Conference on Robotics and Automation (Cat. No. 98CH36146), Leuven, Belgium, 20 May 1998; IEEE: Piscataway Township, NJ, USA, 1998; Volume 4, pp. 2864–2869. [Google Scholar]
- Dai, Y.; Lee, S.-G. The leader-follower formation control of nonholonomic mobile robots. Int. J. Control. Autom. Syst. 2012, 10, 350–361. [Google Scholar]
- Sert, H.; Kökösy, A.; Perruquetti, W. A single landmark based localization algorithm for non-holonomic mobile robots. In Proceedings of the 2011 IEEE International Conference on Robotics and Automation, Shanghai, China, 9–13 May 2011; pp. 293–298. [Google Scholar] [CrossRef]
- Orqueda, O.A.A.; Fierro, R. Visual tracking of mobile robots in formation. In Proceedings of the 2007 American Control Conference, New York, NY, USA, 9–13 July 2007; pp. 5940–5945. [Google Scholar] [CrossRef]
- Orqueda, O.A.A.; Fierro, R. Robust vision-based nonlinear formation control. In Proceedings of the 2006 American Control Conference, Minneapolis, MN, USA, 14–16 June 2006; p. 6. [Google Scholar] [CrossRef]
- Arcos, L.; Calala, C.; Maldonado, D.; Cruz, P.J. ROS based Experimental Testbed for Multi-Robot Formation Control. In Proceedings of the 2020 IEEE ANDESCON, Quito, Ecuador, 13–16 October 2020; pp. 1–6. [Google Scholar] [CrossRef]
- Ravankar, A.; Ravankar, A.A.; Kobayashi, Y.; Emaru, T. Hitchhiking Robots: A Collaborative Approach for Efficient Multi-Robot Navigation in Indoor Environments. Sensors 2017, 17, 1878. [Google Scholar] [CrossRef] [PubMed]
- Jaoura, N.; Hassan, L.; Alkafri, A. Distributed Consensus Problem of Multiple Non- holonomic Mobile Robots. J. Control Autom. Electron. Syst. 2022, 33, 419–433. [Google Scholar] [CrossRef]
- Shan, M.; Zou, Y.; Guan, M.; Wen, C.; Lim, K.; Ng, C.; Tan, P. Probabilistic trajectory estimation based leader following for multi-robot systems. In Proceedings of the 2016 14th International Conference on Control, Automation, Robotics and Vision (ICARCV), Phuket, Thailand, 3–15 November 2016; pp. 1–6. [Google Scholar] [CrossRef]
- Hurwitz, D.; Filin, S.; Klein, I. Relative Constraints and Their Contribution to Image Configurations. Proc. IEEE Sens. J. 2023, 23, 7750–7757. [Google Scholar] [CrossRef]
- Yoo, H.; Kim, D.; Sohn, J.; Lee, K.; Kim, C. Development of a Worker-Following Robot System: Worker Position Estimation and Motion Control under Measurement Uncertainty. Machines 2023, 11, 366. [Google Scholar] [CrossRef]
- Tsoukalas, A.; Tzes, A.; Khorrami, F. Relative Pose Estimation of Unmanned Aerial Systems. In Proceedings of the 2018 26th Mediterranean Conference on Control and Automation (MED), Zadar, Croatia, 19–22 June 2018; pp. 155–160. [Google Scholar] [CrossRef]
- Joon, A.; Kowalczyk, W. Leader Following Control of Non-holonomic Mobile Robots Using EKF-based Localization. In Proceedings of the 2023 27th International Conference on Methods and Models in Automation and Robotics (MMAR), Międzyzdroje, Poland, 22–25 August 2023; pp. 57–62. [Google Scholar] [CrossRef]
- Behavior Tree. Available online: https://www.behaviortree.dev (accessed on 16 July 2023).
- Garrido-Jurado, S.; Muñoz-Salinas, R.; Madrid-Cuevas, F.J.; Marín-Jiménez, M.J. Automatic generation and detection of highly reliable fiducial markers under occlusion. Pattern Recogn. 2014, 47, 2280–2292. [Google Scholar] [CrossRef]
- ArUco Markers Generator. Available online: https://chev.me/arucogen/ (accessed on 17 July 2023).
- Detection of ArUco Markers. Available online: https://docs.opencv.org/4.x/d5/dae/tutorial_aruco_detection.html (accessed on 17 July 2023).
- ROS Wrapper for Intel. Available online: https://github.com/IntelRealSense/realsense-ros (accessed on 17 July 2023).
- Braden, B. The Surveyor’s Area Formula. Coll. Math. J. 1986, 17, 326–337. [Google Scholar] [CrossRef]
- Khatib, E.I.A.L.; Jaradat, M.A.K.; Abdel-Hafez, M.F. Low-Cost Reduced Navigation System for Mobile Robot in Indoor/Outdoor Environments. IEEE Access 2020, 8, 25014–25026. [Google Scholar] [CrossRef]
- de Wit, C.C.; Khennouf, H.; Samson, C.; Sordalen, O.J. Nonlinear Control Design for Mobile Robots. Recent Trends Mob. Rob. 1994, 11, 121–156. [Google Scholar] [CrossRef]
Disclaimer/Publisher’s Note: The statements, opinions and data contained in all publications are solely those of the individual author(s) and contributor(s) and not of MDPI and/or the editor(s). MDPI and/or the editor(s) disclaim responsibility for any injury to people or property resulting from any ideas, methods, instructions or products referred to in the content. |
© 2023 by the authors. Licensee MDPI, Basel, Switzerland. This article is an open access article distributed under the terms and conditions of the Creative Commons Attribution (CC BY) license (https://creativecommons.org/licenses/by/4.0/).
Share and Cite
Joon, A.; Kowalczyk, W. Leader–Follower Approach for Non-Holonomic Mobile Robots Based on Extended Kalman Filter Sensor Data Fusion and Extended On-Board Camera Perception Controlled with Behavior Tree. Sensors 2023, 23, 8886. https://doi.org/10.3390/s23218886
Joon A, Kowalczyk W. Leader–Follower Approach for Non-Holonomic Mobile Robots Based on Extended Kalman Filter Sensor Data Fusion and Extended On-Board Camera Perception Controlled with Behavior Tree. Sensors. 2023; 23(21):8886. https://doi.org/10.3390/s23218886
Chicago/Turabian StyleJoon, Arpit, and Wojciech Kowalczyk. 2023. "Leader–Follower Approach for Non-Holonomic Mobile Robots Based on Extended Kalman Filter Sensor Data Fusion and Extended On-Board Camera Perception Controlled with Behavior Tree" Sensors 23, no. 21: 8886. https://doi.org/10.3390/s23218886
APA StyleJoon, A., & Kowalczyk, W. (2023). Leader–Follower Approach for Non-Holonomic Mobile Robots Based on Extended Kalman Filter Sensor Data Fusion and Extended On-Board Camera Perception Controlled with Behavior Tree. Sensors, 23(21), 8886. https://doi.org/10.3390/s23218886