Evaluation of the Path-Tracking Accuracy of a Three-Wheeled Omnidirectional Mobile Robot Designed as a Personal Assistant
<p>Image of one of the authors of the paper preparing the mobile robot APR-02 for an autonomous exploration under COVID-19 restrictions.</p> "> Figure 2
<p>(<b>a</b>) Representation of the location of the center of the mobile robot <math display="inline"> <semantics> <mrow> <mrow> <mo>(</mo> <mrow> <mi>x</mi> <mo>,</mo> <mi>y</mi> </mrow> <mo>)</mo> </mrow> </mrow> </semantics> </math> and absolute angular orientation <math display="inline"> <semantics> <mrow> <mrow> <mo>(</mo> <mi>θ</mi> <mo>)</mo> </mrow> </mrow> </semantics> </math> of the omnidirectional mobile robot relative to the world reference frame <math display="inline"> <semantics> <mrow> <mrow> <mo>(</mo> <mrow> <msub> <mi>X</mi> <mi>W</mi> </msub> <mo>,</mo> <msub> <mi>Y</mi> <mi>W</mi> </msub> </mrow> <mo>)</mo> </mrow> </mrow> </semantics> </math>. The reference frame <math display="inline"> <semantics> <mrow> <mrow> <mo>(</mo> <mrow> <msub> <mi>X</mi> <mi>R</mi> </msub> <mo>,</mo> <msub> <mi>Y</mi> <mi>R</mi> </msub> </mrow> <mo>)</mo> </mrow> </mrow> </semantics> </math> is the mobile frame of the mobile robot, where the axis <math display="inline"> <semantics> <mrow> <msub> <mi>X</mi> <mi>R</mi> </msub> </mrow> </semantics> </math> depicts the front of the mobile robot. (<b>b</b>) Representation of the parameters of a motion vector <math display="inline"> <semantics> <mrow> <mrow> <mo>(</mo> <mrow> <mi>v</mi> <mo>,</mo> <mi>α</mi> <mo>,</mo> <mi>ω</mi> </mrow> <mo>)</mo> </mrow> </mrow> </semantics> </math>. <math display="inline"> <semantics> <mrow> <mrow> <mo>(</mo> <mrow> <msub> <mi>R</mi> <mi>a</mi> </msub> <mo>,</mo> <msub> <mi>R</mi> <mi>b</mi> </msub> <mo>,</mo> <msub> <mi>R</mi> <mi>c</mi> </msub> </mrow> <mo>)</mo> </mrow> </mrow> </semantics> </math> are the radial distances of each omnidirectional wheel relative to the center of the mobile robot, <math display="inline"> <semantics> <mrow> <mrow> <mo>(</mo> <mrow> <msub> <mi>δ</mi> <mi>a</mi> </msub> <mo>,</mo> <msub> <mi>δ</mi> <mi>b</mi> </msub> <mo>,</mo> <msub> <mi>δ</mi> <mi>c</mi> </msub> </mrow> <mo>)</mo> </mrow> </mrow> </semantics> </math> are the angular orientations of the wheels relative to the mobile robot reference frame <math display="inline"> <semantics> <mrow> <mrow> <mo>(</mo> <mrow> <msub> <mi>X</mi> <mi>R</mi> </msub> <mo>,</mo> <msub> <mi>Y</mi> <mi>R</mi> </msub> </mrow> <mo>)</mo> </mrow> </mrow> </semantics> </math>, and <math display="inline"> <semantics> <mrow> <mrow> <mo>(</mo> <mrow> <msub> <mi>ω</mi> <mi>a</mi> </msub> <mo>,</mo> <msub> <mi>ω</mi> <mi>b</mi> </msub> <mo>,</mo> <msub> <mi>ω</mi> <mi>c</mi> </msub> </mrow> <mo>)</mo> </mrow> </mrow> </semantics> </math> are the representation of the angular rotational speed of the wheels.</p> "> Figure 3
<p>Representation of the displacement of an omnidirectional mobile robot located at <math display="inline"> <semantics> <mrow> <mrow> <mo>(</mo> <mrow> <msub> <mi>x</mi> <mi>i</mi> </msub> <mo>,</mo> <msub> <mi>y</mi> <mi>i</mi> </msub> <mo>,</mo> <msub> <mi>θ</mi> <mi>i</mi> </msub> </mrow> <mo>)</mo> </mrow> </mrow> </semantics> </math> when executing a motion command <math display="inline"> <semantics> <mrow> <mo> </mo> <mi>M</mi> <mo>=</mo> <mrow> <mo>(</mo> <mrow> <mi>v</mi> <mo>,</mo> <mi>α</mi> <mo>,</mo> <mi>ω</mi> <mo>,</mo> <msub> <mi>t</mi> <mi>r</mi> </msub> </mrow> <mo>)</mo> </mrow> </mrow> </semantics> </math>. <math display="inline"> <semantics> <mi>R</mi> </semantics> </math> is the radius of the circular trajectory, <math display="inline"> <semantics> <mrow> <mrow> <mo>(</mo> <mrow> <msub> <mi>x</mi> <mi>c</mi> </msub> <mo>,</mo> <msub> <mi>y</mi> <mi>c</mi> </msub> </mrow> <mo>)</mo> </mrow> </mrow> </semantics> </math> is the location of the center of the circular trajectory, <math display="inline"> <semantics> <mi>β</mi> </semantics> </math> is the angular displacement of the robot along the circular path, and <math display="inline"> <semantics> <mrow> <mrow> <mo>(</mo> <mrow> <msub> <mi>x</mi> <mi>f</mi> </msub> <mo>,</mo> <msub> <mi>y</mi> <mi>f</mi> </msub> <mo>,</mo> <msub> <mi>θ</mi> <mi>f</mi> </msub> </mrow> <mo>)</mo> </mrow> </mrow> </semantics> </math> is the final position of the robot.</p> "> Figure 4
<p>Simulation of the trajectories generated by motion commands <math display="inline"> <semantics> <mrow> <mo> </mo> <mi>M</mi> <mo>=</mo> <mrow> <mo>(</mo> <mrow> <mi>v</mi> <mo>,</mo> <mi>α</mi> <mo>,</mo> <mi>ω</mi> <mo>,</mo> <msub> <mi>t</mi> <mi>r</mi> </msub> <mo>=</mo> <mn>10</mn> <mi>s</mi> </mrow> <mo>)</mo> </mrow> </mrow> </semantics> </math>. The starting point is <math display="inline"> <semantics> <mrow> <mrow> <mo>(</mo> <mrow> <mi>x</mi> <mo>=</mo> <mn>0</mn> <mo>,</mo> <mi>y</mi> <mo>=</mo> <mn>0</mn> <mo>,</mo> <mi>θ</mi> <mo>=</mo> <mn>0</mn> </mrow> <mo>)</mo> </mrow> </mrow> </semantics> </math>, and the velocity <math display="inline"> <semantics> <mi>v</mi> </semantics> </math> = 0.3 m/s. Showing 8 different angular orientations <math display="inline"> <semantics> <mi>α</mi> </semantics> </math>: 0° (red), 45° (green), 90° (blue), 135° (cyan), 180° (black), 225° (yellow), 270° (magenta), 315° (olive), and different angular rotational speeds: (<b>a</b>) <math display="inline"> <semantics> <mi>ω</mi> </semantics> </math> = 0.0 rad/s, (<b>b</b>) <math display="inline"> <semantics> <mi>ω</mi> </semantics> </math> = 1.0 rad/s, (<b>c</b>) <math display="inline"> <semantics> <mi>ω</mi> </semantics> </math> = 1.5 rad/s, (<b>d</b>) <math display="inline"> <semantics> <mi>ω</mi> </semantics> </math> = 2.0 rad/s, (<b>e</b>) <math display="inline"> <semantics> <mi>ω</mi> </semantics> </math> <span class="html-italic">=</span> −1.0 rad/s, (<b>f</b>) <math display="inline"> <semantics> <mi>ω</mi> </semantics> </math> <span class="html-italic">=</span> −1.5 rad/s, (<b>g</b>) <math display="inline"> <semantics> <mi>ω</mi> </semantics> </math> <span class="html-italic">=</span> −2.0 rad/s.</p> "> Figure 5
<p>Representation of the trajectory of the mobile robot: (<b>a</b>) case with translation and rotation in the counterclockwise direction <math display="inline"> <semantics> <mrow> <mo stretchy="false">(</mo> <mi>ω</mi> <mo>></mo> <mn>0</mn> <mo stretchy="false">)</mo> </mrow> </semantics> </math> and (<b>b</b>) case with translation and rotation in the clockwise direction <math display="inline"> <semantics> <mrow> <mo stretchy="false">(</mo> <mi>ω</mi> <mo><</mo> <mn>0</mn> <mo stretchy="false">)</mo> </mrow> </semantics> </math>.</p> "> Figure 6
<p>Representation of the motion command <math display="inline"> <semantics> <mrow> <mi>M</mi> <mo>=</mo> <mrow> <mo>(</mo> <mrow> <mi>v</mi> <mo>,</mo> <mi>α</mi> <mo>,</mo> <mi>ω</mi> <mo>,</mo> <msub> <mi>t</mi> <mi>r</mi> </msub> </mrow> <mo>)</mo> </mrow> </mrow> </semantics> </math> and the mobile robot trajectory (green line) required to move from a starting point <math display="inline"> <semantics> <mrow> <msub> <mi>P</mi> <mi>i</mi> </msub> </mrow> </semantics> </math> (green circle) to a planned destination <math display="inline"> <semantics> <mrow> <msub> <mi>P</mi> <mi>f</mi> </msub> </mrow> </semantics> </math> depending on the number of intermediate waypoints defined using an interpolation procedure: (<b>a</b>) direct trajectory with only one motion command and no intermediate waypoints; (<b>b</b>) trajectory with one intermediate waypoint <span class="html-italic">P</span>1 that requires the computation of two motion commands; (<b>c</b>) trajectory with four intermediate waypoints <span class="html-italic">P</span>1, <span class="html-italic">P</span>2, <span class="html-italic">P</span>3 and <span class="html-italic">P</span>4 that require the computation of five motion commands.</p> "> Figure 7
<p>Comparison between the planned (blue line) and real (magenta) trajectories of the mobile robot moving at a constant speed of 0.15 m/s. The green line depicts the expected initial and final positions and orientations of the mobile robot: (<b>a</b>) describing a circular trajectory with the robot facing inward and (<b>b</b>) describing an eight-shaped trajectory with the robot facing forward.</p> "> Figure 8
<p>Error location <math display="inline"> <semantics> <mrow> <mrow> <mo>(</mo> <mrow> <msub> <mi>x</mi> <mi>e</mi> </msub> <mo>,</mo> <msub> <mi>y</mi> <mi>e</mi> </msub> <mo>,</mo> <msub> <mi>θ</mi> <mi>e</mi> </msub> </mrow> <mo>)</mo> </mrow> </mrow> </semantics> </math> when the mobile robot moves at a constant translational velocity of 0.15 m/s: (<b>a</b>) following a circular trajectory with the robot facing inward and (<b>b</b>) following an eight-shaped trajectory with the robot facing forward.</p> ">
Abstract
:1. Introduction
New Contribution
2. The Mobile Robot APR
3. Kinematics of the Omnidirectional Mobile Robot APR
3.1. Motion Originated by the Execution of a Single Motion Command
3.2. Estimation of the Motion Command to Reach a Target Position for a Known
3.2.1. Translation and Rotation in the Counterclockwise Direction for a Known
3.2.2. Translation and Rotation in the Clockwise Direction for a Known
3.2.3. Translation without Rotation for a Known
3.2.4. Static Rotation without Translation
3.3. Kinematic Model: Determination of from
3.4. Kinematic Model: Determination of from
3.5. Odometry: Determination of from
4. Path Planning and Path Following
4.1. Rough Trajectory Definition through Waypoints
4.2. Path Planning: Linearizing and Smoothing the Trajectory
4.3. Path-Following Procedure
5. Experimental Evaluation of the Path-Tracking Accuracy
6. Discussion and Conclusions
Author Contributions
Funding
Conflicts of Interest
References
- Duda, S.; Dudek, O.; Gembalczyk, G. Developing a Test Site for Testing the Suspension of Vehicles with Omnidirectional Wheels. Vib. Phys. Syst. 2020, 31, 2020304. [Google Scholar]
- Hou, L.; Zhou, F.; Kim, K.; Zhang, L. Practical Model for Energy Consumption Analysis of Omnidirectional Mobile Robot. Sensors 2021, 21, 1800. [Google Scholar] [CrossRef] [PubMed]
- Kao, S.-T.; Ho, M.-T. Ball-Catching System Using Image Processing and an Omni-Directional Wheeled Mobile Robot. Sensors 2021, 21, 3208. [Google Scholar] [CrossRef] [PubMed]
- Levratti, A.; De Vuono, A.; Fantuzzi, C.; Secchi, C. TIREBOT: A novel Tire Workshop Assistant Robot. In Proceedings of the International Conference on Advanced Intelligent Mechatronics (AIM), Banff, AB, Canada, 12–15 July 2016. [Google Scholar] [CrossRef]
- Bogue, R. Domestic robots: Has their time finally come? Ind. Robot Int. J. 2017, 44, 129–136. [Google Scholar] [CrossRef]
- Tagliavini, L.; Botta, A.; Cavallone, P.; Carbonari, L.; Quaglia, G. On the Suspension Design of Paquitop, a Novel Service Robot for Home Assistance Applications. Machines 2021, 9, 52. [Google Scholar] [CrossRef]
- Saadatzi, M.N.; Abubakar, S.; Das, S.K.; Saadatzi, M.H.; Popa, D. Neuroadaptive Controller for Physical Interaction with an Omni-Directional Mobile Nurse Assistant Robot. In Proceedings of the ASME International Design Engineering Technical Conferences and Computers and Information in Engineering Conference, Virtual, Online, 17–19 August 2020. [Google Scholar] [CrossRef]
- Palacín, J.; Martínez, D. Improving the Angular Velocity Measured with a Low-Cost Magnetic Rotary Encoder Attached to a Brushed DC Motor by Compensating Magnet and Hall-Effect Sensor Misalignments. Sensors 2021, 21, 4763. [Google Scholar] [CrossRef] [PubMed]
- Tsai, C.-C.; Jiang, L.-B.; Wang, T.-Y.; Wang, T.-S. Kinematics Control of an Omnidirectional Mobile Robot. In Proceedings of the CACS Automatic Control Conference, Tainan, China, 18–19 November 2005. [Google Scholar]
- Tri, R.; Arifinato, D.; Bachtiar, F.; Intan, J. Holonomic Implementation of Three Wheels Omnidirectional Mobile Robot using DC Motors. J. Robot. Control JRC 2021, 2, 65–71. [Google Scholar]
- Lin, P.; Liu, D.; Yang, D.; Zou, Q.; Du, Y.; Cong, M. Calibration for Odometry of Omnidirectional Mobile Robots Based on Kinematic Correction. In Proceedings of the International Conference on Computer Science & Education (ICCSE), Toronto, ON, Canada, 19–21 August 2019. [Google Scholar] [CrossRef]
- Maddahi, Y.; Maddahi, A.; Sepehri, N. Callibration of omnidirectional wheeled mobile robots: Method and experiments. Robotica 2013, 31, 969–980. [Google Scholar] [CrossRef]
- Li, Y.; Ge, S.; Dai, S.; Zhao, L.; Yan, X.; Zheng, Y.; Shi, Y. Kinematic Modeling of a Combined System of Multiple Mecanum-Wheeled Robots with Velocity Compensation. Sensors 2020, 20, 75. [Google Scholar] [CrossRef] [PubMed]
- Baede, T.A. Motion Control of An Omnidirectional Mobile Robot. In Proceedings of the Motion Control of an Omnidirectional Mobile Robot, Eindhoven, The Netherlands, 18 September 2006In Proceedings of the Motion Control of an Omnidirectional Mobile Robot, Eindhoven, The Netherlands, 18 September 2006. [Google Scholar]
- Wang, C.; Liu, X.; Yang, X.; Hu, F.; Jiang, A.; Yang, C. Trajectory Tracking of an Omni-Directional Wheeled Mobile Robot Using a Model Predictive Control Strategy. Appl. Sci. 2018, 8, 231. [Google Scholar] [CrossRef]
- Leng, C.; Cao, Q.; Huang, Y. A Motion Planning Method for Omnidirectional Mobile Robot Based on the Anisotropic Characteristics. Int. J. Adv. Robot. Syst. 2008, 5, 45. [Google Scholar] [CrossRef]
- Clotet, E.; Martínez, D.; Moreno, J.; Tresanchez, M.; Palacín, J. Assistant Personal Robot (APR): Conception and Application of a Tele-Operated Assisted Living Robot. Sensors 2016, 16, 610. [Google Scholar] [CrossRef]
- Moreno, J.; Clotet, E.; Lupiañez, R.; Tresanchez, M.; Martínez, D.; Pallejà, T.; Casanovas, J.; Palacín, J. Design, Implementation and Validation of the Three-Wheel Holonomic Motion System of the Assistant Personal Robot (APR). Sensors 2016, 16, 1658. [Google Scholar] [CrossRef]
- Palacín, J.; Martínez, D.; Clotet, E.; Pallejà, T.; Burgués, J.; Fonollosa, J.; Pardo, A.; Marco, S. Application of an Array of Metal-Oxide Semiconductor Gas Sensors in an Assistant Personal Robot for Early Gas Leak Detection. Sensors 2019, 19, 1957. [Google Scholar] [CrossRef]
- Slovák, J.; Melicher, M.; Šimovec, M.; Vachálek, J. Vision and RTLS Safety Implementation in an Experimental Human—Robot Collaboration Scenario. Sensors 2021, 21, 2419. [Google Scholar] [CrossRef]
- Bonci, A.; Cen Cheng, P.D.; Indri, M.; Nabissi, G.; Sibona, F. Human-Robot Perception in Industrial Environments: A Survey. Sensors 2021, 21, 1571. [Google Scholar] [CrossRef]
- Palacín, J.; Clotet, E.; Martínez, D.; Martínez, D.; Moreno, J. Extending the Application of an Assistant Personal Robot as a Walk-Helper Tool. Robotics 2019, 8, 27. [Google Scholar] [CrossRef]
- Lluvia, I.; Lazkano, E.; Ansuategi, A. Active Mapping and Robot Exploration: A Survey. Sensors 2021, 21, 2445. [Google Scholar] [CrossRef]
- Palacín, J.; Martínez, D.; Rubies, E.; Clotet, E. Suboptimal Omnidirectional Wheel Design and Implementation. Sensors 2021, 21, 865. [Google Scholar] [CrossRef]
- Palacín, J.; Martínez, D.; Rubies, E.; Clotet, E. Mobile Robot Self-Localization with 2D Push-Broom LIDAR in a 2D Map. Sensors 2020, 20, 2500. [Google Scholar] [CrossRef]
- Inthiam, J.; Deelertpaiboon, C. Self-Localization and Navigation of Holonomic Mobile Robot Using Omni-Directional Wheel Odometry. In Proceedings of the TENCON 2014—2014 IEEE Region 10 Conference, Bangkok, Thailand, 22–25 October 2014; pp. 1–5. [Google Scholar] [CrossRef]
- Rijalusalam, D.U.; Iswanto, I. Implementation Kinematics Modeling and Odometry of Four Omni Wheel Mobile Robot on The Trajectory Planning and Motion Control Based Microcontroller. JRC 2021, 2, 448–455. [Google Scholar] [CrossRef]
- Li, Y.; Dai, S.; Shi, Y.; Zhao, L.; Ding, M. Navigation Simulation of a Mecanum Wheel Mobile Robot Based on an Improved A* Algorithm in Unity3D. Sensors 2019, 19, 2976. [Google Scholar] [CrossRef]
- Lau, B.; Sprunk, C.; Burgard, W. Kinodynamic Motion Planning for Mobile Robots Using Splines. In Proceedings of the IEEE/RSJ International Conference on Intelligent Robots and Systems, St. Louis, MO, USA, 10–15 October 2009. [Google Scholar] [CrossRef]
- Sprunk, C.; Lau, B.; Pfaffz, P.; Burgard, W. Online Generation of Kinodynamic Trajectories for Non-Circular Omnidirectional Robots. In Proceedings of the IEEE International Conference on Robotics and Automation, Shanghai, China, 9–13 May 2011. [Google Scholar] [CrossRef]
- Kuenemund, F.; Kirsch, C.; Hess, D.; Roehrig, C. Fast and Accurate Trajectory Generation for Non-Circular Omnidirectional Robots in Industrial Applications. In Proceedings of the ROBOTIK: German Conference on Robotics, Munich, Germany, 21–22 May 2012. [Google Scholar]
- Cao, Z.; Bryant, D.; Molteno, T.C.A.; Fox, C.; Parry, M. V-Spline: An Adaptive Smoothing Spline for Trajectory Reconstruction. Sensors 2021, 21, 3215. [Google Scholar] [CrossRef] [PubMed]
- Guillén Ruiz, S.; Calderita, L.V.; Hidalgo-Paniagua, A.; Bandera Rubio, J.P. Measuring Smoothness as a Factor for Efficient and Socially Accepted Robot Motion. Sensors 2020, 20, 6822. [Google Scholar] [CrossRef]
- Nascimento, T.; Dórea, C.; Gonçalves, L. Nonholonomic mobile robots’ trajectory tracking model predictive control: A survey. Robotica 2018, 36, 676–696. [Google Scholar] [CrossRef]
- Palacin, J.; Clotet, E. RoboticaUdL. APR-02 Eight-Shaped and Circular Trajectories. Youtube. 2021. Available online: https://youtu.be/vRLM-kc2_UM (accessed on 12 September 2021).
Speed (m/s) | Distance (m) | Angular Orientation (°) | ||
---|---|---|---|---|
RMSE | Absolute Maximum Error | RMSE | Absolute Maximum Error | |
0.10 | 0.017203 | 0.042762 | 6.7002 | 13.4761 |
0.15 | 0.021478 | 0.043392 | 3.9457 | 7.9026 |
0.20 | 0.023732 | 0.050270 | 5.6000 | 12.8638 |
0.25 | 0.033889 | 0.060157 | 5.3974 | 10.8083 |
0.30 | 0.032467 | 0.077929 | 6.2730 | 12.6040 |
0.35 | 0.051998 | 0.125830 | 7.3219 | 15.2344 |
0.40 | 0.038150 | 0.080402 | 6.3755 | 12.9896 |
0.45 | 0.040882 | 0.101140 | 6.0992 | 11.1705 |
0.50 | 0.032762 | 0.070848 | 8.0095 | 17.6987 |
Speed (m/s) | Distance (m) | Angular Orientation (°) | ||
---|---|---|---|---|
RMSE | Absolute Maximum Error | RMSE | Absolute Maximum Error | |
0.10 | 0.017036 | 0.045863 | 8.99090 | 20.8563 |
0.15 | 0.015341 | 0.044073 | 7.80650 | 17.9776 |
0.20 | 0.017418 | 0.039401 | 7.08460 | 17.6608 |
0.25 | 0.025817 | 0.068124 | 6.53130 | 17.2021 |
0.30 | 0.039706 | 0.088557 | 7.76150 | 21.5102 |
0.35 | 0.059036 | 0.123140 | 7.64930 | 19.2291 |
0.40 | 0.065989 | 0.151920 | 9.48780 | 20.7115 |
0.45 | 0.087276 | 0.220810 | 11.5129 | 24.3387 |
0.50 | 0.100260 | 0.265240 | 12.2232 | 22.5929 |
Publisher’s Note: MDPI stays neutral with regard to jurisdictional claims in published maps and institutional affiliations. |
© 2021 by the authors. Licensee MDPI, Basel, Switzerland. This article is an open access article distributed under the terms and conditions of the Creative Commons Attribution (CC BY) license (https://creativecommons.org/licenses/by/4.0/).
Share and Cite
Palacín, J.; Rubies, E.; Clotet, E.; Martínez, D. Evaluation of the Path-Tracking Accuracy of a Three-Wheeled Omnidirectional Mobile Robot Designed as a Personal Assistant. Sensors 2021, 21, 7216. https://doi.org/10.3390/s21217216
Palacín J, Rubies E, Clotet E, Martínez D. Evaluation of the Path-Tracking Accuracy of a Three-Wheeled Omnidirectional Mobile Robot Designed as a Personal Assistant. Sensors. 2021; 21(21):7216. https://doi.org/10.3390/s21217216
Chicago/Turabian StylePalacín, Jordi, Elena Rubies, Eduard Clotet, and David Martínez. 2021. "Evaluation of the Path-Tracking Accuracy of a Three-Wheeled Omnidirectional Mobile Robot Designed as a Personal Assistant" Sensors 21, no. 21: 7216. https://doi.org/10.3390/s21217216
APA StylePalacín, J., Rubies, E., Clotet, E., & Martínez, D. (2021). Evaluation of the Path-Tracking Accuracy of a Three-Wheeled Omnidirectional Mobile Robot Designed as a Personal Assistant. Sensors, 21(21), 7216. https://doi.org/10.3390/s21217216