User Local Coordinate-Based Accompanying Robot for Human Natural Movement of Daily Life
<p>The movements of the accompanying robot and the user. (<b>a</b>) the robot translates and rotates to the target position; (<b>b</b>) the user translates without rotation; (<b>c</b>) the user rotates in place without translation; (<b>d</b>) the user translates and rotates.</p> "> Figure 2
<p>“String-pots” to sense <math display="inline"><semantics> <mrow> <mi>L</mi> <mi>C</mi> <msub> <mi>S</mi> <mi>U</mi> </msub> </mrow> </semantics></math> and <math display="inline"><semantics> <mrow> <mi>L</mi> <mi>C</mi> <msub> <mi>S</mi> <mi>R</mi> </msub> </mrow> </semantics></math>.</p> "> Figure 3
<p>Control system structure.</p> "> Figure 4
<p>System structure block diagram.</p> "> Figure 5
<p>Assembly of the tested accompanying robot.</p> "> Figure 6
<p>Top-view motion graph of the movements of the robot and user w.r.t. the WCS. (<b>a</b>) Forward walking test; (<b>b</b>) backward walking test; (<b>c</b>) left lateral walking test; (<b>d</b>) right lateral walking test; (<b>e</b>) walking test with pivot turning to the left; (<b>f</b>) walking test with pivot turning to the right.</p> "> Figure 7
<p>Relative distance between the robot and the user w.r.t. the LCS and encoder values <math display="inline"><semantics> <mrow> <msub> <mi>θ</mi> <mrow> <mi>R</mi> <mi>o</mi> <mi>b</mi> <mi>o</mi> <mi>t</mi> </mrow> </msub> </mrow> </semantics></math>, <math display="inline"><semantics> <mrow> <msub> <mi>θ</mi> <mrow> <mi>U</mi> <mi>s</mi> <mi>e</mi> <mi>r</mi> </mrow> </msub> </mrow> </semantics></math> in the LCS. (<b>a</b>) Forward walking test; (<b>b</b>) backward walking test; (<b>c</b>) left lateral walking test; (<b>d</b>) right lateral walking test; (<b>e</b>) walking test with pivot turning to the left; (<b>f</b>) walking test with pivot turning to the right.</p> "> Figure 8
<p>Top-view motion graph of walking along the rectangle.</p> "> Figure 9
<p>Top-view motion graph of clockwise walking.</p> "> Figure 10
<p>Top-view motion graph of counterclockwise walking.</p> ">
Abstract
:1. Introduction
2. Methods
2.1. Concept of the User Local Coordinate-Based Accompanying Method
- System Initialize
- Setup and reset user’s coordinate system
- Setup and reset robot’s coordinate system
- Set target position Pt of the accompanying robot w.r.t the user
- Measure the position Pc of the accompanying robot w.r.t the user
- Calculate the errors Eu between Pc and Pt
- Convert Eu to Er in the accompanying robot’s local coordinate system
- Move the robot to reduce the error Er according to a control algorithm
- Repeat steps 4~8 until system stops following
2.2. Embodiment of the Accompanying Robot with the Viewpoint
- : velocity of the robot
- : angular velocity of the four Mecanum wheels
- : radius of the Mecanum wheels
- : motor constant
- : pulse width modulation
2.3. Detailed System Hardware
3. Methods of System Verification
3.1. Testing Tasks
3.2. Testing Environment
4. Results of System Verification
4.1. Basic Walking Tasks
4.2. Combined Tasks
4.2.1. Walking along a Rectangle
4.2.2. Clockwise-Curve Walking Test
4.2.3. Counterclockwise-Curve Walking Test
5. Discussion
6. Conclusions
Author Contributions
Funding
Institutional Review Board Statement
Informed Consent Statement
Data Availability Statement
Conflicts of Interest
References
- Islam, M.J.; Hong, J.; Sattar, J. Person-following by autonomous robots: A categorical overview. Int. J. Robot. Res. 2019, 38, 1581–1618. [Google Scholar] [CrossRef] [Green Version]
- Nishimura, S.; Takemura, H.; Mizoguchi, H. Development of attachable modules for robotizing daily items -Person following shopping cart robot. In Proceedings of the 2007 IEEE International Conference on Robotics and Biomimetics (ROBIO), Sanya, China, 15–18 December 2008. [Google Scholar]
- Travelmate Robotics. Available online: https://travelmaterobotics.com/ (accessed on 20 May 2021).
- Piaggio Fast Forward. Available online: https://www.piaggiofastforward.com/ (accessed on 19 May 2021).
- CaddyTrek. Available online: https://caddytrek.com/ (accessed on 20 May 2021).
- Jung, E.-J.; Yi, B.-J.; Yuta, S. Control algorithms for a mobile robot tracking a human in front. In Proceedings of the 2012 IEEE/RSJ International Conference on Intelligent Robots and Systems, Vilamoura-Algarve, Portugal, 7–12 October 2012. [Google Scholar]
- Hu, J.-S.; Wang, J.-J.; Ho, D. Design of Sensing System and Anticipative Behavior for Human Following of Mobile Robots. IEEE Trans. Ind. Electron. 2014, 61, 1916–1927. [Google Scholar] [CrossRef] [Green Version]
- Ferrer, G.; Garrell, A.; Sanfeliu, A. Robot companion: A social-force based approach with human awareness-navigation in crowded environments. In Proceedings of the 2013 IEEE/RSJ International Conference on Intelligent Robots and Systems, Tokyo, Japan, 3–7 November 2013. [Google Scholar]
- Mi, W.; Wang, X.; Ren, P.; Hou, C. A system for an anticipative front human following robot. In Proceedings of the International Conference on Artificial Intelligence and Robotics and the International Conference on Automation, Control and Robotics Engineering, Lisbon, Portugal, 29–31 July 2016. [Google Scholar]
- Nikdel, P.; Shrestha, R.; Vaughan, R. The hands-free push-cart: Autonomous following in front by predicting user trajectory around obstacles. In Proceedings of the 2018 IEEE International Conference on Robotics and Automation (ICRA), Brisbane, Australia, 21–25 May 2018. [Google Scholar]
- Fritsch, J.; Kleinehagenbrock, M.; Lang, S.; Fink, G.; Sagerer, G. Audiovisual person tracking with a mobile robot. 2004. Available online: https://citeseerx.ist.psu.edu/viewdoc/download?doi=10.1.1.9.7741&rep=rep1&type=pdf (accessed on 4 June 2020).
- Huskić, G.; Buck, S.; González, L.A.I.; Zell, A. Outdoor person following at higher speeds using a skid-steered mobile robot. In Proceedings of the 2017 IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS), Vancouver, BC, Canada, 24–28 September 2017. [Google Scholar]
- Triebel, R.; Arras, K.; Alami, R.; Beyer, L.; Breuers, S.; Chatila, R.; Chetouani, M.; Cremers, D.; Evers, V.; Fiore, M.; et al. SPENCER: A Socially Aware Service Robot for Passenger Guidance and Help in Busy Airports. In Field and Service Robotics; Springer: Cham, Switzerland, 2016; pp. 607–622. [Google Scholar]
- Jevtic, A.; Doisy, G.; Parmet, Y.; Edan, Y. Comparison of interaction modalities for mobile indoor robot guidance: Direct physical interaction, person following, and pointing control. IEEE Trans. Hum.-Mach. Syst. 2015, 45, 653–663. [Google Scholar] [CrossRef]
- Chen, B.X.; Sahdev, R.; Tsotsos, J. Integrating Stereo Vision with a CNN Tracker for a Person-Following Robot. In International Conference on Computer Vision Systems; Springer: Cham, Switzerland, 2017. [Google Scholar]
- Jiang, S.; Yao, W.; Hong, Z.; Li, L.; Su, C.; Kuc, T.-Y. A Classification-Lock Tracking Strategy Allowing a Person-Following Robot to Operate in a Complicated Indoor Environment. Sensors 2018, 18, 3903. [Google Scholar] [CrossRef] [PubMed] [Green Version]
- Wang, X.; Zhang, L.; Wang, D.; Hu, X. Person detection, tracking and following using stereo camera. In Proceedings of the Ninth International Conference on Graphic and Image Processing (ICGIP 2017), Qingdao, China, 14–16 October 2017. [Google Scholar] [CrossRef]
- Cifuentes, C.A.; Frizera, A.; Carelli, R.; Bastos, T. Human–robot interaction based on wearable IMU sensor and laser range finder. Robot. Auton. Syst. 2014, 62, 1425–1439. [Google Scholar] [CrossRef]
- Brookshire, J. Person Following Using Histograms of Oriented Gradients. Int. J. Soc. Robot. 2010, 2, 137–146. [Google Scholar] [CrossRef]
- Chung, W.; Kim, H.; Yoo, Y.; Moon, C.; Park, J. The Detection and Following of Human Legs Through Inductive Approaches for a Mobile Robot With a Single Laser Range Finder. IEEE Trans. Ind. Electron. 2012, 59, 3156–3166. [Google Scholar] [CrossRef]
- Dürr, V.; Arena, P.P.; Cruse, H.; Dallmann, C.J.; Drimus, A.; Hoinville, T.; Krause, T.; Mátéfi-Tempfli, S.; Paskarbeit, J.; Patanè, L.; et al. Integrative Biomimetics of Autonomous Hexapedal Locomotion. Front. Neurorobotics 2019, 13, 88. [Google Scholar] [CrossRef] [PubMed]
- Xiao, X.; Fan, Y.; Dufek, J.; Murphy, R. Indoor UAV Localization Using a Tether. In Proceedings of the 2018 IEEE International Symposium on Safety, Security, and Rescue Robotics (SSRR), Philadelphia, PA, USA, 6–8 August 2018. [Google Scholar]
- Dickerson, S.L.; Lapin, B.D. Control of an omni-directional robotic vehicle with Mecanum wheels. In Proceedings of the NTC ’91—National Telesystems Conference Proceedings, Atlanta, GA, USA, 26–27 March 1991. [Google Scholar]
- Taheri, H.; Qiao, B.; Ghaeminezhad, N. Kinematic model of a four mecanum wheeled mobile robot. Int. J. Comput. Appl. 2015, 113, 6–9. [Google Scholar] [CrossRef]
- Su, Y.; Zheng, C.; Mercorelli, P. Global Finite-Time Stabilization of Planar Linear Systems With Actuator Saturation. IEEE Trans. Circuits Syst. II Express Briefs 2017, 64, 947–951. [Google Scholar] [CrossRef]
- Delcomyn, F.; Nelson, M.E. Architectures for a biomimetic hexapod robot. Robot. Auton. Syst. 2000, 30, 5–15. [Google Scholar] [CrossRef]
- Chakrabarty, A.; Morris, R.; Bouyssounouse, X.; Hunt, R. Autonomous indoor object tracking with the Parrot AR. Drone. In Proceedings of the 2016 International Conference on Unmanned Aircraft Systems (ICUAS), Arlington, VA, USA, 7–10 June 2016. [Google Scholar]
- Fukui, K.; Yamaguchi, O. Face recognition using multi-viewpoint patterns for robot vision. In Robotics Research. The Eleventh International Symposium; Springer: Berlin/Heidelberg, Germany, 2005. [Google Scholar]
Tests | Start Position | Stop Position | Displacement | ||||||
---|---|---|---|---|---|---|---|---|---|
Forward walking | |||||||||
user | 0 | 0 | 99.6 | −3 | 1914 | 91.8 | −3 | 1914 | −7.8 |
robot | 24 | 438 | 78.1 | 90 | 2361 | 77.9 | 66 | 1933 | −0.2 |
Backward walking | |||||||||
user | 0 | 0 | 92.4 | 28 | −1980 | 92.8 | 28 | −1980 | 0.4 |
robot | 88 | 451 | 78.8 | 82 | −1534 | 81.9 | −6 | −1985 | 3.1 |
Left lateral walking | |||||||||
user | 0 | 0 | 97.4 | −1520 | 64 | 92.0 | −1520 | 64 | −5.4 |
robot | 220 | 583 | 95.7 | −1512 | 582 | 108.4 | −1732 | −2 | 12.7 |
Right lateral walking | |||||||||
user | 0 | 0 | 92 | 1558 | −34 | 90.4 | 1558 | −34 | −1.8 |
robot | −8 | 447 | 83.8 | 1658 | 363 | 77.9 | 1666 | −84 | −5.9 |
Left pivot turning | |||||||||
user | 0 | 0 | 97.2 | −265 | −222 | 170.9 | −266 | −222 | 73.7 |
robot | 10 | 417 | 95.9 | −796 | −279 | 167.5 | −807 | −696 | 71.6 |
Right pivot turning | |||||||||
user | 0 | 0 | 95.5 | 309 | −165 | 3.4 | 309 | −165 | −92.1 |
robot | 34 | 588 | 98.6 | 871 | −209 | 10.1 | 837 | −797 | −88.4 |
Tests | Target/Start Position | Stop Position | Average Position during Walking | ||||||
---|---|---|---|---|---|---|---|---|---|
(mm) | (mm) | (°) | (mm) | (mm) | (°) | (mm) | (mm) | (°) | |
Forward walking | 95 | 418 | −12.5 | 107 | 443 | −13.6 | 111 ± 33 | 320 ± 40 | −19.3 ± 6.5 |
Backward walking | 106 | 447 | −13.2 | 75 | 443 | −9.7 | 128 ± 32 | 574 ± 44 | −12.5 ± 3.1 |
Left lateral walking | 292 | 550 | −28.0 | 26 | 517 | −2.9 | 259 ± 31 | 482 ± 36 | −28.3 ± 3.3 |
Right lateral walking | 9 | 447 | −0.7 | 101 | 397 | −14.3 | −159 ± 50 | 422 ± 13 | 20.7 ± 6.5 |
Left pivot turning | 63 | 412 | −8.8 | −9 | 533 | 1.0 | 310 ± 167 | 440 ± 26 | −32.6 ± 14.9 |
Right pivot turning | 91 | 582 | −8.8 | −36 | 563 | 3.7 | −106 ± 189 | 536 ± 54 | 11.5 ± 19.7 |
Walking along a rectangle | −121 | 384 | 17.5 | −98 | 376 | 14.6 | −115 ± 125 | 323 ± 85 | 19.8 ± 20.3 |
Clockwise-curve walking | 42 | 541 | −4.4 | 73 | 514 | −8.0 | 11 ± 72 | 347 ± 53 | −1.8 ± 11.6 |
Counterclockwise-curve walking | −51 | 530 | 5.5 | −147 | 492 | 16.6 | 14 ± 241 | 315 ± 74 | −1.6 ± 37.8 |
Publisher’s Note: MDPI stays neutral with regard to jurisdictional claims in published maps and institutional affiliations. |
© 2021 by the authors. Licensee MDPI, Basel, Switzerland. This article is an open access article distributed under the terms and conditions of the Creative Commons Attribution (CC BY) license (https://creativecommons.org/licenses/by/4.0/).
Share and Cite
Wu, H.-K.; Chen, P.-Y.; Wu, H.-Y.; Yu, C.-H. User Local Coordinate-Based Accompanying Robot for Human Natural Movement of Daily Life. Sensors 2021, 21, 3889. https://doi.org/10.3390/s21113889
Wu H-K, Chen P-Y, Wu H-Y, Yu C-H. User Local Coordinate-Based Accompanying Robot for Human Natural Movement of Daily Life. Sensors. 2021; 21(11):3889. https://doi.org/10.3390/s21113889
Chicago/Turabian StyleWu, Hsiao-Kuan, Po-Yin Chen, Hong-Yi Wu, and Chung-Huang Yu. 2021. "User Local Coordinate-Based Accompanying Robot for Human Natural Movement of Daily Life" Sensors 21, no. 11: 3889. https://doi.org/10.3390/s21113889
APA StyleWu, H. -K., Chen, P. -Y., Wu, H. -Y., & Yu, C. -H. (2021). User Local Coordinate-Based Accompanying Robot for Human Natural Movement of Daily Life. Sensors, 21(11), 3889. https://doi.org/10.3390/s21113889