Development of Dual-Arm Human Companion Robots That Can Dance
<p>The JF-2 (left) and JF-mini (right) human companion robots without casing.</p> "> Figure 2
<p>Hardware configurations and dimensions of the JF-2 and JF-mini robots.</p> "> Figure 3
<p>Four-DOF arm mechanisms of the JF-2 and JF-mini robots.</p> "> Figure 4
<p>Use cases of the chest display. (<b>a</b>) Robot control. (<b>b</b>) Synchronously showing an animation while dancing.</p> "> Figure 5
<p>Various facial expressions with the circular LCD head.</p> "> Figure 6
<p>Software architecture of the JF-2 and JF-mini robots.</p> "> Figure 7
<p>Three different human motion capturing methods. (<b>a</b>) Motion capture-based. (<b>b</b>) VR tracker-based. (<b>c</b>) Keypoint detection-based.</p> "> Figure 8
<p>Human motion-replicating process. (<b>a</b>) Shoulder, elbow, and hand 3D positions assuming zero chest tilt and roll angles. (<b>b</b>) Retargeted joint angles in a simulated environment. (<b>c</b>) Arm postures realized with JF-2 robot.</p> "> Figure 9
<p>(<b>a</b>) Snapshots of the safety test. (<b>b</b>) Time-force results.</p> "> Figure 10
<p>JF-2 and JF-mini robots dancing along to four different children’s songs.</p> "> Figure 10 Cont.
<p>JF-2 and JF-mini robots dancing along to four different children’s songs.</p> "> Figure 11
<p>JF-2 robot executing the gift delivery task.</p> "> Figure 12
<p>Examples of dance motion by JF2.</p> ">
Abstract
:1. Introduction
2. Hardware Architecture
2.1. Arm
2.2. Mobile Base
2.3. Chest Display and Sensor
2.4. Head
2.5. Control and Power
3. Software Architecture
3.1. SLAM and Navigation
3.2. Object and Human Detection
3.3. Human Motion Acquisition
3.4. Behavior Control
4. Experimental Results
4.1. Safety Test
4.2. Dancing Along Task
4.3. Gift Delivery Task
5. Conclusions
Author Contributions
Funding
Institutional Review Board Statement
Informed Consent Statement
Data Availability Statement
Conflicts of Interest
References
- Guiot, D.; Kerekes, M.; Sengès, E. Living with buddy: Can a social robot help elderly with loss of autonomy to age well? In Proceedings of the 28th IEEE RO-MAN Internet Of Intelligent Robotic Things For Healthy Living and Active Ageing, International Conference on Robot & Human Interactive Communication, New Delhi, India, 4–18 October 2019; pp. 23–26. [Google Scholar]
- Khosla, R.; Chu, M.T. Embodying care in Matilda: An affective communication robot for emotional wellbeing of older people in Australian residential care facilities. ACM Trans. Manag. Inf. Syst. (TMIS) 2013, 4, 1–33. [Google Scholar] [CrossRef]
- Yamamoto, T.; Terada, K.; Ochiai, A.; Saito, F.; Asahara, Y.; Murase, K. Development of Human Support Robot as the research platform of a domestic mobile manipulator. Robomech J. 2019, 6, 1–15. [Google Scholar] [CrossRef]
- Coşar, S.; Fernandez-Carmona, M.; Agrigoroaie, R.; Pages, J.; Ferland, F.; Zhao, F.; Yue, S.; Bellotto, N.; Tapus, A. ENRICHME: Perception and Interaction of an Assistive Robot for the Elderly at Home. Int. J. Soc. Robot. 2020, 12, 779–805. [Google Scholar] [CrossRef]
- Tanaka, F.; Isshiki, K.; Takahashi, F.; Uekusa, M.; Sei, R.; Hayashi, K. Pepper learns together with children: Development of an educational application. In Proceedings of the 2015 IEEE-RAS 15th International Conference on Humanoid Robots (Humanoids), Seoul, Republic of Korea, 3–5 November 2015; pp. 270–275. [Google Scholar] [CrossRef]
- Parekh, S.; Lim, J.C. Receptivity & Interaction of Social Robots in Hospitals. In Proceedings of the Companion of the 2020 ACM/IEEE International Conference on Human-Robot Interaction, Cambridge, UK, 23–26 March 2020; Association for Computing Machinery: New York, NY, USA, 2020. HRI ’20. pp. 389–391. [Google Scholar]
- Kobayashi, T.; Dean-Leon, E.; Guadarrama-Olvera, J.R.; Bergner, F.; Cheng, G. Whole-Body Multicontact Haptic Human–Humanoid Interaction Based on Leader–Follower Switching: A Robot Dance of the “Box Step”. Adv. Intell. Syst. 2022, 4, 2100038. [Google Scholar] [CrossRef]
- Okamoto, T.; Shiratori, T.; Kudoh, S.; Nakaoka, S.; Ikeuchi, K. Toward a Dancing Robot with Listening Capability: Keypose-Based Integration of Lower-, Middle-, and Upper-Body Motions for Varying Music Tempos. IEEE Trans. Robot. 2014, 30, 771–778. [Google Scholar] [CrossRef]
- Yang, P.C.; Al-Sada, M.; Chiu, C.C.; Kuo, K.; Tomo, T.P.; Suzuki, K.; Yalta, N.; Shu, K.H.; Ogata, T. HATSUKI: An anime character like robot figure platform with anime-style expressions and imitation learning based action generation. In Proceedings of the 2020 29th IEEE International Conference on Robot and Human Interactive Communication (RO-MAN), Naples, Italy, 31 August–4 September 2020; pp. 384–391. [Google Scholar]
- Kim, S.; Kim, C.; Park, J.H. Human-like Arm Motion Generation for Humanoid Robots Using Motion Capture Database. In Proceedings of the 2006 IEEE/RSJ International Conference on Intelligent Robots and Systems, Beijing, China, 9–15 October 2006; pp. 3486–3491. [Google Scholar]
- Santiago, C.B.; Oliveira, J.L.; Reis, L.P.; Sousa, A. Autonomous robot dancing synchronized to musical rhythmic stimuli. In Proceedings of the 6th Iberian Conference on Information Systems and Technologies (CISTI 2011), Chaves, Portugal, 15–18 June 2011; pp. 1–6. [Google Scholar]
- Xia, G.; Tay, J.; Dannenberg, R.; Veloso, M. Autonomous robot dancing driven by beats and emotions of music. In Proceedings of the Proceedings of the 11th International Conference on Autonomous Agents and Multiagent Systems, Valencia, Spain, 4–8 June 2012; Volume 1, pp. 205–212. [Google Scholar]
- Taunyazov, T.; Omarali, B.; Shintemirov, A. A novel low-cost 4-DOF wireless human arm motion tracker. In Proceedings of the 2016 6th IEEE International Conference on Biomedical Robotics and Biomechatronics (BioRob), Singapore, 26–29 June 2016; pp. 157–162. [Google Scholar] [CrossRef]
- Song, H.; Kim, Y.S.; Yoon, J.; Yun, S.H.; Seo, J.; Kim, Y.J. Development of Low-Inertia High-Stiffness Manipulator LIMS2 for High-Speed Manipulation of Foldable Objects. In Proceedings of the 2018 IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS), Madrid, Spain, 1–5 October 2018; pp. 4145–4151. [Google Scholar] [CrossRef]
- Man-Systems Integration Standards. 1995. Available online: http://msis.jsc.nasa.gov/sections/section03.htm (accessed on 4 July 2024).
- Yi, S.J.; McGill, S.; Hong, D.; Lee, D. Hierarchical Motion Control for a Team of Humanoid Soccer Robots. Int. J. Adv. Robot. Syst. 2016, 13, 32. [Google Scholar] [CrossRef]
- Yi, J.B.; Kang, T.; Song, D.; Yi, S.J. Unified Software Platform for Intelligent Home Service Robots. Appl. Sci. 2020, 10, 5874. [Google Scholar] [CrossRef]
- Kohlbrecher, S.; Meyer, J.; von Stryk, O.; Klingauf, U. A Flexible and Scalable SLAM System with Full 3D Motion Estimation. In Proceedings of the IEEE International Symposium on Safety, Security and Rescue Robotics (SSRR), Kyoto, Japan, 1–5 November 2011. [Google Scholar]
- ROS.org Amcl—ROS Wiki. Available online: http://wiki.ros.org/amcl (accessed on 4 July 2024).
- Kang, T.; Song, D.; Yi, J.B.; Kim, J.; Lee, C.Y.; Yoo, Y.; Kim, M.; Jo, H.-J.; Zhang, B.-T.; Yi, S.J.; et al. Team Tidyboy at the WRS 2020: A modular software framework for home service robots. Adv. Robot. 2022, 36, 836–849. [Google Scholar] [CrossRef]
- Cao, Z.; Simon, T.; Wei, S.E.; Sheikh, Y. Realtime Multi-Person 2D Pose Estimation Using Part Affinity Fields. In Proceedings of the IEEE Conference on Computer Vision and Pattern Recognition, Honolulu, HI, USA, 21–26 July 2017. [Google Scholar]
- Holzwarth, V.; Gisler, J.; Hirt, C.; Kunz, A. Comparing the Accuracy and Precision of SteamVR Tracking 2.0 and Oculus Quest 2 in a Room Scale Setup. In Proceedings of the 2021 the 5th International Conference on Virtual and Augmented Reality Simulations, Melbourne, VIC, Australia, 20–22 March 2021; Association for Computing Machinery: New York, NY, USA, 2021. ICVARS. pp. 42–46. [Google Scholar]
- Han, D.; Park, M.Y.; Choi, J.; Shin, H.; Behrens, R.; Rhim, S. Evaluation of force pain thresholds to ensure collision safety in worker-robot collaborative operations. Front. Robot. AI 2024, 11, 1374999. [Google Scholar] [CrossRef] [PubMed]
Disclaimer/Publisher’s Note: The statements, opinions and data contained in all publications are solely those of the individual author(s) and contributor(s) and not of MDPI and/or the editor(s). MDPI and/or the editor(s) disclaim responsibility for any injury to people or property resulting from any ideas, methods, instructions or products referred to in the content. |
© 2024 by the authors. Licensee MDPI, Basel, Switzerland. This article is an open access article distributed under the terms and conditions of the Creative Commons Attribution (CC BY) license (https://creativecommons.org/licenses/by/4.0/).
Share and Cite
Kim, J.; Kang, T.; Song, D.; Ahn, G.; Yi, S.-J. Development of Dual-Arm Human Companion Robots That Can Dance. Sensors 2024, 24, 6704. https://doi.org/10.3390/s24206704
Kim J, Kang T, Song D, Ahn G, Yi S-J. Development of Dual-Arm Human Companion Robots That Can Dance. Sensors. 2024; 24(20):6704. https://doi.org/10.3390/s24206704
Chicago/Turabian StyleKim, Joonyoung, Taewoong Kang, Dongwoon Song, Gijae Ahn, and Seung-Joon Yi. 2024. "Development of Dual-Arm Human Companion Robots That Can Dance" Sensors 24, no. 20: 6704. https://doi.org/10.3390/s24206704
APA StyleKim, J., Kang, T., Song, D., Ahn, G., & Yi, S. -J. (2024). Development of Dual-Arm Human Companion Robots That Can Dance. Sensors, 24(20), 6704. https://doi.org/10.3390/s24206704