[go: up one dir, main page]
More Web Proxy on the site http://driver.im/ skip to main content
10.1007/978-981-99-8715-3_6guideproceedingsArticle/Chapter ViewAbstractPublication PagesConference Proceedingsacm-pubtype
Article

Clustering Social Touch Gestures for Human-Robot Interaction

Published: 03 December 2023 Publication History

Abstract

Social touch provides a rich non-verbal communication channel between humans and robots. Prior work has identified a set of touch gestures for human-robot interaction and described them with natural language labels (e.g., stroking, patting). Yet, no data exists on the semantic relationships between the touch gestures in users’ minds. To endow robots with touch intelligence, we investigated how people perceive the similarities of social touch labels from the literature. In an online study, 45 participants grouped 36 social touch labels based on their perceived similarities and annotated their groupings with descriptive names. We derived quantitative similarities of the gestures from these groupings and analyzed the similarities using hierarchical clustering. The analysis resulted in 9 clusters of touch gestures formed around the social, emotional, and contact characteristics of the gestures. We discuss the implications of our results for designing and evaluating touch sensing and interactions with social robots.

References

[1]
Abou Chahine, R., Kwon, D., Lim, C., Park, G., Seifi, H.: Vibrotactile similarity perception in crowdsourced and lab studies. In: Seifi, H., et al. (eds.) Haptics: Science, Technology, Applications, EuroHaptics 2022. LNCS, vol. 13235, pp. 255–263. Springer, Cham (2022).
[2]
Ammi, M., et al.: Haptic human-robot affective interaction in a handshaking social protocol. In: Proceedings of the ACM/IEEE International Conference on Human-Robot Interaction (HRI), pp. 263–270 (2015)
[3]
Burns, R.B., Lee, H., Seifi, H., Faulkner, R., Kuchenbecker, K.J.: Endowing a NAO robot with practical social-touch perception. Front. Roboti. AI 86 (2022)
[4]
Burns RB, Seifi H, Lee H, and Kuchenbecker KJ Getting in touch with children with autism: specialist guidelines for a touch-perceiving robot Paladyn J. Behav. Robot. 2021 12 1 115-135
[5]
Choi, H., et al.: Deep learning classification of touch gestures using distributed normal and shear force. In: Proceedings of the IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS), pp. 3659–3665 (2022)
[6]
Cramer, H.S., Kemper, N.A., Amin, A., Evers, V.: The effects of robot touch and proactive behaviour on perceptions of human-robot interactions. In: Proceedings of the ACM/IEEE International Conference on Human-Robot Interaction (HRI), pp. 275–276 (2009)
[7]
Fitter, N.T., Kuchenbecker, K.J.: Analyzing human high-fives to create an effective high-fiving robot. In: Proceedings of the ACM/IEEE International Conference on Human-Robot Interaction (HRI), pp. 156–157 (2014)
[8]
Hertenstein MJ, Holmes R, McCullough M, and Keltner D The communication of emotion via touch Emotion 2009 9 4 566
[9]
Hertenstein MJ and Keltner D Gender and the communication of emotion via touch Sex Roles 2011 64 70-80
[10]
Huisman G Social touch technology: a survey of haptic technology for social touch IEEE Trans. Haptics 2017 10 3 391-408
[11]
Jung MM, Poel M, Poppe R, and Heylen DK Automatic recognition of touch gestures in the corpus of social touch J. Multim. User Interfaces 2017 11 1 81-96
[12]
Jung, M.M., Poppe, R., Poel, M., Heylen, D.K.: Touching the void-introducing cost: corpus of social touch. In: Proceedings of the International Conference on Multimodal Interaction (ICMI), pp. 120–127 (2014)
[13]
Li, B., et al.: Human robot activity classification based on accelerometer and gyroscope. In: Proceedings of the IEEE International Symposium on Robot and Human Interactive Communication (RO-MAN), pp. 423–424 (2016)
[14]
Murtagh F and Contreras P Algorithms for hierarchical clustering: an overview, II Wiley Interdiscip. Rev. Data Mining Knowl. Discov. 2017 7 6
[15]
Pelikan, H.R., Broth, M., Keevallik, L.: Are you sad, cozmo? How humans make sense of a home robot’s emotion displays. In: Proceedings of the ACM/IEEE International Conference on Human-Robot Interaction (HRI), pp. 461–470 (2020)
[16]
Prasad, V., Koert, D., Stock-Homburg, R., Peters, J., Chalvatzaki, G.: Mild: multimodal interactive latent dynamics for learning human-robot interaction. In: Proceedings of the IEEE-RAS International Conference on Humanoid Robots (Humanoids), pp. 472–479 (2022)
[17]
Rognon, C., et al.: An online survey on the perception of mediated social touch interaction and device design. IEEE Trans. Haptics 15(2), 372–381 (2022)
[18]
Russell, J.: A circumplex model of affect. J. Personal. Soc. Psychol. 39, 1161–1178 (1980).
[19]
Salter T, Dautenhahn K, and te Boekhorst R Learning about natural human-robot interaction styles Robot. Auton. Syst. 2006 54 2 127-134
[20]
Salter, T., Michaud, F., Létourneau, D., Lee, D., Werry, I.P.: Using proprioceptive sensors for categorizing human-robot interactions. In: Proceedings of the ACM/IEEE International Conference on Human-Robot Interaction (HRI), pp. 105–112 (2007)
[21]
Seifi, H., Vasquez, S.A., Kim, H., Fazli, P.: First-hand impressions: charting and predicting user impressions of robot hands. ACM Trans. Hum. Robot Interact. (2023)
[22]
Ternes, D., MacLean, K.E.: Designing large sets of haptic icons with rhythm. In: Ferre, M. (ed.) EuroHaptics 2008. LNCS, vol. 5024, pp. 199–208. Springer, Heidelberg (2008).
[23]
Teyssier, M., Bailly, G., Pelachaud, C., Lecolinet, E.: Conveying emotions through device-initiated touch. IEEE Trans. Affect. Comput. (2020)
[24]
Teyssier, M., Parilusyan, B., Roudaut, A., Steimle, J.: Human-like artificial skin sensor for physical human-robot interaction. In: Proceedings of the IEEE International Conference on Robotics and Automation (ICRA), pp. 3626–3633. IEEE (2021)
[25]
Tsogo L, Masson M, and Bardot A Multidimensional scaling methods for many-object sets: a review Multivar. Behav. Res. 2000 35 3 307-319
[26]
Wang Z, Giannopoulos E, Slater M, and Peer A Handshake: realistic human-robot interaction in haptic enhanced virtual reality Presence 2011 20 4 371-392
[27]
Xu S, Xu C, McIntyre S, Olausson H, and Gerling GJ Subtle contact nuances in the delivery of human-to-human touch distinguish emotional sentiment IEEE Trans. Haptics 2021 15 1 97-102
[28]
Yohanan S and MacLean KE The role of affective touch in human-robot interaction: human intent and expectations in touching the haptic creature Int. J. Soc. Robot. 2012 4 2 163-180

Index Terms

  1. Clustering Social Touch Gestures for Human-Robot Interaction
      Index terms have been assigned to the content through auto-classification.

      Recommendations

      Comments

      Please enable JavaScript to view thecomments powered by Disqus.

      Information & Contributors

      Information

      Published In

      cover image Guide Proceedings
      Social Robotics: 15th International Conference, ICSR 2023, Doha, Qatar, December 3–7, 2023, Proceedings, Part I
      Dec 2023
      444 pages
      ISBN:978-981-99-8714-6
      DOI:10.1007/978-981-99-8715-3
      • Editors:
      • Abdulaziz Al Ali,
      • John-John Cabibihan,
      • Nader Meskin,
      • Silvia Rossi,
      • Wanyue Jiang,
      • Hongsheng He,
      • Shuzhi Sam Ge

      Publisher

      Springer-Verlag

      Berlin, Heidelberg

      Publication History

      Published: 03 December 2023

      Author Tags

      1. Social Touch
      2. Touch Dictionary
      3. Non-Verbal Communication
      4. Crowdsourcing Study

      Qualifiers

      • Article

      Contributors

      Other Metrics

      Bibliometrics & Citations

      Bibliometrics

      Article Metrics

      • 0
        Total Citations
      • 0
        Total Downloads
      • Downloads (Last 12 months)0
      • Downloads (Last 6 weeks)0
      Reflects downloads up to 14 Jan 2025

      Other Metrics

      Citations

      View Options

      View options

      Media

      Figures

      Other

      Tables

      Share

      Share

      Share this Publication link

      Share on social media