[go: up one dir, main page]
More Web Proxy on the site http://driver.im/ skip to main content
10.1145/3568294.3580131acmconferencesArticle/Chapter ViewAbstractPublication PageshriConference Proceedingsconference-collections
short-paper

Sawarimōto: A Vision and Touch Sensor-based Method for Automatic or Tele-operated Android-to-human Face Touch

Published: 13 March 2023 Publication History

Abstract

Although robot-to-human touch experiments have been performed, they have all used direct tele-operation with a remote controller, pre-programmed hand motions, or tracked the human with wearable trackers. This report introduces a project that aims to visually track and touch a person's face with a humanoid android using a single RGB-D camera for 3D pose estimation. There are three major components: 3D pose estimation, a touch sensor for the android's hand, and a controller that combines the pose and sensor information to direct the android's actions. The pose estimation is working and released under as open-source. A touch sensor glove has been built and we have begun work on creating an under-skin version. Finally, we have tested android face-touch control. These tests showed many hurdles that will need to be overcome, but also how convincing the experience already is for the potential of this technology to elicit strong emotional responses.

Supplementary Material

MP4 File (HRI-LBR1176.mp4)
Presentation video of Sawarimōto: A Vision and Touch Sensor-based Method for Automatic or Tele-operated Android-to-human Face Touch. This brief video outlines the goals, methods, and results of this research. Most importantly, it shows footage of live demonstrations of the 3D tracking system, touch sensors, and actual face touch on a human subject. For details not explained in this video, please see the full paper at https://doi.org/10.1145/3568294.3580131

References

[1]
J. B. F. van Erp, and A. Toet, "Social Touch in Human-Computer Interaction," Frontiers in Digital Humanities, vol. 2, no. 2, pp. 2, May 2015.
[2]
M. Shiomi, H. Sumioka, and H. Ishiguro, "Survey of Social Touch Interaction Between Humans and Robots," Journal of Robotics and Mechatronics, vol. 32, no. 1, pp. 128--135, Feb. 2020.
[3]
M. Shiomi, X. Zheng, T. Minato, and H. Ishiguro, "Implementation and Evaluation of a Grip Behavior Model to Express Emotions for an Android Robot," Frontiers in Robotics and AI, vol. 8, Oct. 2021.
[4]
T. Law, B. F. Malle, and M. Scheutz, "A touching connection: how observing robotic touch can affect human trust in a robot," International Journal of Social Robotics, vol. 13, no. 8, pp. 2003--2019, Dec. 2021.
[5]
Y. Okada, M. Kimoto, T. Iio, K. Shimohara, H. Nittono, and M. Shiomi, "Kawaii emotions in presentations: Viewing a physical touch affects perception of affiliative feelings of others toward an object," Plos one, vol. 17, no. 3, pp. e0264736, Mar. 2022.
[6]
N. Yoshida, S. Yonemura, M. Emoto, K. Kawai, N. Numaguchi, H. Nakazato, S. Otsubo, M. Takada, and K. Hayashi, "Production of character animation in a home robot: A case study of lovot," International Journal of Social Robotics, vol. 14, no. 1, pp. 39--54, Jan. 2022, 10.1007/s12369-021-00746-0.
[7]
M. Shiomi, and N. Hagita, "Audio-Visual Stimuli Change not Only Robot's Hug Impressions but Also Its Stress-Buffering Effects," International Journal of Social Robotics, pp. 1--8, Jun. 2019.
[8]
M. Shiomi, A. Nakata, M. Kanbara, and N. Hagita, "Robot Reciprocation of Hugs Increases Both Interacting Times and Self-disclosures," International Journal of Social Robotics, pp. 1--9, Apr. 2020.
[9]
A. E. Block, S. Christen, R. Gassert, O. Hilliges, and K. J. Kuchenbecker, "The Six Hug Commandments: Design and Evaluation of a Human-Sized Hugging Robot with Visual and Haptic Perception," in Proceedings of the 2021 ACM/IEEE International Conference on Human-Robot Interaction, pp. 380--388, Mar. 2021.
[10]
A. E. Block, H. Seifi, O. Hilliges, R. Gassert, and K. J. Kuchenbecker, "In the arms of a robot: Designing autonomous hugging robots with intra-hug gestures," ACM Transactions on Human-Robot Interaction, Mar. 2022.
[11]
J. T. Suvilehto, E. Glerean, R. I. M. Dunbar, R. Hari, and L. Nummenmaa, "Topography of social touching depends on emotional bonds between humans," Proceedings of the National Academy of Sciences vol. 112, no. 45, pp. 13811--13816, Sep. 2015.
[12]
J. T. Suvilehto, L. Nummenmaa, T. Harada, R. I. Dunbar, R. Hari, R. Turner, N. Sadato, and R. Kitada, "Cross-cultural similarity in relationship-specific social touching," Journal of Proceedings of the Royal Society B, vol. 286, no. 1901, pp. 20190467, Apr. 2019.
[13]
A. Cekaite, and L. Mondada, Touch in Social Interaction: London: Routledge, 2020.
[14]
"Depth Camera D435." Intel RealSense. https://www.intelrealsense.com/depth-camera-d435/ (accessed December 5, 2022).
[15]
G. Hidalgo et al. "OpenPose Documentation." OpenPose. https://cmu-perceptual-computing-lab.github.io/openpose/web/html/doc/ (accessed November 3, 2022).
[16]
A. Fox-Tierney, "RealSense2OpenPose3D." Github. https://github.com/foxtierney/RealSense2OpenPose3D (Accessed January 18, 2023).
[17]
D. F. Glas, T. Minato, C. T. Ishi, T. Kawahara and H. Ishiguro, "ERICA: The ERATO Intelligent Conversational Android," 2016 25th IEEE International Symposium on Robot and Human Interactive Communication (RO-MAN), 2016, pp. 22--29.
[18]
Y. Zhou, T. Kornher, J. Mohnke, and M. H. Fischer, "Tactile Interaction with a Humanoid Robot: Effects on Physiology and Subjective Impressions," Intl. J. of Soc. Robotics, vol. 13, pp. 1657--1677, Feb. 2021.
[19]
J. T. Suvilehto, E. Glerean, R. I. M. Dunbar, R. Hari, and L. Nummenmaa, "Topography of social touching depends on emotional bonds between humans," PNAS, vol. 112, no. 45, pp. 13811--13816, Nov. 2015.
[20]
M. Shiomi, T. Minato, H. Ishiguro, "Effects of Robot's Awareness and its Subtle Reactions Toward People's Perceived Feelings in Touch Interaction," Journal of Robotics and Mechatronics, vol. 32, no. 1, Feb. 2020.
[21]
R. Andreasson, B. Alenljung, E. Billing, and R. Lowe, "Affective Touch in Human--Robot Interaction: Conveying Emotion to the Nao Robot," International Journal of Social Robotics, vol. 10, no. 4, pp. 473--491, Dec. 2017.
[22]
M. Shiomi, H. Sumioka, K. Sakai, T. Funayama, and T. Minato, "S"TO: An Android Platform with a Masculine Appearance for Social Touch Interaction," presented at the ACM/IEEE Int. Conf. on Human-Robot Interaction, Cambridge, UK, March 23-26, 2020, pp. 447--449.
[23]
X. Zheng, M. Shiomi, T. Minato and H. Ishiguro, "What Kinds of Robot's Touch Will Match Expressed Emotions?," in IEEE Robotics and Automation Letters, vol. 5, no. 1, pp. 127--134, Jan. 2020.

Index Terms

  1. Sawarimōto: A Vision and Touch Sensor-based Method for Automatic or Tele-operated Android-to-human Face Touch

    Recommendations

    Comments

    Please enable JavaScript to view thecomments powered by Disqus.

    Information & Contributors

    Information

    Published In

    cover image ACM Conferences
    HRI '23: Companion of the 2023 ACM/IEEE International Conference on Human-Robot Interaction
    March 2023
    612 pages
    ISBN:9781450399708
    DOI:10.1145/3568294
    Permission to make digital or hard copies of all or part of this work for personal or classroom use is granted without fee provided that copies are not made or distributed for profit or commercial advantage and that copies bear this notice and the full citation on the first page. Copyrights for components of this work owned by others than the author(s) must be honored. Abstracting with credit is permitted. To copy otherwise, or republish, to post on servers or to redistribute to lists, requires prior specific permission and/or a fee. Request permissions from [email protected].

    Sponsors

    Publisher

    Association for Computing Machinery

    New York, NY, United States

    Publication History

    Published: 13 March 2023

    Permissions

    Request permissions for this article.

    Check for updates

    Author Tags

    1. android
    2. hri
    3. pose estimation
    4. touch

    Qualifiers

    • Short-paper

    Funding Sources

    • JST Moonshot R&D

    Conference

    HRI '23
    Sponsor:

    Acceptance Rates

    Overall Acceptance Rate 268 of 1,124 submissions, 24%

    Contributors

    Other Metrics

    Bibliometrics & Citations

    Bibliometrics

    Article Metrics

    • 0
      Total Citations
    • 76
      Total Downloads
    • Downloads (Last 12 months)16
    • Downloads (Last 6 weeks)4
    Reflects downloads up to 13 Dec 2024

    Other Metrics

    Citations

    View Options

    Login options

    View options

    PDF

    View or Download as a PDF file.

    PDF

    eReader

    View online with eReader.

    eReader

    Media

    Figures

    Other

    Tables

    Share

    Share

    Share this Publication link

    Share on social media