[go: up one dir, main page]
More Web Proxy on the site http://driver.im/ skip to main content
10.1109/IROS51168.2021.9636365guideproceedingsArticle/Chapter ViewAbstractPublication PagesConference Proceedingsacm-pubtype
research-article

Exploring Consequential Robot Sound: Should We Make Robots Quiet and Kawaii-et?

Published: 27 September 2021 Publication History

Abstract

All robots create consequential sound—sound produced as a result of the robot’s mechanisms—yet little work has explored how sound impacts human-robot interaction. Recent work shows that the sound of different robot mechanisms affects perceived competence, trust, human-likeness, and discomfort. However, the physical sound characteristics responsible for these perceptions have not been clearly identified. In this paper, we aim to explore key characteristics of robot sound that might influence perceptions. A pilot study from our past work showed that quieter and higher-pitched robots may be perceived as more competent and less discomforting. To better understand how variance in these attributes affects perception, we performed audio manipulations on two sets of industrial robot arm videos within a series of four new studies presented in this paper. Results confirmed that quieter robots were perceived as less discomforting. In addition, higher-pitched robots were perceived as more energetic, happy, warm, and competent. Despite the robot’s industrial purpose and appearance, participants seemed to prefer more "cute" (or "kawaii") sound profiles, which could have implications for the design of more acceptable and fulfilling sound profiles for human-robot interactions with practical collaborative robots.

References

[1]
L. Langeveld, R. van Egmond, R. Jansen, and E. Özcan, “Product Sound Design: Intentional and Consequential Sounds,” Advances in Industrial Design Engineering, Mar. 2013.
[2]
C. Castellini, R. M. Bongers, M. Nowak, and C. K. van der Sluis, “Upper-Limb Prosthetic Myocontrol: Two Recommendations,” Frontiers in Neuroscience, vol. 9, p. 496, Jan. 2016.
[3]
E. Cha, N. T. Fitter, Y. Kim, T. Fong, and M. J. Matarić, “Effects of Robot Sound on Auditory Localization in Human-Robot Collaboration,” in Proc. of the ACM/IEEE International Conference on HumanRobot Interaction (HRI), 2018, pp. 434-442.
[4]
K. Inoue, K. Wada, and Y. Ito, “Effective Application of Paro: Seal Type Robots for Disabled People in According to Ideas of Occupational Therapists,” in Proc. of the International Conference on Computers for Handicapped Persons (ICCHP). Springer, 2008, pp. 1321-1324.
[5]
H. H. Seck, “Marine Corps Shelves Futuristic Robo-Mule Due to Noise Concerns,” Dec. 2015. [Online]. Available: https://www.military.com/daily-news/2015/12/22/marine-corps-shelves-futuristic-robo-mule-due-to-noise-concerns.html
[6]
D. Moore, H. Tennent, N. Martelaro, and W. Ju, “Making Noise Intentional: A Study of Servo Sound Perception,” in Proc. of the ACM/IEEE International Conference on Human-Robot Interaction (HRI), 2017, pp. 12-21.
[7]
H. Tennent, D. Moore, M. Jung, and W. Ju, “Good vibrations: How consequential sounds affect perception of robotic arms,” in Proceedings of the IEEE International Symposium on Robot and Human Interactive Communication (RO-MAN), Lisbon, Portugal, 2017, pp. 928-935.
[8]
B. J. Zhang, C. A. Sanchez, and N. T. Fitter, “Consequential Robot Sound: Should Robots Be Quiet and High-Pitched?” in Workshop on Sound in HRI at the 16th ACM/IEEE International Conference on Human-Robot Interaction (HRI), 2021.
[9]
H. Fastl and E. Zwicker, Psychoacoustics: Facts and Models, 3rd ed. Berlin, NY: Springer, 2007, no. 22.
[10]
J. Blauert and U. Jekosch, “Sound-Quality Evaluation – A Multi-Layered Problem,” Acta Acustica united with Acustica, vol. 83, no. 5, pp. 747-753, Sep. 1997.
[11]
R. H. Lyon, “Product Sound Quality - from Perception to Design,” The Journal of the Acoustical Society of America, vol. 108, no. 5, pp. 2471-2471, Jun. 2000.
[12]
R. Lyon, Designing for Product Sound Quality, ser. Mechanical Engineering. Boca Raton, FL, USA: CRC Press, 2000, no. 127.
[13]
E. Özcan and R. van Egmond, “Product Sound Design: An InterDisciplinary Approach?” in Proc. of the Design Research Society Biennial Conference. Sheffield Hallam University, 2008, pp. 1-12.
[14]
R. Read and T. Belpaeme, “Situational Context Directs How People Affectively Interpret Robotic Non-Linguistic Utterances,” in Proceedings of the ACM/IEEE International Conference on Human-Robot Interaction (HRI), 2014, pp. 41-48.
[15]
G. Trovato, R. Paredes, J. Balvin, F. Cuellar, N. B. Thomsen, S. Bech, and Z.-H. Tan, “The Sound or Silence: Investigating the Influence of Robot Noise on Proxemics,” in Proc. of the IEEE International Symposium on Robot and Human Interactive Communication (ROMAN), 2018, pp. 713-718.
[16]
B. J. Zhang, N. Stargu, S. Brimhall, L. Chan, J. Fick, and N. T Fitter, “Bringing WALL-E out of the Silver Screen: Understanding How Transformative Robot Sound Affects Human Perception,” in Proc. of the IEEE International Conference on Robotics and Automation (ICRA), 2021.
[17]
D. Moore, T. Dahl, P. Varela, W. Ju, T. Næs, and I. Berget, “Unintended Consonances: Methods to Understand Robot Motor Sound Perception,” in Proc. of the CHI Conference on Human Factors in Computing Systems, 2019, pp. 1-12.
[18]
E. Frid, R. Bresin, and S. Alexanderson, “Perception of Mechanical Sounds Inherent to Expressive Gestures of a NAO Robot - Implications for Movement Sonification of Humanoids,” in Proc. of the Sound and Music Computing Conference, Limassol, Cyprus, 2018, pp. 43-51.
[19]
B. J. Zhang, “Appendix for ”Exploring Consequential Robot Sound: Should We Make Robots Quiet and Kawaii-et?"," 2021. [Online]. Available: https://www.brianzhang.org/publications/2021-iros-conference/
[20]
E. Erdfelder, F. Faul, and A. Buchner, “GPOWER: A general power analysis program,” Behavior Research Methods, Instruments & Computers, vol. 28, no. 1, pp. 1-11, 1996.
[21]
R. Thiessen, D. J. Rea, D. S. Garcha, C. Cheng, and J. E. Young, “Infrasound for HRI: A Robot Using Low-Frequency Vibrations to Impact How People Perceive its Actions,” in Proceedings of the ACM/IEEE International Conference on Human-Robot Interaction (HRI), 2019, pp. 11-18.
[22]
C. M. Carpinella, A. B. Wyman, M. A. Perez, and S. J. Stroessner, “The Robotic Social Attributes Scale (RoSAS): Development and Validation,” in Proceedings of the ACM/IEEE International Conference on Human-Robot Interaction (HRI), 2017, pp. 254-262.
[23]
D. S. Syrdal, K. Dautenhahn, K. L. Koay, and M. L. Walters, “The Negative Attitudes towards Robots Scale and Reactions to Robot Behaviour in a Live Human-Robot Interaction Study,” in Proceedings of the Convention of the Society for the Study of Artificial Intelligence and Simulation of Behaviour (AISB), 2009, pp. 109-115.
[24]
D. C. Funder and D. J. Ozer, “Evaluating Effect Size in Psychological Research: Sense and Nonsense,” Advances in Methods and Practices in Psychological Science, vol. 2, no. 2, pp. 156-168, Jun. 2019.
[25]
The jamovi project, “jamovi (Version 1.6) [Computer software],” 2020.
[26]
R Core Team, “R: A language and environment for statistical computing (Version 4.0) [Computer software],” 2020.
[27]
H. Singmann, “afex: Analysis of Factorial Experiments [R package],” 2018.
[28]
R. Lenth, “emmeans: Estimated Marginal Means [R package],” 2020.
[29]
A. D. Cheok and O. N. N. Fernando, “Kawaii/Cute Interactive Media,” Universal Access in the Information Society, vol. 11, no. 3, pp. 295-309, Aug. 2012.
[30]
C. Frazee, B. J. Zhang, and N. T Fitter, “Enabling Intentional Sound for Construction Cobots,” in Workshop on Building, Construction, and Architecture Robotics at the IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS), 2020.

Cited By

View all
  • (2024)Investigation of Low-Moral Actions by Malicious Anonymous Operators of Avatar RobotsACM Transactions on Human-Robot Interaction10.1145/369646614:1(1-34)Online publication date: 21-Sep-2024
  • (2024)Kawaii Computing: Scoping Out the Japanese Notion of Cute in User Experiences with Interactive SystemsExtended Abstracts of the CHI Conference on Human Factors in Computing Systems10.1145/3613905.3651001(1-9)Online publication date: 11-May-2024
  • (2023)New Design Potentials of Non-mimetic Sonification in Human–Robot InteractionACM Transactions on Human-Robot Interaction10.1145/361164612:4(1-17)Online publication date: 13-Dec-2023
  • Show More Cited By

Index Terms

  1. Exploring Consequential Robot Sound: Should We Make Robots Quiet and Kawaii-et?
        Index terms have been assigned to the content through auto-classification.

        Recommendations

        Comments

        Please enable JavaScript to view thecomments powered by Disqus.

        Information & Contributors

        Information

        Published In

        cover image Guide Proceedings
        2021 IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS)
        Sep 2021
        7915 pages

        Publisher

        IEEE Press

        Publication History

        Published: 27 September 2021

        Qualifiers

        • Research-article

        Contributors

        Other Metrics

        Bibliometrics & Citations

        Bibliometrics

        Article Metrics

        • Downloads (Last 12 months)0
        • Downloads (Last 6 weeks)0
        Reflects downloads up to 17 Jan 2025

        Other Metrics

        Citations

        Cited By

        View all
        • (2024)Investigation of Low-Moral Actions by Malicious Anonymous Operators of Avatar RobotsACM Transactions on Human-Robot Interaction10.1145/369646614:1(1-34)Online publication date: 21-Sep-2024
        • (2024)Kawaii Computing: Scoping Out the Japanese Notion of Cute in User Experiences with Interactive SystemsExtended Abstracts of the CHI Conference on Human Factors in Computing Systems10.1145/3613905.3651001(1-9)Online publication date: 11-May-2024
        • (2023)New Design Potentials of Non-mimetic Sonification in Human–Robot InteractionACM Transactions on Human-Robot Interaction10.1145/361164612:4(1-17)Online publication date: 13-Dec-2023
        • (2023)Probing Aesthetics Strategies for Robot Sound: Complexity and Materiality in Movement SonificationACM Transactions on Human-Robot Interaction10.1145/358527712:4(1-22)Online publication date: 13-Dec-2023
        • (2023)Nonverbal Sound in Human-Robot Interaction: A Systematic ReviewACM Transactions on Human-Robot Interaction10.1145/358374312:4(1-46)Online publication date: 13-Dec-2023
        • (2023)The Effects of Natural Sounds and Proxemic Distances on the Perception of a Noisy Domestic Flying RobotACM Transactions on Human-Robot Interaction10.1145/357985912:4(1-32)Online publication date: 13-Dec-2023
        • (2023)Kawaii Game Vocalics: A Preliminary ModelCompanion Proceedings of the Annual Symposium on Computer-Human Interaction in Play10.1145/3573382.3616099(202-208)Online publication date: 6-Oct-2023
        • (2023)Can Voice Assistants Sound Cute? Towards a Model of Kawaii VocalicsExtended Abstracts of the 2023 CHI Conference on Human Factors in Computing Systems10.1145/3544549.3585656(1-7)Online publication date: 19-Apr-2023

        View Options

        View options

        Media

        Figures

        Other

        Tables

        Share

        Share

        Share this Publication link

        Share on social media