User Experience Design for Social Robots: A Case Study in Integrating Embodiment
<p>Methodology to evaluate the user experience.</p> "> Figure 2
<p>The Pepper robot and the environment of the experiment.</p> "> Figure 3
<p>User interacting with Pepper.</p> "> Figure 4
<p>Flowchart diagram of the conversation.</p> "> Figure 5
<p>Diagram of the robot’s initial dialogue with the user, including voice and gestures.</p> "> Figure 6
<p>Participants’ perceptions concerning the robot.</p> "> Figure 7
<p>Participants’ feelings towards the robot.</p> "> Figure 8
<p>Feelings during the session.</p> "> Figure 9
<p>Perceptions of robot interactions.</p> "> Figure 10
<p>Median of answers by age.</p> "> Figure 11
<p>Pearson correlation coefficient of results.</p> ">
Abstract
:1. Introduction
- How does a robot’s verbal and nonverbal communication impact the user experience?
- When communicating with a robot, how do users perceive its nature and to what extent do they view it as a machine or a human-like entity?
2. Methodology
2.1. Experiment Design
- Voice and sound: The voice of the robot is female, clearly distinguishable from a human voice. The tone is stable during the conversation, with slight variations depending on the mood of the responses (sadness, happiness, confusion, etc.).
- Personality: The personality of the robot is extroverted, takes initiative, and actively questions the user. The robot is able to adapt its behaviour and the conversation according to the user’s behaviour, detecting if he/she is shy, if he/she feels like talking, if he/she is scared, etc., generating an individual profile for each user.
2.2. Session Structure: Interaction with Pepper
3. Results and Discussions
3.1. Quantitative Results
3.2. Qualitative Results
4. Conclusions, Limitations, and Future Research
Author Contributions
Funding
Institutional Review Board Statement
Informed Consent Statement
Data Availability Statement
Acknowledgments
Conflicts of Interest
References
- Gates, B. A Robot in every home. Sci. Am. 2007, 296, 58–65. [Google Scholar] [CrossRef]
- Broekens, J.; Heerink, M.; Rosendal, H. Assistive social robots in elderly care: A review. Gerontechnology 2009, 8, 94–103. [Google Scholar] [CrossRef] [Green Version]
- Fiorini, L.; Rovini, E.; Sorrentino, A.; Khalid, O.; Coviello, L.; Radi, L.; Toccafondi, L.; Cavallo, F. Can assistive technology support social services during COVID-19 emergency? Barriers and opportunities. Int. J. Interact. Des. Manuf. 2022, 16, 359–370. [Google Scholar] [CrossRef]
- De Greeff, J.; Belpaeme, T.; Bongard, J. Why robots should be social: Enhancing machine learning through social human–robot interaction. PLoS ONE 2015, 10, e0138061. [Google Scholar] [CrossRef] [PubMed] [Green Version]
- Belpaeme, T.; Kennedy, J.; Ramachandran, A.; Scassellati, B.; Tanaka, F. Social robots for education: A review. Sci. Robot. 2018, 3, eaat5954. [Google Scholar] [CrossRef] [Green Version]
- Egido-García, V.; Estévez, D.; Corrales-Paredes, A.; Terrón-López, M.-J.; Velasco-Quintana, P.-J. Integration of a Social Robot in a Pedagogical and Logopedic Intervention with Children: A Case Study. Sensors 2020, 20, 6483. [Google Scholar] [CrossRef]
- Sriram, V.; Jenkinson, C.; Peters, M. Carers’ experiences of assistive technology use in dementia care: A cross sectional survey. BMC Geriatr. 2021, 21, 471. [Google Scholar] [CrossRef] [PubMed]
- Humphreys, C. Can smart home technology support older people? Br. J. Community Nurs. 2018, 23, 435–438. [Google Scholar] [CrossRef]
- Kim, J.; Lee, H.Y.; Candace Christensen, M.; Merighi, J.R. Technology Access and Use, and Their Associations with Social Engagement among Older Adults: Do Women and Men Differ? J. Gerontol. Ser. B 2017, 72, 836–845. [Google Scholar] [CrossRef] [Green Version]
- Oh, S.-T.; Jeon, G.-W.; Lim, J.-H. Method of Calculating Short-Wavelength-Ratio-Based Color Temperature Supporting the Measurement of Real-Time Natural Light Characteristics through RGB Sensor. Sensors 2020, 20, 6603. [Google Scholar] [CrossRef]
- Abbas, T.; Khan, V.-J.; Gadiraju, U.; Ziegler, J. Crowd of Oz: A Crowd-Powered Social Robotics System for Stress Management. Sensors 2020, 20, 569. [Google Scholar] [CrossRef] [PubMed] [Green Version]
- Scassellati, B.; Admoni, H.; Mataric, M. Robots for Use in Autism Research. Annu. Rev. Biomed. Eng. 2012, 14, 275–294. [Google Scholar] [CrossRef] [Green Version]
- Martín, A.; Pulido, J.C.; González, J.C.; García-Olaya, Á.; Suárez, C. A Framework for User Adaptation and Profiling for Social Robotics in Rehabilitation. Sensors 2020, 20, 4792. [Google Scholar] [CrossRef] [PubMed]
- Chita-Tegmark, M.; Scheutz, M. Assistive Robots for the Social Management of Health: A Framework for Robot Design and Human–Robot Interaction Research. Int. J. Soc. Robot. 2020, 13, 197–217. [Google Scholar] [CrossRef] [Green Version]
- Corrales-Paredes, A.; Malfaz, M.; Egido-García, V.; Salichs, M.A. Waymarking in Social Robots: Environment Signaling Using Human-Robot Interaction. Sensors 2021, 21, 8145. [Google Scholar] [CrossRef]
- ISO 9241-210:2019; Ergonomics of Human-System Interaction—Part 210: Human-Centred Design for Interactive Systems. ISO/TC 159/SC 4; ISO: Geneva, Switzerland, 2019.
- Kraft, C. User Experience and Why It Matters. In User Experience Innovation: User Centered Design That Works; Kraft, C., Ed.; Apress: Berkeley, CA, USA, 2012. [Google Scholar]
- Knemeyer, D.; Svoboda, E. User Experience—UX. In The Glossary of Human Computer Interaction; Interaction Design Foundation: Aarhus, Denmark, 2022; pp. 1–10. [Google Scholar]
- Shackel, B. Usability—Context, framework, definition, design and evaluation. Interact. Comput. 2009, 21, 339–346. [Google Scholar] [CrossRef]
- Hartson, R.; Pyla, P.S. The UX Book: Process and Guidelines for Ensuring a Quality User Experience; Morgan Kaufmann: Amsterdam, The Netherlands, 2012. [Google Scholar]
- Mahdi, H.; Akgun, S.A.; Saleh, S.; Dautenhahn, K. A survey on the design and evolution of social robots—Past, present and future. Robot. Auton. Syst. 2022, 156, 104193. [Google Scholar] [CrossRef]
- Alenljung, B.; Lindblom, J.; Cort, R.; Ziemke, T. User Experience in Social Human-Robot Interaction. Int. J. Ambient. Comput. Intell. 2017, 8, 12–31. [Google Scholar] [CrossRef] [Green Version]
- Prati, E.; Villani, V.; Grandi, F.; Peruzzini, M.; Sabattini, L. Use of Interaction Design Methodologies for Human–Robot Collaboration in Industrial Scenarios. IEEE Trans. Autom. Sci. Eng. 2022, 19, 3126–3138. [Google Scholar] [CrossRef]
- Prati, E.; Peruzzini, M.; Pellicciari, M.; Raffaeli, R. How to include User eXperience in the design of Human-Robot Interaction. Robot. Comput.-Integr. Manuf. 2021, 68, 102072. [Google Scholar] [CrossRef]
- Bernotat, J.; Eyssel, F. A robot at home—How affect, technology commitment, and personality traits influence user experience in an intelligent robotics apartment. In Proceedings of the 2017 26th IEEE International Symposium on Robot and Human Interactive Communication (RO-MAN), Lisbon, Portugal, 28 August–1 September 2017; pp. 641–646. [Google Scholar]
- Tonkin, M.; Vitale, J.; Herse, S.; Williams, M.-A.; Judge, W.; Wang, X. Design Methodology for the UX of HRI: A Field Study of a Commercial Social Robot at an Airport. In Proceedings of the 2018 13th ACM/IEEE International Conference on Human-Robot Interaction (HRI), New York, NY, USA, 5–8 March 2018; pp. 407–415. [Google Scholar]
- Apraiz, A.; Lasa, G.; Mazmela, M. Evaluation of User Experience in Human—Robot Interaction: A Systematic Literature Review. Int. J. Soc. Robot. 2023, 15, 187–210. [Google Scholar] [CrossRef]
- Shourmasti, E.S.; Colomo-Palacios, R.; Holone, H.; Demi, S. User Experience in Social Robots. Sensors 2021, 21, 5052. [Google Scholar] [CrossRef]
- Lindblom, J.; Alenljung, B.; Billing, E. Evaluating the User Experience of Human–Robot Interaction. In Human-Robot Interaction; Springer Series on Bio- and Neurosystems; Springer: Cham, Switzerland, 2020; Volume 12. [Google Scholar] [CrossRef]
- Lindblom, J.; Alenljung, B. The ANEMONE: Theoretical Foundations for UX Evaluation of Action and Intention Recognition in Human-Robot Interaction. Sensors 2020, 20, 4284. [Google Scholar] [CrossRef] [PubMed]
- Shamonsky, D. User Experience Best Practices for Human-Robot Interaction. In Human-Computer Interaction: Interaction Techniques and Novel Applications; Kurosu, M., Ed.; Springer International Publishing: Cham, Switzerland, 2021; pp. 318–328. [Google Scholar]
- Barsalou, L.W.; Niedenthal, P.M.; Barbey, A.K.; Ruppert, J.A. Social embodiment. In The Psychology of Learning and Motivation: Advances in Research and Theory; Elsevier Science: New York, NY, USA, 2003; Volume 43, pp. 43–92. [Google Scholar]
- Wainer, J.; Feil-Seifer, D.J.; Shell, D.A.; Mataric, M.J. The role of physical embodiment in human–robot interaction. In Proceedings of the ROMAN 2006—The 15th IEEE International Symposium on Robot and Human Interactive, Hatfield, UK, 6–8 September 2006. [Google Scholar]
- Mollahosseini, A.; Abdollahi, H.; Sweeny, T.D.; Cole, R.; Mahoor, M.H. Role of embodiment and presence in human perception of robots’ facial cues. Int. J. Hum. Comput. Stud. 2018, 116, 25–39. [Google Scholar] [CrossRef]
- Wölfel, M.; Purps, C.F.; Percifull, N. Enabling Embodied Conversational Agents to Respond to Nonverbal Behavior of Communication Partner. In Proceedings of the HCII 2022: International Conference on Human-Computer Interaction. User Experience and Behavior, Virtual Event, 26 June–1 July 2022; Proceedings, Part III. Springer: Berlin/Heidelberg, Germany, 2022; pp. 591–604. [Google Scholar]
- Deng, E.; Mutlu, B.; Matarić, M.J. Embodiment in Socially Interactive Robots. Found. Trends Robot. 2019, 7, 251–356. [Google Scholar] [CrossRef]
- Kanda, T.; Ishiguro, H. Human-Robot Interaction in Social Robotics, 1st ed.; CRC Press: Boca Raton, FL, USA, 2017. [Google Scholar]
- Mori, M.; Macdorman, K.F.; Kageki, N. The Uncanny Valley: The Original Essay by Masahiro Mori. Available online: ics.purdue.edu (accessed on 31 July 2022).
- Ma, G.; Gao, J.; Yu, Z.; Chen, X.; Huang, Q.; Liu, Y. Development of a Socially Interactive System with Whole-Body Movements for BHR-4. Int. J. Soc. Robot. 2016, 8, 183–192. [Google Scholar] [CrossRef]
- Mirnig, N.; Strasser, E.; Weiss, A.; Kühnlenz, B.; Wollherr, D.; Tscheligi, M. Can You Read My Face? Int. J. Soc. Robot. 2015, 7, 63–76. [Google Scholar] [CrossRef]
- Tsiourti, C.; Weiss, A.; Wac, K.; Vincze, M. Multimodal Integration of Emotional Signals from Voice, Body, and Context: Effects of (In)Congruence on Emotion Recognition and Attitudes Towards Robots. Int. J. Soc. Robot. 2019, 11, 555–573. [Google Scholar] [CrossRef] [Green Version]
- Bao, G.; Fang, H.; Chen, L.; Wan, Y.; Xu, F.; Yang, Q.; Zhang, L. Soft Robotics: Academic Insights and Perspectives Through Bibliometric Analysis. Soft Robot. 2018, 5, 229–241. [Google Scholar] [CrossRef]
- Serholt, S.; Barendregt, W.; Vasalou, A.; Alves-Oliveira, P.; Jones, A.; Petisca, S.; Paiva, A. The case of classroom robots: Teachers’ deliberations on the ethical tensions. AI Soc. 2017, 32, 613–631. [Google Scholar] [CrossRef] [Green Version]
- McColl, D.; Hong, A.; Hatakeyama, N.; Nejat, G.; Benhabib, B. A Survey of Autonomous Human Affect Detection Methods for Social Robots Engaged in Natural HRI. J. Intell. Robot. Syst. Theory Appl. 2016, 82, 101–133. [Google Scholar] [CrossRef]
- Özcan, B.; Caligiore, D.; Sperati, V.; Moretta, T.; Baldassarre, G. Transitional Wearable Companions: A Novel Concept of Soft Interactive Social Robots to Improve Social Skills in Children with Autism Spectrum Disorder. Int. J. Soc. Robot. 2016, 8, 471–481. [Google Scholar] [CrossRef] [Green Version]
- Ventre-Dominey, J.; Gibert, G.; Bosse-Platiere, M.; Farne, A.; Dominey, P.F.; Pavani, F. Embodiment into a robot increases its acceptability. Sci. Rep. 2019, 9, 10083. [Google Scholar] [CrossRef] [Green Version]
- García-Corretjer, M.; Ros, R.; Mallol, R.; Miralles, D. Empathy as an engaging strategy in social robotics: A pilot study. User Model. User-Adapt. Interact. 2023, 33, 221–259. [Google Scholar] [CrossRef]
- Zamboni, J.; Viana, D.M.; Rodrigues, L.; Consalvo, M. Expressive Elements of Lifelike Machines. Int. J. Soc. Robot. 2023. [Google Scholar] [CrossRef]
- Rossi, S.; Staffa, M.; Bove, L.; Capasso, B.; Ercolano, G. User’s personality and activity influence on HRI comfortable distances. In Proceedings of the 9th International Conference, ICSR, Tsukuba, Japan, 22–24 November 2017; Springer: Berlin/Heidelberg, Germany; pp. 167–177. [Google Scholar] [CrossRef]
- Nocentini, O.; Fiorini, L.; Acerbi, G.; Sorrentino, A.; Mancioppi, G.; Cavallo, F. A Survey of Behavioral Models for Social Robots. Robotics 2019, 8, 54. [Google Scholar] [CrossRef] [Green Version]
- Interaction Design Foundation (IxDF). What is Usability? 2022. Available online: https://www.interaction-design.org/literature/topics/usability (accessed on 15 February 2023).
- Bainbridge, W.; Hart, J.; Kim, E.S.; Scassellati, B. The effect of presence on human–robot interaction. In Proceedings of the RO-MAN 2008—The 17th IEEE International Symposium on Robot and Human Interactive Communication, Munich, Germany, 1–3 August 2008. [Google Scholar]
- Konijn, E.A.; Hoorn, J.F. Differential Facial Articulacy in Robots and Humans Elicit Different Levels of Responsiveness, Empathy, and Projected Feelings. Robotics 2020, 9, 92. [Google Scholar] [CrossRef]
- Shimaya, J.; Yoshikawa, Y.; Kumazaki, H.; Matsumoto, Y.; Miyao, M.; Ishiguro, H. Communication Support via a Tele-Operated Robot for Easier Talking: Case/Laboratory Study of Individuals with/Without Autism Spectrum Disorder. Int. J. Soc. Robot. 2019, 11, 171–184. [Google Scholar] [CrossRef]
- Creswell, J.W.; Luketić, S.U.; Zadru, D.; Za Pedagogiju, O. Research Design: Qualitative, Quantitative and Mixed Methods Approaches, 3rd ed.; (Nacrt Istraživanja: Kvalitativni, Kvantitativni i Mješoviti Pristupi); Sage: Newcastle upon Tyne, UK, 2013. [Google Scholar]
- Morse, J.M. Mixed Method Design: Principles and Procedures, 1st ed.; Routledge: Abingdon, UK, 2009. [Google Scholar]
- Bryman, A. Quantitative and qualitative research: Further reflections on their integration. In Mixing Methods: Qualitative and Quantitative Research, 1st ed.; Routledge: Abingdon, UK, 1992; p. 22. [Google Scholar] [CrossRef]
- Weiss, A.; Bartneck, C. Meta analysis of the usage of the Godspeed Questionnaire Series. In Proceedings of the IEEE International Workshop on Robot and Human Interactive Communication, Kobe, Japan, 31 August–4 September 2015; pp. 381–388. [Google Scholar] [CrossRef]
- Del Valle-Canencia, M.; Moreno Martínez, C.; Rodríguez-Jiménez, R.-M.; Corrales-Paredes, A. The emotions effect on a virtual character’s design—A student perspective analysis. Front. Comput. Sci. 2022, 4, 892597. [Google Scholar] [CrossRef]
- Softbank Robotics. Available online: https://www.softbankrobotics.com/ (accessed on 15 March 2023).
- Stoeva, D.; Gelautz, M. Body language in affective human–robot interaction. In Proceedings of the Companion of the 2020 ACM/IEEE International Conference on Human-Robot Interaction, Cambridge, UK, 23–26 March 2020; pp. 606–608. [Google Scholar] [CrossRef] [Green Version]
- Fischer, K.; Jung, M.; Jensen, L.C.; aus der Wieschen, M.V. Emotion expression in HRI—When and why. In Proceedings of the ACM/IEEE International Conference on Human-Robot Interaction, Daegu, Republic of Korea, 11–14 March 2019; pp. 29–38. [Google Scholar] [CrossRef]
- Given, L.M. (Ed.) The SAGE Encyclopedia of Qualitative Research Methods (Two Volume Set); SAGE Publications: Newcastle upon Tyne, UK, 2008. [Google Scholar]
- Kuckartz, U. Qualitative Text Analysis: A Systematic Approach. In Compendium for Early Career Researchers in Mathematics Education; Kaiser, G., Presmeg, N., Eds.; ICME-13 Monographs; Springer: Cham, Switzerland, 2019. [Google Scholar] [CrossRef] [Green Version]
- Creswell, J.W.; Creswell, J.D. Research Design: Qualitative, Quantitative, and Mixed Methods Approaches, 4th ed.; Sage: Newbury Park, CA, USA, 2017. [Google Scholar]
Name | Movements and Eye Colours |
---|---|
Great_1 | It nods its head. The robot also raises its right arm to the level of its torso and brings its elbow backwards. Then, it lowers it slightly. |
ScratchEye_1 | It raises its arm towards its eyes and rotates it repeatedly (as if pretending to scratch its eye). Eyes with light blue and blue LEDs. |
Surprise_3 | It raises its head, looking forward. It opens its arms slightly, palms facing outwards. Gasp of surprise. Eyes with yellow LEDs. |
Shocked_1 | It looks straight ahead. It moves its torso forward, crouching slightly. It brings its hands close to each other and separates them, leaving its arms hanging by its sides. Eyes with blue LEDs. |
Mocker_1 | It looks up, then lowers its head to the side and repeatedly raises and lowers it. The robot points forward with its left arm. It moves its right arm into a fist towards the lower part of the torso. Mocking laugh sound. Eyes with white LEDs. |
Laugh_2 | It lowers its head and then raises it, looking straight ahead. It bends its right arm putting its fist on its chin. It bends its left arm bringing the fist to the lower part of the torso. Sound of brief discreet laughter. Eyes with light blue and green LEDs. |
Happy_4 | It shakes its head up and down as if nodding. It bends its arms up slightly, bringing its hands closer. Eyes with green LEDs. |
Answer | Age |
---|---|
18 to 20 years | 47.22% |
20 to 25 years | 30.56% |
26 to 30 years | 2.78% |
30 to 40 years | 5.56% |
more than 40 years | 13.89% |
Dimensions | Categories | Sub-Categories |
---|---|---|
Environment | Room | |
Video camera | ||
User body language | ||
Talk to a robot | Selfie | Positive |
Disturbing | ||
Auto selfie | ||
Movement | Natural | |
Only in answer | ||
Move hands | ||
Technical aspects | Language processing | |
Structure | ||
Interview | Conversation | |
Interaction |
Disclaimer/Publisher’s Note: The statements, opinions and data contained in all publications are solely those of the individual author(s) and contributor(s) and not of MDPI and/or the editor(s). MDPI and/or the editor(s) disclaim responsibility for any injury to people or property resulting from any ideas, methods, instructions or products referred to in the content. |
© 2023 by the authors. Licensee MDPI, Basel, Switzerland. This article is an open access article distributed under the terms and conditions of the Creative Commons Attribution (CC BY) license (https://creativecommons.org/licenses/by/4.0/).
Share and Cite
Corrales-Paredes, A.; Sanz, D.O.; Terrón-López, M.-J.; Egido-García, V. User Experience Design for Social Robots: A Case Study in Integrating Embodiment. Sensors 2023, 23, 5274. https://doi.org/10.3390/s23115274
Corrales-Paredes A, Sanz DO, Terrón-López M-J, Egido-García V. User Experience Design for Social Robots: A Case Study in Integrating Embodiment. Sensors. 2023; 23(11):5274. https://doi.org/10.3390/s23115274
Chicago/Turabian StyleCorrales-Paredes, Ana, Diego Ortega Sanz, María-José Terrón-López, and Verónica Egido-García. 2023. "User Experience Design for Social Robots: A Case Study in Integrating Embodiment" Sensors 23, no. 11: 5274. https://doi.org/10.3390/s23115274
APA StyleCorrales-Paredes, A., Sanz, D. O., Terrón-López, M. -J., & Egido-García, V. (2023). User Experience Design for Social Robots: A Case Study in Integrating Embodiment. Sensors, 23(11), 5274. https://doi.org/10.3390/s23115274