Abstract
A successful Human-Robot Interaction (HRI) depends on the empathy that the robot has the capability of instantiating on the user, namely through the expression of emotions. In this pilot study, we examined the recognition of emotions being expressed by a service robot in a virtual environment (VE), by university students. The VE was a corridor, neutral in terms of context of use. The robot’s facial expressions, body movements, and displacement were manipulated to express eight basic emotions. Results showed that participants had difficulties in recognizing the emotions (33% of success). Also, results suggested that the participants established empathy with the robot. Further work is needed to improve the emotional expression of this robot, which aims to interact with hospitalized children.
You have full access to this open access chapter, Download conference paper PDF
Similar content being viewed by others
Keywords
1 Introduction
The robotics field, initially, had as main objective the production of industrial robots that are able to replace humans in the most routine and dangerous tasks [1]. However, recently, it is possible to witness a paradigm shift with the appearance of service robots [2–5]. Service robots are intelligent artificial partners that interact with humans to promote social and intellectual benefits. In this sense, today, the disciplines such as robotics, computer science, and design, among others are focusing on Human-Robot Interaction (HRI) [6]. This type of robots is mainly intended to interact, accompany and help humans while they perform their work and daily tasks [1]. In order to interact with humans, robots should have characteristics that make for an effective and fluent interaction. However, to interact with a robot is significantly different than interacting with another human, even when sharing the same capacities and communication skills, either verbal (e.g., speech) or nonverbal (e.g., facial expressions). It is known, through studies in Psychology and Sociology, that emotions play a fundamental role in the individual’s behavior regulation in social contexts [7]. Emotions are what allow the human initiate, maintain, and/or terminate, a relationship or interaction with the environment and with other humans [8].
In the design field, the importance of emotions in the idealization of objects and products, in addition to the components of usability, are also recognized. In this context Emotional Design [9] emerges, based on the premise that products and objects that are able to elicit emotions in the user (positive emotions, preferably), allow a greater and better human-product interaction.
Taking into account the fundamental role that emotions play in human interaction processes, it is believed that a service robot must involve artificial intelligence with emotions [10]. HRI is only effective if the robot is able to express emotions [11]. Robots must be able to recognize emotions expressed by humans and respond accordingly, including those that are empathic responses. In other words, this interaction should be as similar as possible to the interaction between humans, but it must be taken into account that, despite it being possible for robots to recognize the emotional states of humans and respond in accordance with them, this does not occur in the same way as the interaction between humans. This is because robots are unable to attribute cognitive meaning to an experience or situation as humans can do [12].
However, humans do not interact only based on emotions. Humans also interact through facial expressions, speech, body movements, among others [13]. It is in this context that humanoid robots appear. Human-like bodies of humanoid robots enable humans to intuitively understand their gestures and cause people to unconsciously behave as if they were communicating with humans, that is, if a humanoid robot effectively uses its body, people will communicate naturally with it [14].
In this context, it is important to understand the essential features that are inherent to a successful HRI and, therefore, provide fundamental guidelines for the design of robots (i.e., service robots).
This paper presents a pilot study on the emotional design of a service robot being used in healthcare contexts for children [15]. The main purpose was to define how the robot would express eight emotions (i.e., joy, trust, fear, surprise, sadness, disgust, anger, and anticipation) by using components such as facial expressions (e.g., eyes, mouth), body movements (e.g., head, arms) and displacement (e.g., back and forward, turns, sideways). Recommendations regarding the design of this type of robot are practically non-existent; consequently, this study seeks to complete/enhance this important gap and contributes to this expanding area of research. The use and the creation of a functional robot involve various costs, including time and economic ones. However, these costs may be reduced through the use of Virtual Environments (VEs). Virtual Reality has important advantages, as a research tool, such as the opportunity to conduct a myriad of evaluations without a physical fully functional prototype, the high replicability of the experiments, and the easy manipulation of the robot’s design variables (e.g., facial expressions), with good internal, external and ecological validities. Moreover, the use of VEs for the evaluation of user experience is still largely unexplored.
2 Method
The method is divided into three stages: Definition; Design; and Test, as described next.
2.1 Stage 1 - Definition
At the outset of the study, this stage was conducted aiming to define a specific combination of components (i.e., facial expressions, body movements and displacement) that would represent a given emotion for which the robot’s expressions could be created.
Participants. Ten undergraduate Design students participated, aged between 19 and 21 (M = 19.40; SD = 0.66) years old, equally distributed by gender.
Stimuli and apparatus. Eight target emotions; i.e., joy, trust, fear, surprise, sadness, disgust, anger, and anticipation were selected, gathered from Plutchik’s theory [16]. The procedure took place in a photography studio of IADE’s Media Lab in which a video camera was used for the data collection. A squared area of 2 by 2 meters was marked in the studio’s floor, inside which the participants should perform. To better simulate the robot’s behavior, two arms made of cardboard were attached to the arms of the participants.
Procedure. Upon arrival, participants received a brief explanation about the purpose of the study and an informed consent was obtained. After dressed with the cardboard arms, they were given instructions to follow and were requested to perform each one of the emotions. It was stressed that they should mimic a robot, as such, their movements should be, to some extent, constrained; e.g., they could move their arms only up and down, walk forward backward and move sideways, rotate their head and body to the left and right, but they could not bend their body. All participants were requested to perform a training session. After they declared that they were able to perform the procedure as requested, the experimental session began. Following the training, participants were requested to perform the emotions that were printed in a sheet of paper. The order of presentation of the emotions was randomized. After the participant had finished the performance of each emotion, the researcher asked him/her about dubious facial expressions or movements. The experimental session ended after the 8 emotions were performed.
Results. Two independent researchers analyzed all the recorded videos in an attempt to identify the characteristics that, according to the participants, would express better each emotion. Special attention was given to the facial expression (eyes and mouth), arms and body movement (up and down; forward or backward). The researchers, working individually, filled a table identifying the features for each emotion. At the end of this process, the two observations were compared and all discrepancies were resolved with the help of the research team. The results were implemented to the robot as shown in the Fig. 1.
2.2 Stage 2 - Design
This stage aimed to program the virtual robot prototype to express the 8 emotions, based on the features defined in the previous step. For that, a Virtual Environment (VE) was created which was contextual neutral, approximately 5 m long. The robot was placed inside the VE (see Fig. 2). Both the robot and the VE were created using Rhinoceros and then exported to Unity 3D.
For the Testing stage, the scene was envisioned as follows: The robot started approaching through the end of the corridor. In the beginning, the robot would move to be closer to the participants while remaining completely on the screen (i.e., approximately 2.3 m away from the participants). During this movement, the robot was exhibiting a Neutral emotion (i.e., arms rested, mouth was a single line and the eyes were lit but without any level of glow), so that the participants become accustomed to the VE. Afterwards, the robot would perform the first emotion (each emotion was displayed for 10 s), and at the end of each the participants had to match the emotion displayed with one in a list of 16 possibilities. The emotions displayed by the robot were changed by a keypress, by the researcher. Between each presented emotion, the robot would return to its position in front of the participants and remained turned off (i.e., arms rested, eyes and mouth turned off) until the researcher would activate the next emotion. So as not to create an order effect of the emotions presentation, two randomized sequences were created, where the emotions were presented in the following order: Sequence 1 - Surprise, Sadness, Trust, Fear, Anticipation, Disgust, Anger, Joy; Sequence 2 - Sadness, Fear, Trust, Joy, Surprise, Anger, Anticipation, Disgust. At the end of this task, the virtual robot prototype was ready to be used in the Test stage.
2.3 Stage 3 - Test
This stage aimed to test the extent to which the robot was able to display the intended emotion in a way that was correctly identified by the participants. To do so, participants were requested to watch the robot performing and identify the emotion being acted.
Participants. 13 participants volunteered in this experiment. 7 were female (53.8%) and 6 males (46.2%) aged between 19 and 37 (M = 24.6; SD = 5.02) years old.
Materials and design. The experiment was conducted using a 3D projection-based virtual reality system, with a 1280 × 720 pixels resolution at 120 Hz. Participants sat 100 cm away from the wall screen and wore 3D shutter glasses. The experiment took place in a darkened room to prevent outside distractions. The stimuli were the 8 emotions expressed by the robot in the VE, as described before.
A questionnaire based on Lakatos et al. [17] was administered divided in three parts: (1) Before the interaction with the robot; (2) During the interaction; and (3) After the interaction. Part 1 begins with demographic questions (e.g., age, degree). Afterwards, a brief definition about service robots was given and it was asked if the participant had previous contact with a service robot (if yes, the participant must indicate the context) and if the participant had a service robot. A Technological Attitude Scale, consisting of 9 affirmations followed, aiming to understand the participant’s relationship with technology. A five-point Likert scale (1 - I Strongly Disagree; 2 - I Disagree; 3 - Undecided; 4 - I Agree, 5 - I Strongly Agree) was employed. Then, to assess the participants’ opinion about robots the Perception Scale about Robots was applied, which is based on the Negative Attitudes Toward Robots Scale [18]. This questionnaire had 12 items and the same scale used as previous. Part 2 was presented during the interaction with the robot and aimed to assess the participants’ ability to identify the emotion being expressed by the robot. The participants should select either one emotion from a list of sixteen emotions (8 emotions under study plus 8 distracting emotions - Anxiety, Irritation, Shame, Contempt, Guilt, Pleasure, Despair, Proud), or the option “none of the above emotions is correct”. The participants did this matching for every emotion expressed. After the interaction with the robot, Part 3 of the questionnaire was presented. This part aimed to understand the participants’ perception of the virtual prototype of the robot. In this sense, three questions were asked to the participants, using the same Likert scale mentioned above: 1. I would feel comfortable if I had to interact with this robot; 2. I did not like having this robot in my house; 3. I would not feel sorry if I had to destroy this robot. Finally, participants were also asked about the perceived robot’s gender and about what function they think that the robot can perform.
Procedure. Participants performed the experiment at IADE-UX.Lab. Upon arrival, participants read the informed consent form containing brief information about the objectives of the study, and were warned for the possibility of slight negative effects due to the use of 3D glasses (e.g., nausea). Once informed consent was acquired, participants answered the Part 1 of the questionnaire. Then they were asked to watch the robot’s performance, while wearing the 3D glasses, and reply to the Part 2 of the questionnaire. After they did so for the 8 emotions, they responded to the Part 3 of the questionnaire. The total duration of the procedure was of about 15 min. At the end of the experiment participants were thanked, debriefed and dismissed.
3 Results and Discussion
The small number of participants did not allow to conduct quantitative analyzes of the data, by which a qualitative analysis is presented.
With regard to Part 1 of the questionnaire, it was found that only 3 participants had been in contact with a service robot in past. Regarding the Technological Attitude Scale, it is of note that, on average, participants considered their technological knowledge as good, enjoying exploring new technological devices. Moreover, participants revealed that they are afraid that robots are used for bad purposes, and they are afraid that in the future the robots might dominate society. It notes also that the participants would like to have a service robot in their homes, and they think that the service robots are useful. These results suggest that participants have a positive attitude towards technology and service robots.
The most important data from this pilot study is related with the ability to correctly recognize the emotion being expressed by the robot (Part 2 of the questionnaire). Table 1 shows the associations between the displayed emotions and the listed emotions.
Results revealed that participants had some difficulty in recognizing the emotions expressed by the robot, since the emotions are only correctly recognized in 33% of cases. However, while looking at the recognition for each of emotions separately, it is possible to understand that two of the emotions are correctly recognized by more than 50% of the participants: joy (56%) and surprise (54%). Also anticipation and sadness were recognized correctly in 46% of the cases. On the other hand, the other emotions attain a low percentage of correct recognition: fear (23%), trust (15%), disgust (8%) and anger (0%). It should be noted that the participants could choose an emotion from 16 possible (8 target emotions, and 8 distracting emotions).
The trust emotion was wrongly identified by 46% as joy. One possible explanation for the poorly identified emotions can be the limitation of the robot’s mouth design, e.g., displaying a smile can be easily confused with a state of contentment/happiness. Furthermore, it is also highlighted the fact that trust is not an emotion that is considered by many authors as a basic emotion e.g., [19], and this may explain the difficulty of recognizing it. Disgust was identified as anxiety, shame or fear in 46% of the cases, or selected as not corresponding to any emotion of the list (31%). This may be due to the fact that for this emotion, the robot raises one arm at the face level and this movement can be interpreted as hiding the face as a sign of shame or fear. Moreover, a large number of the participants were not able to identify disgust emotion. This demonstrates the need to reprogram this emotion. With regard to anger, it was 61% of the time confused with anxiety or irritation. This is because, and according to the participants’ feedback, the arms movements of the robot (up and down relatively quickly), gives the idea of anxiety and impatience. Furthermore, participants showed some difficulty in distinguishing anger and irritation, this could be due to the fact that in Portuguese language they are very similar concepts. However, despite the low level of correct recognition, there is a tendency that negative emotions are recognized as negative (e.g., anger-irritation), and positive emotions as positive (e.g., joy-pleasure); i.e., there is congruence in the recognition of the emotions valence.
In Part 3 of the questionnaire, participants answered a few questions about the robot. Participants revealed that they would feel comfortable if they had to interact with this robot, also they said they would like having this robot in their homes and they felt sorry if they had to destroy it. These results suggested that the robot creates some degree of empathy with the participants. With regard to the robot gender, 10 participants said that the robot does not have a defined gender, and the remaining 3 participants said that it is male. Regarding the function of the robot, the participants answered that it would be a robot to help with housework and a companion robot for people with disabilities, elderly and children.
4 Conclusion
The main objective of this study was to examine the recognition of emotions being expressed by a service robot in a Virtual Environment (VE). Thus, in the stage Definition, a set of components was defined that the robot should perform to express the emotions. The Design stage’s aim was to program the virtual robot prototype, based on the components defined in the Definition stage, using a VE. Finally, in the Test stage, the goal was to test the virtual prototype robot, in order to understand if the emotions were correctly identified and recognized by the participants.
The results showed that the participants had some troubles to recognize correctly some of the emotions defined (i.e. fear, disgust, trust, and anger). However, through the feedback given by the participants, with respect to the difficulties experienced during the test, it was possible to identify some problems in the presentation of some of those emotions. For future work it is intended to implement some changes in the expressions of these emotions, making them easier to recognize by manipulating/changing (i.e., eyes, mouth) and/or adding different components (i.e., eye brows).
References
Siegel, M., Breazeal, C., Norton, M.: Persuasive robotics: the influence of robot gender on human behavior. In: IEEE/RSJ International Conference on Intelligent Robots and Systems, pp. 2563–2568. IEEE Press, New York (2009)
Goris, K., Saldien, J., Lefeber, D.: Probo: a testbed for human robot interaction. In: ACM/IEEE International Conference on Human Robot Interaction, pp. 253–254. IEEE Press, New York (2009)
Graaf, M., Allouch, S.: Exploring influencing variables for the acceptance of social robots. Robot. Auton. Syst. 61, 1476–1486 (2013)
Hollinger, G.A., Georgiey, Y., Manfredi, A., Maxwell, B.A., Pezzementi, Z.A., Mitchell, B.: Design of a social mobile robot using emotion-based decision mechanisms. In: IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS), pp 3093–3098. IEEE Press, New York (2006)
Oh, K., Kim, M.: Social attributes of robotic products: observations of child-robot interactions in a school environment. Int. J. Des. 4, 45–55 (2010)
Breazeal, C.: Designing Sociable Robots. MIT Press, Cambridge (2002)
Plutchik, R.: The Emotions. University Press of America, Lanham (1991)
Frijda, N.: The Emotions. Cambridge University Press, Cambridge (1986)
Norman, D.: Emotional Design: Why we Love (or Hate) Everyday Things. Basic Books, New York (2004)
Picard, R.: Affective computing. Technical report, MIT Media Laboratory - Perceptual Computing Section, vol. 321, pp. 1–16 (1997)
Picard, R.: Towards computers that recognize and respond to user emotion. ibm systems J. 39, 705–719 (2000)
Blow, M., Dautenhahn, K., Appleby, A., Nehaniv, C., Lee, D.: Perception of robot smiles and dimensions for human-robot interaction design. In: 15th IEEE International Symposium on Robot and Human Interactive Communication, pp. 469–474. IEEE Press, New York (2006)
Bartneck, C.: How convincing is Mr Data’s Smile: affective expressions of machines. User Model. User Adap. Interact. 11, 279–295 (2001)
Kanda, T., Iwase, K., Shiomi, M., Ishiguro, H.: Moderating user’s tension to enable them to exhibit other emotions. In: Kanda, T., Ishiguro, H. (eds.) Human-Robot Interaction in Social Robots, pp. 299–311. CRC Press Taylor & Francis, Florida (2005)
MonarCH Project. http://monarch-fp7.eu/
Plutchik, R.: Emotion: A Psychoevolutionary Synthesis. Harkes & Row, New York (1980)
Lakatos, G., Gácsi, M., Konok, V., Brúder, I., Bereczky, B., Korondi, P., Miklosi, A.: Emotion attribution to a non-humanoid robot in different social situations. PLoS ONE 9(12), e114207 (2014)
Nomura, T., Suzuki, T., Kanda, T., Kato, K.: Altered attitudes of people toward robots: investigation through the Negative Attitudes toward Robots Scale. In: AAAI-06 Workshop on Human Implications of Human-Robot Interaction, pp. 29–35. AAAI Press, California (2006)
Ekman, P., Friesen, W., Ellsworth, P.: Emotion in the human face. Cambridge University Press, Cambridge (1982)
Acknowledgements
The authors would like to thank Isabel Ferreira and João Sequeira, from the MOnarCH Project for the encouragement and the opportunity to use the virtual prototype of the robot.
Author information
Authors and Affiliations
Corresponding author
Editor information
Editors and Affiliations
Rights and permissions
Copyright information
© 2016 Springer International Publishing Switzerland
About this paper
Cite this paper
Giambattista, A., Teixeira, L., Ayanoğlu, H., Saraiva, M., Duarte, E. (2016). Expression of Emotions by a Service Robot: A Pilot Study. In: Marcus, A. (eds) Design, User Experience, and Usability: Technological Contexts. DUXU 2016. Lecture Notes in Computer Science(), vol 9748. Springer, Cham. https://doi.org/10.1007/978-3-319-40406-6_31
Download citation
DOI: https://doi.org/10.1007/978-3-319-40406-6_31
Published:
Publisher Name: Springer, Cham
Print ISBN: 978-3-319-40405-9
Online ISBN: 978-3-319-40406-6
eBook Packages: Computer ScienceComputer Science (R0)