• Gervasi R, Barravecchia F, Mastrogiacomo L and Franceschini F. (2022). Applications of affective computing in human-robot interaction: State-of-art and challenges for manufacturing. Proceedings of the Institution of Mechanical Engineers, Part B: Journal of Engineering Manufacture. 10.1177/09544054221121888. 237:6-7. (815-832). Online publication date: 1-May-2023.

    http://journals.sagepub.com/doi/10.1177/09544054221121888

  • Chin H and Yi M. (2021). Voices that Care Differently: Understanding the Effectiveness of a Conversational Agent with an Alternative Empathy Orientation and Emotional Expressivity in Mitigating Verbal Abuse. International Journal of Human–Computer Interaction. 10.1080/10447318.2021.1987680. 38:12. (1153-1167). Online publication date: 21-Jul-2022.

    https://www.tandfonline.com/doi/full/10.1080/10447318.2021.1987680

  • Penčić M, Čavić M, Oros D, Vrgović P, Babković K, Orošnjak M and Čavić D. (2022). Anthropomorphic Robotic Eyes: Structural Design and Non-Verbal Communication Effectiveness. Sensors. 10.3390/s22083060. 22:8. (3060).

    https://www.mdpi.com/1424-8220/22/8/3060

  • Wairagkar M, Lima M, Bazo D, Craig R, Weissbart H, Etoundi A, Reichenbach T, Iyengar P, Vaswani S, James C, Barnaghi P, Melhuish C and Vaidyanathan R. Emotive Response to a Hybrid-Face Robot and Translation to Consumer Social Robots. IEEE Internet of Things Journal. 10.1109/JIOT.2021.3097592. 9:5. (3174-3188).

    https://ieeexplore.ieee.org/document/9486848/

  • Mccoll D, Hong A, Hatakeyama N, Nejat G and Benhabib B. (2016). A Survey of Autonomous Human Affect Detection Methods for Social Robots Engaged in Natural HRI. Journal of Intelligent and Robotic Systems. 82:1. (101-133). Online publication date: 1-Apr-2016.

    https://doi.org/10.1007/s10846-015-0259-2

  • Supratak A, Wu C, Dong H, Sun K and Guo Y. (2016). Survey on Feature Extraction and Applications of Biosignals. Machine Learning for Health Informatics. 10.1007/978-3-319-50478-0_8. (161-182).

    http://link.springer.com/10.1007/978-3-319-50478-0_8

  • Fitter N and Kuchenbecker K. (2016). Designing and Assessing Expressive Open-Source Faces for the Baxter Robot. Social Robotics. 10.1007/978-3-319-47437-3_33. (340-350).

    http://link.springer.com/10.1007/978-3-319-47437-3_33

  • Mäkäräinen M, Kätsyri J and Takala T. (2014). Exaggerating Facial Expressions: A Way to Intensify Emotion or a Way to the Uncanny Valley?. Cognitive Computation. 10.1007/s12559-014-9273-0. 6:4. (708-721). Online publication date: 1-Dec-2014.

    http://link.springer.com/10.1007/s12559-014-9273-0

  • Bennett C, Sabanovic S, Fraune M and Shaw K. (2014). Context congruency and robotic facial expressions: Do effects on human perceptions vary across culture? 2014 RO-MAN: The 23rd IEEE International Symposium on Robot and Human Interactive Communication. 10.1109/ROMAN.2014.6926296. 978-1-4799-6765-0. (465-470).

    http://ieeexplore.ieee.org/document/6926296/

  • Lawrence L and Abdel Nabi D. (2013). The Compilation and Validation of a Collection of Emotional Expression Images Communicated by Synthetic and Human Faces. International Journal of Synthetic Emotions. 4:2. (34-62). Online publication date: 1-Jul-2013.

    https://doi.org/10.4018/ijse.2013070104

  • Bazo D, Vaidyanathan R, Lentz A and Melhuish C. (2010). Design and testing of a hybrid expressive face for a humanoid robot 2010 IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS 2010). 10.1109/IROS.2010.5651469. 978-1-4244-6674-0. (5317-5322).

    http://ieeexplore.ieee.org/document/5651469/

  • Nguyen H and Masthoff J. Designing empathic computers. Proceedings of the 4th International Conference on Persuasive Technology. (1-9).

    https://doi.org/10.1145/1541948.1541958

  • Khanum A, Mufti M, Javed M and Shafiq M. (2009). Fuzzy case-based reasoning for facial expression recognition. Fuzzy Sets and Systems. 160:2. (231-250). Online publication date: 15-Jan-2009.

    https://doi.org/10.1016/j.fss.2008.05.022

  • Fitrianie S and Rothkrantz L. The Generation of Emotional Expressions for a Text-Based Dialogue Agent. Proceedings of the 11th international conference on Text, Speech and Dialogue. (569-576).

    https://doi.org/10.1007/978-3-540-87391-4_72

  • Gong L. (2007). Is happy better than sad even if they are both non-adaptive? Effects of emotional expressions of talking-head interface agents. International Journal of Human-Computer Studies. 65:3. (183-191). Online publication date: 1-Mar-2007.

    https://doi.org/10.1016/j.ijhcs.2006.09.005

  • Hone K. (2006). Empathic agents to reduce user frustration. Interacting with Computers. 18:2. (227-245). Online publication date: 1-Mar-2006.

    https://doi.org/10.1016/j.intcom.2005.05.003

  • Garau M. Selective Fidelity: Investigating Priorities for the Creation of Expressive Avatars. Avatars at Work and Play. 10.1007/1-4020-3898-4_2. (17-38).

    http://link.springer.com/10.1007/1-4020-3898-4_2

  • Kapur A, Kapur A, Virji-Babul N, Tzanetakis G and Driessen P. Gesture-Based affective computing on motion capture data. Proceedings of the First international conference on Affective Computing and Intelligent Interaction. (1-7).

    https://doi.org/10.1007/11573548_1

  • Bartneck C and Reichenbach J. (2005). Subtle emotional expressions of synthetic characters. International Journal of Human-Computer Studies. 62:2. (179-192). Online publication date: 1-Feb-2005.

    https://doi.org/10.1016/j.ijhcs.2004.11.006

  • Takahashi T, Bartneck C, Katagiri Y and Arai N. (2005). TelMeA—Expressive avatars in asynchronous communications. International Journal of Human-Computer Studies. 10.1016/j.ijhcs.2004.11.005. 62:2. (193-209). Online publication date: 1-Feb-2005.

    https://linkinghub.elsevier.com/retrieve/pii/S1071581904001247

  • Brave S, Nass C and Hutchinson K. (2005). Computers that care. International Journal of Human-Computer Studies. 62:2. (161-178). Online publication date: 1-Feb-2005.

    https://doi.org/10.1016/j.ijhcs.2004.11.002

  • Schiano D, Ehrlich S and Sheridan K. Categorical imperative NOT. Proceedings of the SIGCHI Conference on Human Factors in Computing Systems. (49-56).

    https://doi.org/10.1145/985692.985699

  • Brave S and Nass C. Emotion in human-computer interaction. The human-computer interaction handbook. (81-96).

    /doi/10.5555/772072.772081

  • Bartneck C. (2001). How Convincing is Mr. Data's Smile. User Modeling and User-Adapted Interaction. 11:4. (279-295). Online publication date: 1-Nov-2001.

    https://doi.org/10.1023/A:1011811315582

  • Schiano D, Ehrlich S and Sheridan K. Categorical perception of facial affect. CHI '01 Extended Abstracts on Human Factors in Computing Systems. (299-300).

    https://doi.org/10.1145/634067.634244

  • Schiano D, Ehrlich S, Rahardja K and Sheridan K. (2000). Measuring and modeling facial affect. Behavior Research Methods, Instruments, & Computers. 10.3758/BF03200822. 32:4. (505-514). Online publication date: 1-Dec-2000.

    http://link.springer.com/10.3758/BF03200822

  • Ehrlich S, Schiano D and Sheridan K. Communicating facial affect. CHI '00 Extended Abstracts on Human Factors in Computing Systems. (251-252).

    https://doi.org/10.1145/633292.633439

  • Johnson B, Prabhakar A and Hindus D. Interval research corportation. CHI '00 Extended Abstracts on Human Factors in Computing Systems. (231-232).

    https://doi.org/10.1145/633292.633427