[go: up one dir, main page]
More Web Proxy on the site http://driver.im/ skip to main content
10.1145/3460418.3479341acmconferencesArticle/Chapter ViewAbstractPublication PagesubicompConference Proceedingsconference-collections
research-article

Challenges of Emotion Detection Using Facial Expressions and Emotion Visualisation in Remote Communication

Published: 24 September 2021 Publication History

Abstract

Emotion detection research has gained significant attention across different scientific disciplines, including Human-Computer Interaction and Ubiquitous Computing. In this work, we provide an overview of challenges that researchers and designers face when designing and deploying emotion detection systems. We discuss these challenges in the context of an online video conferencing tool we designed to detect emotions from facial expressions. Finally, we propose ideas on how to better design emotion inference systems and how to provide visual feedback on emotions.

References

[1]
Yusuf Albayram, Mohammad Maifi Hasan Khan, Theodore Jensen, Ross Buck, and Emil Coman. 2018. The effects of risk and role on users’ anticipated emotions in safety-critical systems. In International Conference on Engineering Psychology and Cognitive Ergonomics. Springer, 369–388.
[2]
Giuseppe Amato, Fabrizio Falchi, Claudio Gennaro, and Claudio Vairo. 2018. A comparison of face verification with facial landmarks and deep features. In 10th International Conference on Advances in Multimedia (MMEDIA). 1–6.
[3]
Ebrahim Babaei, Namrata Srivastava, Joshua Newn, Qiushi Zhou, Tilman Dingler, and Eduardo Velloso. 2020. Faces of Focus: A Study on the Facial Cues of Attentional States. Association for Computing Machinery, New York, NY, USA, 1–13. https://doi.org/10.1145/3313831.3376566
[4]
Pascal Belin, Sarah Fillion-Bilodeau, and Frédéric Gosselin. 2008. The Montreal Affective Voices: a validated set of nonverbal affect bursts for research on auditory affective processing. Behavior research methods 40, 2 (2008), 531–539.
[5]
Henriette Cramer, Paloma de Juan, and Joel Tetreault. 2016. Sender-intended functions of emojis in US messaging. In Proceedings of the 18th International Conference on Human-Computer Interaction with Mobile Devices and Services. 504–509.
[6]
Tilman Dingler, Takashi Goto, Benjamin Tag, and Kai Kunze. 2017. EMS Icons: Conveying Information by Analogy to Enhance Communication through Electrical Muscle Stimulation. In Proceedings of the 2017 ACM International Joint Conference on Pervasive and Ubiquitous Computing and Proceedings of the 2017 ACM International Symposium on Wearable Computers (Maui, Hawaii) (UbiComp ’17). Association for Computing Machinery, New York, NY, USA, 732–739. https://doi.org/10.1145/3123024.3129275
[7]
Sam Dutton. 2013. WebRTC in the real world: STUN, TURN and signaling. Google, Nov (2013).
[8]
Sam Dutton 2012. Getting started with WebRTC. HTML5 Rocks 23(2012).
[9]
K. Egevang and P. Francis. 1994. The IP Network Address Translator (NAT). RFC 1631(1994), 1–10.
[10]
Paul Ekman. 1992. An argument for basic emotions. Cognition and Emotion 6, 3-4 (1992), 169–200. https://doi.org/10.1080/02699939208411068 arXiv:https://doi.org/10.1080/02699939208411068
[11]
Paul Ekman and Maureen O’Sullivan. 1991. Who can catch a liar?American psychologist 46, 9 (1991), 913.
[12]
Samuel W Fernberger. 1929. Can an emotion be accurately judged by its facial expression alone. Am. Inst. Crim. L. & Criminology 20 (1929), 554.
[13]
Ian Fette and Alexey Melnikov. 2011. The websocket protocol.
[14]
Nico H Frijda 1986. The emotions. Cambridge University Press.
[15]
Sasanko Sekhar Gantayat and Swathi Lenka. 2021. Study of Algorithms and Methods on Emotion Detection from Facial Expressions: A Review from Past Research. In Communication Software and Networks, Suresh Chandra Satapathy, Vikrant Bhateja, M. Ramakrishna Murty, Nguyen Gia Nhu, and Jayasri Kotti(Eds.). Springer Singapore, Singapore, 231–244.
[16]
Jose Maria Garcia-Garcia, Victor MR Penichet, and Maria D Lozano. 2017. Emotion detection: a technology review. In Proceedings of the XVIII international conference on human computer interaction. 1–8.
[17]
Anchal Garg and Rohit Bajaj. 2015. Review on Facial Emotion Recognition. International Journal of Advanced Trends inComputer Applications 1, 1(2015), 10–13.
[18]
Scott B. Greenspan, Kelsey L Gordon, Sara A. Whitcomb, and Alexandra A. Lauterbach. 2021. Use of Video Conferencing to Facilitate Focus Groups for Qualitative Data Collection. American Journal of Qualitative Research 5, 1 (2021), 85–93.
[19]
Mitsuhiko Hanada. 2018. Correspondence analysis of color–emotion associations. Color Research & Application 43, 2 (2018), 224–237. https://doi.org/10.1002/col.22171 arXiv:https://onlinelibrary.wiley.com/doi/pdf/10.1002/col.22171
[20]
Mark Handley, Van Jacobson, Colin Perkins, 1998. SDP: session description protocol.
[21]
Mariam Hassib, Max Pfeiffer, Stefan Schneegass, Michael Rohs, and Florian Alt. 2017. Emotion actuator: Embodied emotional feedback through electroencephalography and electrical muscle stimulation. In Proceedings of the 2017 CHI Conference on Human Factors in Computing Systems. 6133–6146.
[22]
Jennifer M Jenkins and Keith Oatley. 2000. Psychopathology and short-term emotion: The balance of affects. The Journal of Child Psychology and Psychiatry and Allied Disciplines 41, 4(2000), 463–472.
[23]
Arvid Kappas. 2010. Smile when you read this, whether you like it or not: Conceptual challenges to affect detection. IEEE Transactions on Affective Computing 1, 1 (2010), 38–41.
[24]
Peter J. Lang, Mark K. Greenwald, Margaret M. Bradley, and Alfons O. Hamm. 1993. Looking at pictures: Affective, facial, visceral, and behavioral reactions. Psychophysiology 30, 3 (1993), 261–273. https://doi.org/10.1111/j.1469-8986.1993.tb03352.x arXiv:https://onlinelibrary.wiley.com/doi/pdf/10.1111/j.1469-8986.1993.tb03352.x
[25]
Richard S. Lazarus. 1991. Emotion and adaptation. Oxford University Press on Demand.
[26]
Heather C Lench, Sarah A Flores, and Shane W Bench. 2011. Discrete emotions predict changes in cognition, judgment, experience, behavior, and physiology: a meta-analysis of experimental emotion elicitations. Psychological bulletin 137, 5 (September 2011), 834—855. https://doi.org/10.1037/a0024244
[27]
Iris B Mauss, Robert W Levenson, Loren McCarter, Frank H Wilhelm, and James J Gross. 2005. The tie that binds? Coherence among emotion experience, behavior, and physiology.Emotion 5, 2 (2005), 175.
[28]
Hannah Miller, Jacob Thebault-Spieker, Shuo Chang, Isaac Johnson, Loren Terveen, and Brent Hecht. 2016. Blissfully happy” or “ready to fight”: Varying Interpretations of Emoji. Proceedings of ICWSM 2016 (2016).
[29]
James Ohene-Djan, A. Sammon, and Rachel Shipsey. 2006. Colour Spectrums of Opinion: An Information Visualisation Interface for Representing Degrees of Emotion in Real Time. Proceedings of the International Conference on Information Visualisation, 80 – 88. https://doi.org/10.1109/IV.2006.34
[30]
Sébastien Paquette, Isabelle Peretz, and Pascal Belin. 2013. The “Musical Emotional Bursts”: a validated set of musical affect bursts to investigate auditory affective processing. Frontiers in psychology 4 (2013), 509.
[31]
Rosalind W Picard. 2003. Affective computing: challenges. International Journal of Human-Computer Studies 59, 1-2 (2003), 55–64.
[32]
John Pollak, Phil Adams, and Geri Gay. 2011. PAM: A Photographic Affect Meter for frequent, in situ measurement of affect. Conference on Human Factors in Computing Systems - Proceedings, 725–734. https://doi.org/10.1145/1978942.1979047
[33]
Jon Postel 1980. User datagram protocol. (1980).
[34]
Jyoti Rani and Kanwal Garg. 2014. Emotion Detection Using Facial Expressions -A Review. International Journal of Advanced Research in Computer Science and Software Engineering 4, 4 (2014), 465–467.
[35]
David Robb, Stefano Padilla, Thomas Methven, Britta Kalkreuter, and M. Chantler. 2017. Image-based Emotion Feedback: How Does the Crowd Feel? And Why?https://doi.org/10.1145/3064663.3064665
[36]
Claire L. Roether, Lars Omlor, Andrea Christensen, and Martin A. Giese. 2009. Critical features for the perception of emotion from gait. Journal of Vision 9, 6. https://doi.org/10.1167/9.6.15
[37]
Jonathan Rosenberg, Joel Weinberger, Christian Huitema, and Rohan Mahy. 2003. STUN-simple traversal of user datagram protocol (UDP) through network address translators (NATs). (2003).
[38]
Christos Sagonas, Georgios Tzimiropoulos, Stefanos Zafeiriou, and Maja Pantic. 2013. 300 faces in-the-wild challenge: The first facial landmark localization challenge. In Proceedings of the IEEE International Conference on Computer Vision Workshops. 397–403.
[39]
Katri Salminen, Veikko Surakka, Jukka Raisamo, Jani Lylykangas, Johannes Pystynen, Roope Raisamo, Kalle Mäkelä, and Teemu Ahmaniemi. 2011. Emotional responses to thermal stimuli. In Proceedings of the 13th international conference on multimodal interfaces. 193–196.
[40]
Zhanna Sarsenbayeva, Denzil Ferreira, Niels van Berkel, Chu Luo, Mikko Vaisanen, Vassilis Kostakos, and Jorge Goncalves. 2017. Vision-Based Happiness Inference: A Feasibility Case-Study. In Proceedings of the 2017 ACM International Joint Conference on Pervasive and Ubiquitous Computing and Proceedings of the 2017 ACM International Symposium on Wearable Computers(Maui, Hawaii) (UbiComp ’17). Association for Computing Machinery, New York, NY, USA, 494–499. https://doi.org/10.1145/3123024.3124438
[41]
Zhanna Sarsenbayeva, Gabriele Marini, Niels van Berkel, Chu Luo, Weiwei Jiang, Kangning Yang, Greg Wadley, Tilman Dingler, Vassilis Kostakos, and Jorge Goncalves. 2020. Does Smartphone Use Drive Our Emotions or Vice Versa? A Causal Analysis. Association for Computing Machinery, New York, NY, USA, 1–15. https://doi.org/10.1145/3313831.3376163
[42]
Zhanna Sarsenbayeva, Niels van Berkel, Weiwei Jiang, Danula Hettiachchi, Vassilis Kostakos, and Jorge Goncalves. 2019. Effect of Ambient Light on Mobile Interaction. In Human-Computer Interaction – INTERACT 2019, David Lamas, Fernando Loizides, Lennart Nacke, Helen Petrie, Marco Winckler, and Panayiotis Zaphiris (Eds.). Springer International Publishing, Cham, 465–475.
[43]
Zhanna Sarsenbayeva, Niels van Berkel, Eduardo Velloso, Vassilis Kostakos, and Jorge Goncalves. 2018. Effect of distinct ambient noise types on mobile interaction. Proceedings of the ACM on Interactive, Mobile, Wearable and Ubiquitous Technologies 2, 2 (2018), 1–23.
[44]
Marc Schröder. 2003. Experimental study of affect bursts. Speech communication 40, 1-2 (2003), 99–116.
[45]
P. Suja, Sherin Mariam Thomas, Shikha Tripathi, and V. K. Madan. 2016. Emotion Recognition from Images Under Varying Illumination Conditions. In Soft Computing Applications, Valentina Emilia Balas, Lakhmi C. Jain, and Branko Kovačević (Eds.). Springer International Publishing, Cham, 913–921.
[46]
Xiaoyang Tan and Bill Triggs. 2010. Enhanced Local Texture Feature Sets for Face Recognition Under Difficult Lighting Conditions. IEEE Transactions on Image Processing 19, 6 (2010), 1635–1650. https://doi.org/10.1109/TIP.2010.2042645
[47]
Jianhua Tao and Tieniu Tan. 2005. Affective computing: A review. In International Conference on Affective computing and intelligent interaction. Springer, 981–995.
[48]
Cameron Hon Leng Teoh. 2012. Body Language Availability In Videoconferencing. (2012).
[49]
Thanyathorn Thanapattheerakul, Katherine Mao, Jacqueline Amoranto, and Jonathan Chan. 2018. Emotion in a Century: A Review of Emotion Recognition. 1–8. https://doi.org/10.1145/3291280.3291788
[50]
Joseph B. Walther. 1992. Interpersonal Effects in Computer-Mediated Interaction: A Relational Perspective. Communication Research 19, 1 (1992), 52–90. https://doi.org/10.1177/009365092019001003 arXiv:https://doi.org/10.1177/009365092019001003
[51]
Yichen Wang and Aditya Pal. 2015. Detecting Emotions in Social Media: A Constrained Optimization Approach. In Proceedings of the 24th International Conference on Artificial Intelligence (Buenos Aires, Argentina) (IJCAI’15). AAAI Press, 996–1002.
[52]
David Watson, Lee Anna Clark, and Auke Tellegen. 1988. Development and validation of brief measures of positive and negative affect: The PANAS scales.Journal of Personality and Social Psychology 54, 6(1988), 1063–1070. https://doi.org/10.1037/0022-3514.54.6.1063
[53]
Graham Wilson, Pietro Romeo, and Stephen A Brewster. 2016. Mapping Abstract Visual Feedback to a Dimensional Model of Emotion. In Proceedings of the 2016 CHI Conference Extended Abstracts on Human Factors in Computing Systems. 1779–1787.
[54]
John Xin, K. Cheng, Gail Taylor, T. Sato, and Aran Hansuebsai. 2004. Cross‐regional comparison of colour emotions Part I: Quantitative analysis. Color Research & Application 29 (12 2004), 451 – 457. https://doi.org/10.1002/col.20062
[55]
Kangning Yang, Chaofan Wang, Zhanna Sarsenbayeva, Benjamin Tag, Tilman Dingler, Greg Wadley, and Jorge Goncalves. 2021. Benchmarking commercial emotion detection systems using realistic distortions of facial image datasets. The Visual Computer (2021), 1447–1466. https://doi.org/10.1007/s00371-020-01881-x
[56]
Shigeo Yoshida, Tomohiro Tanikawa, Sho Sakurai, Michitaka Hirose, and Takuji Narumi. 2013. Manipulation of an emotional experience by real-time deformed facial feedback. In Proceedings of the 4th Augmented Human International Conference. 35–42.

Cited By

View all
  • (2024)Towards Universal Visualisation of Emotional States for Information SystemsProceedings of the 32nd International Conference on Information Systems Development10.62036/ISD.2024.117Online publication date: 2024
  • (2024)The Effects of Font-Based Emotional Expression Messenger on User Interaction in Text-Based CommunicationArchives of Design Research10.15187/adr.2024.08.37.4.31537:4(315-334)Online publication date: 31-Aug-2024
  • (2024)EmoWear: Exploring Emotional Teasers for Voice Message Interaction on SmartwatchesProceedings of the 2024 CHI Conference on Human Factors in Computing Systems10.1145/3613904.3642101(1-16)Online publication date: 11-May-2024
  • Show More Cited By

Recommendations

Comments

Please enable JavaScript to view thecomments powered by Disqus.

Information & Contributors

Information

Published In

cover image ACM Conferences
UbiComp/ISWC '21 Adjunct: Adjunct Proceedings of the 2021 ACM International Joint Conference on Pervasive and Ubiquitous Computing and Proceedings of the 2021 ACM International Symposium on Wearable Computers
September 2021
711 pages
ISBN:9781450384612
DOI:10.1145/3460418
Permission to make digital or hard copies of all or part of this work for personal or classroom use is granted without fee provided that copies are not made or distributed for profit or commercial advantage and that copies bear this notice and the full citation on the first page. Copyrights for components of this work owned by others than ACM must be honored. Abstracting with credit is permitted. To copy otherwise, or republish, to post on servers or to redistribute to lists, requires prior specific permission and/or a fee. Request permissions from [email protected]

Sponsors

Publisher

Association for Computing Machinery

New York, NY, United States

Publication History

Published: 24 September 2021

Permissions

Request permissions for this article.

Check for updates

Author Tags

  1. Affective Computing
  2. Emotion Detection
  3. Face Detection
  4. Remote Chat
  5. Video Conferencing Tool

Qualifiers

  • Research-article
  • Research
  • Refereed limited

Conference

UbiComp '21

Acceptance Rates

Overall Acceptance Rate 764 of 2,912 submissions, 26%

Contributors

Other Metrics

Bibliometrics & Citations

Bibliometrics

Article Metrics

  • Downloads (Last 12 months)101
  • Downloads (Last 6 weeks)14
Reflects downloads up to 11 Dec 2024

Other Metrics

Citations

Cited By

View all
  • (2024)Towards Universal Visualisation of Emotional States for Information SystemsProceedings of the 32nd International Conference on Information Systems Development10.62036/ISD.2024.117Online publication date: 2024
  • (2024)The Effects of Font-Based Emotional Expression Messenger on User Interaction in Text-Based CommunicationArchives of Design Research10.15187/adr.2024.08.37.4.31537:4(315-334)Online publication date: 31-Aug-2024
  • (2024)EmoWear: Exploring Emotional Teasers for Voice Message Interaction on SmartwatchesProceedings of the 2024 CHI Conference on Human Factors in Computing Systems10.1145/3613904.3642101(1-16)Online publication date: 11-May-2024
  • (2024)Decoding Group Emotional Dynamics in a Web-Based Collaborative Environment: A Novel Framework Utilizing Multi-Person Facial Expression RecognitionInternational Journal of Human–Computer Interaction10.1080/10447318.2024.2338614(1-19)Online publication date: 17-Apr-2024
  • (2024)Challenges of Facial Expression Recognition and Recommendations for the Use of Emotion AI in Video ConferencesHuman-Computer Interaction10.1007/978-3-031-60405-8_18(272-290)Online publication date: 1-Jun-2024
  • (2023)Development of a Universal Validation Protocol and an Open-Source Database for Multi-Contextual Facial Expression RecognitionSensors10.3390/s2320837623:20(8376)Online publication date: 10-Oct-2023
  • (2023)EmoFlow: Visualizing Emotional Changes in Video Chat - Preliminary StudyProceedings of the 25th International Conference on Mobile Human-Computer Interaction10.1145/3565066.3608702(1-7)Online publication date: 26-Sep-2023
  • (2023)Enhancing Facial Emotion Detection with CNN: Exploring the Impact of Hyperparameters2023 IEEE Region 10 Symposium (TENSYMP)10.1109/TENSYMP55890.2023.10223480(1-6)Online publication date: 6-Sep-2023
  • (2023)Facial Emotion Recognition in-the-Wild Using Deep Neural Networks: A Comprehensive ReviewSN Computer Science10.1007/s42979-023-02423-75:1Online publication date: 13-Dec-2023
  • (2022)Emotion recognition in talking-face videos using persistent entropy and neural networksElectronic Research Archive10.3934/era.202203430:2(644-660)Online publication date: 2022
  • Show More Cited By

View Options

Login options

View options

PDF

View or Download as a PDF file.

PDF

eReader

View online with eReader.

eReader

HTML Format

View this article in HTML Format.

HTML Format

Media

Figures

Other

Tables

Share

Share

Share this Publication link

Share on social media