[go: up one dir, main page]
More Web Proxy on the site http://driver.im/ skip to main content
10.1145/3552465.3555038acmconferencesArticle/Chapter ViewAbstractPublication PagesmmConference Proceedingsconference-collections
research-article
Open access

A More Objective Quantification of Micro-Expression Intensity through Facial Electromyography

Published: 10 October 2022 Publication History

Abstract

Micro-expressions are facial expressions that individuals reveal when trying to hide their genuine emotions. It has potential applications in areas such as lie detection and national security. It is generally believed that micro-expressions have three essential characteristics: short duration, low intensity, and local asymmetry. Most previous studies have investigated micro-expressions based on the characteristic of short duration. To our knowledge, no empirical studies have been conducted on the low-intensity characteristic. In this paper, we use facial EMG for the first time to study the characteristic of low intensity for micro-expression. In our experiment, micro-expressions were elicited from subjects and simultaneously collected their facial EMG through the second-generation micro-expression elicitation paradigm. We collected and annotated 33 macro-expressions and 48 micro-expressions. By comparing the two indicators of EMG :(1) the percentage of apex value in maximum voluntary contraction (MVC%) and (2) the area under EMG signal curve (integrated EMG, iEMG), we found that the MVC% and iEMG of micro-expression were significantly smaller than that of macro-expression. The result demonstrates that the intensity of micro-expression is significantly smaller than that of macro-expression.

References

[1]
Robert W Levenson. The search for autonomic specificity. The nature of emotion, pages 252--257, 1994.
[2]
Paul Ekman, Wallace V Friesen, and Phoebe Ellsworth. Emotion in the Human Face: Guide-lines for Research and an Integration of Findings: Guidelines for Research and an Integration of Findings. Pergamon, 1972.
[3]
Ursula Hess and Robert E Kleck. Differentiating emotion elicited and deliberate emotional facial expressions. European Journal of Social Psychology, 20(5):369--385, 1990.
[4]
Paul Ekman and Wallace V Friesen. Unmasking the face: A guide to recognizing emotions from facial clues, volume 10. Ishk, 2003.
[5]
Paul Ekman. Lie catching and microexpressions. In The Philosophy of Deception, pages 118--133. Oxford University Press, 2009.
[6]
Paul Ekman and Wallace V Friesen. Nonverbal leakage and clues to deception. Psychiatry, 32(1):88--106, 1969.
[7]
Hodam Kim, Dan Zhang, Laehyun Kim, and Chang-Hwan Im. Classification of individual's discrete emotions reflected in facial microexpressions using electroencephalogram and facial electromyogram. Expert Systems with Applications, 188:116101, 2022.
[8]
Aldert Vrij. Detecting lies and deceit: Pitfalls and opportunities. John Wiley & Sons, 2008.
[9]
Xiaopeng Hong, Wei Peng, Mehrtash Harandi, Ziheng Zhou, Matti Pietik"ainen, and Guoying Zhao. Characterizing subtle facial movements via riemannian manifold. ACM Transactions on Multimedia Computing, Communications, and Applications (TOMM), 15(3s):1--24, 2019.
[10]
Takayuki Ito, Emi Z Murano, and Hiroaki Gomi. Fast force-generation dynamics of human articulatory muscles. Journal of applied physiology, 96(6):2318--2324, 2004.
[11]
David Matsumoto and Hyi Sung Hwang. Evidence for training the ability to read microexpressions of emotion. Motivation and emotion, 35(2):181--191, 2011.
[12]
Wen-Jing Yan, Qi Wu, Jing Liang, Yu-Hsin Chen, and Xiaolan Fu. How fast are the leaked facial expressions: The duration of micro-expressions. Journal of Nonverbal Behavior, 37(4):217--230, 2013.
[13]
Wen-Jing Yan, Qi Wu, Yong-Jin Liu, Su-Jing Wang, and Xiaolan Fu. CASME database: a dataset of spontaneous micro-expressions collected from neutralized faces, page 1--7. IEEE, 2013.
[14]
Wen-Jing Yan, Xiaobai Li, Su-Jing Wang, Guoying Zhao, Yong-Jin Liu, Yu-Hsin Chen, and Xiaolan Fu. Casme ii: An improved spontaneous micro-expression database and the baseline evaluation. PloS one, 9(1):e86041, 2014.
[15]
Fangbing Qu, Su-Jing Wang, Wen-Jing Yan, He Li, Shuhang Wu, and Xiaolan Fu. CAS(ME)$^2$: a database for spontaneous macro-expression and micro-expression spotting and recognition. IEEE Transactions on Affective Computing, 2017.
[16]
Jingting Li, Zizhao Dong, Shaoyuan Lu, Su-Jing Wang, Wen-Jing Yan, Yinhuan Ma, Ye Liu, Changbing Huang, and Xiaolan Fu. CAS(ME)$^3$: A third generation facial spontaneous micro-expression database with depth information and high ecological validity. IEEE Transactions on Pattern Analysis and Machine Intelligence, 2022.
[17]
Xiaobai Li, Tomas Pfister, Xiaohua Huang, Guoying Zhao, and Matti Pietik"ainen. A spontaneous micro-expression database: Inducement, collection and baseline. In 2013 10th IEEE International Conference and Workshops on Automatic face and gesture recognition (FG), pages 1--6. IEEE, 2013.
[18]
Xiaobai Li, Shiyang Cheng, Yante Li, Muzammil Behzad, Jie Shen, Stefanos Zafeiriou, Maja Pantic, and Guoying Zhao. 4DME: A spontaneous 4d micro-expression dataset with multimodalities. IEEE Transactions on Affective Computing, pages 1--18, 2022.
[19]
Adrian K Davison, Cliff Lansley, Nicholas Costen, Kevin Tan, and Moi Hoon Yap. Samm: A spontaneous micro-facial movement dataset. IEEE Transactions on Affective Computing, 9(1):116--129, 2018.
[20]
Xianye Ben, Yi Ren, Junping Zhang, Su-Jing Wang, Kidiyo Kpalma, Weixiao Meng, and Yong-Jin Liu. Video-based facial micro-expression analysis: A survey of datasets, features and algorithms. IEEE Transactions on Pattern Analysis and Machine Intelligence, 2021.
[21]
P Ekman. Telling lies--a how-to-guide for all those who want to detect lies, 1985.
[22]
Paul Ekman and Erika Rosenberg. What the Face Reveals: Basic and Applied Studies of Spontaneous Expression Using the Facial Action Encoding System (FACS). Oxford University Press, 2005.
[23]
Mark Frank, Malgorzata Herbasz, Kang Sinuk, A Keller, and Courtney Nolan. I see how you feel: Training laypeople and professionals to recognize fleeting emotions. In The Annual Meeting of the International Communication Association. Sheraton New York, New York City, pages 1--35, 2009.
[24]
Paul Ekman and Wallace V Friesen. Facial action coding system. Environmental Psychology & Nonverbal Behavior, 1978.
[25]
S Mohammad Mavadati, Mohammad H Mahoor, Kevin Bartlett, Philip Trinh, and Jeffrey F Cohn. Disfa: A spontaneous facial action intensity database. IEEE Transactions on Affective Computing, 4(2):151--160, 2013.
[26]
Beat Fasel and Juergen Luettin. Recognition of asymmetric facial action unit activities and intensities. In Proceedings of the International Conference on Pattern Recognition (ICPR 2000), volume 1, pages 1100--1103, 2000.
[27]
Arman Savran, Bulent Sankur, and M Taha Bilge. Regression-based intensity estimation of facial action units. Image and Vision Computing, 30(10):774--784, 2012.
[28]
Mohammad H Mahoor, Steven Cadavid, Daniel S Messinger, and Jeffrey F Cohn. A framework for automated measurement of the intensity of non-posed facial action units. In 2009 IEEE Computer Society Conference on Computer Vision and Pattern Recognition Workshops, pages 74--80. IEEE, 2009.
[29]
Marian Stewart Bartlett, Gwen Littlewort, Mark G Frank, Claudia Lainscsek, Ian R Fasel, Javier R Movellan, et al. Automatic recognition of facial actions in spontaneous expressions. J. Multim., 1(6):22--35, 2006.
[30]
Shangfei Wang, Longfei Hao, and Qiang Ji. Facial action unit recognition and intensity estimation enhanced through label dependencies. IEEE Transactions on Image Processing, 28(3):1428--1442, 2018.
[31]
Monica Perusquia-Hernandez, Mazakasu Hirokawa, and Kenji Suzuki. A wearable device for fast and subtle spontaneous smile recognition. IEEE Transactions on Affective Computing, 8(4):522--533, 2017.
[32]
CJ Deluca and JH Lawrence. Effect of muscle on the emg signal-force relationship. J Applied Physiol, page 77, 1983.
[33]
John T Cacioppo, Richard E Petty, Mary E Losch, and Hai Sook Kim. Electromyographic activity over facial muscle regions can differentiate the valence and intensity of affective reactions. Journal of personality and social psychology, 50(2):260, 1986.
[34]
Michael Dufner, Ruben C Arslan, Birk Hagemeyer, Felix D Schönbrodt, and Jaap JA Denissen. Affective contingencies in the affiliative domain: Physiological assessment, associations with the affiliation motive, and prediction of behavior. Journal of Personality and Social Psychology, 109(4):662, 2015.
[35]
Richard L Hazlett and Sasha Yassky Hazlett. Emotional response to television commercials: Facial emg vs. self-report. Journal of advertising research, 39(2):7--7, 1999.
[36]
Eric J Vanman, Brenda Y Paul, Tiffany A Ito, and Norman Miller. The modern face of prejudice and structural features that moderate the effect of cooperation on affect. Journal of personality and social psychology, 73(5):941, 1997.
[37]
W. G. Dopson, B. E. Beckwith, D. M. Tucker, and P. C. Bullard-Bates. Asymmetry of facial expression in spontaneous emotion. Cortex, 20(2):243--251, 1984.
[38]
Anton Van Boxtel. Facial emg as a tool for inferring affective states. In Proceedings of measuring behavior, volume 7, pages 104--108, 2010.
[39]
Ursula Hess. facial emg. Methods in social neuroscience, pages 70--91, 2009.

Cited By

View all
  • (2024)Acquisition and Analysis of Facial Electromyographic Signals for Emotion RecognitionSensors10.3390/s2415478524:15(4785)Online publication date: 24-Jul-2024
  • (2024)Demystifying Mental Health by Decoding Facial Action Unit SequencesBig Data and Cognitive Computing10.3390/bdcc80700788:7(78)Online publication date: 9-Jul-2024
  • (2024)Attention-guided three-stream convolutional neural network for microexpression recognitionJournal of Image and Graphics10.11834/jig.23005329:1(111-122)Online publication date: 2024
  • Show More Cited By

Index Terms

  1. A More Objective Quantification of Micro-Expression Intensity through Facial Electromyography

    Recommendations

    Comments

    Please enable JavaScript to view thecomments powered by Disqus.

    Information & Contributors

    Information

    Published In

    cover image ACM Conferences
    FME '22: Proceedings of the 2nd Workshop on Facial Micro-Expression: Advanced Techniques for Multi-Modal Facial Expression Analysis
    October 2022
    23 pages
    ISBN:9781450394956
    DOI:10.1145/3552465
    This work is licensed under a Creative Commons Attribution International 4.0 License.

    Sponsors

    Publisher

    Association for Computing Machinery

    New York, NY, United States

    Publication History

    Published: 10 October 2022

    Check for updates

    Author Tags

    1. facial electromyography
    2. micro-expression characteristic
    3. micro-expression intensity

    Qualifiers

    • Research-article

    Funding Sources

    Conference

    MM '22
    Sponsor:

    Acceptance Rates

    FME '22 Paper Acceptance Rate 2 of 5 submissions, 40%;
    Overall Acceptance Rate 2 of 5 submissions, 40%

    Contributors

    Other Metrics

    Bibliometrics & Citations

    Bibliometrics

    Article Metrics

    • Downloads (Last 12 months)206
    • Downloads (Last 6 weeks)33
    Reflects downloads up to 11 Dec 2024

    Other Metrics

    Citations

    Cited By

    View all
    • (2024)Acquisition and Analysis of Facial Electromyographic Signals for Emotion RecognitionSensors10.3390/s2415478524:15(4785)Online publication date: 24-Jul-2024
    • (2024)Demystifying Mental Health by Decoding Facial Action Unit SequencesBig Data and Cognitive Computing10.3390/bdcc80700788:7(78)Online publication date: 9-Jul-2024
    • (2024)Attention-guided three-stream convolutional neural network for microexpression recognitionJournal of Image and Graphics10.11834/jig.23005329:1(111-122)Online publication date: 2024
    • (2023)Data Leakage and Evaluation Issues in Micro-Expression AnalysisIEEE Transactions on Affective Computing10.1109/TAFFC.2023.326506315:1(186-197)Online publication date: 6-Apr-2023

    View Options

    View options

    PDF

    View or Download as a PDF file.

    PDF

    eReader

    View online with eReader.

    eReader

    Login options

    Media

    Figures

    Other

    Tables

    Share

    Share

    Share this Publication link

    Share on social media