[go: up one dir, main page]
More Web Proxy on the site http://driver.im/ skip to main content
10.1145/3293663.3293671acmotherconferencesArticle/Chapter ViewAbstractPublication PagesaivrConference Proceedingsconference-collections
research-article

The Indonesian Mixed Emotion Dataset (IMED): A Facial Expression Dataset for Mixed Emotion Recognition

Published: 23 November 2018 Publication History

Abstract

In Social Signal Processing (SSP) and affective computing area, the publicly available facial expression dataset for emotion recognition task is still limited for basic emotion categories. Whereas in everyday life, various types of emotions are being used by humans more than basic emotion, such as mixed emotion. To enrich the diversity of the existing dataset, we developed the Indonesian Mixed Emotion Dataset (IMED). The objective of creating this dataset is to provide the annotated data for mixed emotion recognition as a ground-truth for benchmarking. Mixed emotion is constructed by combining basic emotion categories to resulting new ones. This dataset can be used to facilitate mixed emotion recognition experiments. Our dataset displays 19 categories of emotions performed by 15 subjects, all are Indonesians with various ethnicities: Javanese, Sundanese, Malay, Bataknese, Minang, and Manadonian. Subjects are 60% female and 40% male with age ranging from 17 to 32 demonstrated basic and mixed emotion classes in videos. We then used a computational model to show that mixed emotion categories were discriminable to be recognized by machine classifiers. We believe that IMED dataset is useful for researchers on the same field to test their novel method by using our dataset.

References

[1]
P. Ekman, "An Argument for Basic Emotions," Cogn. Emot., vol. 6, no. 3--4, pp. 169--200, May 1992.
[2]
S. Buisine, S. Abrilian, R. Niewiadomski, J.-C. Martin, L. Devillers, and C. Pelachaud, "Perception of Blended Emotions: From Video Corpus to Expressive Agent," pp. 93--106, 2006.
[3]
N. H. Frijda, "Emotion experience and its varieties," Emot. Rev., vol. 1, no. 3, pp. 264--271, 2009.
[4]
C. P. Sumathi, T. Santhanam, and M. Mahadevi, 2012. Automatic Facial Expression Analysis A Survey. Int. J. Comput. Sci. Eng. Surv., vol. 3, no. 6 (Dec. 2012), 47--59.
[5]
S. Du, Y. Tao, and A. M. Martinez, "Compound facial expressions of emotion," Proc. Natl. Acad. Sci., vol. 111, no. 15, pp. E1454-E1462, Apr. 2014.
[6]
D. Y. Liliana, C. Basaruddin, and M. R. Widyanto, "Mix Emotion Recognition from Facial Expression using SVM-CRF Sequence Classifier," Proc. Int. Conf. Algorithms, Comput. Syst. - ICACS '17, pp. 27--31, 2017.
[7]
D. Y. Liliana, T. Basaruddin, M. R. Widyanto, "Mixed Facial Emotion Recognition using Active Appearance Model and Hidden Conditional Random Fields," IJPAM, vol. 118, no. 18, pp. 3159--3167, 2018.
[8]
P. Ekman, W. Frisen, and J. Hager, "Facial Action Coding System," A Hum. Face, 2002.
[9]
R. Berrios, P. Totterdell, and S. Kellett, "Eliciting mixed emotions: A meta-analysis comparing models, types and measures.," Front. Psychol., vol. 6, no. MAR, pp. 1--15, 2015.
[10]
C. E. Izard, "Basic emotions, relations among emotions, and emotion-cognition relations.," Psychol. Rev., vol. 99, no. 3, pp. 561--5, 1992.
[11]
P. Lucey, J. F. Cohn, T. Kanade, J. Saragih, Z. Ambadar, and I. Matthews, "The extended Cohn-Kanade dataset (CK+): A complete dataset for action unit and emotion-specified expression," 2010 IEEE Comput. Soc. Conf. Comput. Vis. Pattern Recognit. - Work. CVPRW 2010, no. May, pp. 94--101, 2010.
[12]
G. Tzimiropoulos and M. Pantic, "Optimization Problems for Fast AAM Fitting in-the-Wild," in 2013 IEEE International Conference on Computer Vision, 2013, pp. 593--600.
[13]
Y.-L. Tian, T. Kanade, and J. F. Cohn, "Recognizing action units for facial expression analysis," Proc. IEEE Conf. Comput. Vis. Pattern Recognition. CVPR 2000 (Cat. No.PR00662), vol. 1, no. 2, pp. 1--19, 2001.
[14]
Y. Wu and Q. Ji, "Discriminative Deep Face Shape Model for Facial Point Detection," Int. J. Comput. Vis., vol. 113, no. 1, pp. 37--53, 2015.
[15]
M. Kamachi, M. Lyons, and J. Gyoba, "The japanese female facial expression (jaffe) database," Availble http//www. kasrl. org/jaffe. html, 1997.
[16]
S. M. Mavadati, S. Member, M. H. Mahoor, K. Bartlett, P. Trinh, and J. F. Cohn, "DISFA?: A Spontaneous Facial Action Intensity Database," pp. 1--13.
[17]
M. Pantic, M. Valstar, R. Rademaker, and L. Maat, "Web-based database for facial expression analysis," IEEE International Conference on Multimedia and Expo, ICME 2005, vol. 2005. p. 5 pp.-, 2005.

Cited By

View all
  • (2024)Implementation of Dynamic Image for Facial Expression Recognition on Indonesian Facial Expression Dataset2024 IEEE International Conference on Artificial Intelligence in Engineering and Technology (IICAIET)10.1109/IICAIET62352.2024.10730516(7-12)Online publication date: 26-Aug-2024
  • (2023)Implementing Vision Transformer to Model Emotions Recognition from Facial Expressions2023 4th International Conference on Artificial Intelligence and Data Sciences (AiDAS)10.1109/AiDAS60501.2023.10284712(48-53)Online publication date: 6-Sep-2023
  • (2022)Conditional Generative Adversarial Networks with Total Variation and Color Correction for Generating Indonesian Face Photo from SketchApplied Sciences10.3390/app12191000612:19(10006)Online publication date: 5-Oct-2022
  • Show More Cited By

Index Terms

  1. The Indonesian Mixed Emotion Dataset (IMED): A Facial Expression Dataset for Mixed Emotion Recognition

    Recommendations

    Comments

    Please enable JavaScript to view thecomments powered by Disqus.

    Information & Contributors

    Information

    Published In

    cover image ACM Other conferences
    AIVR 2018: Proceedings of the 2018 International Conference on Artificial Intelligence and Virtual Reality
    November 2018
    144 pages
    ISBN:9781450366410
    DOI:10.1145/3293663
    Permission to make digital or hard copies of all or part of this work for personal or classroom use is granted without fee provided that copies are not made or distributed for profit or commercial advantage and that copies bear this notice and the full citation on the first page. Copyrights for components of this work owned by others than ACM must be honored. Abstracting with credit is permitted. To copy otherwise, or republish, to post on servers or to redistribute to lists, requires prior specific permission and/or a fee. Request permissions from [email protected]

    In-Cooperation

    • University of Tsukuba: University of Tsukuba

    Publisher

    Association for Computing Machinery

    New York, NY, United States

    Publication History

    Published: 23 November 2018

    Permissions

    Request permissions for this article.

    Check for updates

    Author Tags

    1. Affective computing
    2. Emotion recognition
    3. Facial expression dataset
    4. Mixed emotion
    5. Social signal processing

    Qualifiers

    • Research-article
    • Research
    • Refereed limited

    Conference

    AIVR 2018

    Contributors

    Other Metrics

    Bibliometrics & Citations

    Bibliometrics

    Article Metrics

    • Downloads (Last 12 months)25
    • Downloads (Last 6 weeks)5
    Reflects downloads up to 20 Dec 2024

    Other Metrics

    Citations

    Cited By

    View all
    • (2024)Implementation of Dynamic Image for Facial Expression Recognition on Indonesian Facial Expression Dataset2024 IEEE International Conference on Artificial Intelligence in Engineering and Technology (IICAIET)10.1109/IICAIET62352.2024.10730516(7-12)Online publication date: 26-Aug-2024
    • (2023)Implementing Vision Transformer to Model Emotions Recognition from Facial Expressions2023 4th International Conference on Artificial Intelligence and Data Sciences (AiDAS)10.1109/AiDAS60501.2023.10284712(48-53)Online publication date: 6-Sep-2023
    • (2022)Conditional Generative Adversarial Networks with Total Variation and Color Correction for Generating Indonesian Face Photo from SketchApplied Sciences10.3390/app12191000612:19(10006)Online publication date: 5-Oct-2022
    • (2022)The Development of Facial Expressions Dataset for Teaching Context: Preliminary Research2022 5th International Conference of Computer and Informatics Engineering (IC2IE)10.1109/IC2IE56416.2022.9970043(263-269)Online publication date: 13-Sep-2022
    • (2022)Emotional Meta-annotation of Educational Videos. A Review of Emotional Database CharacterizationApplications and Usability of Interactive TV10.1007/978-3-031-22210-8_4(53-69)Online publication date: 17-Dec-2022
    • (2021)Separable convolutional neural networks for facial expressions recognitionJournal of Big Data10.1186/s40537-021-00522-x8:1Online publication date: 16-Oct-2021
    • (2021)Facial Emotion Generation using StarGAN with Differentiable AugmentationProceedings of the 2021 4th International Conference on Signal Processing and Machine Learning10.1145/3483207.3483218(66-71)Online publication date: 18-Aug-2021
    • (2021)Facial Expression Classification for User Experience Testing Using K-Nearest Neighbor2021 IEEE International Conference on Industry 4.0, Artificial Intelligence, and Communications Technology (IAICT)10.1109/IAICT52856.2021.9532535(63-68)Online publication date: 27-Jul-2021
    • (2020)A Systematic Literature Review of Recognition of Compound Facial Expression of EmotionsProceedings of the 2020 4th International Conference on Video and Image Processing10.1145/3447450.3447469(116-121)Online publication date: 25-Dec-2020
    • (2020)Facial Expression Recognition on Video Data with Various Face Poses Using Deep Learning2020 12th International Conference on Information Technology and Electrical Engineering (ICITEE)10.1109/ICITEE49829.2020.9271740(362-367)Online publication date: 6-Oct-2020
    • Show More Cited By

    View Options

    Login options

    View options

    PDF

    View or Download as a PDF file.

    PDF

    eReader

    View online with eReader.

    eReader

    Media

    Figures

    Other

    Tables

    Share

    Share

    Share this Publication link

    Share on social media