[go: up one dir, main page]
More Web Proxy on the site http://driver.im/ skip to main content
research-article

Human emotion recognition using deep belief network architecture

Published: 01 November 2019 Publication History

Highlights

Fine Gaussian SVM and DBN architecture is proposed for recognizing emotions.
A feature-fusion vector is used to classify emotions.
Significant increases in the accuracy of emotion recognition was identified.

Abstract

Recently, deep learning methodologies have become popular to analyse physiological signals in multiple modalities via hierarchical architectures for human emotion recognition. In most of the state-of-the-arts of human emotion recognition, deep learning for emotion classification was used. However, deep learning is mostly effective for deep feature extraction. Therefore, in this research, we applied unsupervised deep belief network (DBN) for depth level feature extraction from fused observations of Electro-Dermal Activity (EDA), Photoplethysmogram (PPG) and Zygomaticus Electromyography (zEMG) sensors signals. Afterwards, the DBN produced features are combined with statistical features of EDA, PPG and zEMG to prepare a feature-fusion vector. The prepared feature vector is then used to classify five basic emotions namely Happy, Relaxed, Disgust, Sad and Neutral. As the emotion classes are not linearly separable from the feature-fusion vector, the Fine Gaussian Support Vector Machine (FGSVM) is used with radial basis function kernel for non-linear classification of human emotions. Our experiments on a public multimodal physiological signal dataset show that the DBN, and FGSVM based model significantly increases the accuracy of emotion recognition rate as compared to the existing state-of-the-art emotion classification techniques.

References

[1]
R.W. Picard, E. Vyzas, J. Healey, Toward machine emotional intelligence: analysis of affective physiological state, IEEE Trans. Pattern Anal. Mach. Intell. 23 (10) (2001) 1175–1191.
[2]
I.C. Christie, B.H. Friedman, Autonomic specificity of discrete emotion and dimensions of affective space: a multivariate approach, Int. J. Psychophysiol. 51 (2) (2004) 143–153.
[3]
Lee G., M. Kwon, S.K. Sri, Lee M., Emotion recognition based on 3D fuzzy visual and EEG features in movie clips, Neurocomputing 144 (2014) 560–568.
[4]
R. Cowie, E. Douglas-Cowie, N. Tsapatsoulis, G. Votsis, S. Kollias, W. Fellenz, J.G. Taylor, Emotion recognition in human-computer interaction, IEEE Signal Process. Mag. 18 (1) (2001) 32–80.
[5]
O. AlZoubi, S.K. D’Mello, R.A. Calvo, Detecting naturalistic expressions of nonbasic affect using physiological signals, IEEE Trans. Affec. Comput. 3 (3) (2012) 298–310.
[6]
E. Kanjo, E.M. Younis, N. Sherkat, Towards unravelling the relationship between on-body, environmental and emotion data using sensor information fusion approach, Inf. Fusion 40 (2018) 18–31.
[7]
M.M. Hassan, S. Huda, J. Yearwood, H.F. Jelinek, A. Almogren, Multistage fusion approaches based on a generative model and multivariate exponentially weighted moving average for diagnosis of cardiovascular autonomic nerve dysfunction, Inf. Fusion 41 (2018) 105–118.
[8]
X. Zhang, L. Yao, Q.Z. Sheng, S.S. Kanhere, T. Gu, D. Zhang, Converting your thoughts to texts: Enabling brain typing via deep feature learning of EEG signals, 2018 IEEE International Conference on Pervasive Computing and Communications (PerCom), IEEE, New York, 2018, pp. 1–10.
[9]
N. Martini, D. Menicucci, L. Sebastiani, R. Bedini, A. Pingitore, N. Vanello, M. Milanesi, L. Landini, A. Gemignani, The dynamics of EEG gamma responses to unpleasant visual stimuli: from local activity to functional connectivity, NeuroImage 60 (2) (2012) 922–932.
[10]
S.M. Alarcao, M.J. Fonseca, Emotions recognition using EEG signals: A survey, IEEE Trans. Affect. Comput.
[11]
Liu Y.J., Yu M., Zhao G., Song J., Ge Y., Shi Y., Real-time movie-induced discrete emotion recognition from EEG signals, IEEE Trans. Affect. Compu. (1) (2017) 1–11.
[12]
Wang X.W., Nie D., Lu B.L., Emotional state classification from EEG data using machine learning approach, Neurocomputing 129 (2014) 94–106.
[13]
M.M. Hassan, M.Z. Uddin, A. Mohamed, A. Almogren, A robust human activity recognition system using smartphone sensors and deep learning, Future Gener. Comput. Syst. 81 (2018) 307–313.
[14]
G.K. Verma, U.S. Tiwary, Multimodal fusion framework: a multiresolution approach for emotion classification and recognition from physiological signals, NeuroImage 102 (2014) 162–172.
[15]
B.H. Kim, S. Jo, Deep physiological affect network for the recognition of human emotions, IEEE Trans. Affect. Comput.
[16]
M. Ali, A.H. Mosa, F.A. Machot, K. Kyamakya, Emotion recognition involving physiological and speech signals: a comprehensive review, Recent Advances in Nonlinear Dynamics and Synchronization, Springer, 2018, pp. 287–302.
[17]
S. Poria, E. Cambria, R. Bajpai, A. Hussain, A review of affective computing: From unimodal analysis to multimodal fusion, Inf. Fusion 37 (2017) 98–125.
[18]
G. Hinton, Where do feature comes from?, Cogn. Sci. 38 (2014) 1078–1101.
[19]
N. Srivastava, R.R. Salakhutdinov, G.E. Hinton, Modeling documents with deep boltzmann machines, arXiv:1309.6865.
[20]
J.J. Hopfield, Neural networks and physical systems with emergent collective computational abilities, Proceedings of the National Academy of Sciences 79(8) (1982) 2554–2558.
[21]
Yang T., Long X., A.K. Sangaiah, Zheng Z., Tong C., Deep detection network for real-life traffic sign in vehicular networks, Comput. Netw. 136 (2018) 95–104.
[22]
M.G.R. Alam, M.M. Hassan, M.Z. Uddin, A. Almogren, G. Fortino, Autonomic computation Offloading in mobile edge for IoT applications, Future Gener. Comput. Syst. 90 (2019) 149–157.
[23]
D. Polat, Z. Çataltepe, Feature selection and classification on brain computer interface (BCI) data, Proceedings of the 2012 20th Signal Processing and Communications Applications Conference (SIU), IEEE, 2012, pp. 1–4.
[24]
S. Huda, R. Islam, J. Abawajy, J. Yearwood, M.M. Hassan, G. Fortino, A hybrid-multi filter-wrapper framework to identify run-time behaviour for fast malware detection, Future Gener. Comput. Syst. 83 (2018) 193–207.
[25]
M.G.R. Alam, S.F. Abedin, M.A. Ameen, Hong C.S., Web of objects based ambient assisted living framework for emergency psychiatric state prediction, Sensors 16 (9) (2016) 1431.
[26]
F. Martinelli, F. Mercaldo, A. Orlando, V. Nardone, A. Santone, Sangaiah A.K., Human behavior characterization for driving style recognition in vehicle system, Computers & Electrical Engineering (2018) Online First, DOI: https://doi.org/10.1016/j.compeleceng.2017.12.050.
[27]
M.G.R. Alam, R. Haw, S.S. Kim, M.A.K. Azad, S.F. Abedin, Hong C.S., Em-psychiatry: an ambient intelligent system for psychiatric emergency, IEEE Trans. Indus. Inform. 12 (6) (2016) 2321–2330.
[28]
K. Noda, Y. Yamaguchi, K. Nakadai, H.G. Okuno, T. Ogata, Audio-visual speech recognition using deep learning, Appl. Intell. 42 (4) (2015) 722–737.
[29]
Tian B., Li L., Qu Y., Yan L., Video object detection for tractability with deep learning method, Fifth International Conference on Advanced Cloud and Big Data (CBD), IEEE, 2017, pp. 397–401.
[30]
C.A. Frantzidis, C. Bratsas, C.L. Papadelis, E. Konstantinidis, C. Pappas, P.D. Bamidis, Toward emotion aware computing: an integrated approach using multichannel neurophysiological recordings and affective visual stimuli, IEEE Trans. Inf. Technol. in Biomed. 14 (3) (2010) 589–597.
[31]
M. Balconi, G. Mazza, Brain oscillations and BIS/BAS (behavioral inhibition/activation system) effects on processing masked emotional cues.: ERS/ERD and coherence measures of alpha band, Int. J. Psychophys. 74 (2) (2009) 158–165.
[32]
R. Jenke, A. Peer, M. Buss, Feature extraction and selection for emotion recognition from EEG, IEEE Trans. Affec. Comput. 5 (3) (2014) 327–339.
[33]
M. Khezri, M. Firoozabadi, A.R. Sharafat, Reliable emotion recognition system based on dynamic adaptive fusion of forehead biopotentials and physiological signals, Comput. Methods Programs Biomed. 122 (2) (2015) 149–164.
[34]
J. Atkinson, D. Campos, Improving BCI-based emotion recognition by combining EEG feature selection and kernel classifiers, Expert Syst. Appl. 47 (2016) 35–41.
[35]
S. Ramírez-Gallego, A. Fernández, S. García, Chen M., F. Herrera, Big data: tutorial and guidelines on information and process fusion for analytics algorithms with mapreduce, Inf. Fusion 42 (2018) 51–61.
[36]
Chen M., Hao Y., Task offloading for mobile edge computing in software defined ultra-dense network, IEEE J. Sel. Areas Commun. 36 (3) (2018) 587–597.
[37]
Li C., Xu C., Feng Z., Analysis of physiological for emotion recognition with the IRS model, Neurocomputing 178 (2016) 103–111.
[38]
S. Koelstra, C. Muhl, M. Soleymani, Lee J.S., A. Yazdani, T. Ebrahimi, T. Pun, A. Nijholt, I. Patras, Deap: a database for emotion analysis; using physiological signals, IEEE Trans. Affect. Comput. 3 (1) (2012) 18–31.
[39]
Chen M., Qian Y., Hao Y., Li Y., Song J., Data-driven computing and caching in 5g networks: architecture and delay analysis, IEEE Wirel. Commun. 25 (1) (2018) 70–75.
[40]
M. Balconi, C. Lucchiari, EEG correlates (event-related desynchronization) of emotional face elaboration: a temporal analysis, Neurosci. Lett. 392 (1-2) (2006) 118–123.
[41]
P.C. Petrantonakis, L.J. Hadjileontiadis, Adaptive emotional information retrieval from EEG signals in the time-frequency domain, IEEE Trans. Signal Process. 60 (5) (2012) 2604–2616.
[42]
Anh V.H., Van M.N., Ha B.B., T.H. Quyet, A real-time model based support vector machine for emotion recognition through EEG, Proceedings of the 2012 International Conference on Control, Automation and Information Sciences (ICCAIS), IEEE, 2012, pp. 191–196.
[43]
D. Iacoviello, A. Petracca, M. Spezialetti, G. Placidi, A real-time classification algorithm for EEG-based BCI driven by self-induced emotions, Comput. Methods Programs Biomed. 122 (3) (2015) 293–303.
[44]
R.M. Mehmood, Lee H.J., A novel feature extraction method based on late positive potential for emotion recognition in human brain signal patterns, Comput. Electr. Eng. 53 (2016) 444–457.
[45]
E.I. Konstantinidis, C.A. Frantzidis, C. Pappas, P.D. Bamidis, Real time emotion aware applications: a case study employing emotion evocative pictures and neuro-physiological sensing enhanced by graphic processor units, Comput. Methods Programs Biomed. 107 (1) (2012) 16–27.
[46]
E. Kroupi, A. Yazdani, T. Ebrahimi, EEG correlates of different emotional states elicited during watching music videos, Affective Computing and Intelligent Interaction, Springer, 2011, pp. 457–466.
[47]
Nie D., Wang X.W., Shi L.C., Lu B.L., EEG-based emotion recognition during watching movies, Proceedings of the 5th international IEEE/EMBS conference on Neural engineering (NER), 2011, IEEE, 2011, pp. 667–670.
[48]
Chen M., F. Herrera, Hwang K., Cognitive computing: architecture, technologies and intelligent applications, IEEE Access 6 (2018) 19774–19783.
[49]
Wang D., Shang Y., Modeling physiological data with deep belief networks, Int. J. Inf. Edu. Technol. (IJIET) 3 (5) (2013) 505.
[50]
A. Fleury, M. Vacher, N. Noury, Svm-based multimodal classification of activities of daily living in health smart homes: sensors, algorithms, and first experimental results, IEEE Trans. Inf. Technol. Biomed. 14 (2) (2010) 274–283.
[51]
J.A. Russell, A circumplex model of affect, J. Personal. Soc. Psychol. 39 (6) (1980) 1161.
[52]
Zhao M., F. Adib, D. Katabi, Emotion recognition using wireless signals, Proceedings of the 22nd Annual International Conference on Mobile Computing and Networking, ACM, 2016, pp. 95–108.
[53]
Liu C., K. Conn, N. Sarkar, W. Stone, Physiology-based affect recognition for computer-assisted intervention of children with autism spectrum disorder, Int. J. Hum. Comput. Stud. 66 (9) (2008) 662–677.

Cited By

View all

Index Terms

  1. Human emotion recognition using deep belief network architecture
        Index terms have been assigned to the content through auto-classification.

        Recommendations

        Comments

        Please enable JavaScript to view thecomments powered by Disqus.

        Information & Contributors

        Information

        Published In

        cover image Information Fusion
        Information Fusion  Volume 51, Issue C
        Nov 2019
        287 pages

        Publisher

        Elsevier Science Publishers B. V.

        Netherlands

        Publication History

        Published: 01 November 2019

        Author Tags

        1. Emotion recognition
        2. Physiological signals
        3. Fusion model
        4. Deep belief network
        5. Fine Gaussian support vector machine

        Qualifiers

        • Research-article

        Contributors

        Other Metrics

        Bibliometrics & Citations

        Bibliometrics

        Article Metrics

        • Downloads (Last 12 months)0
        • Downloads (Last 6 weeks)0
        Reflects downloads up to 12 Dec 2024

        Other Metrics

        Citations

        Cited By

        View all
        • (2024)U-Shaped Distribution Guided Sign Language Emotion Recognition With Semantic and Movement FeaturesIEEE Transactions on Affective Computing10.1109/TAFFC.2024.340935715:4(2180-2191)Online publication date: 4-Jun-2024
        • (2024)Multimodal Emotion Recognition with Deep LearningInformation Fusion10.1016/j.inffus.2023.102218105:COnline publication date: 1-May-2024
        • (2024)An advanced data fabric architecture leveraging homomorphic encryption and federated learningInformation Fusion10.1016/j.inffus.2023.102004102:COnline publication date: 1-Feb-2024
        • (2024)Performance degradation assessment of rolling bearing cage failure based on enhanced CycleGANExpert Systems with Applications: An International Journal10.1016/j.eswa.2024.124697255:PBOnline publication date: 1-Dec-2024
        • (2024)EEG-based emotion recognition using MobileNet Recurrent Neural Network with time-frequency featuresApplied Soft Computing10.1016/j.asoc.2024.111338154:COnline publication date: 1-Mar-2024
        • (2024)A review on speech emotion recognition for late deafened educators in online educationInternational Journal of Speech Technology10.1007/s10772-023-10064-727:1(29-52)Online publication date: 1-Mar-2024
        • (2024)Exploring Wearable Emotion Recognition with Transformer-Based Continual LearningArtificial Intelligence in Pancreatic Disease Detection and Diagnosis, and Personalized Incremental Learning in Medicine10.1007/978-3-031-73483-0_8(86-101)Online publication date: 10-Oct-2024
        • (2023)Empirical study of emotional state recognition using multimodal fusion frameworkProceedings of the 2023 5th International Conference on Image, Video and Signal Processing10.1145/3591156.3591160(27-35)Online publication date: 24-Mar-2023
        • (2023)Full-body Human Motion Reconstruction with Sparse Joint Tracking Using Flexible SensorsACM Transactions on Multimedia Computing, Communications, and Applications10.1145/356470020:2(1-19)Online publication date: 25-Sep-2023
        • (2023)Virtual Reality for Emotion Elicitation – A ReviewIEEE Transactions on Affective Computing10.1109/TAFFC.2022.318105314:4(2626-2645)Online publication date: 1-Oct-2023
        • Show More Cited By

        View Options

        View options

        Login options

        Media

        Figures

        Other

        Tables

        Share

        Share

        Share this Publication link

        Share on social media