EEG-RegNet: Regressive Emotion Recognition in Continuous VAD Space Using EEG Signals
<p>Overview of EEG-RegNet’s emotion recognition processes, including preprocessing, spatiotemporal feature extraction, and regressive emotion recognition.</p> "> Figure 2
<p>Detailed framework of EEG-RegNet, illustrating the internal architecture for spatiotemporal feature extraction and the two-stage regressive emotion recognition process.</p> "> Figure 3
<p>Emotional spaces for representing emotions. (<b>a</b>) Russell’s valence–arousal space, and (<b>b</b>) Mehrabian’s valence–arousal–dominance space.</p> "> Figure 4
<p>Confusion matrix of EEG-RegNet predictions for nonary emotion classification across valence, arousal, and dominance dimensions on the DEAP dataset.</p> "> Figure 5
<p>Impact of hybrid loss balance: (<b>a</b>) Average RMSE for emotional score prediction; (<b>b</b>) Average error rates for fine-grained nonary classification across valence, arousal, and dominance as <math display="inline"><semantics> <mi>τ</mi> </semantics></math> values vary.</p> "> Figure 6
<p>Probability distribution for emotional score prediction across valence, arousal, and dominance on the DEAP dataset: (<b>a</b>) Baseline predictions using only MSE loss; (<b>b</b>) Predictions generated by EEG-RegNet using hybrid loss.</p> ">
Abstract
:1. Introduction
- We introduce EEG-RegNet, a novel deep learning model designed for emotion recognition from EEG signals. By combining 2D CNNs for spatial feature extraction and a 1D CNN for temporal dynamics, EEG-RegNet effectively captures both spatial and temporal relationships in EEG data, enhancing its representation in the VAD space.
- We propose a hybrid loss function that integrates CE loss with MSE loss, effectively addressing the tendency of MSE to bias predictions toward neutral states and improving discriminability. Additionally, the Bernoulli penalty increases the confidence of probability estimates, further enhancing the model’s performance in both continuous score prediction and fine-grained emotional classification.
- Experimental results demonstrate that EEG-RegNet achieves state-of-the-art performance on the DEAP dataset, outperforming existing approaches in predicting continuous emotional scores and classifying fine-grained emotional categories.
2. Related Work
2.1. Hand-Crafted Features vs. Deep Features
Papers | Features Type | Model | V | A | D | Recognition |
---|---|---|---|---|---|---|
Yin et al. [20] | Hand-crafted Features | LSSVM | 67.97 | 65.10 | - | High, Low |
Du et al. [15] | Hand-crafted Features | ATDD-LSTM | 69.06 | 72.97 | - | High, Low |
Li et al. [16] | Hand-crafted Features | CADD-DCCNN | 69.45 | 70.50 | - | High, Low |
Mert and Akan [17] | Hand-crafted Features | ANN | 72.87 | 75.00 | - | High, Low |
Ahmed et al. [21] | Hand-crafted Features | InvBase + MLP | 91.5 | 92.1 | - | High, Low |
Celebi et al. [19] | Hand-crafted Features | CNN + BiLSTM + GRU + AT | 90.57 | 90.59 | - | High, Low |
Xing et al. [22] | Deep Features | LSTM | 81.10 | 74.38 | - | High, Low |
Jin and Kim [10] | Deep Features | E-EmoticonNet | 93.09 | 93.69 | - | High, Low |
Chao et al. [18] | Hand-crafted Features | CapsNet | 66.73 | 68.28 | 67.25 | High, Low |
Nawaz et al. [11] | Hand-crafted Features | SVM | 77.62 | 78.96 | 77.60 | High, Low |
Liu et al. [23] | Deep Features | AP-CapsNet | 62.71 | 63.51 | 64.00 | High, Low |
Hu et al. [24] | Deep Features | ScalingNet | 71.88 | 71.80 | 73.67 | High, Low |
Wang et al. [12] | Deep Features | FLDNet | 83.85 | 78.22 | 77.52 | High, Low |
Islam et al. [25] | Hand-crafted Features | CNN | 78.22 | 74.92 | - | High, Low |
70.23 | 70.25 | - | High, Neutral, Low | |||
Kim and Choi [26] | Deep Features | CNN + Attention | 90.10 | 88.30 | - | High, Low |
LSTM | 86.90 | 84.10 | - | High, Neutral, Low | ||
Verma and Tiwary [27] | Hand-crafted Features | KNN | 67.51 | 68.55 | 65.10 | High, Neutral, Low |
2.2. Classification vs. Score Estimation
3. Methods
3.1. Preprocessing
3.2. Spatiotemporal Feature Extraction
3.3. Regressive Emotion Recognition
Algorithm 1 Pseudo-code for training EEG-RegNet. |
|
3.3.1. Hybrid Loss
3.3.2. Bernoulli Penalty
4. Experiments
4.1. Experimental Settings
4.1.1. Dataset
4.1.2. Implementation Details
4.2. Continuous VAD Emotion Space
4.3. Experimental Results
4.3.1. Main Results
4.3.2. Incremental Performance Gains
4.4. Trade-Off Between MSE and CE
4.5. Qualitative Results
5. Discussion
6. Conclusions
Author Contributions
Funding
Data Availability Statement
Conflicts of Interest
References
- Kakunje, A.; Mithur, R.; Kishor, M. Emotional well-being, mental health awareness, and prevention of suicide: COVID-19 pandemic and digital psychiatry. Arch. Med. Health Sci. 2020, 8, 147–153. [Google Scholar] [CrossRef]
- Liu, Y.; Sourina, O.; Nguyen, M.K. Real-time EEG-based emotion recognition and its applications. In Transactions on Computational Science XII: Special Issue on Cyberworlds; Springer: Berlin/Heidelberg, Germany, 2011; pp. 256–277. [Google Scholar]
- Bai, R.; Xiao, L.; Guo, Y.; Zhu, X.; Li, N.; Wang, Y.; Chen, Q.; Feng, L.; Wang, Y.; Yu, X.; et al. Tracking and monitoring mood stability of patients with major depressive disorder by machine learning models using passive digital data: Prospective naturalistic multicenter study. JMIR mHealth uHealth 2021, 9, e24365. [Google Scholar] [CrossRef]
- Li, W.; Zhang, Z.; Song, A. Physiological-signal-based emotion recognition: An odyssey from methodology to philosophy. Measurement 2021, 172, 108747. [Google Scholar] [CrossRef]
- Venkatraman, A.; Edlow, B.L.; Immordino-Yang, M.H. The brainstem in emotion: A review. Front. Neuroanat. 2017, 11, 15. [Google Scholar] [CrossRef]
- Rached, T.S.; Perkusich, A. Emotion recognition based on brain-computer interface systems. In Brain-Computer Interface Systems-Recent Progress and Future Prospects; Intechopen: London, UK, 2013; pp. 253–270. [Google Scholar]
- Piho, L.; Tjahjadi, T. A mutual information based adaptive windowing of informative EEG for emotion recognition. IEEE Trans. Affect. Comput. 2018, 11, 722–735. [Google Scholar] [CrossRef]
- Alazrai, R.; Homoud, R.; Alwanni, H.; Daoud, M.I. EEG-based emotion recognition using quadratic time-frequency distribution. Sensors 2018, 18, 2739. [Google Scholar] [CrossRef] [PubMed]
- Kusumaningrum, T.; Faqih, A.; Kusumoputro, B. Emotion recognition based on DEAP database using EEG time-frequency features and machine learning methods. Proc. J. Phys. Conf. Ser. Iop Publ. 2020, 1501, 012020. [Google Scholar] [CrossRef]
- Jin, L.; Kim, E.Y. E-EmotiConNet: EEG-based emotion recognition with context information. In Proceedings of the 2022 International Joint Conference on Neural Networks (IJCNN), Padua, Italy, 18–23 July 2022; IEEE: Piscataway, NJ, USA, 2022; pp. 1–8. [Google Scholar]
- Nawaz, R.; Cheah, K.H.; Nisar, H.; Yap, V.V. Comparison of different feature extraction methods for EEG-based emotion recognition. Biocybern. Biomed. Eng. 2020, 40, 910–926. [Google Scholar] [CrossRef]
- Wang, Z.; Gu, T.; Zhu, Y.; Li, D.; Yang, H.; Du, W. FLDNet: Frame-level distilling neural network for EEG emotion recognition. IEEE J. Biomed. Health Inform. 2021, 25, 2533–2544. [Google Scholar] [CrossRef] [PubMed]
- Russell, J.A.; Mehrabian, A. Evidence for a three-factor theory of emotions. J. Res. Personal. 1977, 11, 273–294. [Google Scholar] [CrossRef]
- Koelstra, S.; Muhl, C.; Soleymani, M.; Lee, J.S.; Yazdani, A.; Ebrahimi, T.; Pun, T.; Nijholt, A.; Patras, I. Deap: A database for emotion analysis; using physiological signals. IEEE Trans. Affect. Comput. 2011, 3, 18–31. [Google Scholar] [CrossRef]
- Du, X.; Ma, C.; Zhang, G.; Li, J.; Lai, Y.K.; Zhao, G.; Deng, X.; Liu, Y.J.; Wang, H. An efficient LSTM network for emotion recognition from multichannel EEG signals. IEEE Trans. Affect. Comput. 2020, 13, 1528–1540. [Google Scholar] [CrossRef]
- Li, C.; Bian, N.; Zhao, Z.; Wang, H.; Schuller, B.W. Multi-view domain-adaptive representation learning for EEG-based emotion recognition. Inf. Fusion 2024, 104, 102156. [Google Scholar] [CrossRef]
- Mert, A.; Akan, A. Emotion recognition from EEG signals by using multivariate empirical mode decomposition. Pattern Anal. Appl. 2018, 21, 81–89. [Google Scholar] [CrossRef]
- Chao, H.; Dong, L.; Liu, Y.; Lu, B. Emotion recognition from multiband EEG signals using CapsNet. Sensors 2019, 19, 2212. [Google Scholar] [CrossRef] [PubMed]
- Çelebi, M.; Öztürk, S.; Kaplan, K. An emotion recognition method based on EWT-3D–CNN–BiLSTM-GRU-AT model. Comput. Biol. Med. 2024, 169, 107954. [Google Scholar] [CrossRef] [PubMed]
- Yin, Z.; Liu, L.; Chen, J.; Zhao, B.; Wang, Y. Locally robust EEG feature selection for individual-independent emotion recognition. Expert Syst. Appl. 2020, 162, 113768. [Google Scholar] [CrossRef]
- Ahmed, M.Z.I.; Sinha, N.; Ghaderpour, E.; Phadikar, S.; Ghosh, R. A novel baseline removal paradigm for subject-independent features in emotion classification using EEG. Bioengineering 2023, 10, 54. [Google Scholar] [CrossRef] [PubMed]
- Xing, X.; Li, Z.; Xu, T.; Shu, L.; Hu, B.; Xu, X. SAE+ LSTM: A new framework for emotion recognition from multi-channel EEG. Front. Neurorobot. 2019, 13, 37. [Google Scholar] [CrossRef]
- Liu, S.; Wang, Z.; An, Y.; Zhao, J.; Zhao, Y.; Zhang, Y.D. EEG emotion recognition based on the attention mechanism and pre-trained convolution capsule network. Knowl.-Based Syst. 2023, 265, 110372. [Google Scholar] [CrossRef]
- Hu, J.; Wang, C.; Jia, Q.; Bu, Q.; Sutcliffe, R.; Feng, J. ScalingNet: Extracting features from raw EEG data for emotion recognition. Neurocomputing 2021, 463, 177–184. [Google Scholar] [CrossRef]
- Islam, M.R.; Islam, M.M.; Rahman, M.M.; Mondal, C.; Singha, S.K.; Ahmad, M.; Awal, A.; Islam, M.S.; Moni, M.A. EEG channel correlation based model for emotion recognition. Comput. Biol. Med. 2021, 136, 104757. [Google Scholar] [CrossRef] [PubMed]
- Kim, Y.; Choi, A. EEG-based emotion classification using long short-term memory network with attention mechanism. Sensors 2020, 20, 6727. [Google Scholar] [CrossRef]
- Verma, G.K.; Tiwary, U.S. Affect representation and recognition in 3D continuous valence–arousal–dominance space. Multimed. Tools Appl. 2017, 76, 2159–2183. [Google Scholar] [CrossRef]
- Galvão, F.; Alarcão, S.M.; Fonseca, M.J. Predicting exact valence and arousal values from EEG. Sensors 2021, 21, 3414. [Google Scholar] [CrossRef]
- Al-Fahad, R.; Yeasin, M. Robust modeling of continuous 4-d affective space from eeg recording. In Proceedings of the 2016 15th IEEE International Conference on Machine Learning and Applications (ICMLA), Anaheim, CA, USA, 18–20 December 2016; IEEE: Piscataway, NJ, USA, 2016; pp. 1040–1045. [Google Scholar]
- Garg, N.; Garg, R.; Anand, A.; Baths, V. Decoding the neural signatures of valence and arousal from portable EEG headset. Front. Hum. Neurosci. 2022, 16, 1051463. [Google Scholar] [CrossRef] [PubMed]
- Howard, A.G. Mobilenets: Efficient convolutional neural networks for mobile vision applications. arXiv 2017, arXiv:1704.04861. [Google Scholar]
- Abdelrahman, A.A.; Hempel, T.; Khalifa, A.; Al-Hamadi, A.; Dinges, L. L2cs-net: Fine-grained gaze estimation in unconstrained environments. In Proceedings of the 2023 8th International Conference on Frontiers of Signal Processing (ICFSP), Corfu, Greece, 23–25 October 2023; IEEE: Piscataway, NJ, USA, 2023; pp. 98–102. [Google Scholar]
- Bradley, M.M.; Lang, P.J. Measuring emotion: The self-assessment manikin and the semantic differential. J. Behav. Ther. Exp. Psychiatry 1994, 25, 49–59. [Google Scholar] [CrossRef] [PubMed]
- Russell, J.A. A circumplex model of affect. J. Personal. Soc. Psychol. 1980, 39, 1161. [Google Scholar] [CrossRef]
- Mehrabian, A. Pleasure-arousal-dominance: A general framework for describing and measuring individual differences in temperament. Curr. Psychol. 1996, 14, 261–292. [Google Scholar] [CrossRef]
- Hochreiter, S. Long Short-term Memory. In Neural Computation; MIT-Press: Cambridge, MA, USA, 1997. [Google Scholar]
- Cho, K.; Van Merriënboer, B.; Gulcehre, C.; Bahdanau, D.; Bougares, F.; Schwenk, H.; Bengio, Y. Learning phrase representations using RNN encoder-decoder for statistical machine translation. arXiv 2014, arXiv:1406.1078. [Google Scholar]
- Roselli, F.; Tartaglione, B.; Federico, F.; Lepore, V.; Defazio, G.; Livrea, P. Rate of MMSE score change in Alzheimer’s disease: Influence of education and vascular risk factors. Clin. Neurol. Neurosurg. 2009, 111, 327–330. [Google Scholar] [CrossRef]
- Arevalo-Rodriguez, I.; Smailagic, N.; Figuls, M.R.; Ciapponi, A.; Sanchez-Perez, E.; Giannakou, A.; Cullum, S. Mini-Mental State Examination (MMSE) for the detection of Alzheimer’s disease and other dementias in people with mild cognitive impairment (MCI). Cochrane Database Syst. Rev. 2015, 3. [Google Scholar] [CrossRef]
- Gilbody, S.; Richards, D.; Brealey, S.; Hewitt, C. Screening for depression in medical settings with the Patient Health Questionnaire (PHQ): A diagnostic meta-analysis. J. Gen. Intern. Med. 2007, 22, 1596–1602. [Google Scholar] [CrossRef]
Layer | #. of Filters | Filter Size | Output Size | Activation | Options |
---|---|---|---|---|---|
Input | |||||
Depthwise 1 | 1 | Padding = 1 | |||
Pointwise 1 | 32 | ELU | BatchNorm | ||
Depthwise 2 | 32 | Padding = 1 | |||
Pointwise 2 | 64 | ELU | BatchNorm | ||
Depthwise 3 | 64 | Padding = 1 | |||
Pointwise 3 | 128 | ELU | BatchNorm | ||
Flatten | |||||
1D CNN | 128 | 128 | ELU | Padding = 0 | |
Normalization | 128 | ||||
Fully Connected 1 | 128 | ELU | |||
Fully Connected 2 | 128 | ELU | |||
Fully Connected 3 | 9 | SoftMax | |||
Expectation | 1 | Bernoulli Penalty |
Method | Valence | Arousal | Dominance |
---|---|---|---|
Al-Fahad et al. [29] | 1.40 | 1.23 | 1.24 |
Galvão et al. [28] | 1.376 * | 1.304 * | - |
Garg et al. [30] | 1.902 | 1.769 | - |
EEG-RegNet | 0.628 | 0.567 | 0.544 |
Method | Valence | Arousal | Dominance |
---|---|---|---|
Verma and Tiwary [27] | 63.47 | 69.62 | 63.57 |
Islam et al. [25] | 70.23 | 70.25 | - |
Kim and Choi [26] | 86.90 | 84.10 | - |
EEG-RegNet | 94.90 | 94.70 | 95.60 |
Method | Valence | Arousal | Dominance |
---|---|---|---|
Emotional Score Prediction (RMSE) | |||
Baseline (MSE Loss) | 2.029 | 1.912 | 1.930 |
+ CE Loss | 0.632 (−1.397) | 0.584 (−1.328) | 0.549 (−1.381) |
+ Bernoulli Penalty | 0.628 (−0.004) | 0.567 (−0.017) | 0.544 (−0.005) |
Ternary Emotion Classification (%) | |||
Baseline (MSE Loss) | 40.25 | 45.30 | 46.52 |
+ CE Loss | 94.82 (+54.57) | 94.50 (+49.20) | 95.51 (+48.99) |
+ Bernoulli Penalty | 94.90 (+0.08) | 94.70 (+0.20) | 95.60 (+0.09) |
Nonary Emotion Classification (%) | |||
Baseline (MSE Loss) | 16.81 | 16.23 | 19.27 |
+ CE Loss | 94.35 (+77.54) | 94.58 (+78.35) | 94.83 (+75.56) |
+ Bernoulli Penalty | 94.35 (±0.00) | 94.83 (+0.25) | 94.98 (+0.15) |
Valence | Arousal | Dominance | |
---|---|---|---|
1.00 | 0.587 | 0.509 | 0.493 |
0.75 | 0.616 | 0.566 | 0.537 |
0.50 | 0.628 | 0.567 | 0.544 |
0.25 | 0.646 | 0.580 | 0.574 |
0.00 | 0.670 | 0.608 | 0.594 |
Valence | Arousal | Dominance | |
---|---|---|---|
1.00 | 86.76 | 87.49 | 87.55 |
0.75 | 93.92 | 94.36 | 94.42 |
0.50 | 94.35 | 94.83 | 94.98 |
0.25 | 94.66 | 95.01 | 94.98 |
0.00 | 94.74 | 95.15 | 95.12 |
Disclaimer/Publisher’s Note: The statements, opinions and data contained in all publications are solely those of the individual author(s) and contributor(s) and not of MDPI and/or the editor(s). MDPI and/or the editor(s) disclaim responsibility for any injury to people or property resulting from any ideas, methods, instructions or products referred to in the content. |
© 2024 by the authors. Licensee MDPI, Basel, Switzerland. This article is an open access article distributed under the terms and conditions of the Creative Commons Attribution (CC BY) license (https://creativecommons.org/licenses/by/4.0/).
Share and Cite
Jon, H.J.; Jin, L.; Jung, H.; Kim, H.; Kim, E.Y. EEG-RegNet: Regressive Emotion Recognition in Continuous VAD Space Using EEG Signals. Mathematics 2025, 13, 87. https://doi.org/10.3390/math13010087
Jon HJ, Jin L, Jung H, Kim H, Kim EY. EEG-RegNet: Regressive Emotion Recognition in Continuous VAD Space Using EEG Signals. Mathematics. 2025; 13(1):87. https://doi.org/10.3390/math13010087
Chicago/Turabian StyleJon, Hyo Jin, Longbin Jin, Hyuntaek Jung, Hyunseo Kim, and Eun Yi Kim. 2025. "EEG-RegNet: Regressive Emotion Recognition in Continuous VAD Space Using EEG Signals" Mathematics 13, no. 1: 87. https://doi.org/10.3390/math13010087
APA StyleJon, H. J., Jin, L., Jung, H., Kim, H., & Kim, E. Y. (2025). EEG-RegNet: Regressive Emotion Recognition in Continuous VAD Space Using EEG Signals. Mathematics, 13(1), 87. https://doi.org/10.3390/math13010087