Asian Affective and Emotional State (A2ES) Dataset of ECG and PPG for Affective Computing Research
<p>The P, QRS, and T waves in a single cycle of a standard ECG reading.</p> "> Figure 2
<p>Typical transmission of a PPG signal with the systolic period. Reproduced with permission from [<a href="#B37-algorithms-16-00130" class="html-bibr">37</a>].</p> "> Figure 3
<p>The self-assessment form prepared in Google Forms.</p> "> Figure 4
<p>AliveCor Kardia Mobile device and its application.</p> "> Figure 5
<p>Maxim PPG Band.</p> "> Figure 6
<p>Data distribution of the emotion labels.</p> "> Figure 7
<p>Intensity of the emotions rated by the subjects for each video, except neutral.</p> "> Figure 8
<p>The demography of the participants’ gender.</p> "> Figure 9
<p>The demography of the participants’ age.</p> "> Figure 10
<p>The demography of the participants’ race.</p> "> Figure 11
<p>The demography of the participants’ occupation.</p> "> Figure 12
<p>Data collection lab setup.</p> "> Figure 13
<p>Deep learning model architecture.</p> ">
Abstract
:1. Introduction
2. Related Works
3. Data Collection Protocol
3.1. Emotion Annotation and Stimuli Selection
3.2. Participants
4. Data Preprocessing and Features Extraction
4.1. ECG
4.2. PPG
5. Experimental Results and Discussion
5.1. Machine Learning
5.1.1. ECG
5.1.2. PPG
5.2. Deep Learning
5.2.1. ECG
5.2.2. PPG
6. Discussion and Conclusions
Author Contributions
Funding
Institutional Review Board Statement
Data Availability Statement
Acknowledgments
Conflicts of Interest
References
- World Health Organisation. COVID-19 Disrupting Mental Health Services in Most Countries, WHO Survey; World Health Organization: Geneva, Switzerland, 2020; pp. 2–5. [Google Scholar]
- Findings, P.; Impact, T.; Coronavirus, O.; Lifeamerica, O. KFF Health Tracking Poll—Early April 2020: The Impact of Coronavirus on Life In America Poll Findings: The Impact of Coronavirus on Life In America Americans See Major Disruptions to Their Own Lives, Report No End in Sight; KFF: San Francisco, CA, USA, 2020; pp. 1–22. [Google Scholar]
- Son, C.; Hegde, S.; Smith, A.; Wang, X.; Sasangohar, F. Effects of COVID-19 on college students’ mental health in the United States: Interview survey study. J. Med. Internet Res. 2020, 22, 14. [Google Scholar] [CrossRef] [PubMed]
- Stacey, A.; D’Eon, M.; Madojemu, G. Medical student stress and burnout: Before and after COVID-19. Can. Med. Educ. J. 2020, 11, e204. [Google Scholar] [CrossRef] [PubMed]
- Koldijk, S.; Sappelli, M.; Verberne, S.; Neerincx, M.A.; Kraaij, W. The Swell knowledge work dataset for stress and user modeling research. In Proceedings of the 2014 International Conference on Multimodal Interaction, Istanbul, Turkey, 12–16 November 2014. [Google Scholar]
- Rastgoo, M.N.; Nakisa, B.; Maire, F.; Rakotonirainy, A.; Chandran, V. Automatic driver stress level classification using multimodal deep learning. Expert Syst. Appl. 2019, 138, 112793. [Google Scholar] [CrossRef]
- Lee, D.S.; Chong, T.W.; Lee, B.G. Stress Events Detection of Driver by Wearable Glove System. IEEE Sens. J. 2017, 17, 194–204. [Google Scholar] [CrossRef]
- Spencer, C.; Koc, I.A.; Suga, C.; Lee, A.; Dhareshwar, A.M.; Franzén, E.; Iozzo, M.; Morrison, G.; McKeown, G.J. A Comparison of Unimodal and Multimodal Measurements of Driver Stress in Real-World Driving Conditions. arXiv 2020. [Google Scholar] [CrossRef]
- Lee, J.; Kim, J.; Shin, M. Correlation Analysis between Electrocardiography (ECG) and Photoplethysmogram (PPG) Data for Driver’s Drowsiness Detection Using Noise Replacement Method. Procedia Comput. Sci. 2017, 116, 421–426. [Google Scholar] [CrossRef]
- Bahreini, K.; Nadolski, R.; Westera, W. Towards real-time speech emotion recognition for affective e-learning. Educ. Inf. Technol. 2016, 21, 1367–1386. [Google Scholar] [CrossRef] [Green Version]
- Wang, W.; Xu, K.; Niu, H.; Miao, X. Emotion Recognition of Students Based on Facial Expressions in Online Education Based on the Perspective of Computer Simulation. Complexity 2020, 2020, 4065207. [Google Scholar] [CrossRef]
- Alqahtani, F.; Katsigiannis, S.; Ramzan, N. Using Wearable Physiological Sensors for Affect-Aware Intelligent Tutoring Systems. IEEE Sens. J. 2021, 21, 3366–3378. [Google Scholar] [CrossRef]
- Koelstra, S.; Mühl, C.; Soleymani, M.; Lee, J.S.; Yazdani, A.; Ebrahimi, T.; Pun, T.; Nijholt, A.; Patras, I. DEAP: A database for emotion analysis using physiological signals. IEEE Trans. Affect. Comput. 2012, 3, 18–31. [Google Scholar] [CrossRef] [Green Version]
- Katsigiannis, S.; Ramzan, N. DREAMER: A Database for Emotion Recognition Through EEG and ECG Signals from Wireless Low-cost Off-the-Shelf Devices. IEEE J. Biomed. Health Inform. 2018, 22, 98–107. [Google Scholar] [CrossRef] [Green Version]
- Minhad, K.N.; Ali, S.H.M.; Reaz, M.B.I. Happy-anger emotions classifications from electrocardiogram signal for automobile driving safety and awareness. J. Transp. Health 2017, 7, 75–89. [Google Scholar] [CrossRef]
- Song, T.; Zheng, W.; Lu, C.; Zong, Y.; Zhang, X.; Cui, Z. MPED: A multi-modal physiological emotion database for discrete emotion recognition. IEEE Access 2019, 7, 12177–12191. [Google Scholar] [CrossRef]
- Hasnul, M.A.; Ab Aziz, N.A.; Aziz, A.A. Evaluation of TEAP and AuBT as ECG’s Feature Extraction Toolbox for Emotion Recognition System. In Proceedings of the 2021 IEEE 9th Conference on System, Process and Control, ICSPC 2021, Malacca, Malaysia, 10–11 December 2021; IEEE: Piscataway, NJ, USA, 2021; pp. 52–57. [Google Scholar]
- Miranda Correa, J.A.; Abadi, M.K.; Sebe, N.; Patras, I. AMIGOS: A Dataset for Affect, Personality and Mood Research on Individuals and Groups. IEEE Trans. Affect. Comput. 2018, 12, 479–493. [Google Scholar] [CrossRef] [Green Version]
- Park, C.Y.; Cha, N.; Kang, S.; Kim, A.; Khandoker, A.H.; Hadjileontiadis, L.; Oh, A.; Jeong, Y.; Lee, U. K-EmoCon, a multimodal sensor dataset for continuous emotion recognition in naturalistic conversations. Sci. Data 2020, 7, 293. [Google Scholar] [CrossRef]
- Abadi, M.K.; Subramanian, R.; Kia, S.M.; Avesani, P.; Patras, I.; Sebe, N. DECAF: MEG-Based Multimodal Database for Decoding Affective Physiological Responses. IEEE Trans. Affect. Comput. 2015, 6, 209–222. [Google Scholar] [CrossRef]
- Udovičić, G.; Ðerek, J.; Russo, M.; Sikora, M. Wearable Emotion Recognition System based on GSR and PPG Signals. In Proceedings of the 2nd International Workshop on Multimedia for Personal Health and Health Care, Mountain View, CA, USA, 13 October 2017; pp. 53–59. [Google Scholar] [CrossRef]
- Bagirathan, A.; Selvaraj, J.; Gurusamy, A.; Das, H. Recognition of positive and negative valence states in children with autism spectrum disorder (ASD) using discrete wavelet transform (DWT) analysis of electrocardiogram signals (ECG). J. Ambient Intell. Humaniz. Comput. 2021, 12, 405–416. [Google Scholar] [CrossRef]
- Hsu, Y.L.; Wang, J.S.; Chiang, W.C.; Hung, C.H. Automatic ECG-Based Emotion Recognition in Music Listening. IEEE Trans. Affect. Comput. 2017, 11, 85–99. [Google Scholar] [CrossRef]
- Mand, A.A.; Wen, J.S.J.; Sayeed, M.S.; Swee, S.K. Robust stress classifier using adaptive neuro-fuzzy classifier-linguistic hedges. In Proceedings of the 2017 International Conference on Robotics, Automation and Sciences, ICORAS 2017, Melaka, Malaysia, 27–29 November 2017; pp. 1–5. [Google Scholar]
- Hasnul, M.A.; Aziz, N.A.A.; Alelyani, S.; Mohana, M.; Aziz, A.A. Electrocardiogram-based emotion recognition systems and their applications in healthcare—A review. Sensors 2021, 21, 5015. [Google Scholar] [CrossRef]
- Rock Health. Stanford Medicine Center for Digital Health Digital Health Consumer Adoption Report 2019; Rock Health: San Francisco, CA, USA, 2019; pp. 1–6. [Google Scholar]
- Laricchia, F. Smarwatches—Statistics and Facts. 2020. Available online: https://www.statista.com/topics/4762/smartwatches/#editorsPicks (accessed on 30 November 2022).
- Jemioło, P.; Storman, D.; Mamica, M.; Szymkowski, M.; Żabicka, W.; Wojtaszek-Główka, M.; Ligęza, A. Datasets for Automated Affect and Emotion Recognition from Cardiovascular Signals Using Artificial Intelligence—A Systematic Review. Sensors 2022, 22, 2538. [Google Scholar] [CrossRef]
- Schmidt, P.; Reiss, A.; Duerichen, R.; Van Laerhoven, K. Introducing WeSAD, a multimodal dataset for wearable stress and affect detection. In Proceedings of the 2018 International Confference Multimodal Interaction, Boulder, CO, USA, 16–20 October 2018; pp. 400–408. [Google Scholar] [CrossRef]
- Quiroz, J.C.; Geangu, E.; Yong, M.H. Emotion recognition using smart watch sensor data: Mixed-design study. JMIR Ment. Health 2018, 5, e10153. [Google Scholar] [CrossRef] [PubMed] [Green Version]
- Colvonen, P.J.; DeYoung, P.N.; Bosompra, N.O.A.; Owens, R.L. Limiting racial disparities and bias for wearable devices in health science research. Sleep 2020, 43, zsaa159. [Google Scholar] [CrossRef] [PubMed]
- Noseworthy, P.A.; Attia, Z.I.; Brewer, L.P.C.; Hayes, S.N.; Yao, X.; Kapa, S.; Friedman, P.A.; Lopez-Jimenez, F. Assessing and Mitigating Bias in Medical Artificial Intelligence: The Effects of Race and Ethnicity on a Deep Learning Model for ECG Analysis. Circ. Arrhythmia Electrophysiol. 2020, 13, 208–214. [Google Scholar] [CrossRef] [PubMed]
- Koerber, D.; Khan, S.; Shamsheri, T.; Kirubarajan, A.; Mehta, S. The Effect of Skin Tone on Accuracy of Heart Rate Measurement in Wearable Devices: A Systematic Review. J. Am. Coll. Cardiol. 2022, 79, 1990. [Google Scholar] [CrossRef]
- Rizal, A.; Hidayat, R.; Nugroho, H.A. Signal Domain in Respiratory Sound Analysis: Methods, Application and Future Development. J. Comput. Sci. 2016, 11, 1005–1016. [Google Scholar] [CrossRef] [Green Version]
- Kamshilin, A.A.; Margaryants, N.B. Origin of Photoplethysmographic Waveform at Green Light. Phys. Procedia 2017, 86, 72–80. [Google Scholar] [CrossRef]
- Renesas. OB1203 Heart Rate, Blood Oxygen Concentration, Pulse Oximetry, Proximity, Light and Color Sensor: Signal to Noise Ratio; Renesas: Tokyo, Japan, 2020; pp. 1–12. [Google Scholar]
- Fischer, C.; Glos, M.; Penzel, T.; Fietze, I. Extended algorithm for real-time pulse waveform segmentation and artifact detection in photoplethysmograms. Somnologie 2017, 21, 110–120. [Google Scholar] [CrossRef]
- Lee, M.S.; Lee, Y.K.; Pae, D.S.; Lim, M.T.; Kim, D.W.; Kang, T.K. Fast emotion recognition based on single pulse PPG signal with convolutional neural network. Appl. Sci. 2019, 9, 3355. [Google Scholar] [CrossRef] [Green Version]
- Preethi, M.; Nagaraj, S.; Madhan Mohan, P. Emotion based Media Playback System using PPG Signal. In Proceedings of the 2021 International Conference on Wireless Communications, Signal Processing and Networking, WiSPNET 2021, Chennai, India, 25–27 March 2021; pp. 426–430. [Google Scholar]
- Yang, W.; Rifqi, M.; Marsala, C.; Pinna, A. Physiological-Based Emotion Detection and Recognition in a Video Game Context. In Proceedings of the 2018 International Joint Conference on Neural Networks (IJCNN), Rio de Janeiro, Brazil, 8–13 July 2018. [Google Scholar] [CrossRef] [Green Version]
- Zainudin, Z.; Hasan, S.; Shamsuddin, S.M.; Argawal, S. Stress Detection using Machine Learning and Deep Learning. J. Phys. Conf. Ser. 2021, 1997, 012019. [Google Scholar] [CrossRef]
- Chen, T.; Yin, H.; Yuan, X.; Gu, Y.; Ren, F.; Sun, X. Emotion recognition based on fusion of long short-term memory networks and SVMs. Digit. Signal Process A Rev. J. 2021, 117, 103153. [Google Scholar] [CrossRef]
- Ayata, D.; Yaslan, Y.; Kamasak, M.E. Emotion Recognition from Multimodal Physiological Signals for Emotion Aware Healthcare Systems. J. Med. Biol. Eng. 2020, 40, 149–157. [Google Scholar] [CrossRef] [Green Version]
- Domínguez-Jiménez, J.A.; Campo-Landines, K.C.; Martínez-Santos, J.C.; Delahoz, E.J.; Contreras-Ortiz, S.H. A machine learning model for emotion recognition from physiological signals. Biomed. Signal Process Control. 2019, 55, 101646. [Google Scholar] [CrossRef]
- Kim, B.H.; Jo, S. Deep Physiological Affect Network for the Recognition of Human Emotions. IEEE Trans. Affect. Comput. 2020, 11, 230–243. [Google Scholar] [CrossRef] [Green Version]
- Li, C.; Xu, C.; Feng, Z. Analysis of physiological for emotion recognition with the IRS model. Neurocomputing 2016, 178, 103–111. [Google Scholar] [CrossRef]
- Shahid, H.; Butt, A.; Aziz, S.; Khan, M.U.; Hassan Naqvi, S.Z. Emotion Recognition System featuring a fusion of Electrocardiogram and Photoplethysmogram Features. In Proceedings of the 2020 14th International Conference on Open Source Systems and Technologies (ICOSST), Lahore, Pakistan, 16–17 December 2020; pp. 1–6. [Google Scholar] [CrossRef]
- Yang, C.J.; Fahier, N.; Li, W.C.; Fang, W.C. A Convolution Neural Network Based Emotion Recognition System using Multimodal Physiological Signals. In Proceedings of the 2020 IEEE International Conference on Consumer Electronics-Taiwan (ICCE-Taiwan 2020), Taoyuan, Taiwan, 28–30 September 2020; pp. 2020–2021. [Google Scholar] [CrossRef]
- Raheel, A.; Majid, M.; Anwar, S.M. DEAR-MULSEMEDIA: Dataset for emotion analysis and recognition in response to multiple sensorial media. Inf. Fusion 2021, 65, 37–49. [Google Scholar] [CrossRef]
- Sharma, K.; Castellini, C.; van den Broek, E.L.; Albu-Schaeffer, A.; Schwenker, F. A dataset of continuous affect annotations and physiological signals for emotion analysis. Sci. Data 2019, 6, 196. [Google Scholar] [CrossRef] [Green Version]
- Markova, V.; Ganchev, T.; Kalinkov, K. CLAS: A Database for Cognitive Load, Affect and Stress Recognition. In Proceedings of the International Conference on Biomedical Innovations and Applications (BIA 2019), Varna, Bulgaria, 8–9 November 2019. [Google Scholar]
- Gao, Z.; Cui, X.; Wan, W.; Zheng, W.; Gu, Z. ECSMP: A dataset on emotion, cognition, sleep, and multi-model physiological signals. Data Br. 2021, 39, 107660. [Google Scholar] [CrossRef]
- Kim, J.; André, E. Emotion recognition based on physiological changes in music listening. IEEE Trans. Pattern Anal. Mach. Intell. 2008, 30, 2067–2083. [Google Scholar] [CrossRef]
- Subramanian, R.; Wache, J.; Abadi, M.K.; Vieriu, R.L.; Winkler, S.; Sebe, N. Ascertain: Emotion and personality recognition using commercial sensors. IEEE Trans. Affect. Comput. 2018, 9, 147–160. [Google Scholar] [CrossRef]
- Wagner, J. Augsburg Biosignal Toolbox (Aubt); University of Augsburg: Augsburg, Germany, 2014. [Google Scholar]
- Healey, J.A.; Picard, R.W. Detecting stress during real-world driving tasks using physiological sensors. IEEE Trans. Intell. Transp. Syst. 2005, 6, 156–166. [Google Scholar] [CrossRef] [Green Version]
- Soleymani, M.; Lichtenauer, J.; Pun, T.; Pantic, M. A multimodal database for affect recognition and implicit tagging. IEEE Trans. Affect. Comput. 2012, 3, 42–55. [Google Scholar] [CrossRef] [Green Version]
- Ringeval, F.; Sonderegger, A.; Sauer, J.; Lalanne, D. Introducing the RECOLA multimodal corpus of remote collaborative and affective interactions. In Proceedings of the 2013 10th IEEE International Conference and Workshops on Automatic Face and Gesture Recognition (FG), Shanghai, China, 22–26 April 2013. [Google Scholar] [CrossRef] [Green Version]
- Ismail, S.N.M.S.; Aziz, N.A.A.; Ibrahim, S.Z.; Khan, C.T.; Rahman, M.A. Selecting Video Stimuli for Emotion Elicitation via Online Survey. Hum.-Cent. Comput. Inf. Sci. 2021, 11, 19. [Google Scholar] [CrossRef]
- Sayed Ismail, S.N.M.; Nor, N.A.; Ibrahim, S.Z. A comparison of emotion recognition system using electrocardiogram (ECG) and photoplethysmogram (PPG). J. King Saud Univ.-Comput. Inf. Sci. 2022, 34, 3539–3558. [Google Scholar] [CrossRef]
- Lau, J.K.; Lowres, N.; Neubeck, L.; Brieger, D.B.; Sy, R.W.; Galloway, C.D.; Albert, D.E.; Freedman, S.B. iPhone ECG application for community screening to detect silent atrial fibrillation: A novel technology to prevent stroke. Int. J. Cardiol. 2013, 165, 193–194. [Google Scholar] [CrossRef] [PubMed]
- Haberman, Z.C.; Jahn, R.T.; Bose, R.; Tun, H.; Shinbane, J.S.; Doshi, R.N.; Chang, P.M.; Saxon, L.A. Wireless Smartphone ECG Enables Large-Scale Screening in Diverse Populations. J. Cardiovasc. Electrophysiol. 2015, 26, 520–526. [Google Scholar] [CrossRef]
- Tarakji, K.G.; Wazni, O.M.; Callahan, T.; Kanj, M.; Hakim, A.H.; Wolski, K.; Wilkoff, B.L.; Saliba, W.; Lindsay, B.D. Using a novel wireless system for monitoring patients after the atrial fibrillation ablation procedure: The iTransmit study. Heart Rhythm 2015, 12, 554–559. [Google Scholar] [CrossRef]
- Lowres, N.; Mulcahy, G.; Gallagher, R.; Ben Freedman, S.; Marshman, D.; Kirkness, A.; Orchard, J.; Neubeck, L. Self-monitoring for atrial fibrillation recurrence in the discharge period post-cardiac surgery using an iPhone electrocardiogram. Eur. J. Cardiothorac. Surg. 2016, 50, 44–51. [Google Scholar] [CrossRef] [Green Version]
- Desteghe, L.; Raymaekers, Z.; Lutin, M.; Vijgen, J.; Dilling-Boer, D.; Koopman, P.; Schurmans, J.; Vanduynhoven, P.; Dendale, P.; Heidbuchel, H. Performance of handheld electrocardiogram devices to detect atrial fibrillation in a cardiology and geriatric ward setting. Europace 2017, 19, 29–39. [Google Scholar] [CrossRef]
- Bumgarner, J.M.; Lambert, C.T.; Hussein, A.A.; Cantillon, D.J.; Baranowski, B.; Wolski, K.; Lindsay, B.D.; Wazni, O.M.; Tarakji, K.G. Automated Atrial Fibrillation Detection Algorithm Using Smartwatch Technology. J. Am. Coll. Cardiol. Autom. Atr. Fibrillation Detect. Algorithm Using Smartwatch Technol. 2018, 71, 2381–2388. [Google Scholar] [CrossRef]
- Soleymani, M.; Villaro-Dixon, F.; Pun, T.; Chanel, G. Toolbox for Emotional feAture extraction from Physiological signals (TEAP). Front. ICT 2017, 4, 1–7. [Google Scholar] [CrossRef] [Green Version]
- Shu, L.; Xie, J.; Yang, M.; Li, Z.; Li, Z.; Liao, D.; Xu, X.; Yang, X. A review of emotion recognition using physiological signals. Sensors 2018, 18, 2074. [Google Scholar] [CrossRef] [Green Version]
- Mano, L.Y. Emotional condition in the Health Smart Homes environment: Emotion recognition using ensemble of classifiers. In Proceedings of the 2018 IEEE (SMC) International Conference on Innovations in Intelligent Systems and Applications, INISTA 2018, Roma, Italy, 3–5 July 2018. [Google Scholar]
- Pedregosa, F.; Varoquaux, G.; Gramfort, A.; Michel, V.; Thirion, B. Scikit-learn: Machine Learning in Python. J. Ofmachine Learn. Res. 2011, 12, 2825–2830. [Google Scholar] [CrossRef]
Dataset | Data | Stimuli Used | Modalities Used | Cardiac-Based Devices | Emotion Label |
---|---|---|---|---|---|
AMIGOS [18] | 40 subjects, 24 stimuli | Audio–visual | ECG, EEG, GSR | Shimmer | Valence, arousal, dominance |
ASCERTAIN [54] | 58 subjects, 36 stimuli | Audio–visual | ECG, EEG, GSR | NA | Valence, arousal |
AuBT [55] | 1 subject, 4 stimuli | Audio | ECG, EMG, RESP, GSR | NA | Joy, anger, sadness, pleasure |
CASE [50] | 30 subject, 20 stimuli | Audio–visual | ECG, PPG, EMG, GSR | Thought Technology | Valence, arousal |
CLAS [51] | 62 subjects, 32 stimuli | Audio–visual, visual | ECG, PPG, GSR | Shimmer3 | Valence, arousal |
DEAP [13] | 32 subjects, 40 stimuli | Audio–visual | PPG, EEG, GSR, EOG | Biosemi ActiveTwo | Valence, arousal, liking |
DEAR-MULSEMEDIA [49] | 18 subjects, 4 stimuli | Audio–visual, tactile, olfaction, haptic | PPG, EEG, GSR | Shimmer | Valence, arousal |
DECAF [20] | 30 subjects, 76 stimuli | Audio–visual | ECG, EOG, EMG, MEG, facial expression | NA | Valence, arousal, dominance |
DREAMER [14] | 23 subjects, 18 stimuli | Audio–visual | ECG, EEG | Shimmer | Valence, arousal, dominance |
DSDRWDT [56] | 24 subjects | Driving task | ECG, EMG, GSR, RESP | FlexComp | Stress |
ECSMP [52] | 89 subjects, 6 stimuli | Audio–visual, cognitive assessment task | ECG, PPG, EEG, GSR, TEMP, ACC | AECG-100A (ECG), Empatica E4 (PPG) | Neutral, fear, sad, happy, anger, disgust, fatigue |
EMDC [53] | 3 subjects, 360 stimuli | Audio | ECG, EMG, GSR, RESP | Procomp2 Infiniti | Valence, arousal |
K-EmoCon [19] | 32 subjects | Naturalistic conversation | ECG, PPG, EEG, GSR, TEMP | Polar H7 (ECG), Empatica E4 (PPG) | Valence, arousal |
MAHNOB-HCI [57] | 27 subjects, 20 stimuli | Audio–visual | ECG, EEG, GSR, RESP, TEMP | Biosemi ActiveTwo | Valence, arousal, dominance |
MPED [16] | 23 subjects, 28 stimuli | Audio–visual | ECG, EEG, GSR, RESP | Biopac System | Joy, funny, anger, fear, disgust, sad, neutral |
RECOLA [58] | 46 subjects | Spontaneous and naturalistic interactions | ECG, GSR, voice, facial expression | Biopac MP36 | Valence, arousal |
SWELL [5] | 25 subjects, 4 working conditions with stressors | Writing, presenting, reading, searching task | ECG, GSR, facial expression, body posture | Mobi (TMSi) | Valence, arousal, stress |
WESAD [29] | 15 subjects, 10 stimuli | Audio–visual, public speaking, mental arithmetic task | ECG, PPG, GSR, EMG, TEMP, RESP | RespiBan Professional2 (ECG), Empatica E4 (PPG) | Neutral, stress, amusement |
Videos (Number of Subjects That Chose the Labelled Emotion) | ||||||||||||||||||||||||||
---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|
1 | 2 | 3 | 4 | 5 | 6 | 7 | 8 | 9 | 10 | 11 | 12 | 13 | 14 | 15 | 16 | 17 | 18 | 19 | 20 | 21 | 22 | 23 | 24 | 25 | ||
Labelled Emotion | Happy | 6 | 37 | 29 | 4 | 7 | 7 | 2 | 15 | 14 | 2 | 1 | 9 | 38 | 6 | 33 | 15 | 1 | 15 | |||||||
Sad | 5 | 1 | 2 | 3 | 1 | 5 | 17 | 2 | 7 | 37 | 10 | 2 | 2 | 5 | 8 | 1 | ||||||||||
Anger | 1 | 1 | 36 | 24 | 37 | |||||||||||||||||||||
Fear | 2 | 2 | 9 | 2 | 9 | 2 | 1 | 37 | 37 | 24 | 2 | 1 | 1 | 2 | 1 | |||||||||||
Disgust | 1 | 1 | 2 | 1 | 5 | 2 | 1 | 1 | 42 | 46 | 41 | 2 | 14 | 1 | ||||||||||||
Surprise | 4 | 7 | 17 | 1 | 27 | 35 | 29 | 2 | 3 | 2 | 1 | 6 | 4 | 2 | 1 | 3 | ||||||||||
Neutral | 40 | 2 | 9 | 14 | 37 | 3 | 8 | 4 | 30 | 1 | 4 | 36 | 3 | 1 | 1 | 42 | 3 | 1 | 4 | 31 | 6 | 2 | 2 | 32 | ||
Targeted Emotion | Neutral | Happy | Happy | Happy | Neutral | Surprise | Surprise | Surprise | Neutral | Fear | Fear | Fear | Neutral | Disgust | Disgust | Disgust | Neutral | Sad | Sad | Sad | Neutral | Anger | Anger | Anger | Neutral | |
Target Reached? | Yes | Yes | Yes | No | Yes | Yes | Yes | Yes | Yes | Yes | Yes | Yes | Yes | Yes | Yes | Yes | Yes | No | Yes | No | Yes | Yes | Yes | Yes | Yes |
Underlying Features | Statistical Features | Number of Features |
---|---|---|
RR, PP, QQ, SS, TT, PQ, QS, ST interval | Mean, Median, Stdev, Min, Max, Range | 48 |
P, R, S amplitude | Mean, Median, Stdev, Min, Max, Range | 18 |
HRV | Mean, Median, Stdev, Min, Max, Range, pNN50, specRange | 8 |
HRV distribution | Mean, Median, Stdev, Min, Max, Range, TriInd | 7 |
Total number of features: | 81 |
Features | Abbreviation | Description | Number of Features |
---|---|---|---|
IBI | meanIBI, HRV | Mean IBI, HRV (std(IBI)) | 2 |
MSE | MSE1, MSE2, MSE3, MSE4, MSE5 | Multiscale entropy at 5 levels | 5 |
Tachogram power | Tachogram_LF, Tachogram_MF, Tachogram_HF | log (𝑃𝑋𝐿𝐹(𝑓)), log (𝑃𝑋𝑀𝐹(𝑓)), log (𝑃𝑋𝐻𝐹(𝑓)), log (𝑃𝑋𝑀𝐹(𝑓))/log (𝑃𝑋𝐿𝐹(𝑓))+ log (𝑃𝑋𝐻𝐹(𝑓)), where LF:f ∈ [0.01, 0.08] Hz, MF:[0.08, 0.15] Hz, HF:[0.15, 0.4] Hz | 3 |
PSD | sp0001, sp0102, sp0203, sp0304 | Log(PX(f)), f ∈ {[0, 0.01], [0.1, 0.2], [0.2, 0.3], [0.3, 0.4]} Hz log (𝑃𝑋𝐿𝐹(𝑓)𝑃𝑋𝐻𝐹(𝑓), where LF:f ∈[0.0, 0.08] Hz and HF:f ∈ [0.15, 5.0] Hz | 4 |
Statistical moments | mean_ | Mean | 1 |
Energy ratio | sp_energyRatio, tachogram_energy_ratio | Spectral power ratio between the 0.0–0.08 Hz and 0.15–0.5 Hz bands, energy ratio tachogram_MFSP/(tachogram_HSP+tachogram_LFSP) | 2 |
Total number of Features: | 17 |
SVM | DT | NB | KNN | RF | |
---|---|---|---|---|---|
Arousal | 68.75 | 66.19 | 40.06 | 66.76 | 65.91 |
Valence | 58.81 | 55.4 | 54.55 | 54.83 | 54.55 |
Dimensional | 29.26 | 30.11 | 17.33 | 32.1 | 31.82 |
SVM | DT | NB | KNN | RF | |
---|---|---|---|---|---|
Arousal | 64.61 | 63.64 | 63.31 | 60.06 | 67.30 |
Valence | 64.94 | 53.25 | 54.22 | 54.55 | 57.07 |
Dimensional | 37.01 | 36.69 | 25.00 | 34.42 | 40.00 |
Parameter | Parameter Setting |
---|---|
Input nodes | 17,920 |
Hidden layers | 33 |
Activation function (hidden layers) | Relu |
Activation function (output layer) | Sigmoid |
DL | |
---|---|
Arousal | 63.50 |
Valence | 53.26 |
Dimensional | 57.50 |
DL | |
---|---|
Arousal | 34.63 |
Valence | 56.8 |
Dimensional | 24.35 |
Disclaimer/Publisher’s Note: The statements, opinions and data contained in all publications are solely those of the individual author(s) and contributor(s) and not of MDPI and/or the editor(s). MDPI and/or the editor(s) disclaim responsibility for any injury to people or property resulting from any ideas, methods, instructions or products referred to in the content. |
© 2023 by the authors. Licensee MDPI, Basel, Switzerland. This article is an open access article distributed under the terms and conditions of the Creative Commons Attribution (CC BY) license (https://creativecommons.org/licenses/by/4.0/).
Share and Cite
Ab. Aziz, N.A.; K., T.; Ismail, S.N.M.S.; Hasnul, M.A.; Ab. Aziz, K.; Ibrahim, S.Z.; Abd. Aziz, A.; Raja, J.E. Asian Affective and Emotional State (A2ES) Dataset of ECG and PPG for Affective Computing Research. Algorithms 2023, 16, 130. https://doi.org/10.3390/a16030130
Ab. Aziz NA, K. T, Ismail SNMS, Hasnul MA, Ab. Aziz K, Ibrahim SZ, Abd. Aziz A, Raja JE. Asian Affective and Emotional State (A2ES) Dataset of ECG and PPG for Affective Computing Research. Algorithms. 2023; 16(3):130. https://doi.org/10.3390/a16030130
Chicago/Turabian StyleAb. Aziz, Nor Azlina, Tawsif K., Sharifah Noor Masidayu Sayed Ismail, Muhammad Anas Hasnul, Kamarulzaman Ab. Aziz, Siti Zainab Ibrahim, Azlan Abd. Aziz, and J. Emerson Raja. 2023. "Asian Affective and Emotional State (A2ES) Dataset of ECG and PPG for Affective Computing Research" Algorithms 16, no. 3: 130. https://doi.org/10.3390/a16030130
APA StyleAb. Aziz, N. A., K., T., Ismail, S. N. M. S., Hasnul, M. A., Ab. Aziz, K., Ibrahim, S. Z., Abd. Aziz, A., & Raja, J. E. (2023). Asian Affective and Emotional State (A2ES) Dataset of ECG and PPG for Affective Computing Research. Algorithms, 16(3), 130. https://doi.org/10.3390/a16030130