Emotion Recognition Using EEG Signals through the Design of a Dry Electrode Based on the Combination of Type 2 Fuzzy Sets and Deep Convolutional Graph Networks
<p>The proposed electrode design and customized deep architecture provide a general framework for classifying two types of emotions: positive and negative.</p> "> Figure 2
<p>Copper bars of various diameters.</p> "> Figure 3
<p>Electrode copper bases are machined and ready for sintering.</p> "> Figure 4
<p>Powdered samples inside the sintering furnace.</p> "> Figure 5
<p>Samples taken from the furnace with a copper base and silver top.</p> "> Figure 6
<p>The amplifier used in the experiment for the proposed dry electrode.</p> "> Figure 7
<p>Recording of EEG signals from one of the participants based on the dry electrode (Three electrodes FP1, PZ, and FZ have been used for recording according to the image).</p> "> Figure 8
<p>Musical stimulation scenario to evoke positive and negative emotions.</p> "> Figure 9
<p>Proposed deep network representation in combination with TF2 for automatic recognition of emotions.</p> "> Figure 10
<p>Details of each layer in the proposed pipeline.</p> "> Figure 11
<p>Electrode sample at the imaging point of the SEM.</p> "> Figure 12
<p>Illustrates the silver powder utilized in the annealing procedure, in conjunction with an EDXA instrument. (<b>a</b>) powder particles, (<b>b</b>) EDX results.</p> "> Figure 13
<p><a href="#biomimetics-09-00562-f013" class="html-fig">Figure 13</a> shows an EDXA image of the silver block that came into being on the copper base after the silver powder was sintered. (<b>a</b>) sintering of the silver powder, (<b>b</b>) EDX analysis.</p> "> Figure 14
<p>Optimization of the number and computational efficiency of the proposed DFCGN network.</p> "> Figure 15
<p>Considered polynomial values for the proposed DFCGN network.</p> "> Figure 16
<p>Comparison of error performance and accuracy of dry electrodes made with dry and wet electrodes from different brands. (The suggested dry electrode, dry electrode, and wet electrode are shown with blue, red, and yellow legends, respectively).</p> "> Figure 17
<p>ROC diagram for the various evaluated electrodes (from left: recommended dry electrode, wet electrode, and dry electrode).</p> "> Figure 18
<p>TSNE diagram for the first and last layers of the proposed DFCGN model to recognize two different classes of positive and negative emotion according to the recorded suggested dry electrode.</p> "> Figure 19
<p>The proposed network’s performance in comparison to other networks.</p> "> Figure 20
<p>The effect of environmental noise on the proposed dry electrode and dry electrode.</p> ">
Abstract
:1. Introduction
- The design and manufacture of an effective dry electrode for long-term recording of EEG signals.
- An EEG database based on music stimulation will be created using the proposed dry electrode.
- A customized architecture based on the combination of FT2 sets and deep convolutional graph networks will be presented for the automatic recognition of emotions.
- Achieving the best performance in the classification of positive and negative emotional classes compared to recent research.
2. Related Works
2.1. Recent Research in the Field of Automatic Recognition of Emotions
2.2. Recent Research in the Field of Dry Electrode Design and Manufacturing
3. Materials and Methods
3.1. Brief of Graph Convolutional Network
3.2. Brief of Type 2 Fuzzy Sets
4. Proposed Model
4.1. Construction and Design of Dry Electrode
4.2. Data Collection
4.3. Pre-Processing of EEG Data
4.4. Architecture
4.5. Customized Architecture
4.6. Training, Validation, and Test Series
5. Experimental Results
5.1. Optimization of the Proposed Dry Electrode
5.2. Optimization of Proposed Model
5.3. Results of Simulation
5.4. Comparison with Previous Algorithms and Studies
6. Conclusions
Author Contributions
Funding
Institutional Review Board Statement
Data Availability Statement
Conflicts of Interest
References
- Agung, E.S.; Rifai, A.P.; Wijayanto, T. Image-based facial emotion recognition using convolutional neural network on emognition dataset. Sci. Rep. 2024, 14, 14429. [Google Scholar] [CrossRef] [PubMed]
- Alsaadawı, H.F.T.; Daş, R. Multimodal Emotion Recognition Using Bi-LG-GCN for MELD Dataset. Balk. J. Electr. Comput. Eng. 2024, 12, 36–46. [Google Scholar] [CrossRef]
- Alslaity, A.; Orji, R. Machine learning techniques for emotion detection and sentiment analysis: Current state, challenges, and future directions. Behav. Inf. Technol. 2024, 43, 139–164. [Google Scholar] [CrossRef]
- Deshmukh, S.; Chaudhary, S.; Gayakwad, M.; Kadam, K.; More, N.S.; Bhosale, A. Advances in Facial Emotion Recognition: Deep Learning Approaches and Future Prospects. In 2024 MIT Art, Design and Technology School of Computing International Conference (MITADTSoCiCon), Pune, India, 25–27 April 2024; IEEE: Piscataway, NJ, USA, 2024; pp. 1–3. [Google Scholar]
- Farashi, S.; Bashirian, S.; Jenabi, E.; Razjouyan, K. Effectiveness of virtual reality and computerized training programs for enhancing emotion recognition in people with autism spectrum disorder: A systematic review and meta-analysis. Int. J. Dev. Disabil. 2024, 70, 110–126. [Google Scholar] [CrossRef]
- Mohajelin, F.; Sheykhivand, S.; Shabani, A.; Danishvar, M.; Danishvar, S.; Lahijan, L.Z. Automatic Recognition of Multiple Emotional Classes from EEG Signals through the Use of Graph Theory and Convolutional Neural Networks. Sensors 2024, 24, 5883. [Google Scholar] [CrossRef]
- Li, J.; Washington, P. A comparison of personalized and generalized approaches to emotion recognition using consumer wearable devices: Machine learning study. JMIR AI 2024, 3, e52171. [Google Scholar] [CrossRef]
- Liu, H.; Lou, T.; Zhang, Y.; Wu, Y.; Xiao, Y.; Jensen, C.S.; Zhang, D. EEG-based multimodal emotion recognition: A machine learning perspective. IEEE Trans. Instrum. Meas. 2024, 73, 3369130. [Google Scholar] [CrossRef]
- Peng, Z.; Fu, R.Z.; Chen, H.P.; Takahashi, K.; Tanioka, Y.; Roy, D. AI Applications in Emotion Recognition: A Bibliometric Analysis. SHS Web Conf. 2024, 194, 03005. [Google Scholar] [CrossRef]
- Ferreira, L.G.; Pimenta, T.C. Dry Electrodes for Capturing Brain Electrical Signals. In Proceedings of the 2024 31st International Conference on Mixed Design of Integrated Circuits and System (MIXDES), Gdansk, Poland, 27–28 June 2024; IEEE: Piscataway, NJ, USA, 2024; pp. 290–293. [Google Scholar]
- He, C.; Chen, Y.-Y.; Phang, C.-R.; Stevenson, C.; Chen, I.-P.; Jung, T.-P.; Ko, L.-W. Diversity and suitability of the state-of-the-art wearable and wireless EEG systems review. IEEE J. Biomed. Health Inform. 2023, 27, 3830–3843. [Google Scholar] [CrossRef]
- Jackovatz, C.O. New EEG Electrode Design Supporting Improved Comfort and Measurement Reliability for Active Users; University of Georgia: Athens, Georgia, 2024. [Google Scholar]
- Kleeva, D.; Ninenko, I.; Lebedev, M.A. Resting-state EEG recorded with gel-based vs. consumer dry electrodes: Spectral characteristics and across-device correlations. Front. Neurosci. 2024, 18, 1326139. [Google Scholar] [CrossRef]
- Oh, J.; Nam, K.-W.; Kim, W.-J.; Kang, B.-H.; Park, S.-H. Flexible Dry Electrode Based on a Wrinkled Surface That Uses Carbon Nanotube/Polymer Composites for Recording Electroencephalograms. Materials 2024, 17, 668. [Google Scholar] [CrossRef] [PubMed]
- Pieter, B.; Victor-Paul, G.; Gilles, D.; Nicolas, G.; Alain, D.; Antoine, N. Integration of Sustainability in the Design Process of Medical Devices–Application to Dry Electrodes. In Proceedings of the 2024 IEEE International Symposium on Medical Measurements and Applications (MeMeA), Eindhoven, The Netherlands, 26–28 June 2024; IEEE: Piscataway, NJ, USA, 2024; pp. 1–6. [Google Scholar]
- Sheykhivand, S.; Mousavi, Z.; Rezaii, T.Y.; Farzamnia, A. Recognizing emotions evoked by music using CNN-LSTM networks on EEG signals. IEEE Access 2020, 8, 139332–139345. [Google Scholar] [CrossRef]
- Baradaran, F.; Farzan, A.; Danishvar, S.; Sheykhivand, S. Customized 2D CNN Model for the Automatic Emotion Recognition Based on EEG Signals. Electronics 2023, 12, 2232. [Google Scholar] [CrossRef]
- Baradaran, F.; Farzan, A.; Danishvar, S.; Sheykhivand, S. Automatic Emotion Recognition from EEG Signals Using a Combination of Type-2 Fuzzy and Deep Convolutional Networks. Electronics 2023, 12, 2216. [Google Scholar] [CrossRef]
- Yang, L.; Wang, Y.; Yang, X.; Zheng, C. Stochastic weight averaging enhanced temporal convolution network for EEG-based emotion recognition. Biomed. Signal Process. Control 2023, 83, 104661. [Google Scholar] [CrossRef]
- Hussain, M.; AboAlSamh, H.A.; Ullah, I. Emotion recognition system based on two-level ensemble of deep-convolutional neural network models. IEEE Access 2023, 11, 16875–16895. [Google Scholar] [CrossRef]
- Khubani, J.; Kulkarni, S. Inventive deep convolutional neural network classifier for emotion identification in accordance with EEG signals. Soc. Netw. Anal. Min. 2023, 13, 34. [Google Scholar] [CrossRef]
- Peng, G.; Zhao, K.; Zhang, H.; Xu, D.; Kong, X. Temporal relative transformer encoding cooperating with channel attention for EEG emotion analysis. Comput. Biol. Med. 2023, 154, 106537. [Google Scholar] [CrossRef] [PubMed]
- Xu, J.; Qian, W.; Hu, L.; Liao, G.; Tian, Y. EEG decoding for musical emotion with functional connectivity features. Biomed. Signal Process. Control 2024, 89, 105744. [Google Scholar] [CrossRef]
- Alotaibi, F.M. An AI-inspired spatio-temporal neural network for EEG-based emotional status. Sensors 2023, 23, 498. [Google Scholar] [CrossRef]
- Qiao, Y.; Mu, J.; Xie, J.; Hu, B.; Liu, G. Music emotion recognition based on temporal convolutional attention network using EEG. Front. Hum. Neurosci. 2024, 18, 1324897. [Google Scholar] [CrossRef] [PubMed]
- Yokus, M.A. Multiplexed Biochemical and Biophysical Sensing Systems for Monitoring Human Physiology; North Carolina State University: Raleigh, CA, USA, 2020. [Google Scholar]
- Jiang, Y.; Liu, L.; Chen, L.; Zhang, Y.; He, Z.; Zhang, W.; Zhao, J.; Lu, D.; He, J.; Zhu, H. Flexible and stretchable dry active electrodes with PDMS and silver flakes for bio-potentials sensing systems. IEEE Sens. J. 2021, 21, 12255–12268. [Google Scholar] [CrossRef]
- Gong, X.-B.; You, S.-J.; Wang, X.-H.; Zhang, J.-N.; Gan, Y.; Ren, N.-Q. A novel stainless steel mesh/cobalt oxide hybrid electrode for efficient catalysis of oxygen reduction in a microbial fuel cell. Biosens. Bioelectron. 2014, 55, 237–241. [Google Scholar] [CrossRef] [PubMed]
- Krachunov, S.; Casson, A.J. 3D printed dry EEG electrodes. Sensors 2016, 16, 1635. [Google Scholar] [CrossRef]
- Hsieh, J.-C.; He, W.; Venkatraghavan, D.; Koptelova, V.B.; Ahmad, Z.J.; Pyatnitskiy, I.; Wang, W.; Jeong, J.; Tang, K.K.W.; Harmeier, C. Design of an injectable, self-adhesive, and highly stable hydrogel electrode for sleep recording. Device 2024, 2, 100182. [Google Scholar] [CrossRef] [PubMed]
- Tong, A.; Perera, P.; Sarsenbayeva, Z.; McEwan, A.; De Silva, A.C.; Withana, A. Fully 3D-printed dry EEG electrodes. Sensors 2023, 23, 5175. [Google Scholar] [CrossRef]
- Wang, Z.; Ding, Y.; Yuan, W.; Chen, H.; Chen, W.; Chen, C. Active Claw-Shaped Dry Electrodes for EEG Measurement in Hair Areas. Bioengineering 2024, 11, 276. [Google Scholar] [CrossRef]
- Goh, T.L.; Peh, L.-S. WalkingWizard—A truly wearable EEG headset for everyday use. ACM Trans. Comput. Healthc. 2024, 5, 1–38. [Google Scholar] [CrossRef]
- Ghoreishi, E.; Abolhassani, B.; Huang, Y.; Acharya, S.; Lou, W.; Hou, Y.T. In Cyrus: A DRL-based Puncturing Solution to URLLC/eMBB Multiplexing in O-RAN. In Proceedings of the 2024 33rd International Conference on Computer Communications and Networks (ICCCN), Big Island, HI, USA, 29–31 July 2024; IEEE: Piscataway, NJ, USA, 2024; pp. 1–9. [Google Scholar]
- Farrokhi, S.; Dargie, W.; Poellabauer, C. Human Activity Recognition Based on Wireless Electrocardiogram and Inertial Sensors. IEEE Sens. J. 2024, 24, 6490–6499. [Google Scholar] [CrossRef]
- EskandariNasab, M.; Raeisi, Z.; Lashaki, R.A.; Najafi, H. A GRU–CNN model for auditory attention detection using microstate and recurrence quantification analysis. Sci. Rep. 2024, 14, 8861. [Google Scholar] [CrossRef]
- Dargie, W.; Farrokhi, S.; Poellabauer, C. Identification of Persons Based on Electrocardiogram and Motion Data. TechRxiv 2024. [Google Scholar] [CrossRef]
- Shavandi, M.; Taghavi, A. Maps preserving n-tuple A* B− B* A derivations on factor von Neumann algebras. Publ. L’institut Math. 2023, 113, 131–140. [Google Scholar] [CrossRef]
- Shavandi, M.; Taghavi, A. Non-linear triple product A* B-B* A derivations on*-algebras. Surv. Math. Its Appl. 2024, 19, 67–78. [Google Scholar]
- Zhang, S.; Tong, H.; Xu, J.; Maciejewski, R. Graph convolutional networks: A comprehensive review. Comput. Soc. Netw. 2019, 6, 1–23. [Google Scholar] [CrossRef]
- Rahmani, M.; Mohajelin, F.; Khaleghi, N.; Sheykhivand, S.; Danishvar, S. An Automatic Lie Detection Model Using EEG Signals Based on the Combination of Type 2 Fuzzy Sets and Deep Graph Convolutional Networks. Sensors 2024, 24, 3598. [Google Scholar]
- Kumar, K.V.; Sathish, A. Medical image fusion based on type-2 fuzzy sets with teaching learning based optimization. Multimed. Tools Appl. 2024, 83, 33235–33262. [Google Scholar] [CrossRef]
- Güven, Y.; Köklu, A.; Kumbasar, T. Zadeh’s Type-2 Fuzzy Logic Systems: Precision and High-Quality Prediction Intervals. In Proceedings of the 2024 IEEE International Conference on Fuzzy Systems (FUZZ-IEEE), Yokohama, Japan, 30 June–5 July 2024; IEEE: Piscataway, NJ, USA, 2024; pp. 1–6. [Google Scholar]
- Habibi, A.; Damasio, A. Music, feelings, and the human brain. Psychomusicology Music. Mind Brain 2014, 24, 92. [Google Scholar] [CrossRef]
- Gertler, S.; Otterstrom, N.T.; Gehl, M.; Starbuck, A.L.; Dallo, C.M.; Pomerene, A.T.; Trotter, D.C.; Lentine, A.L.; Rakich, P.T. Narrowband microwave-photonic notch filters using Brillouin-based signal transduction in silicon. Nat. Commun. 2022, 13, 1947. [Google Scholar] [CrossRef]
- Mahata, S.; Herencsar, N.; Kubanek, D. Optimal approximation of fractional-order Butterworth filter based on weighted sum of classical Butterworth filters. IEEE Access 2021, 9, 81097–81114. [Google Scholar] [CrossRef]
- Zhang, Y.; Yu, Y.; Wang, B.; Shen, H.; Lu, G.; Liu, Y.; Zeng, L.-L.; Hu, D. Graph learning with co-teaching for EEG-based motor imagery recognition. IEEE Trans. Cogn. Dev. Syst. 2022, 15, 1722–1731. [Google Scholar] [CrossRef]
- Abdullah, D.M.; Abdulazeez, A.M. Machine learning applications based on SVM classification a review. Qubahan Acad. J. 2021, 1, 81–90. [Google Scholar] [CrossRef]
- Desai, M.; Shah, M. An anatomization on breast cancer detection and diagnosis employing multi-layer perceptron neural network (MLP) and Convolutional neural network (CNN). Clin. eHealth 2021, 4, 1–11. [Google Scholar] [CrossRef]
- Cunningham, P.; Delany, S.J. k-nearest neighbour classifiers-a tutorial. ACM Comput. Surv. (CSUR) 2021, 54, 128. [Google Scholar] [CrossRef]
- Liu, Y.; Yu, Y.; Ye, Z.; Li, M.; Zhang, Y.; Zhou, Z.; Hu, D.; Zeng, L.-L. Fusion of spatial, temporal, and spectral EEG signatures improves multilevel cognitive load prediction. IEEE Trans. Hum.-Mach. Syst. 2023, 53, 357–366. [Google Scholar] [CrossRef]
- Jia, Z.; Lin, Y.; Cai, X.; Chen, H.; Gou, H.; Wang, J. Sst-emotionnet: Spatial-spectral-temporal based attention 3d dense network for eeg emotion recognition. In Proceedings of the 28th ACM International Conference on Multimedia, Melbourne, VIC, Australia, 12–18 October 2020; pp. 2909–2917. [Google Scholar]
- Koonce, B.; Koonce, B. ResNet 50. In Convolutional Neural Networks with Swift for Tensorflow: Image Recognition and Dataset Categorization; Apress: New York, NY, USA, 2021; pp. 63–72. [Google Scholar]
- Soria, X.; Sappa, A.; Humanante, P.; Akbarinia, A. Dense extreme inception network for edge detection. Pattern Recognit. 2023, 139, 109461. [Google Scholar] [CrossRef]
- Vaziri, A.Y.; Makkiabadi, B.; Samadzadehaghdam, N. EEGg: Generating Synthetic EEG Signals in Matlab Environment. Front. Biomed. Technol. 2023, 10, 370–381. [Google Scholar]
Emotion | N1 | P1 | N2 | P2 | P3 | N3 | N4 | P4 | N5 | P5 |
---|---|---|---|---|---|---|---|---|---|---|
Music played | Esfehani | 6&8 | Homayoun | 6&8 | 6&8 | Afshari | Esfehani | 6&8 | Dashti | 6&8 |
Layer | Weight Tensor | Bias | Parameters |
---|---|---|---|
GConv1 | (Q1, 76,800, 76,800) | 76,800 | 5,898,240,000 × Q1 + 76,800 |
GConv2 | (Q2, 76,800, 38,400) | 38,400 | 2,949,120,000 × Q2 + 38,400 |
Gconv3 | (Q3, 38,400, 19,200) | 19,200 | 737,280,000 × Q3 + 19,200 |
Gconv4 | (Q4, 19,200, 9600) | 9600 | 184,320,000 × Q4 + 9600 |
Gconv5 | (Q5, 9600, 4800) | 4800 | 46,080,000 × Q5 + 4800 |
Flattening Layer | 4800 | 2 | 9600 |
Parameters | Values | Optimal Value |
---|---|---|
Number of Gconv | 2, 3, 4, 5, 6, 7 | 5 |
Batch Size in DFCGN | 8, 16, 32 | 16 |
Batch normalization | ReLU, Leaky-ReLU, TF-2 | TF-2 |
Learning Rate in DFCGN | 0.1, 0.01, 0.001, 0.0001, 0.00001 | 0.0001 |
Dropout Rate | 0.1, 0.2, 0.3 | 0.2 |
Weight of optimizer | ||
Error function | MSE, Cross Entropy | Cross Entropy |
Optimizer in DFCGN | Adam, SGD, Adadelta, Adamax | SGD |
Measurement Index | Accuracy | Sensitivity | Precision | Specificity | Kappa Coefficient |
---|---|---|---|---|---|
Proposed Dry Electrode | 99.2% | 98.7% | 99.4 | 98.4% | 0.9 |
Wet Electrode | 98.0% | 96.4% | 98.7% | 99.2% | 0.8 |
Dry Electrode | 90.1% | 88.7% | 91.3% | 93.8% | 0.7 |
Research | Algorithms | ACC (%) |
---|---|---|
Sheykhivand et al. [16] | CNN + LSTM | 97 |
Baradaran et al. [17] | DCNN | 98 |
Baradaran et al. [18] | Type 2 Fuzzy + CNN | 98 |
Yang et al. [19] | SITCN | 95 |
Hussain et al. [20] | LP-1D-CNN | 98.43 |
Khubani et al. [21] | DCNN | 97.12 |
Peng et al. [22] | Temporal Relative (TR) Encoding | 95.58 |
Xu et al. [23] | Functional Connectivity Features | 97 |
Alotaibi et al. [24] | GoogLeNet DNN | 96.95 |
Qiao et al. [25] | CNN-SA-BiLSTM | 96.43 |
Our Model | New Dry Electrode + DFCGN Network | 99.2 |
Method | Feature Learning (ACC) | Handcrafted Features (ACC) |
---|---|---|
KNN | 65.1% | 81.8% |
SVM | 72.1% | 88.1% |
CNN | 92.7% | 71.6% |
MLP | 70.5% | 87.6% |
P-M (DFCGN) | 99.2% | 78.8% |
Disclaimer/Publisher’s Note: The statements, opinions and data contained in all publications are solely those of the individual author(s) and contributor(s) and not of MDPI and/or the editor(s). MDPI and/or the editor(s) disclaim responsibility for any injury to people or property resulting from any ideas, methods, instructions or products referred to in the content. |
© 2024 by the authors. Licensee MDPI, Basel, Switzerland. This article is an open access article distributed under the terms and conditions of the Creative Commons Attribution (CC BY) license (https://creativecommons.org/licenses/by/4.0/).
Share and Cite
Mounesi Rad, S.; Danishvar, S. Emotion Recognition Using EEG Signals through the Design of a Dry Electrode Based on the Combination of Type 2 Fuzzy Sets and Deep Convolutional Graph Networks. Biomimetics 2024, 9, 562. https://doi.org/10.3390/biomimetics9090562
Mounesi Rad S, Danishvar S. Emotion Recognition Using EEG Signals through the Design of a Dry Electrode Based on the Combination of Type 2 Fuzzy Sets and Deep Convolutional Graph Networks. Biomimetics. 2024; 9(9):562. https://doi.org/10.3390/biomimetics9090562
Chicago/Turabian StyleMounesi Rad, Shokoufeh, and Sebelan Danishvar. 2024. "Emotion Recognition Using EEG Signals through the Design of a Dry Electrode Based on the Combination of Type 2 Fuzzy Sets and Deep Convolutional Graph Networks" Biomimetics 9, no. 9: 562. https://doi.org/10.3390/biomimetics9090562
APA StyleMounesi Rad, S., & Danishvar, S. (2024). Emotion Recognition Using EEG Signals through the Design of a Dry Electrode Based on the Combination of Type 2 Fuzzy Sets and Deep Convolutional Graph Networks. Biomimetics, 9(9), 562. https://doi.org/10.3390/biomimetics9090562