[go: up one dir, main page]
More Web Proxy on the site http://driver.im/
You seem to have javascript disabled. Please note that many of the page functionalities won't work as expected without javascript enabled.
 
 
Sign in to use this feature.

Years

Between: -

Subjects

remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline

Journals

Article Types

Countries / Regions

remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline

Search Results (177)

Search Parameters:
Keywords = wearable EEG

Order results
Result details
Results per page
Select all
Export citation of selected articles as:
22 pages, 2293 KiB  
Article
Novel Perspectives for Sensory Analysis Applied to Piperaceae and Aromatic Herbs: A Pilot Study
by Isabella Taglieri, Alessandro Tonacci, Guido Flamini, Pierina Díaz-Guerrero, Roberta Ascrizzi, Lorenzo Bachi, Giorgia Procissi, Lucia Billeci and Francesca Venturi
Foods 2025, 14(1), 110; https://doi.org/10.3390/foods14010110 - 3 Jan 2025
Viewed by 676
Abstract
Spices and aromatic herbs are important components of everyday nutrition in several countries and cultures, thanks to their capability to enhance the flavor of many dishes and convey significant emotional contributions by themselves. Indeed, spices as well as aromatic herbs are to be [...] Read more.
Spices and aromatic herbs are important components of everyday nutrition in several countries and cultures, thanks to their capability to enhance the flavor of many dishes and convey significant emotional contributions by themselves. Indeed, spices as well as aromatic herbs are to be considered not only for their important values of antimicrobial agents or flavor enhancers everybody knows, but also, thanks to their olfactory and gustatory spectrum, as drivers to stimulate the consumers’ memories and, in a stronger way, emotions. Considering these unique characteristics, spices and aromatic herbs have caught the attention of consumer scientists and experts in sensory analysis for their evaluation using semi-quantitative approaches, with interesting evidence. In this pilot study as a first step, each studied botanical, belonging to Piperaceae or aromatic herbs, has been subjected to headspace solid phase micro-extraction (HS-SPME) coupled with gas-chromatography mass spectrometry (GC-MS) analysis to assess their spontaneous volatile emission, representing the complex chemical pattern, which encounters the consumers’ olfactory perception. Furthermore, the present investigation, performed on 12 individuals, outlines the administration of a pilot study, merging the typical sensory analysis with emotional data collection and the innovative contribution related to the study around the Autonomic and Central Nervous System activation in consumers, performed using wearable technologies and related signal processing. The results obtained by our study, beyond demonstrating the feasibility of the approach, confirmed, both in terms of emotional responses and biomedical signals, the significant emotional potential of spices and aromatic herbs, most of which featuring an overall positive valence, yet with inter-subjects’ variations. Future investigations should aim to increase the number of volunteers evaluated with such an approach to draw more stable conclusions and attempting a customization of product preferences based on both implicit and explicit sensory responses. Full article
(This article belongs to the Special Issue Feature Review on Food Nutrition)
Show Figures

Figure 1

Figure 1
<p>Olfactory wheel preliminary developed by the consensus panel.</p>
Full article ">Figure 2
<p>(<b>a</b>) Olfactory profile of Piperaceae. (<b>b</b>) Olfactory profile of Herbs.</p>
Full article ">Figure 2 Cont.
<p>(<b>a</b>) Olfactory profile of Piperaceae. (<b>b</b>) Olfactory profile of Herbs.</p>
Full article ">Figure 3
<p>Emotions/moods whole spectrum evoked by spices.</p>
Full article ">Figure 4
<p>(<b>a</b>). Specific pattern of emotions of Piperaceae. (<b>b</b>) Specific pattern of emotions of herbs.</p>
Full article ">Figure 5
<p>Group distribution of the mainly significant biomedical signal features depending on perceived pleasantness ((<b>a</b>) total GSR, (<b>b</b>) GSR phasic peak, (<b>c</b>) CSI, (<b>d</b>) power of the temporal region between 18 and 25 Hz).</p>
Full article ">
32 pages, 11502 KiB  
Article
DETEC-ADHD: A Data-Driven Web App for Early ADHD Detection Using Machine Learning and Electroencephalography
by Ismael Santarrosa-López, Giner Alor-Hernández, Maritza Bustos-López, Jonathan Hernández-Capistrán, Laura Nely Sánchez-Morales, José Luis Sánchez-Cervantes and Humberto Marín-Vega
Big Data Cogn. Comput. 2025, 9(1), 3; https://doi.org/10.3390/bdcc9010003 - 30 Dec 2024
Viewed by 519
Abstract
Attention Deficit Hyperactivity Disorder (ADHD) diagnosis is often challenging due to subjective assessments and symptom variability, which can delay accurate detection and treatment. To address these limitations, this study introduces DETEC-ADHD, a web-based application that combines machine learning (ML) techniques with multi-source data [...] Read more.
Attention Deficit Hyperactivity Disorder (ADHD) diagnosis is often challenging due to subjective assessments and symptom variability, which can delay accurate detection and treatment. To address these limitations, this study introduces DETEC-ADHD, a web-based application that combines machine learning (ML) techniques with multi-source data to enhance diagnostic accuracy. Unlike traditional approaches, DETEC-ADHD primarily utilizes extensive personal, medical, and psychological information for its initial classification. DETEC-ADHD further refines diagnoses by identifying ADHD subtypes (inattentive, hyperactive, combined) through theta/beta wave ratio analysis from EEG data, offering neurophysiological insights that complement its classification process. Logistic Regression, selected for its validated accuracy and reliability, served as the ML model for the app. The case studies demonstrated DETEC-ADHD’s effectiveness, achieving 100% accuracy in children and 90% in adults. By integrating diverse data sources with real-time EEG analysis, DETEC-ADHD provides a scalable, cost-effective, and accessible solution for ADHD detection and subtype identification, addressing diagnostic challenges and supporting healthcare providers, particularly in resource-limited environments. Full article
Show Figures

Figure 1

Figure 1
<p>Data collection and synchronization process workflow.</p>
Full article ">Figure 2
<p>ADHD classification process workflow.</p>
Full article ">Figure 3
<p>ADHD prognosis process workflow.</p>
Full article ">Figure 4
<p>Architecture of DETEC–ADHD, detailing the workflow: (1) Web client manages tests and results; (2) Wearable data sent for processing; (3) Data sent to ML models; (4) ADHD classification performed; (5–7) Results stored and managed; (8) Monitoring ensures accuracy; (9) Results visualized; (10) Historical data displayed.</p>
Full article ">Figure 5
<p>DETEC-ADHD methodological architecture for ADHD diagnosis. The workflow consists of three layers: the Input Layer, which collects medical history, test results, and expert opinions; the Processing Layer, where data and EEG analysis results are processed through a machine learning model; and the Output Layer, which integrates the results to provide a comprehensive final diagnosis.</p>
Full article ">Figure 6
<p>The ridgeline plot illustrates the distributions of absolute errors for predictive models, providing a comparative overview of their performance variability.</p>
Full article ">Figure 7
<p>Comparative plot of key metrics of ML models trained with HYPERAKTIV dataset.</p>
Full article ">Figure 8
<p>Convergence plot for Logistic Regression, showing the decrease in log loss over iterations.</p>
Full article ">Figure 9
<p>ROC curves of the 10 models assessed on the HYPERAKTIV dataset.</p>
Full article ">Figure 10
<p>Impact of feature removal on F1 score (ablation experiment).</p>
Full article ">Figure 11
<p>A case study for ADHD detection in children.</p>
Full article ">Figure 12
<p>A case study for ADHD detection in adults.</p>
Full article ">Figure 13
<p>DETEC–ADHD—main page.</p>
Full article ">Figure 14
<p>DETEC–ADHD—example of real-time test performance.</p>
Full article ">Figure 15
<p>Patient Profile page (new test report added).</p>
Full article ">Figure 16
<p>Test Report Screen.</p>
Full article ">Figure 17
<p>EEG results report page.</p>
Full article ">Figure 18
<p>Comparison of EEG signal patterns—ADHD child vs. control child.</p>
Full article ">
19 pages, 10977 KiB  
Article
Comparison of EEG Signal Spectral Characteristics Obtained with Consumer- and Research-Grade Devices
by Dmitry Mikhaylov, Muhammad Saeed, Mohamed Husain Alhosani and Yasser F. Al Wahedi
Sensors 2024, 24(24), 8108; https://doi.org/10.3390/s24248108 - 19 Dec 2024
Viewed by 623
Abstract
Electroencephalography (EEG) has emerged as a pivotal tool in both research and clinical practice due to its non-invasive nature, cost-effectiveness, and ability to provide real-time monitoring of brain activity. Wearable EEG technology opens new avenues for consumer applications, such as mental health monitoring, [...] Read more.
Electroencephalography (EEG) has emerged as a pivotal tool in both research and clinical practice due to its non-invasive nature, cost-effectiveness, and ability to provide real-time monitoring of brain activity. Wearable EEG technology opens new avenues for consumer applications, such as mental health monitoring, neurofeedback training, and brain–computer interfaces. However, there is still much to verify and re-examine regarding the functionality of these devices and the quality of the signal they capture, particularly as the field evolves rapidly. In this study, we recorded the resting-state brain activity of healthy volunteers via three consumer-grade EEG devices, namely PSBD Headband Pro, PSBD Headphones Lite, and Muse S Gen 2, and compared the spectral characteristics of the signal obtained with that recorded via the research-grade Brain Product amplifier (BP) with the mirroring montages. The results showed that all devices exhibited higher mean power in the low-frequency bands, which are characteristic of dry-electrode technology. PSBD Headband proved to match BP most precisely among the other examined devices. PSBD Headphones displayed a moderate correspondence with BP and signal quality issues in the central group of electrodes. Muse demonstrated the poorest signal quality, with extremely low alignment with BP. Overall, this study underscores the importance of considering device-specific design constraints and emphasizes the need for further validation to ensure the reliability and accuracy of wearable EEG devices. Full article
(This article belongs to the Section Biomedical Sensors)
Show Figures

Figure 1

Figure 1
<p>The PSD plots for the signals obtained via PSBD-band, BP-band (BP-b), BP-headphones (BP-h), and PSBD-headphones (PSBD h-phones) in the open- and closed-eye conditions.</p>
Full article ">Figure 2
<p>The PSD plots for the signals obtained via Muse and BP-band (BP-b) in the open- and closed-eye conditions.</p>
Full article ">Figure 3
<p>The PSD plots for the signals obtained via PSBD-headphones and BP-headphones (BP-h) in the open- and closed-eye conditions.</p>
Full article ">Figure 4
<p>The PSD plots for the signals obtained via PSBD band and BP-band (BP) in the open- and closed-eye conditions.</p>
Full article ">Figure 5
<p>Box plots illustrating differences in the PSD values for the delta, theta, alpha, low beta, high beta, and gamma rhythms obtained with Muse and BP-band in the closed-/open-eye conditions at the frontal site.</p>
Full article ">Figure 6
<p>Box plots illustrating differences in the PSD values for the delta, theta, alpha, low beta, high beta, and gamma rhythms obtained with PSBD-band, BP-band, PSBD-headphones, and BP-headphones in the closed-/open-eye conditions at the temporal site.</p>
Full article ">Figure 7
<p>Box plots illustrating differences in the PSD values for the delta, theta, alpha, low beta, high beta, and gamma rhythms obtained with PSBD band and BP-band in the closed-/open-eye conditions at the occipital site.</p>
Full article ">Figure 8
<p>Box plots illustrating differences in the PSD values for the delta, theta, alpha, low beta, high beta, and gamma rhythms obtained with PSBD-headphones and BP-headphones in the closed-/open-eye conditions at the central site.</p>
Full article ">Figure 9
<p>Scatter plots depicting power values obtained with Muse and BP-band at the frontal site.</p>
Full article ">Figure 10
<p>Scatter plots depicting power values obtained with PSBD band and BP-band at the temporal site.</p>
Full article ">Figure 11
<p>Scatter plots depicting power values obtained with PSBD band and BP-band at the occipital site.</p>
Full article ">Figure 12
<p>Scatter plots depicting power values obtained with PSBD headphones and BP-headphones at the temporal site.</p>
Full article ">Figure 13
<p>Scatter plots depicting power values obtained with PSBD headphones and BP-headphones at the central site.</p>
Full article ">
13 pages, 2380 KiB  
Article
Exploring the Utility of the Muse Headset for Capturing the N400: Dependability and Single-Trial Analysis
by Hannah Begue Hayes and Cyrille Magne
Sensors 2024, 24(24), 7961; https://doi.org/10.3390/s24247961 - 13 Dec 2024
Viewed by 558
Abstract
Consumer-grade EEG devices, such as the InteraXon Muse 2 headband, present a promising opportunity to enhance the accessibility and inclusivity of neuroscience research. However, their effectiveness in capturing language-related ERP components, such as the N400, remains underexplored. This study thus aimed to investigate [...] Read more.
Consumer-grade EEG devices, such as the InteraXon Muse 2 headband, present a promising opportunity to enhance the accessibility and inclusivity of neuroscience research. However, their effectiveness in capturing language-related ERP components, such as the N400, remains underexplored. This study thus aimed to investigate the feasibility of using the Muse 2 to measure the N400 effect in a semantic relatedness judgment task. Thirty-seven participants evaluated the semantic relatedness of word pairs while their EEG was recorded using the Muse 2. Single-trial ERPs were analyzed using robust Yuen t-tests and hierarchical linear modeling (HLM) to assess the N400 difference between semantically related and unrelated target words. ERP analyses indicated a significantly larger N400 effect in response to unrelated word pairs over the right frontal electrode. Additionally, dependability estimates suggested acceptable internal consistency for the N400 data. Overall, these findings illustrate the capability of the Muse 2 to reliably measure the N400 effect, reinforcing its potential as a valuable tool for language research. This study highlights the potential of affordable, wearable EEG technology to expand access to brain research by offering an affordable and portable way to study language and cognition in diverse populations and settings. Full article
(This article belongs to the Section Biomedical Sensors)
Show Figures

Figure 1

Figure 1
<p>Placement of Muse 2 headset electrodes (dark blue) according to the 10–20 International System (AF = anterior frontal, TP = temporoparietal).</p>
Full article ">Figure 2
<p>Dependability coefficients of equivalence (internal consistency) for mean N400 amplitudes (250–600 ms) in the semantically unrelated (red trace) and semantically related (blue trace) conditions as a function of the number of trials included. Dependability is a unitless measure of consistency, ranging from 0 to +1. The dotted line represents the reliability threshold which was set at 0.70.</p>
Full article ">Figure 3
<p>Grand-average waveforms of the ERPs elicited in the semantically unrelated (red trace) and semantically related (blue trace) conditions at the left (AF7) and right (AF8) frontal electrodes.</p>
Full article ">Figure 4
<p>Statistical analysis. (<b>a</b>) Modeled amplitudes averaged across subjects for the semantically unrelated (red trace) and semantically related (blue trace) conditions. The shaded area around the beta parameter time courses indicates 95% robust confidence intervals; (<b>b</b>) T-values calculated using paired Yuen <span class="html-italic">t</span>-tests to compare the two conditions at each individual time point. The thick black line at the bottom of the right plot indicates significant time points after correction for multiple comparisons. The asterisk (*) indicates a significance level of <span class="html-italic">p</span> &lt; 0.05.</p>
Full article ">
21 pages, 5220 KiB  
Article
A Closed-Loop Ear-Worn Wearable EEG System with Real-Time Passive Electrode Skin Impedance Measurement for Early Autism Detection
by Muhammad Sheeraz, Abdul Rehman Aslam, Emmanuel Mic Drakakis, Hadi Heidari, Muhammad Awais Bin Altaf and Wala Saadeh
Sensors 2024, 24(23), 7489; https://doi.org/10.3390/s24237489 - 24 Nov 2024
Viewed by 664
Abstract
Autism spectrum disorder (ASD) is a chronic neurological disorder with the severity directly linked to the diagnosis age. The severity can be reduced if diagnosis and intervention are early (age < 2 years). This work presents a novel ear-worn wearable EEG system designed [...] Read more.
Autism spectrum disorder (ASD) is a chronic neurological disorder with the severity directly linked to the diagnosis age. The severity can be reduced if diagnosis and intervention are early (age < 2 years). This work presents a novel ear-worn wearable EEG system designed to aid in the early detection of ASD. Conventional EEG systems often suffer from bulky, wired electrodes, high power consumption, and a lack of real-time electrode–skin interface (ESI) impedance monitoring. To address these limitations, our system incorporates continuous, long-term EEG recording, on-chip machine learning for real-time ASD prediction, and a passive ESI evaluation system. The passive ESI methodology evaluates impedance using the root mean square voltage of the output signal, considering factors like pressure, electrode surface area, material, gel thickness, and duration. The on-chip machine learning processor, implemented in 180 nm CMOS, occupies a minimal 2.52 mm² of active area while consuming only 0.87 µJ of energy per classification. The performance of this ML processor is validated using the Old Dominion University ASD dataset. Full article
(This article belongs to the Section Biomedical Sensors)
Show Figures

Figure 1

Figure 1
<p>(<b>a</b>) Conventional Autism Diagnostic Observation Schedule (ADOS-2) for ASD diagnosis along with (<b>b</b>) proposed ASD prediction system.</p>
Full article ">Figure 2
<p>Proposed ear-worn EEG acquisition, ASD prediction, and real-time ESI monitoring system.</p>
Full article ">Figure 3
<p>AFE schematic along with electrode placement and ESI circuit.</p>
Full article ">Figure 4
<p>ASD classification processor block diagram.</p>
Full article ">Figure 5
<p>Feature calculation unit’s hardware implementation.</p>
Full article ">Figure 6
<p>SNN unit’s hardware implementation.</p>
Full article ">Figure 7
<p>Skin layers and respective equivalent Webster ESI model for wet and dry electrodes under different conditions (no sweating and sweating conditions).</p>
Full article ">Figure 8
<p>Configuration schematics and measuring model for approximate ESI.</p>
Full article ">Figure 9
<p>AFE power profile details.</p>
Full article ">Figure 10
<p>(<b>a</b>) Fabricated PCB. (<b>b</b>) Completely assembled device. (<b>c</b>) Mobile application for real-time EEG, ESI measurement, and ASD classification.</p>
Full article ">Figure 11
<p>EEG acquisition (Fp1 and Fp2) for various conditions including (<b>a</b>) closed eyes, (<b>c</b>) open eyes, and (<b>e</b>) blinking eyes along with the respective signal spectrograms (<b>b</b>,<b>d</b>,<b>f</b>) for 1 min using wet pre-gelled Ag/AgCl electrodes.</p>
Full article ">Figure 12
<p>(<b>a</b>) Raw and (<b>b</b>) filtered EEG signal acquired using the developed device for a window of five seconds using dry electrodes.</p>
Full article ">Figure 13
<p>EEG AFE characteristic measurements results for (<b>a</b>) CMRR, (<b>b</b>) input referred noise, and (<b>c</b>) AFE variable gain.</p>
Full article ">Figure 14
<p>(<b>a</b>) Various types of electrodes along with the effective asking contact area (SCA *). The 7.2 mm diameter dry electrodes, (<b>b</b>) 8 mm diameter wet electrodes, (<b>c</b>) 10 mm diameter wet electrodes, and (<b>d</b>) conductive paste.</p>
Full article ">Figure 15
<p>(<b>A</b>) Effect of pressure on ESI with the wet disposable pre-gelled Ag/AgCl electrodes and (<b>B</b>) dry reusable Ag/AgCl electrodes.</p>
Full article ">Figure 16
<p>Effect of skin preparation on dry Ag/AgCl electrodes for ESI.</p>
Full article ">Figure 17
<p>(<b>a</b>) Effect of skin locations on the electrodes. (<b>b</b>) Wet electrode ESI measurement on forehead. (<b>c</b>) Dry electrode ESI measurement on forearm. (<b>d</b>) Dry electrode ESI measurement under pressure.</p>
Full article ">Figure 18
<p>SNN processor die photo and performance summary.</p>
Full article ">Figure 19
<p>ASD detection processor measurement results.</p>
Full article ">
34 pages, 3181 KiB  
Review
Commercial Wearables for the Management of People with Autism Spectrum Disorder: A Review
by Jonathan Hernández-Capistrán, Giner Alor-Hernández, Humberto Marín-Vega, Maritza Bustos-López, Laura Nely Sanchez-Morales and Jose Luis Sanchez-Cervantes
Biosensors 2024, 14(11), 556; https://doi.org/10.3390/bios14110556 - 15 Nov 2024
Viewed by 1405
Abstract
Autism Spectrum Disorder (ASD) necessitates comprehensive management, addressing complex challenges in social communication, behavioral regulation, and sensory processing, for which wearable technologies offer valuable tools to monitor and support interventions. Therefore, this review explores recent advancements in wearable technology, categorizing devices based on [...] Read more.
Autism Spectrum Disorder (ASD) necessitates comprehensive management, addressing complex challenges in social communication, behavioral regulation, and sensory processing, for which wearable technologies offer valuable tools to monitor and support interventions. Therefore, this review explores recent advancements in wearable technology, categorizing devices based on executive function, psychomotor skills, and the behavioral/emotional/sensory domain, highlighting their potential to improve ongoing management and intervention. To ensure rigor and comprehensiveness, the review employs a PRISMA-based methodology. Specifically, literature searches were conducted across diverse databases, focusing on studies published between 2014 and 2024, to identify the most commonly used wearables in ASD research. Notably, 55.45% of the 110 devices analyzed had an undefined FDA status, 23.6% received 510(k) clearance, and only a small percentage were classified as FDA Breakthrough Devices or in the submission process. Additionally, approximately 50% of the devices utilized sensors like ECG, EEG, PPG, and EMG, highlighting their widespread use in real-time physiological monitoring. Our work comprehensively analyzes a wide array of wearable technologies, including emerging and advanced. While these technologies have the potential to transform ASD management through real-time data collection and personalized interventions, improved clinical validation and user-centered design are essential for maximizing their effectiveness and user acceptance. Full article
(This article belongs to the Special Issue Recent Advances in Wearable Biosensors for Human Health Monitoring)
Show Figures

Figure 1

Figure 1
<p>Diagram of the main body functions that can be supported by advanced diagnostic technologies in individuals with ASD.</p>
Full article ">Figure 2
<p>PRISMA flow diagram of the search strategy.</p>
Full article ">Figure 3
<p>Frequency of wearable form factor types used. HST: headset; HND: handheld; CPS: capsule; HP: headphone; EB: earbuds; GLS: glasses; CAM: camera; TAG: tag; TSH: t-shirt; WBD: wristband; VST: vest; AUR: auricular; BLT: belt; FP: force plate; HCP: headcap; several form factors relates to armband, smartband, chest belt, headband, grid, shoe pod, shorts, and sleeve, each with 1.2%.</p>
Full article ">Figure 4
<p>FDA status for wearable devices in ASD domains.</p>
Full article ">Figure 5
<p>Distribution of commercially utilized sensors for executive function management in autism. The chart highlights the predominance of electroencephalogram (EEG) over other technologies such as transcranial direct current stimulation (tDCS), electromyography (EMG), and cameras (Cam). It also shows the use of less prevalent sensors like accelerometer (ACC), electrocardiogram (ECG), gyroscopes (Gyro), infrared (IR), functional near-infrared spectroscopy (fNIRS), photoplethysmography (PPG), and virtual reality (VR), indicating the diversity of tools employed in this domain.</p>
Full article ">Figure 6
<p>Distribution of commercially utilized sensors for psychomotor monitoring in autism. The chart highlights key sensors such as electrocardiogram (ECG), photoplethysmography (PPG), temperature (Temp), electromyography (EMG), blood oxygen saturation (SpO<sub>2</sub>), inertial measurement units (IMU), accelerometer (ACC), infrared sensor (IR), galvanic skin response (GSR), pressure sensors (Press), bioimpedance (BI), blood pressure (BP), cameras (CAM), capnography (CG), electrogoniometer (EG), footswitch (FS), galvanic gauge (GG), strain gauge (ST), and stretch sensor (STCH).</p>
Full article ">Figure 7
<p>Distribution of commercially utilized sensors in the behavioral/emotional/sensory (BES) domain for autism. The chart highlights key sensors such as cameras (CAM), microphones (Mic), galvanic skin response (GSR), electromyography (EMG), inertial measurement units (IMU), photoplethysmography (PPG), and temperature (Temp).</p>
Full article ">Figure 8
<p>Frequency of sensors used in wearables. ECG: electrocardiogram; EEG: electroencephalogram; PPG: photoplethysmography; EMG: electromyography; CAM: camera; TEMP: temperature; SpO<sub>2</sub>: peripheral capillary oxygen saturation; ACC: accelerometer; IMU: inertial measurement unit; IR: infrared; MIC: microphone; GSR: galvanic skin response; tDCS: transcranial direct current stimulation; GYRO: gyroscope; PRESS: pressure; various sensors relates to bioimpedance, blood pressure, capnograph, electrogoniometer, fNIRS, footswitch, galvanometer, strain, stretch, and virtual reality, each with 0.6%.</p>
Full article ">
17 pages, 4127 KiB  
Tutorial
Optimizing EEG Signal Integrity: A Comprehensive Guide to Ocular Artifact Correction
by Vincenzo Ronca, Rossella Capotorto, Gianluca Di Flumeri, Andrea Giorgi, Alessia Vozzi, Daniele Germano, Valerio Di Virgilio, Gianluca Borghini, Giulia Cartocci, Dario Rossi, Bianca M. S. Inguscio, Fabio Babiloni and Pietro Aricò
Bioengineering 2024, 11(10), 1018; https://doi.org/10.3390/bioengineering11101018 - 12 Oct 2024
Cited by 1 | Viewed by 1292
Abstract
Ocular artifacts, including blinks and saccades, pose significant challenges in the analysis of electroencephalographic (EEG) data, often obscuring crucial neural signals. This tutorial provides a comprehensive guide to the most effective methods for correcting these artifacts, with a focus on algorithms designed for [...] Read more.
Ocular artifacts, including blinks and saccades, pose significant challenges in the analysis of electroencephalographic (EEG) data, often obscuring crucial neural signals. This tutorial provides a comprehensive guide to the most effective methods for correcting these artifacts, with a focus on algorithms designed for both laboratory and real-world settings. We review traditional approaches, such as regression-based techniques and Independent Component Analysis (ICA), alongside more advanced methods like Artifact Subspace Reconstruction (ASR) and deep learning-based algorithms. Through detailed step-by-step instructions and comparative analysis, this tutorial equips researchers with the tools necessary to maintain the integrity of EEG data, ensuring accurate and reliable results in neurophysiological studies. The strategies discussed are particularly relevant for wearable EEG systems and real-time applications, reflecting the growing demand for robust and adaptable solutions in applied neuroscience. Full article
(This article belongs to the Section Biosignal Processing)
Show Figures

Figure 1

Figure 1
<p>Raw EEG signal on frontal electrodes showing ocular artifacts, which can be easily identified due to their larger amplitudes compared to the EEG signal.</p>
Full article ">Figure 2
<p>Signal composition block diagram.</p>
Full article ">Figure 3
<p>Raw EEG signal affected by ocular artifacts. Such artifacts can be easily visually recognized as the prominent peaks visible along the signal trace.</p>
Full article ">Figure 4
<p>Example of artifactual component derived from the EEG signal affected by ocular artifacts through the regression-based algorithm.</p>
Full article ">Figure 5
<p>Overlapped representation of the raw (orange line) and clean (blue line) EEG signals. The figure shows how the algorithm successfully identified and corrected the ocular artifacts.</p>
Full article ">Figure 6
<p>Block diagram of the principal steps for approaching the identification and correction of ocular artifacts from an EEG signal through a regression-based method.</p>
Full article ">Figure 7
<p>Example of ICA’s performance in removing ocular artifacts. The presented plots show: (<b>i</b>) the raw EEG from frontal electrodes; (<b>ii</b>) the first five components from ICA, ordered by energy; and (<b>iii</b>) the clean EEG from the same electrodes after removing the artifactual components (specifically, the first and second components). Green rectangles highlight blink patterns in both the raw EEG and the ICA components, while red rectangles indicate saccade patterns. After cleaning the EEG signal, these rectangles no longer contain artifact patterns, demonstrating the effectiveness of the artifact removal process.</p>
Full article ">Figure 8
<p>Representation of the ASR method performance for correcting ocular blink artifacts from the EEG signal. The figure shows how the method was effective in identifying and correcting the ocular artifacts from the raw EEG signal (green line) and obtaining the clean (red line) EEG trace.</p>
Full article ">
22 pages, 4691 KiB  
Article
Wearable EEG-Based Brain–Computer Interface for Stress Monitoring
by Brian Premchand, Liyuan Liang, Kok Soon Phua, Zhuo Zhang, Chuanchu Wang, Ling Guo, Jennifer Ang, Juliana Koh, Xueyi Yong and Kai Keng Ang
NeuroSci 2024, 5(4), 407-428; https://doi.org/10.3390/neurosci5040031 - 8 Oct 2024
Viewed by 2435
Abstract
Detecting stress is important for improving human health and potential, because moderate levels of stress may motivate people towards better performance at cognitive tasks, while chronic stress exposure causes impaired performance and health risks. We propose a Brain–Computer Interface (BCI) system to detect [...] Read more.
Detecting stress is important for improving human health and potential, because moderate levels of stress may motivate people towards better performance at cognitive tasks, while chronic stress exposure causes impaired performance and health risks. We propose a Brain–Computer Interface (BCI) system to detect stress in the context of high-pressure work environments. The BCI system includes an electroencephalogram (EEG) headband with dry electrodes and an electrocardiogram (ECG) chest belt. We collected EEG and ECG data from 40 participants during two stressful cognitive tasks: the Cognitive Vigilance Task (CVT), and the Multi-Modal Integration Task (MMIT) we designed. We also recorded self-reported stress levels using the Dundee Stress State Questionnaire (DSSQ). The DSSQ results indicated that performing the MMIT led to significant increases in stress, while performing the CVT did not. Subsequently, we trained two different models to classify stress from non-stress states, one using EEG features, and the other using heart rate variability (HRV) features extracted from the ECG. Our EEG-based model achieved an overall accuracy of 81.0% for MMIT and 77.2% for CVT. However, our HRV-based model only achieved 62.1% accuracy for CVT and 56.0% for MMIT. We conclude that EEG is an effective predictor of stress in the context of stressful cognitive tasks. Our proposed BCI system shows promise in evaluating mental stress in high-pressure work environments, particularly when utilizing an EEG-based BCI. Full article
Show Figures

Figure 1

Figure 1
<p>Cognitive Vigilance Task. (<b>A</b>) A critical number, 32, is shown in this trial. The number 32 is critical, as the digits differ by 1. The other numbers are not critical, as their digits do not differ by 0 or 1. Comparisons between numbers in different squares (e.g., 61 and 60) are not relevant in this task, and do not constitute a critical number. (<b>B</b>) A no-go trial. In no-go trials (which all included a critical number, in this case 89), a white star appeared at the bottom left of the screen. Participants were required to wait for trial to time-out. For both screenshots, the red annotations did not appear to the participants during the actual task.</p>
Full article ">Figure 2
<p>Multi-Modal Integration Task. (<b>A</b>) This screenshot shows a trial in which all properties of the suspect match the rules, meaning that the participant should respond by pressing the spacebar. (<b>B</b>) This screenshot shows a no-go trial, meaning that the participant should respond by waiting for the trial to timeout. For both screenshots, the red annotations did not appear to the participants during the actual task.</p>
Full article ">Figure 3
<p>Example of the experimental setup. The right-side screen enabled the operator to monitor the EEG/ECG signals and data acquisition process. The left-side screen was for displaying visual cues to the participant for the cognitive tasks. Each participant wore an EEG headband on their forehead, and an ECG chest belt.</p>
Full article ">Figure 4
<p>Heart rate (HR) and R-R interval plots for one participant for one short practice session (7 blocks). HR is measured in beats per minute (BPM), and the R-R interval is measured in milliseconds (ms). The R-R interval is normalized to zero mean. The red line is the mean HR and R-R interval, respectively.</p>
Full article ">Figure 5
<p>Diagram of our stress detection algorithm, indicating how it was trained and tested. The blue dashed box indicates the blocks that were labelled as “non-stressed”, and the orange dashed box indicates the blocks that were labelled as “stressed”.</p>
Full article ">Figure 6
<p>Changes in subjective stress levels before and after the CVTs and MMITs. Error bars represent the standard error of the mean, and the significance levels were calculated using paired two-tailed <span class="html-italic">t</span>-tests (α = 0.05). (<b>A</b>) Performing the CVT did not significantly increase the reported levels of distress. (<b>B</b>) Performing the MMIT significantly increased the reported levels of distress.</p>
Full article ">Figure 7
<p>Predicted stress levels of a participant as they performed the CVT and MMIT. The green points represent task blocks recorded during the MMIT, and the purple points represent task blocks recorded during the CVT. There was an overall trend towards higher predicted stress levels; however, the block-to-block variability was considerable.</p>
Full article ">Figure 8
<p>Stress detection accuracy for the CVT and MMIT, based on the model input and number of blocks used. Top: The accuracy of the stress-detection model built from CVT data. Bottom: The accuracy of the stress-detection model built from MMIT data. The x-axis represents the number of blocks from the start and end of the session used to label stress for training and testing, and the y-axis indicates classification accuracy. Reference accuracy levels of 50% (chance level) and 75% (minimum level for a useful brain-machine interface) are shown with red dotted lines.</p>
Full article ">
18 pages, 5504 KiB  
Article
Fatigue Driving State Detection Based on Spatial Characteristics of EEG Signals
by Wenwen Chang, Wenchao Nie, Renjie Lv, Lei Zheng, Jialei Lu and Guanghui Yan
Electronics 2024, 13(18), 3742; https://doi.org/10.3390/electronics13183742 - 20 Sep 2024
Viewed by 886
Abstract
Monitoring the driver’s physical and mental state based on wearable EEG acquisition equipment, especially the detection and early warning of fatigue, is a key issue in the research of the brain–computer interface in human–machine intelligent fusion driving. Comparing and analyzing the waking (alert) [...] Read more.
Monitoring the driver’s physical and mental state based on wearable EEG acquisition equipment, especially the detection and early warning of fatigue, is a key issue in the research of the brain–computer interface in human–machine intelligent fusion driving. Comparing and analyzing the waking (alert) state and fatigue state by simulating EEG data during simulated driving, this paper proposes a brain functional network construction method based on a phase locking value (PLV) and phase lag index (PLI), studies the relationship between brain regions, and quantitatively analyzes the network structure. The characteristic parameters of the brain functional network that have significant differences in fatigue status are screened out and constitute feature vectors, which are then combined with machine learning algorithms to complete classification and identification. The experimental results show that this method can effectively distinguish between alertness and fatigue states. The recognition accuracy rates of 52 subjects are all above 70%, with the highest recognition accuracy reaching 89.5%. Brain network topology analysis showed that the connectivity between brain regions was weakened under a fatigue state, especially under the PLV method, and the phase synchronization relationship between delta and theta frequency bands was significantly weakened. The research results provide a reference for understanding the interdependence of brain regions under fatigue conditions and the development of fatigue driving detection systems. Full article
(This article belongs to the Section Bioelectronics)
Show Figures

Figure 1

Figure 1
<p>The roadmap of this study.</p>
Full article ">Figure 2
<p>Experimental scene and paradigm.</p>
Full article ">Figure 3
<p>Electrode location diagram.</p>
Full article ">Figure 4
<p>(<b>a</b>): The PLV average matrix of the second group of experiments. (<b>b</b>): The PLV average matrix of the second group of experiments.</p>
Full article ">Figure 5
<p>(<b>a</b>): The PLI average matrix of the second group of experiments. (<b>b</b>): The PLI average matrix of the second group of experiments.</p>
Full article ">Figure 6
<p>(<b>a</b>): PLV matrix frequency distribution histogram and threshold. (<b>b</b>): PLI matrix frequency distribution histogram and threshold.</p>
Full article ">Figure 7
<p>(<b>a</b>): The first group of the experimental PLV binarization matrix. (<b>b</b>): The second group of the experimental PLV binarization matrix.</p>
Full article ">Figure 8
<p>(<b>a</b>): The first group of the experimental PLI binarization matrix. (<b>b</b>): The second group of the experimental PLI binarization matrix.</p>
Full article ">Figure 8 Cont.
<p>(<b>a</b>): The first group of the experimental PLI binarization matrix. (<b>b</b>): The second group of the experimental PLI binarization matrix.</p>
Full article ">Figure 9
<p>(<b>a</b>): The diagram of the first group of experiments, based on the PLV functional brain network topology connection. (<b>b</b>): The diagram of the second group of experiments, based on the PLV functional brain network topology connection.</p>
Full article ">Figure 9 Cont.
<p>(<b>a</b>): The diagram of the first group of experiments, based on the PLV functional brain network topology connection. (<b>b</b>): The diagram of the second group of experiments, based on the PLV functional brain network topology connection.</p>
Full article ">Figure 10
<p>(<b>a</b>): The diagram of the first group of experiments, based on the PLI functional brain network topology connection. (<b>b</b>): The diagram of the second group of experiments, based on the PLI functional brain network topology connection.</p>
Full article ">Figure 11
<p>(<b>a</b>): Classification accuracy of brain network feature parameters based on PLV. (<b>b</b>): Classification accuracy of brain network feature parameters based on PLI.</p>
Full article ">
3 pages, 535 KiB  
Abstract
Effect of Aesthetic Images on a Population with Mild Cognitive Decline: An Electroencephalography/Functional Near-Infrared Spectroscopy Study
by Livio Clemente, Marianna La Rocca, Marianna Delussi, Giusy Tancredi, Katia Ricci, Giuseppe Procida, Antonio Brunetti, Vitoantonio Bevilacqua and Marina de Tommaso
Proceedings 2024, 97(1), 228; https://doi.org/10.3390/proceedings2024097228 - 19 Sep 2024
Viewed by 574
Abstract
Neuroaesthetics is a relatively young field that connects neuroscience with empirical aesthetics and originates in the neurological theory of aesthetic experience. It investigates brain structures and activity during the phenomena of artistic perception and production and, at the same time, attempts to understand [...] Read more.
Neuroaesthetics is a relatively young field that connects neuroscience with empirical aesthetics and originates in the neurological theory of aesthetic experience. It investigates brain structures and activity during the phenomena of artistic perception and production and, at the same time, attempts to understand the influence of neurological pathologies on these mechanisms. For each participant (six subjects with mild cognitive decline and ten controls), electroencephalography (EEG) and functional near-infrared spectroscopy (fNIRS) data were acquired thanks to a wearable EEG–fNIRS system during the execution of a P300 task. Full article
(This article belongs to the Proceedings of XXXV EUROSENSORS Conference)
Show Figures

Figure 1

Figure 1
<p>The figure shows the preliminary results of the study: (<b>a</b>) IA group P300 latency under different conditions (ugly dynamic = UD; beautiful dynamic = BD; ugly static = US; beautiful static = BS); (<b>b</b>) NA group P300 latency in the different conditions; (<b>c</b>) control group shows elevated haemodynamic cortical activation in the left hemisphere.</p>
Full article ">
22 pages, 1617 KiB  
Article
Combining Signals for EEG-Free Arousal Detection during Home Sleep Testing: A Retrospective Study
by Safa Boudabous, Juliette Millet and Emmanuel Bacry
Diagnostics 2024, 14(18), 2077; https://doi.org/10.3390/diagnostics14182077 - 19 Sep 2024
Viewed by 902
Abstract
Introduction: Accurately detecting arousal events during sleep is essential for evaluating sleep quality and diagnosing sleep disorders, such as sleep apnea/hypopnea syndrome. While the American Academy of Sleep Medicine guidelines associate arousal events with electroencephalogram (EEG) signal variations, EEGs are often not recorded [...] Read more.
Introduction: Accurately detecting arousal events during sleep is essential for evaluating sleep quality and diagnosing sleep disorders, such as sleep apnea/hypopnea syndrome. While the American Academy of Sleep Medicine guidelines associate arousal events with electroencephalogram (EEG) signal variations, EEGs are often not recorded during home sleep testing (HST) using wearable devices or smartphone applications. Objectives: The primary objective of this study was to explore the potential of alternatively relying on combinations of easily measurable physiological signals during HST for arousal detection where EEGs are not recorded. Methods: We conducted a data-driven retrospective study following an incremental device-agnostic analysis approach, where we simulated a limited-channel setting using polysomnography data and used deep learning to automate the detection task. During the analysis, we tested multiple signal combinations to evaluate their potential effectiveness. We trained and evaluated the model on the Multi-Ethnic Study of Atherosclerosis dataset. Results: The results demonstrated that combining multiple signals significantly improved performance compared with single-input signal models. Notably, combining thoracic effort, heart rate, and a wake/sleep indicator signal achieved competitive performance compared with the state-of-the-art DeepCAD model using electrocardiogram as input with an average precision of 61.59% and an average recall of 56.46% across the test records. Conclusions: This study demonstrated the potential of combining easy-to-record HST signals to characterize the autonomic markers of arousal better. It provides valuable insights to HST device designers on signals that improve EEG-free arousal detection. Full article
(This article belongs to the Special Issue Diagnosis of Sleep Disorders Using Machine Learning Approaches)
Show Figures

Figure 1

Figure 1
<p>DL model architecture. We show in (<b>a</b>) the structure of the convolutional block composing both the inception and residual blocks, in (<b>b</b>) the structure of a residual block, and in (<b>c</b>) the final model architecture composed of an inception block, two residual blocks, two LSTM layers, and a final fully connected layer with a sigmoid activation.</p>
Full article ">Figure 2
<p>Illustrative example of measurements of selected signal (Thor, DHR, Snore, WS, Pos) around an arousal event. The green shadows indicate the manually scored arousal event.</p>
Full article ">Figure 3
<p>The flowchart of the incremental analysis approach. It illustrates the first round, where models trained using a single-input signal are evaluated, and the incremental subprocess, where the performance of models trained in combinations of input signals is evaluated to identify the best-performing one. In this subprocess, the input combinations are formed by adding a single signal from the list of input signal candidates to the best combination from the previous round.</p>
Full article ">Figure 4
<p>Bar plot of the average recordwise event-based F1-scores of the different trained models during the first experiment round. Error bars represent 50% percentile intervals. The model trained using the Thor signal achieves a significantly better score, reaching an average score exceeding 50%.</p>
Full article ">Figure 5
<p>Bar plot of the average recordwise event-based F1-scores of the different trained models during the second experiment round. Error bars represent 50% percentile intervals. Combining the Thor signal with another from Pos, Pos_chg, WS, Snore, or DHR signals yields higher F1-scores. The highest score is obtained by training the model using Thor and DHR signals.</p>
Full article ">Figure 6
<p>Bar plot of the average recordwise event-based F1-scores of the trained models during the third experiment round using combinations of three signals as input. Error bars represent 50% percentile intervals. The combination of Thor, WS, and DHR signals results in the best model performance in terms of F1-score among all the tested combinations.</p>
Full article ">Figure 7
<p>Bar plot of the average recordwise event-based F1-scores of all models trained during the incremental approach. The error bars represent 50% percentile intervals. The gold-colored bar with bold borders represents the mean F1-score of the ECG-based DeepCAD model.</p>
Full article ">Figure 8
<p>Bland−Altman analysis for calculated ArI versus True ArI. Bland–Altman plots of calculated ArIs using the SoTA DeepCAD model and our DL model trained using Thor+W/S+DHR are superimposed to compare their estimation biases. The Bland−Altman plot shows the estimation bias as a function of the average of the two ArIs. The solid horizontal lines indicate the mean of estimation bias, and the dashed lines show the 95% bounds of estimation bias for each comparison (mean ± 1.96 SD). The bias distributions for the two models are shown on the right. Results show that the two models yield similar results in terms of ArI estimation.</p>
Full article ">Figure A1
<p>Study diagram. The flowchart shows the exclusion criteria and sample size of the final analysis.</p>
Full article ">Figure A2
<p>(<b>a</b>) Critical difference (CD) diagrams showing the average ranks of models trained with different classification error rates. The lower the rank (farther to the right), the better the performance is. A line in each diagram indicates no significant difference in performance among the models crossed by that particular line in terms of the bootstrapping paired <span class="html-italic">t</span>-test with Holm–Bonferroni multiple test correction. (<b>b</b>) Error bar plot of the average event-based F1-scores of the different trained models. Error bars represent 50% percentile intervals.</p>
Full article ">
13 pages, 9508 KiB  
Article
Preliminary Study on Gender Differences in EEG-Based Emotional Responses in Virtual Architectural Environments
by Zhubin Li, Kun Wang, Mingyue Hai, Pengyu Cai and Ya Zhang
Buildings 2024, 14(9), 2884; https://doi.org/10.3390/buildings14092884 - 12 Sep 2024
Cited by 1 | Viewed by 803
Abstract
In traditional cultural perceptions of gender, women are stereotyped as being more “emotional” than men. Although significant progress has been made in studying gender differences in emotional responses over the past few decades, there is still no consistent conclusion as to whether women [...] Read more.
In traditional cultural perceptions of gender, women are stereotyped as being more “emotional” than men. Although significant progress has been made in studying gender differences in emotional responses over the past few decades, there is still no consistent conclusion as to whether women are more emotional than men. In this study, we investigated gender differences in emotional responses between two groups of students (10 males and 10 females) in the same architectural environment, particularly in a digital cultural tourism scenario. Participants viewed the “Time Tunnel” of the ancient city of Qingzhou through VR simulation. Brainwave evoked potentials were recorded using wearable EEG devices. The results showed that females typically reported stronger emotional responses, as evidenced by higher arousal, lower potency, and stronger avoidance motivation. In contrast, males exhibited higher potency, lower arousal, and stronger comfort. The findings suggest that males have a more positive emotional response in virtual digital environments, whereas females are more sensitive and vulnerable to such environments, experiencing some discomfort. These findings can be used to guide the design and adaptation of virtual built environments. Full article
(This article belongs to the Special Issue Optimizing Living Environments for Mental Health)
Show Figures

Figure 1

Figure 1
<p>“Time Tunnel” floor plan.</p>
Full article ">Figure 2
<p>Internal scene of the “Time Tunnel”. Source: China Southwest Architecture Design Institute.</p>
Full article ">Figure 3
<p>Experimental equipment and point map. Sources: (<b>a</b>) <a href="https://everloyal.com.cn/" target="_blank">https://everloyal.com.cn/</a>; and (<b>b</b>) adapted by the author from IFCN.</p>
Full article ">Figure 4
<p>Overview of EEG signal measurement process.</p>
Full article ">Figure 5
<p>Experimental flow chart.</p>
Full article ">Figure 6
<p>Experimental photographs.</p>
Full article ">Figure 7
<p>Data processing flow: (<b>a</b>) collecting time-series EEG data; (<b>b</b>) removing signal artifacts in EEG data; (<b>c</b>) measuring mean PSD at the frontal area; and (<b>d</b>) emotion identification by calculating valence and arousal levels using the frontal area mean PSD.</p>
Full article ">Figure 8
<p>Low-dimensional data.</p>
Full article ">Figure 9
<p>Changes in S-TAI by gender: (<b>a</b>) changes in S-TAI scale for women; and (<b>b</b>) changes in S-TAI scale for men.</p>
Full article ">Figure 10
<p>Gender differences in valence.</p>
Full article ">Figure 11
<p>Gender differences in arousal.</p>
Full article ">Figure 12
<p>Gender differences in valence–arousal.</p>
Full article ">Figure 13
<p>Gender differences in emotions.</p>
Full article ">
28 pages, 952 KiB  
Review
A Comprehensive Review of Hardware Acceleration Techniques and Convolutional Neural Networks for EEG Signals
by Yu Xie and Stefan Oniga
Sensors 2024, 24(17), 5813; https://doi.org/10.3390/s24175813 - 7 Sep 2024
Viewed by 1798
Abstract
This paper comprehensively reviews hardware acceleration techniques and the deployment of convolutional neural networks (CNNs) for analyzing electroencephalogram (EEG) signals across various application areas, including emotion classification, motor imagery, epilepsy detection, and sleep monitoring. Previous reviews on EEG have mainly focused on software [...] Read more.
This paper comprehensively reviews hardware acceleration techniques and the deployment of convolutional neural networks (CNNs) for analyzing electroencephalogram (EEG) signals across various application areas, including emotion classification, motor imagery, epilepsy detection, and sleep monitoring. Previous reviews on EEG have mainly focused on software solutions. However, these reviews often overlook key challenges associated with hardware implementation, such as scenarios that require a small size, low power, high security, and high accuracy. This paper discusses the challenges and opportunities of hardware acceleration for wearable EEG devices by focusing on these aspects. Specifically, this review classifies EEG signal features into five groups and discusses hardware implementation solutions for each category in detail, providing insights into the most suitable hardware acceleration strategies for various application scenarios. In addition, it explores the complexity of efficient CNN architectures for EEG signals, including techniques such as pruning, quantization, tensor decomposition, knowledge distillation, and neural architecture search. To the best of our knowledge, this is the first systematic review that combines CNN hardware solutions with EEG signal processing. By providing a comprehensive analysis of current challenges and a roadmap for future research, this paper provides a new perspective on the ongoing development of hardware-accelerated EEG systems. Full article
(This article belongs to the Special Issue Sensors Fusion in Digital Healthcare Applications)
Show Figures

Figure 1

Figure 1
<p>FPGA block diagram from [<a href="#B128-sensors-24-05813" class="html-bibr">128</a>].</p>
Full article ">Figure 2
<p>ASIC block diagram from [<a href="#B129-sensors-24-05813" class="html-bibr">129</a>].</p>
Full article ">Figure 3
<p>Results on different platforms from [<a href="#B127-sensors-24-05813" class="html-bibr">127</a>].</p>
Full article ">Figure 4
<p>FPGA block diagram from [<a href="#B9-sensors-24-05813" class="html-bibr">9</a>].</p>
Full article ">Figure 5
<p>Results on different platforms from [<a href="#B9-sensors-24-05813" class="html-bibr">9</a>].</p>
Full article ">
24 pages, 3885 KiB  
Article
One-Channel Wearable Mental Stress State Monitoring System
by Lamis Abdul Kader, Fares Al-Shargie, Usman Tariq and Hasan Al-Nashash
Sensors 2024, 24(16), 5373; https://doi.org/10.3390/s24165373 - 20 Aug 2024
Viewed by 1678
Abstract
Assessments of stress can be performed using physiological signals, such as electroencephalograms (EEGs) and galvanic skin response (GSR). Commercialized systems that are used to detect stress with EEGs require a controlled environment with many channels, which prohibits their daily use. Fortunately, there is [...] Read more.
Assessments of stress can be performed using physiological signals, such as electroencephalograms (EEGs) and galvanic skin response (GSR). Commercialized systems that are used to detect stress with EEGs require a controlled environment with many channels, which prohibits their daily use. Fortunately, there is a rise in the utilization of wearable devices for stress monitoring, offering more flexibility. In this paper, we developed a wearable monitoring system that integrates both EEGs and GSR. The novelty of our proposed device is that it only requires one channel to acquire both physiological signals. Through sensor fusion, we achieved an improved accuracy, lower cost, and improved ease of use. We tested the proposed system experimentally on twenty human subjects. We estimated the power spectrum of the EEG signals and utilized five machine learning classifiers to differentiate between two levels of mental stress. Furthermore, we investigated the optimum electrode location on the scalp when using only one channel. Our results demonstrate the system’s capability to classify two levels of mental stress with a maximum accuracy of 70.3% when using EEGs alone and 84.6% when using fused EEG and GSR data. This paper shows that stress detection is reliable using only one channel on the prefrontal and ventrolateral prefrontal regions of the brain. Full article
(This article belongs to the Section Biomedical Sensors)
Show Figures

Figure 1

Figure 1
<p>Block diagram of the hardware system designed for mental state monitoring.</p>
Full article ">Figure 2
<p>Fabricated PCB board of the data acquisition system for EEG and GSR.</p>
Full article ">Figure 3
<p>(<b>A</b>) Electrode positioning on Fp1 and Fp2 or F8 and F7. (<b>B</b>) Cursor/handle of the rail to move the electrode right and left. (<b>C</b>) Mechanical framework is divided into front and back parts. (<b>D</b>) Area used to fit the electrodes from behind for secure positioning of the electrodes at the desired location.</p>
Full article ">Figure 4
<p>Stroop color and word task. (<b>a</b>) Instructions, (<b>b</b>) Resting period, (<b>c</b>) Stroop stimulus, (<b>d</b>) Trial feedback.</p>
Full article ">Figure 5
<p>Experimental protocol for control and stress using SCWT.</p>
Full article ">Figure 6
<p>Validation of the sensitivity of the wearable system for changes in resistance and voltage of the GSR at constant current of 10 μA.</p>
Full article ">Figure 7
<p>Framework of method used for EEG analysis and stress detection.</p>
Full article ">Figure 8
<p>The accuracy of different machine learning classifiers used for every band.</p>
Full article ">Figure 9
<p>(<b>A</b>) The electrode position placed at Fp1 and Fp2. (<b>B</b>) The electrode position placed at F7 and Fp8.</p>
Full article ">Figure 10
<p><span class="html-italic">p</span>-value at each designated position (Fp1 and Fp2/F7 and F8) of the electrodes during the experiment.</p>
Full article ">Figure 11
<p>Subjective data of control and stress phases during experiments.</p>
Full article ">Figure 12
<p>Behavioral accuracy for control and stress phases when participants took the SCWT test.</p>
Full article ">
18 pages, 8360 KiB  
Article
A Method for the Spatial Interpolation of EEG Signals Based on the Bidirectional Long Short-Term Memory Network
by Wenlong Hu, Bowen Ji and Kunpeng Gao
Sensors 2024, 24(16), 5215; https://doi.org/10.3390/s24165215 - 12 Aug 2024
Viewed by 1341
Abstract
The precision of electroencephalograms (EEGs) significantly impacts the performance of brain–computer interfaces (BCI). Currently, the majority of research into BCI technology gives priority to lightweight design and a reduced electrode count to make it more suitable for application in wearable environments. This paper [...] Read more.
The precision of electroencephalograms (EEGs) significantly impacts the performance of brain–computer interfaces (BCI). Currently, the majority of research into BCI technology gives priority to lightweight design and a reduced electrode count to make it more suitable for application in wearable environments. This paper introduces a deep learning-based time series bidirectional (BiLSTM) network that is designed to capture the inherent characteristics of EEG channels obtained from neighboring electrodes. It aims to predict the EEG data time series and facilitate the conversion process from low-density EEG signals to high-density EEG signals. BiLSTM pays more attention to the dependencies in time series data rather than mathematical maps, and the root mean square error can be effectively restricted to below 0.4μV, which is less than half the error in traditional methods. After expanding the BCI Competition III 3a dataset from 18 channels to 60 channels, we conducted classification experiments on four types of motor imagery tasks. Compared to the original low-density EEG signals (18 channels), the classification accuracy was around 82%, an increase of about 20%. When juxtaposed with real high-density signals, the increment in the error rate remained below 5%. The expansion of the EEG channels showed a substantial and notable improvement compared with the original low-density signals. Full article
Show Figures

Figure 1

Figure 1
<p>The structure of the LSTM unit.</p>
Full article ">Figure 2
<p>The structure of the LSTM unit and a simple BiLSTM model.</p>
Full article ">Figure 3
<p>The BiLSTM network structure diagram.</p>
Full article ">Figure 4
<p>Distribution map of electrode positions.</p>
Full article ">Figure 5
<p>Four initial electrode combination methods (<b>A</b>–<b>D</b>).</p>
Full article ">Figure 6
<p>Timing of the imaginary movement paradigm.</p>
Full article ">Figure 7
<p>The error of the proposed BiLSTM; the left column is the training set, and the right column is the test set.</p>
Full article ">Figure 8
<p>Comparison of two electrode combinations: 9- and 18-electrode combinations.</p>
Full article ">Figure 9
<p>Comparison of detail magnification for two electrode combinations: 9- and 18-electrode combinations.</p>
Full article ">Figure 9 Cont.
<p>Comparison of detail magnification for two electrode combinations: 9- and 18-electrode combinations.</p>
Full article ">Figure 10
<p>Comparison of weighted average and BiLSTM.</p>
Full article ">Figure 11
<p>Comparison of cubic spline and BiLSTM.</p>
Full article ">Figure 12
<p>EEG plot; the <b>left</b> column shows the data of a few electrodes, the <b>middle</b> column shows the estimated data from BiLSTM, and the <b>right</b> column shows the true values.</p>
Full article ">Figure 13
<p>Classification flow chart of MI task.</p>
Full article ">Figure 14
<p>Confusion matrix of three types of data.</p>
Full article ">
Back to TopTop