[go: up one dir, main page]
More Web Proxy on the site http://driver.im/
You seem to have javascript disabled. Please note that many of the page functionalities won't work as expected without javascript enabled.
 
 
Sign in to use this feature.

Years

Between: -

Subjects

remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline

Journals

Article Types

Countries / Regions

Search Results (8)

Search Parameters:
Keywords = OpenViBE

Order results
Result details
Results per page
Select all
Export citation of selected articles as:
10 pages, 2965 KiB  
Article
Brain–Computer-Interface-Based Smart-Home Interface by Leveraging Motor Imagery Signals
by Simona Cariello, Dario Sanalitro, Alessandro Micali, Arturo Buscarino and Maide Bucolo
Inventions 2023, 8(4), 91; https://doi.org/10.3390/inventions8040091 - 18 Jul 2023
Cited by 3 | Viewed by 2659
Abstract
In this work, we propose a brain–computer-interface (BCI)-based smart-home interface which leverages motor imagery (MI) signals to operate home devices in real-time. The idea behind MI-BCI is that different types of MI activities will activate various brain regions. Therefore, after recording the user’s [...] Read more.
In this work, we propose a brain–computer-interface (BCI)-based smart-home interface which leverages motor imagery (MI) signals to operate home devices in real-time. The idea behind MI-BCI is that different types of MI activities will activate various brain regions. Therefore, after recording the user’s electroencephalogram (EEG) data, two approaches, i.e., Regularized Common Spatial Pattern (RCSP) and Linear Discriminant Analysis (LDA), analyze these data to classify users’ imagined tasks. In such a way, the user can perform the intended action. In the proposed framework, EEG signals were recorded by using the EMOTIV helmet and OpenVibe, a free and open-source platform that has been utilized for EEG signal feature extraction and classification. After being classified, such signals are then converted into control commands, and the open communication protocol for building automation KNX (“Konnex”) is proposed for the tasks’ execution, i.e., the regulation of two switching devices. The experimental results from the training and testing stages provide evidence of the effectiveness of the users’ intentions classification, which has subsequently been used to operate the proposed home automation system, allowing users to operate two light bulbs. Full article
(This article belongs to the Special Issue Recent Advances and New Trends in Signal Processing)
Show Figures

Figure 1

Figure 1
<p>Experimental setup: a personal computer connected to two devices through KNX, i.e., two light bulbs.</p>
Full article ">Figure 2
<p>Experimental paradigm: MI trial’s phases. The three key stages—fixation cross, arrow cue, and MI task—are depicted to highlight the time intervals between them.</p>
Full article ">Figure 3
<p>Arrows’ sequence during the training phase. (<b>a</b>) Right arrow; (<b>b</b>) Fixation cross; (<b>c</b>) Left arrow.</p>
Full article ">Figure 4
<p>EEG topographical distribution of subject A during the training phase. (<b>a</b>) Fixation cross 0 [s], (<b>b</b>) Arrow cue at 2.75 [s], (<b>c</b>) MI task starting at 4.25 [s], (<b>d</b>) MI task at 5.25 [s].</p>
Full article ">Figure 5
<p>Neuroheadset headset and its spatial configuration. (<b>a</b>) Emotiv EPOC X; (<b>b</b>) Electrodes configuration.</p>
Full article ">Figure 6
<p>Adopted classification methodology.</p>
Full article ">Figure 7
<p>Common spatial pattern map. The figure illustrates the set of CSP filters of a single participant in the study. The CSPs are optimized for the discrimination of left-hand motor imagery.</p>
Full article ">Figure 8
<p>System architecture overview: from the signals acquisition to hardware devices.</p>
Full article ">Figure 9
<p>Node-RED flow chart of the proposed software implementation.</p>
Full article ">
17 pages, 926 KiB  
Article
Mind the Move: Developing a Brain-Computer Interface Game with Left-Right Motor Imagery
by Georgios Prapas, Kosmas Glavas, Katerina D. Tzimourta, Alexandros T. Tzallas and Markos G. Tsipouras
Information 2023, 14(7), 354; https://doi.org/10.3390/info14070354 - 21 Jun 2023
Cited by 5 | Viewed by 3615
Abstract
Brain-computer interfaces (BCIs) are becoming an increasingly popular technology, used in a variety of fields such as medical, gaming, and lifestyle. This paper describes a 3D non-invasive BCI game that uses a Muse 2 EEG headband to acquire electroencephalogram (EEG) data and OpenViBE [...] Read more.
Brain-computer interfaces (BCIs) are becoming an increasingly popular technology, used in a variety of fields such as medical, gaming, and lifestyle. This paper describes a 3D non-invasive BCI game that uses a Muse 2 EEG headband to acquire electroencephalogram (EEG) data and OpenViBE platform for processing the signals and classifying them into three different mental states: left and right motor imagery and eye blink. The game is developed to assess user adjustment and improvement in BCI environment after training. The classification algorithm used is Multi-Layer Perceptron (MLP), with 96.94% accuracy. A total of 33 subjects participated in the experiment and successfully controlled an avatar using mental commands to collect coins. The online metrics employed for this BCI system are the average game score, the average number of clusters and average user improvement. Full article
Show Figures

Figure 1

Figure 1
<p>The Muse 2 headband is presented (<b>right</b> image). A screenshot from one recording is shown. The four EEG channels of the Muse 2 headband and their waves are presented (<b>left</b> image).</p>
Full article ">Figure 2
<p>This flowchart illustrates the two-step process of the proposed system. Initially, an offline processing phase is employed to train a classifier using EEG data. Following this, an online processing phase is used to take mental commands from the user, which are subsequently translated into in-game movement through the use of the trained classifier.</p>
Full article ">Figure 3
<p>Offline processing scenario to train the classifier. The first step of the scenario involves importing EEG recordings using a CSV file reader. Three separate boxes are used to handle the three different EEG recordings (Left, Right and Blink) of the BCI system. The signals are then filtered to remove frequencies outside the range of 8–40 Hz. The filtered signals are epoched in time windows of 3 s and EEG waves are calculated (Alpha, Beta 1, Beta 2 Gamma 1, Gamma 2). In the next step, the energy of the signals is calculated and a feature vector of all frequency bands energies is formulated in logarithmic scale. Finally, all feature vectors from all the different EGG recordings are used for the training of the classifier.</p>
Full article ">Figure 4
<p>Online Scenario for the proposed BCI system. Acquisition client connects with the LSL stream from BlueMuse in a specific port and the real time data processing starts. With the channel selector only the four EEG channels are included (TP9, TP10, AF7 AF8) and then the same process with the offline scenario is applied. The signals are filtered, EEG waves and energy are calculated and then this features are fed in the classifier to classify the mental commads. Finally, the LSL stream is employed to transmit the classifier’s results to the game. This was accomplished through the implementation of the LSL stream box, which facilitated the communication of data between OpenViBE and the game.</p>
Full article ">Figure 5
<p>Screenshot from the game-play. The avatar is moving on the platform depending on the user’s mental commands.</p>
Full article ">Figure 6
<p>Histogram that presents the different groups of users depending on their performance while testing the BCI game.</p>
Full article ">Figure 7
<p>The average user improvement for MI commands before and after playing the game.</p>
Full article ">
14 pages, 826 KiB  
Article
Evaluation of the User Adaptation in a BCI Game Environment
by Kosmas Glavas, Georgios Prapas, Katerina D. Tzimourta, Nikolaos Giannakeas and Markos G. Tsipouras
Appl. Sci. 2022, 12(24), 12722; https://doi.org/10.3390/app122412722 - 12 Dec 2022
Cited by 7 | Viewed by 2132
Abstract
Brain-computer interface (BCI) technology is a developing field of study with numerous applications. The purpose of this paper is to discuss the use of brain signals as a direct communication pathway to an external device. In this work, Zombie Jumper is developed, which [...] Read more.
Brain-computer interface (BCI) technology is a developing field of study with numerous applications. The purpose of this paper is to discuss the use of brain signals as a direct communication pathway to an external device. In this work, Zombie Jumper is developed, which consists of 2 brain commands, imagining moving forward and blinking. The goal of the game is to jump over static or moving “zombie” characters in order to complete the level. To record the raw EEG data, a Muse 2 headband is used, and the OpenViBE platform is employed to process and classify the brain signals. The Unity engine is used to build the game, and the lab streaming layer (LSL) protocol is the connective link between Muse 2, OpenViBE and the Unity engine for this BCI-controlled game. A total of 37 subjects tested the game and played it at least 20 times. The average classification accuracy was 98.74%, ranging from 97.06% to 99.72%. Finally, playing the game for longer periods of time resulted in greater control. Full article
Show Figures

Figure 1

Figure 1
<p>Major components of a BCI system.</p>
Full article ">Figure 2
<p>Muse 2 headband with the corresponding electrodes.</p>
Full article ">Figure 3
<p>Flowchart diagram of the proposed system. The left side presents the offline processing that trains the classifier. The right side presents the online processing that uses the trained classifier to translate the mental commands into in-game movement.</p>
Full article ">Figure 4
<p>Offline processing scenario to train the classifier.</p>
Full article ">Figure 5
<p>Real-time classification scenario.</p>
Full article ">Figure 6
<p>Snapshot from BCI-controlled game.</p>
Full article ">
23 pages, 7666 KiB  
Article
Cross-Platform Implementation of an SSVEP-Based BCI for the Control of a 6-DOF Robotic Arm
by Eduardo Quiles, Javier Dadone, Nayibe Chio and Emilio García
Sensors 2022, 22(13), 5000; https://doi.org/10.3390/s22135000 - 2 Jul 2022
Cited by 18 | Viewed by 4418
Abstract
Robotics has been successfully applied in the design of collaborative robots for assistance to people with motor disabilities. However, man-machine interaction is difficult for those who suffer severe motor disabilities. The aim of this study was to test the feasibility of a low-cost [...] Read more.
Robotics has been successfully applied in the design of collaborative robots for assistance to people with motor disabilities. However, man-machine interaction is difficult for those who suffer severe motor disabilities. The aim of this study was to test the feasibility of a low-cost robotic arm control system with an EEG-based brain-computer interface (BCI). The BCI system relays on the Steady State Visually Evoked Potentials (SSVEP) paradigm. A cross-platform application was obtained in C++. This C++ platform, together with the open-source software Openvibe was used to control a Stäubli robot arm model TX60. Communication between Openvibe and the robot was carried out through the Virtual Reality Peripheral Network (VRPN) protocol. EEG signals were acquired with the 8-channel Enobio amplifier from Neuroelectrics. For the processing of the EEG signals, Common Spatial Pattern (CSP) filters and a Linear Discriminant Analysis classifier (LDA) were used. Five healthy subjects tried the BCI. This work allowed the communication and integration of a well-known BCI development platform such as Openvibe with the specific control software of a robot arm such as Stäubli TX60 using the VRPN protocol. It can be concluded from this study that it is possible to control the robotic arm with an SSVEP-based BCI with a reduced number of dry electrodes to facilitate the use of the system. Full article
(This article belongs to the Special Issue Real-Life Wearable EEG-Based BCI: Open Challenges)
Show Figures

Figure 1

Figure 1
<p>Robot Stäubli TX60. (<b>a</b>) robot arm in the lab; (<b>b</b>) degrees of freedom scheme.</p>
Full article ">Figure 2
<p>SSVEP-BCI methodology for robotic arm control.</p>
Full article ">Figure 3
<p>Timing of a single SSVEP trial.</p>
Full article ">Figure 4
<p>Time duration for starting on stimulation frequency and resting period in one run.</p>
Full article ">Figure 5
<p>Stimuli frequencies.</p>
Full article ">Figure 6
<p>Electrode disposition according to the international system 10–20.</p>
Full article ">Figure 7
<p>Signal Processing Procedure.</p>
Full article ">Figure 8
<p>GUI for the control of the robotic arm.</p>
Full article ">Figure 9
<p>Control signal flowchart.</p>
Full article ">Figure 10
<p>Combination of softkeys and their functionality.</p>
Full article ">Figure 11
<p>Program execution threads.</p>
Full article ">Figure 12
<p>Position of the targets and order of appearance.</p>
Full article ">Figure 13
<p>Stäubli Robotic Suite environment showing the rotation of the extreme joint of the robot.</p>
Full article ">Figure 14
<p>SSVEP BCI <b>c</b>ontrol of the Stäubli robotic arm.</p>
Full article ">Figure 15
<p>Evolution of the time to complete the task in each attempt of the five subjects.</p>
Full article ">Figure 16
<p>Average total time to complete task per subject.</p>
Full article ">Figure 17
<p>Percentage of success of each subject for each attempt.</p>
Full article ">Figure 18
<p>Distribution and average percentage of success of each subject.</p>
Full article ">Figure 19
<p>Relationship between time and the number of total movements completed.</p>
Full article ">Figure 20
<p>Distribution and average ITR of each subject.</p>
Full article ">
25 pages, 1805 KiB  
Article
BioPyC, an Open-Source Python Toolbox for Offline Electroencephalographic and Physiological Signals Classification
by Aurélien Appriou, Léa Pillette, David Trocellier, Dan Dutartre, Andrzej Cichocki and Fabien Lotte
Sensors 2021, 21(17), 5740; https://doi.org/10.3390/s21175740 - 26 Aug 2021
Cited by 7 | Viewed by 4184
Abstract
Research on brain–computer interfaces (BCIs) has become more democratic in recent decades, and experiments using electroencephalography (EEG)-based BCIs has dramatically increased. The variety of protocol designs and the growing interest in physiological computing require parallel improvements in processing and classification of both EEG [...] Read more.
Research on brain–computer interfaces (BCIs) has become more democratic in recent decades, and experiments using electroencephalography (EEG)-based BCIs has dramatically increased. The variety of protocol designs and the growing interest in physiological computing require parallel improvements in processing and classification of both EEG signals and bio signals, such as electrodermal activity (EDA), heart rate (HR) or breathing. If some EEG-based analysis tools are already available for online BCIs with a number of online BCI platforms (e.g., BCI2000 or OpenViBE), it remains crucial to perform offline analyses in order to design, select, tune, validate and test algorithms before using them online. Moreover, studying and comparing those algorithms usually requires expertise in programming, signal processing and machine learning, whereas numerous BCI researchers come from other backgrounds with limited or no training in such skills. Finally, existing BCI toolboxes are focused on EEG and other brain signals but usually do not include processing tools for other bio signals. Therefore, in this paper, we describe BioPyC, a free, open-source and easy-to-use Python platform for offline EEG and biosignal processing and classification. Based on an intuitive and well-guided graphical interface, four main modules allow the user to follow the standard steps of the BCI process without any programming skills: (1) reading different neurophysiological signal data formats, (2) filtering and representing EEG and bio signals, (3) classifying them, and (4) visualizing and performing statistical tests on the results. We illustrate BioPyC use on four studies, namely classifying mental tasks, the cognitive workload, emotions and attention states from EEG signals. Full article
Show Figures

Figure 1

Figure 1
<p>BioPyC data flow: the 4 main modules allow users to follow the standard BCI process for offline EEG and biosignal processing and classification.</p>
Full article ">Figure 2
<p>Comparison of main features of existing toolboxes having modules for EEG signals processing and classification. BioPyC values for each feature are written in black; values of features that are similar to those of BioPyC are written in green; and finally, values of features that differ from those of BioPyC are written in grey. “opt” stands for “optional” in the figure.</p>
Full article ">Figure 3
<p>Screenshot of BioPyC’s widgets, i.e., “select multiples” and buttons at the step of selecting the type of data/signals to work on. In BioPyC, a blue button stands for the action to make, when the disabled orange ones stand for future actions to make: orange buttons turn blue when the previous action is done.</p>
Full article ">Figure 4
<p>Screenshot of BioPyC’s choice of both calibration and evaluation types.</p>
Full article ">Figure 5
<p>Classification accuracy of each algorithm, for each subject, on the “BCI competition IV data set 2a”, in both subject-specific and subject-independent calibrations.</p>
Full article ">Figure 6
<p>Classification accuracy of each algorithm on the “BCI competition IV data set 2a”, in both subject-specific and subject-independent calibrations.</p>
Full article ">Figure 7
<p>Average confusion matrices over all subjects for classification of attention in theta (4–8 Hz) and alpha (8–12 Hz) frequency bands of 5 attentional states, i.e., alertness (tonic), alertness (phasic), sustained, selective, and divided.</p>
Full article ">Figure 8
<p>Classification accuracy of each algorithm on the workload data, in both subject-specific and subject-independent calibrations.</p>
Full article ">Figure 9
<p>Classification accuracy of each algorithm on the valence data, in both subject-specific and subject-independent calibrations.</p>
Full article ">Figure 10
<p>Classification accuracy of each algorithm on the arousal data, in both subject-specific and subject-independent calibrations.</p>
Full article ">
14 pages, 727 KiB  
Article
Induction of Neural Plasticity Using a Low-Cost Open Source Brain-Computer Interface and a 3D-Printed Wrist Exoskeleton
by Mads Jochumsen, Taha Al Muhammadee Janjua, Juan Carlos Arceo, Jimmy Lauber, Emilie Simoneau Buessinger and Rasmus Leck Kæseler
Sensors 2021, 21(2), 572; https://doi.org/10.3390/s21020572 - 15 Jan 2021
Cited by 14 | Viewed by 4669
Abstract
Brain-computer interfaces (BCIs) have been proven to be useful for stroke rehabilitation, but there are a number of factors that impede the use of this technology in rehabilitation clinics and in home-use, the major factors including the usability and costs of the BCI [...] Read more.
Brain-computer interfaces (BCIs) have been proven to be useful for stroke rehabilitation, but there are a number of factors that impede the use of this technology in rehabilitation clinics and in home-use, the major factors including the usability and costs of the BCI system. The aims of this study were to develop a cheap 3D-printed wrist exoskeleton that can be controlled by a cheap open source BCI (OpenViBE), and to determine if training with such a setup could induce neural plasticity. Eleven healthy volunteers imagined wrist extensions, which were detected from single-trial electroencephalography (EEG), and in response to this, the wrist exoskeleton replicated the intended movement. Motor-evoked potentials (MEPs) elicited using transcranial magnetic stimulation were measured before, immediately after, and 30 min after BCI training with the exoskeleton. The BCI system had a true positive rate of 86 ± 12% with 1.20 ± 0.57 false detections per minute. Compared to the measurement before the BCI training, the MEPs increased by 35 ± 60% immediately after and 67 ± 60% 30 min after the BCI training. There was no association between the BCI performance and the induction of plasticity. In conclusion, it is possible to detect imaginary movements using an open-source BCI setup and control a cheap 3D-printed exoskeleton that when combined with the BCI can induce neural plasticity. These findings may promote the availability of BCI technology for rehabilitation clinics and home-use. However, the usability must be improved, and further tests are needed with stroke patients. Full article
(This article belongs to the Collection EEG-Based Brain–Computer Interface for a Real-Life Appliance)
Show Figures

Figure 1

Figure 1
<p>Timeline of the experiment; the approximate duration of each block is indicated in parentheses. First, the subject was familiarized with transcranial magnetic stimulation (TMS) and motor imagination (MI), and the EEG cap was mounted. Next, the brain-computer interface (BCI) was calibrated, followed by the identification of the optimal stimulation site (hotspot) and intensity (RTh). The pre-intervention TMS, post-intervention TMS, and post-30 min intervention TMS were identical. After the pre-intervention TMS, the threshold for each subject was tested with an online BCI and changed if needed. Afterwards, the intervention started, and it was stopped when the subject reached 50 correct pairings between motor imagination (MI) and movement of the exoskeleton. The post-30 min intervention TMS started 30 min after the BCI intervention ended.</p>
Full article ">Figure 2
<p>Motor-evoked potential (MEP) from a representative subject (post-intervention transcranial magnetic stimulation measurement for subject 1). The peak around 25 milliseconds is the stimulation artefact from the transcranial magnetic stimulation.</p>
Full article ">Figure 3
<p>Overview of the hardware setup. The Arduino and Linear Actuator Control board were mounted on the exoskeleton. The EEG electrodes were connected through wires to the Open BCI board from which the signals were transmitted through wireless communication to the PC running OpenViBE. Once an imagined wrist extension was detected a trigger was sent through wireless communication to the Arduino on the exoskeleton. The Arduino was connected to the Linear Actuator Control board with a wire. The Linear Actuator Control board was powered with a 12 V power supply. The motor was connected to the Linear Actuator Control board with a wire.</p>
Full article ">Figure 4
<p>View of the 3D-printed exoskeleton. The illustration is not drawn to scale. The surfaces that were in contact with the forearm and hand were padded with foam to improve the comfort. The exoskeleton was fixated to the subject’s hand and forearm with Velcro straps (A). ‘LAC’: Linear Actuator Control.</p>
Full article ">Figure 5
<p>Summary of the MEP results. (<b>a</b>) Averaged MEP amplitudes (in mV) across the subjects, the vertical black line represents the standard deviation across subjects. The MEPs from the measurement 30 min after the intervention (Post 30) were significantly higher (denoted by *) than those from the measurement before the intervention (Pre). (<b>b</b>) MEP changes (in percent) from the measurement before the intervention to the measurement immediately after the intervention (Pre-Post) and 30 min after the intervention (Pre-Post 30).</p>
Full article ">
34 pages, 19994 KiB  
Article
A Magnetoencephalographic/Encephalographic (MEG/EEG) Brain-Computer Interface Driver for Interactive iOS Mobile Videogame Applications Utilizing the Hadoop Ecosystem, MongoDB, and Cassandra NoSQL Databases
by Wilbert McClay
Diseases 2018, 6(4), 89; https://doi.org/10.3390/diseases6040089 - 28 Sep 2018
Cited by 6 | Viewed by 7729
Abstract
In Phase I, we collected data on five subjects yielding over 90% positive performance in Magnetoencephalographic (MEG) mid-and post-movement activity. In addition, a driver was developed that substituted the actions of the Brain Computer Interface (BCI) as mouse button presses for real-time use [...] Read more.
In Phase I, we collected data on five subjects yielding over 90% positive performance in Magnetoencephalographic (MEG) mid-and post-movement activity. In addition, a driver was developed that substituted the actions of the Brain Computer Interface (BCI) as mouse button presses for real-time use in visual simulations. The process was interfaced to a flight visualization demonstration utilizing left or right brainwave thought movement, the user experiences, the aircraft turning in the chosen direction, or on iOS Mobile Warfighter Videogame application. The BCI’s data analytics of a subject’s MEG brain waves and flight visualization performance videogame analytics were stored and analyzed using the Hadoop Ecosystem as a quick retrieval data warehouse. In Phase II portion of the project involves the Emotiv Encephalographic (EEG) Wireless Brain–Computer interfaces (BCIs) allow for people to establish a novel communication channel between the human brain and a machine, in this case, an iOS Mobile Application(s). The EEG BCI utilizes advanced and novel machine learning algorithms, as well as the Spark Directed Acyclic Graph (DAG), Cassandra NoSQL database environment, and also the competitor NoSQL MongoDB database for housing BCI analytics of subject’s response and users’ intent illustrated for both MEG/EEG brainwave signal acquisition. The wireless EEG signals that were acquired from the OpenVibe and the Emotiv EPOC headset can be connected via Bluetooth to an iPhone utilizing a thin Client architecture. The use of NoSQL databases were chosen because of its schema-less architecture and Map Reduce computational paradigm algorithm for housing a user’s brain signals from each referencing sensor. Thus, in the near future, if multiple users are playing on an online network connection and an MEG/EEG sensor fails, or if the connection is lost from the smartphone and the webserver due to low battery power or failed data transmission, it will not nullify the NoSQL document-oriented (MongoDB) or column-oriented Cassandra databases. Additionally, NoSQL databases have fast querying and indexing methodologies, which are perfect for online game analytics and technology. In Phase II, we collected data on five MEG subjects, yielding over 90% positive performance on iOS Mobile Applications with Objective-C and C++, however on EEG signals utilized on three subjects with the Emotiv wireless headsets and (n < 10) subjects from the OpenVibe EEG database the Variational Bayesian Factor Analysis Algorithm (VBFA) yielded below 60% performance and we are currently pursuing extending the VBFA algorithm to work in the time-frequency domain referred to as VBFA-TF to enhance EEG performance in the near future. The novel usage of NoSQL databases, Cassandra and MongoDB, were the primary main enhancements of the BCI Phase II MEG/EEG brain signal data acquisition, queries, and rapid analytics, with MapReduce and Spark DAG demonstrating future implications for next generation biometric MEG/EEG NoSQL databases. Full article
(This article belongs to the Section Neuro-psychiatric Disorders)
Show Figures

Figure 1

Figure 1
<p>Phase II, MongoDB MEG Brain Computer Interface Database(s).</p>
Full article ">Figure 2
<p>Phase II, magnetoencephalography brain-computer interface(s) (MEG BCI) with Apple iOS Mobile Applications stored in MongoDB and Cassandra.</p>
Full article ">Figure 3
<p>Yongwook Chae, “EYE-BRAIN INTERFACE (ERI) SYSTEM AND METHOD FOR CONTROLLING SAME”, US2018/0196511.</p>
Full article ">Figure 4
<p>University of San Francisco in California (UCSF) MEG Scanner with Superconducting Quantum Interference Device (SQUID) detectors.</p>
Full article ">Figure 5
<p>Phase I, “A Real-Time Magnetoencephalography Brain-Computer Interface Using Interactive three-dimensional 3D-Visualization and the Hadoop Ecosystem”, Journal of Brain Sciences, 2015.</p>
Full article ">Figure 6
<p>Phase I, “A Real-Time Magnetoencephalography Brain-Computer Interface Using Interactive 3D-Visualization and the Hadoop Ecosystem”, flowchart process of BCI analytics in the Hadoop Ecosystem.</p>
Full article ">Figure 7
<p>Phase I, “A Real-Time Magnetoencephalography Brain-Computer Interface Using Interactive 3D-Visualization and the Hadoop Ecosystem”, Pig analysis for MEG Subject performance on Warfighter.</p>
Full article ">Figure 8
<p>(<b>a</b>) Phase II, MongoDB Magnetoencephalography Brain-Computer Interface Database. (<b>b</b>) Phase II, Variational Bayesian Factor Analysis (VBFA) Machine Learning Algorithm. (<b>c</b>) Phase II, MEG Subject Brain Wave Data and VBFAgeneratorCTF training matrices in MongoDBdatabase(s). (<b>d</b>) Phase II, C code testVBFA function on MEG Subject Brainwave Data.</p>
Full article ">Figure 8 Cont.
<p>(<b>a</b>) Phase II, MongoDB Magnetoencephalography Brain-Computer Interface Database. (<b>b</b>) Phase II, Variational Bayesian Factor Analysis (VBFA) Machine Learning Algorithm. (<b>c</b>) Phase II, MEG Subject Brain Wave Data and VBFAgeneratorCTF training matrices in MongoDBdatabase(s). (<b>d</b>) Phase II, C code testVBFA function on MEG Subject Brainwave Data.</p>
Full article ">Figure 9
<p>Phase II, MongoDB Magnetoencephalography Brain-Computer Interface Database storage of MEG Subject Variational Bayesian Factor Analysis training matrices and MEG Subject Performance and Metadata.</p>
Full article ">Figure 10
<p>MEG Brainwave data acquisition in MongoDB with 12-byte BSON timestamp representing ObjectID for Epoch Trial performance for MEG Subject.</p>
Full article ">Figure 11
<p>(<b>a</b>) MEG Brainwave data acquisition in MongoDB with 12-byte BSON timestamp representing ObjectID representing Subject’s Training Matrices acquired during VBFA Machine learning algorithm training on MEG brainwaves. (<b>b</b>) MEG Brainwave data acquisition in MongoDB with 12-byte BSON timestamp representing ObjectID representing with Subject Brainwaves controlling flight of Warfighter simulation. (<b>c</b>) Nazzy Ironman Subject MEG Brain Computer Interface to Warfighter Flight Simulator iOS Mobile Applications yielding over 90% performance on MEG Subject brain signal data. (<b>d</b>) Nazzy Ironman Subject MEG Brain Computer Interface to Warfighter Flight Simulator iOS Mobile Applications stored in MongoDB databases yielding over 90% performance on Subject Data, demonstrated in <a href="#diseases-06-00089-f009" class="html-fig">Figure 9</a>, <a href="#diseases-06-00089-f010" class="html-fig">Figure 10</a> and <a href="#diseases-06-00089-f011" class="html-fig">Figure 11</a>.</p>
Full article ">Figure 11 Cont.
<p>(<b>a</b>) MEG Brainwave data acquisition in MongoDB with 12-byte BSON timestamp representing ObjectID representing Subject’s Training Matrices acquired during VBFA Machine learning algorithm training on MEG brainwaves. (<b>b</b>) MEG Brainwave data acquisition in MongoDB with 12-byte BSON timestamp representing ObjectID representing with Subject Brainwaves controlling flight of Warfighter simulation. (<b>c</b>) Nazzy Ironman Subject MEG Brain Computer Interface to Warfighter Flight Simulator iOS Mobile Applications yielding over 90% performance on MEG Subject brain signal data. (<b>d</b>) Nazzy Ironman Subject MEG Brain Computer Interface to Warfighter Flight Simulator iOS Mobile Applications stored in MongoDB databases yielding over 90% performance on Subject Data, demonstrated in <a href="#diseases-06-00089-f009" class="html-fig">Figure 9</a>, <a href="#diseases-06-00089-f010" class="html-fig">Figure 10</a> and <a href="#diseases-06-00089-f011" class="html-fig">Figure 11</a>.</p>
Full article ">Figure 12
<p>(<b>a</b>) NAZZY IronMan with Frozen Videogame &amp; iOS Warfighter Mobile Game for Brain Computer Interface Project with Emotiv/OpenVibe Wireless electroencephalography (EEG) brain signal(s) data while using machine learning algorithms to classify brain signals in iOS videogame applications utilizing EEG brain signal data storage in NoSQL database MongoDB. (<b>b</b>) NAZZY IronMan with Frozen Project with Emotiv Wireless EEG brain signal(s) data using machine learning algorithms to classify brain signals in iOS Frozen videogame utilizing EEG brain signal data storage in NoSQL database MongoDB.</p>
Full article ">Figure 13
<p>(<b>a</b>) Emotiv EPOC Headset, Features, and Brain Computer Interface applications. (<b>b</b>) Utilization of Matlab FIR (Finite Impulse Response) &amp; IIR (Infinite Impulse Response) Bandpass and Lowpass Filters on Wireless EEG Signals.</p>
Full article ">Figure 14
<p>Nazzy IronMan Brain Computer Interface Cloud Provider Facility with Cassandra NoSQL database(s).</p>
Full article ">Figure 15
<p>Nazzy IronMan Brain Computer Interface Cassandra Cloud Security Architecture Strategy.</p>
Full article ">Figure 16
<p>Emotiv and OpenVibe EEG Sensor Array stored in Cassandra NoSQL database.</p>
Full article ">Figure 17
<p>OpenVibe EEG Sensor Array stored in Cassandra NoSQL KEYSPACE (database) with Simple_Strategy and Replication Factor = 1.</p>
Full article ">Figure 18
<p>OpenVibe EEG Sensor Array stored in Cassandra NoSQL KEYSPACE (database) with Simple_Strategy and Replication Factor = 1 displaying primary key and all attributes for keyspace, eeg_motor_imagery_openvibe and table, eeg_1_signal Cassandra statistics.</p>
Full article ">Figure 19
<p>OpenVibe EEG Sensor Array stored in Cassandra NoSQL KEYSPACE (database) with Simple_Strategy, table, eeg_1_signal importing 317,825 rows of EEG brain signal data.</p>
Full article ">Figure 20
<p>OpenVibe EEG Sensor Array stored in Cassandra NoSQL KEYSPACE (database) with Simple_Strategy, Stimulation table, eeg_signal_1_stimulation_table importing eeg brain signal data (<span class="html-italic">e.g., time, identifier, duration</span>).</p>
Full article ">Figure 21
<p>MongoDB Brain Computer Interface Cloud Security Restraints.</p>
Full article ">Figure 22
<p>Java Tokenization of OpenVibe EEG Sensor Array inputted into MongoDB Collection utilizing db.openVibeSignal.find() queries.</p>
Full article ">Figure 23
<p>Usage of NoSQL database MongoDB for Wireless EEG Signal Storage and Retrieval with MongoDB BSON Timestamp with EEG Signal Electrode Array.</p>
Full article ">Figure 24
<p>Java Program for Emotiv and OpenVibe EEG Sensor Array Channel inserting a document into MongoDB Collection using Java class <b><span class="html-italic">BasicDBObject</span></b>.</p>
Full article ">Figure 25
<p>OpenVibe EEG Sensor Array Java Program for Brainwave Signal Stimulation Codes for time, stimulation code, and duration.</p>
Full article ">Figure 26
<p>Wireless EEG Java Stimulation Code Dictionary to input EEG signal patterns in MongoDB.</p>
Full article ">Figure 27
<p>Stimulation Codes have to match the acquired EEG signal patterns in MongoDB.</p>
Full article ">Figure 28
<p>MapReduce in MongoDB for Signal Processing and EEG data analytics.</p>
Full article ">Figure 29
<p>(<b>a</b>) iOS Mobile Application of Warfighter Videogame using OpenGL ES 2.0 (Khronos Group, Beaverton, Oregon if USA, country, <a href="https://www.khronos.org/about/" target="_blank">https://www.khronos.org/about/</a>) and GLKit with the UITapGestureRecognizer class to fire a projectile. (<b>b</b>) iOS Mobile Application of Warfighter Videogame using OpenGL ES 2.0 and GLKit with aerial targets using the addTarget Method. (<b>c</b>) Display of iOS Mobile Application of Warfighter Videogame using OpenGL ES 2.0 and GLKit with aerial targets using the addTarget Method (close-up).</p>
Full article ">Figure 29 Cont.
<p>(<b>a</b>) iOS Mobile Application of Warfighter Videogame using OpenGL ES 2.0 (Khronos Group, Beaverton, Oregon if USA, country, <a href="https://www.khronos.org/about/" target="_blank">https://www.khronos.org/about/</a>) and GLKit with the UITapGestureRecognizer class to fire a projectile. (<b>b</b>) iOS Mobile Application of Warfighter Videogame using OpenGL ES 2.0 and GLKit with aerial targets using the addTarget Method. (<b>c</b>) Display of iOS Mobile Application of Warfighter Videogame using OpenGL ES 2.0 and GLKit with aerial targets using the addTarget Method (close-up).</p>
Full article ">Figure 30
<p>iOS Mobile Application of Warfighter Videogame using OpenGL ES 2.0 and GLKit to evade or chase aerial targets.</p>
Full article ">Figure 31
<p>(<b>a</b>) iOS Mobile Application of Warfighter Videogame using OpenGL ES 2.0 and GLKit to evade or chase aerial targets. (<b>b</b>) iOS Mobile Application of Warfighter Videogame using OpenGL ES 2.0 and GLKit to evade or chase aerial targets can be interfaced to MEG Subject Brain Signal Data with over 90% classification performance. (<b>c</b>) Nazzy IronMan with Apple iOS Frozen Videogram Application can be interfaced to with MEG Subject Brain Signal Data with over 90% classification performance.</p>
Full article ">Figure 31 Cont.
<p>(<b>a</b>) iOS Mobile Application of Warfighter Videogame using OpenGL ES 2.0 and GLKit to evade or chase aerial targets. (<b>b</b>) iOS Mobile Application of Warfighter Videogame using OpenGL ES 2.0 and GLKit to evade or chase aerial targets can be interfaced to MEG Subject Brain Signal Data with over 90% classification performance. (<b>c</b>) Nazzy IronMan with Apple iOS Frozen Videogram Application can be interfaced to with MEG Subject Brain Signal Data with over 90% classification performance.</p>
Full article ">Figure 32
<p>iOS Mobile Application of Warfighter Videogame using OpenGL ES 2.0 and GLKit for online user’s game analytics and dynamic biometrics.</p>
Full article ">Figure 33
<p>Nazzy Ironman MEG/EEG (Virtual LAN) VLAN Base Unit for Security Authentication.</p>
Full article ">Figure 34
<p>MEG/EEG Cryptographic Key Authentication utilizing MEG/EEG brainwaves with Cassandra and MongoDB NoSQL databases.</p>
Full article ">
12 pages, 1743 KiB  
Article
Direct Assessment of Alcohol Consumption in Mental State Using Brain Computer Interfaces and Grammatical Evolution
by Katerina D. Tzimourta, Ioannis Tsoulos, Thanasis Bilero, Alexandros T. Tzallas, Markos G. Tsipouras and Nikolaos Giannakeas
Inventions 2018, 3(3), 51; https://doi.org/10.3390/inventions3030051 - 27 Jul 2018
Cited by 13 | Viewed by 5905
Abstract
Alcohol consumption affects the function of the brain and long-term excessive alcohol intake can lead to severe brain disorders. Wearable electroencephalogram (EEG) recording devices combined with Brain Computer Interface (BCI) software may serve as a tool for alcohol-related brain wave assessment. In this [...] Read more.
Alcohol consumption affects the function of the brain and long-term excessive alcohol intake can lead to severe brain disorders. Wearable electroencephalogram (EEG) recording devices combined with Brain Computer Interface (BCI) software may serve as a tool for alcohol-related brain wave assessment. In this paper, a method for mental state assessment from alcohol-related EEG recordings is proposed. EEG recordings are acquired with the Emotiv EPOC+, after consumption of three separate doses of alcohol. Data from the four stages (alcohol-free and three levels of doses) are processed using the OpenViBE platform. Spectral and statistical features are calculated, and Grammatical Evolution is employed for discrimination across four classes. Obtained results in terms of accuracy reached high levels (89.95%), which renders the proposed approach suitable for direct assessment of the driver’s mental state for road safety and accident avoidance in a potential in-vehicle smart system. Full article
(This article belongs to the Special Issue Frontiers in Wearable Devices)
Show Figures

Figure 1

Figure 1
<p>Flowchart of the proposed BCI system.</p>
Full article ">Figure 2
<p>The OpenViBE training scenario of the analysis.</p>
Full article ">Figure 3
<p>The EmotivPRO software for device set up and EEG monitoring.</p>
Full article ">Figure 4
<p>The timeline of the experiment. Each dose is a 50 mL whisky (40% alc/vol). The total time in the end of each stage is shown in the orange box.</p>
Full article ">
Back to TopTop