Mind the Move: Developing a Brain-Computer Interface Game with Left-Right Motor Imagery †
<p>The Muse 2 headband is presented (<b>right</b> image). A screenshot from one recording is shown. The four EEG channels of the Muse 2 headband and their waves are presented (<b>left</b> image).</p> "> Figure 2
<p>This flowchart illustrates the two-step process of the proposed system. Initially, an offline processing phase is employed to train a classifier using EEG data. Following this, an online processing phase is used to take mental commands from the user, which are subsequently translated into in-game movement through the use of the trained classifier.</p> "> Figure 3
<p>Offline processing scenario to train the classifier. The first step of the scenario involves importing EEG recordings using a CSV file reader. Three separate boxes are used to handle the three different EEG recordings (Left, Right and Blink) of the BCI system. The signals are then filtered to remove frequencies outside the range of 8–40 Hz. The filtered signals are epoched in time windows of 3 s and EEG waves are calculated (Alpha, Beta 1, Beta 2 Gamma 1, Gamma 2). In the next step, the energy of the signals is calculated and a feature vector of all frequency bands energies is formulated in logarithmic scale. Finally, all feature vectors from all the different EGG recordings are used for the training of the classifier.</p> "> Figure 4
<p>Online Scenario for the proposed BCI system. Acquisition client connects with the LSL stream from BlueMuse in a specific port and the real time data processing starts. With the channel selector only the four EEG channels are included (TP9, TP10, AF7 AF8) and then the same process with the offline scenario is applied. The signals are filtered, EEG waves and energy are calculated and then this features are fed in the classifier to classify the mental commads. Finally, the LSL stream is employed to transmit the classifier’s results to the game. This was accomplished through the implementation of the LSL stream box, which facilitated the communication of data between OpenViBE and the game.</p> "> Figure 5
<p>Screenshot from the game-play. The avatar is moving on the platform depending on the user’s mental commands.</p> "> Figure 6
<p>Histogram that presents the different groups of users depending on their performance while testing the BCI game.</p> "> Figure 7
<p>The average user improvement for MI commands before and after playing the game.</p> ">
Abstract
:1. Introduction
- Increased immersion: one of the key advantages of using BCI in gaming is that it has the potential to increase immersion by allowing players to directly control game characters with their thoughts. This could lead to a more immersive and realistic gaming experience.
- Increased interactivity: another benefit of using BCI in gaming is that it could allow for increased interactivity between players and game characters/environments. For example, if a player’s EEG signals indicated they are feeling scared, this could trigger a change in the game environment (e.g., an enemy appearing), which would then require the player to react accordingly (e.g., by fighting back).
- Accessibility: another potential advantage of using BCI in gaming is that it could make games more accessible for people with disabilities who may not be able to use traditional input devices such as keyboards or controllers.
2. Related Work
3. Materials
3.1. Muse 2 Headband
3.2. BlueMuse
3.3. Lab Streaming Layer
3.4. OpenViBE
4. Methods
4.1. Classification
- It is a powerful and flexible algorithm that can be used to classify EEG data with high accuracy.
- It can be used to classify EEG data in a non-linear manner, allowing for more accurate classifications than linear methods.
- It is also capable of dealing with large datasets and can be used to classify EEG data from multiple subjects.
- It can also be used to perform unsupervised learning, which can help reduce the cost of labeling EEG data.
- It is able to generalize well to new data and can be used to classify EEG data from different individuals.
- It is computationally efficient and can be used to classify EEG data in real-time.
4.2. Online Processing
4.3. Dataset
4.4. Game Design
- If the user is glancing right and imagining moving their right hand, the sample is put in the first category and the in-game avatar slides right.
- If the user is glancing left and imagining moving their left hand, the sample is put in the first category and the in-game avatar slides left.
- If the user blinks, the sample goes to the third category and the in-game avatar jumps.
5. Results
6. Discussion
7. Conclusions
Author Contributions
Funding
Institutional Review Board Statement
Informed Consent Statement
Data Availability Statement
Conflicts of Interest
Abbreviations
BCI | Brain–computer interface |
HCI | Human–computer interface |
EEG | Electroencephalography |
EOG | Electrooculography |
LSL | Lab streaming layer |
MLP | Multi-layer perceptron |
MI | Motor Imagery |
XR | Extended Reality |
References
- Saha, S.; Mamun, K.A.; Ahmed, K.; Mostafa, R.; Naik, G.R.; Darvishi, S.; Khandoker, A.H.; Baumert, M. Progress in brain computer interface: Challenges and opportunities. Front. Syst. Neurosci. 2021, 15, 578875. [Google Scholar] [CrossRef] [PubMed]
- Wolpaw, J.R. Brain-computer interfaces (BCIs) for communication and control. In Proceedings of the 9th International ACM SIGACCESS Conference on Computers and Accessibility, Tempe, AZ, USA, 15–17 October 2007; pp. 1–2. [Google Scholar]
- Birbaumer, N. Breaking the silence: Brain–computer interfaces (BCI) for communication and motor control. Psychophysiology 2006, 43, 517–532. [Google Scholar] [CrossRef] [PubMed]
- Dornhege, G.; Millán, J.d.R.; Hinterberger, T.; McFarland, D.J.; Muller, K.R. Toward Brain-Computer Interfacing; Citeseer: State College, PA, USA, 2007; Volume 63. [Google Scholar]
- Kalagi, S.; Machado, J.; Carvalho, V.; Soares, F.; Matos, D. Brain computer interface systems using non-invasive electroencephalogram signal: A literature review. In Proceedings of the 2017 International Conference on Engineering, Technology and Innovation (ICE/ITMC), Madeira, Portugal, 27–29 June 2017; pp. 1578–1583. [Google Scholar]
- Abiri, R.; Borhani, S.; Sellers, E.W.; Jiang, Y.; Zhao, X. A comprehensive review of EEG-based brain–computer interface paradigms. J. Neural Eng. 2019, 16, 011001. [Google Scholar] [CrossRef] [PubMed]
- Teplan, M. Fundamentals of EEG measurement. Meas. Sci. Rev. 2002, 2, 1–11. [Google Scholar]
- Georgiev, D.D.; Georgieva, I.; Gong, Z.; Nanjappan, V.; Georgiev, G.V. Virtual reality for neurorehabilitation and cognitive enhancement. Brain Sci. 2021, 11, 221. [Google Scholar] [CrossRef]
- Wen, D.; Fan, Y.; Hsu, S.H.; Xu, J.; Zhou, Y.; Tao, J.; Lan, X.; Li, F. Combining brain–computer interface and virtual reality for rehabilitation in neurological diseases: A narrative review. Ann. Phys. Rehabil. Med. 2021, 64, 101404. [Google Scholar] [CrossRef]
- Robinson, N.; Mane, R.; Chouhan, T.; Guan, C. Emerging trends in BCI-robotics for motor control and rehabilitation. Curr. Opin. Biomed. Eng. 2021, 20, 100354. [Google Scholar] [CrossRef]
- Alrajhi, W.; Alaloola, D.; Albarqawi, A. Smart home: Toward daily use of BCI-based systems. In Proceedings of the 2017 International Conference on Informatics, Health & Technology (ICIHT), Riyadh, Saudi Arabia, 21–23 February 2017; pp. 1–5. [Google Scholar]
- Brunner, C.; Birbaumer, N.; Blankertz, B.; Guger, C.; Kübler, A.; Mattia, D.; Millán, J.D.R.; Miralles, F.; Nijholt, A.; Opisso, E.; et al. BNCI Horizon 2020: Towards a roadmap for the BCI community. Brain-Comput. Interfaces 2015, 2, 1–10. [Google Scholar] [CrossRef] [Green Version]
- Marshall, D.; Coyle, D.; Wilson, S.; Callaghan, M. Games, gameplay, and BCI: The state of the art. IEEE Trans. Comput. Intell. AI Games 2013, 5, 82–99. [Google Scholar] [CrossRef]
- Vasiljevic, G.A.M.; de Miranda, L.C. Brain–computer interface games based on consumer-grade EEG Devices: A systematic literature review. Int. J. Hum.–Comput. Interact. 2020, 36, 105–142. [Google Scholar] [CrossRef]
- Kerous, B.; Skola, F.; Liarokapis, F. EEG-based BCI and video games: A progress report. Virtual Real. 2018, 22, 119–135. [Google Scholar] [CrossRef]
- Plass-Oude Bos, D.; Reuderink, B.; van de Laar, B.; Gürkök, H.; Mühl, C.; Poel, M.; Nijholt, A.; Heylen, D. Brain-computer interfacing and games. Brain-Computer Interfaces: Applying our Minds to Human-Computer Interaction; Springer: Cham, Switzerland, 2010; pp. 149–178. [Google Scholar]
- Stamps, K.; Hamam, Y. Towards inexpensive BCI control for wheelchair navigation in the enabled environment–a hardware survey. In Proceedings of the Brain Informatics: International Conference, BI 2010, Toronto, ON, Canada, 28–30 August 2010; pp. 336–345. [Google Scholar]
- Hjørungdal, R.M.; Sanfilippo, F.; Osen, O.; Rutle, A.; Bye, R.T. A game-based learning framework for controlling brain-actuated wheelchairs. In Proceedings of the 30th European Conference on Modelling and Simulation, Regensburg, Germany, 31 May–3 June 2016. [Google Scholar]
- Alchalcabi, A.E.; Eddin, A.N.; Shirmohammadi, S. More attention, less deficit: Wearable EEG-based serious game for focus improvement. In Proceedings of the 2017 IEEE 5th international conference on serious games and applications for health (SeGAH), Perth, WA, Australia, 2–4 April 2017; pp. 1–8. [Google Scholar]
- Fiałek, S.; Liarokapis, F. Comparing Two Commercial Brain Computer Interfaces for Serious Games and Virtual Environments. In Emotion in Games; Springer: Cham, Switzerland, 2016; pp. 103–117. [Google Scholar]
- Djamal, E.C.; Abdullah, M.Y.; Renaldi, F. Brain computer interface game controlling using fast fourier transform and learning vector quantization. J. Telecommun. Electron. Comput. Eng. 2017, 9, 71–74. [Google Scholar]
- Joselli, M.; Binder, F.; Clua, E.; Soluri, E. Mindninja: Concept, Development and Evaluation of a Mind Action Game Based on EEGs. In Proceedings of the 2014 Brazilian Symposium on Computer Games and Digital Entertainment, Porto Alegre, Brazil, 12–14 November 2014; pp. 123–132. [Google Scholar] [CrossRef]
- Glavas, K.; Prapas, G.; Tzimourta, K.D.; Giannakeas, N.; Tsipouras, M.G. Evaluation of the User Adaptation in a BCI Game Environment. Appl. Sci. 2022, 12, 12722. [Google Scholar] [CrossRef]
- Interaxon’s Muse 2. Available online: https://choosemuse.com/muse-2/ (accessed on 1 June 2023).
- Garcia-Moreno, F.M.; Bermudez-Edo, M.; Rodríguez-Fórtiz, M.J.; Garrido, J.L. A CNN-LSTM deep Learning classifier for motor imagery EEG detection using a low-invasive and low-Cost BCI headband. In Proceedings of the 2020 16th International Conference on Intelligent Environments (IE), Madrid, Spain, 20–23 July 2020; pp. 84–91. [Google Scholar]
- Chaudhary, M.; Mukhopadhyay, S.; Litoiu, M.; Sergio, L.E.; Adams, M.S. Understanding brain dynamics for color perception using wearable eeg headband. arXiv 2020, arXiv:2008.07092. [Google Scholar]
- Pu, L.; Lion, K.M.; Todorovic, M.; Moyle, W. Portable EEG monitoring for older adults with dementia and chronic pain-A feasibility study. Geriatr. Nurs. 2021, 42, 124–128. [Google Scholar] [CrossRef] [PubMed]
- Prapas, G.; Glavas, K.; Tzallas, A.T.; Tzimourta, K.D.; Giannakeas, N.; Tsipouras, M.G. Motor Imagery Approach for BCI Game Development. In Proceedings of the 2022 7th South-East Europe Design Automation, Computer Engineering, Computer Networks and Social Media Conference (SEEDA-CECNSM), Ioannina, Greece, 23–25 September 2022; pp. 1–5. [Google Scholar] [CrossRef]
- Kowaleski, J. BlueMuse. 2019. Available online: https://github.com/kowalej/BlueMuse (accessed on 10 October 2022).
- Kothe, C. Lab Streaming-Layer. 2018. Available online: https://github.com/sccn/labstreaminglayer (accessed on 8 May 2022).
- Marsland, S. Machine Learning: An Algorithmic Perspective; Chapman and Hall/CRC: Boca Raton, FL, USA, 2011. [Google Scholar]
- Raj, P.; Evangeline, P. The Digital Twin Paradigm for Smarter Systems and Environments: The Industry Use Cases; Academic Press: Cambridge, MA, USA, 2020. [Google Scholar]
- Miladinović, A.; Barbaro, A.; Valvason, E.; Ajčević, M.; Accardo, A.; Battaglini, P.P.; Jarmolowska, J. Combined and Singular Effects of Action Observation and Motor Imagery Paradigms on Resting-State Sensorimotor Rhythms. In Proceedings of the XV Mediterranean Conference on Medical and Biological Engineering and Computing—MEDICON 2019, Coimbra, Portugal, 26–28 September 2019; pp. 1129–1137. [Google Scholar]
Feature Vectors for Class 1 per Subject | Feature Vectors for Class 2 per Subject | Feature Vectors for Class 3 per Subject |
---|---|---|
97 | 97 | 97 |
Subjects | Right Motor Imagery (TPR/Precision) (%) | Left Motor Imagery (TPR/Precision) (%) | Blink (TPR/Precision) (%) | Overall (Acc %) | ROC Area |
---|---|---|---|---|---|
1 | 76.6/93.4 | 94.5/79.7 | 100.0/100.0 | 90.3 | 0.946 |
2 | 97.2/100.0 | 100.0/96.7 | 100.0/100.0 | 99 | 0.997 |
3 | 99.3/100.0 | 100/98.7 | 100.0/100.0 | 99.7 | 1 |
4 | 95.2/89.8 | 89.7/94.6 | 100.0/100.0 | 94.9 | 0.993 |
5 | 99.3/98.7 | 98.6/99.4 | 100.0/100.0 | 99.3 | 0.997 |
6 | 84.1/85.0 | 85.5/84.6 | 100.0/100.0 | 89.8 | 0.968 |
7 | 84.8/98.5 | 99.3/84.4 | 99.3/99.0 | 94.4 | 0.993 |
8 | 100.0/99.0 | 99.3/98.7 | 100.0/100.0 | 99.7 | 1 |
9 | 95.2/93.5 | 94.6/95.0 | 99.3/100.0 | 96.5 | 0.997 |
10 | 99.3/97.7 | 97.5/98.8 | 100.0/100.0 | 99.1 | 0.998 |
11 | 78.6/94.8 | 96.2/81.3 | 100.0/100.0 | 91.1 | 0.982 |
12 | 96.6/98.4 | 97.9/100.0 | 100.0/100.0 | 98.1 | 0.999 |
13 | 91.7/77.5 | 73.8/90.4 | 99.3/99.7 | 88.2 | 0.935 |
14 | 97.9/91.3 | 91.0/97.5 | 100.0/100.0 | 96.3 | 0.997 |
15 | 99.0/100.0 | 100.0/99.0 | 100.0/100.0 | 99.6 | 1 |
16 | 100.0/98.0 | 97.9/100.0 | 100.0/100.0 | 99.3 | 0.997 |
17 | 91.7/100.0 | 100.0/92.4 | 100.0/100.0 | 97.2 | 0.997 |
18 | 93.8/92.9 | 92.8/93.8 | 100.0/100.0 | 95.5 | 0.994 |
19 | 100.0/100.0 | 100.0/100.0 | 100.0/100.0 | 100 | 1 |
20 | 97.9/96.9 | 96.6/97.9 | 100.0/100.0 | 98.1 | 0.994 |
21 | 95.2/98.5 | 99.3/93.8 | 99.3/100.0 | 97.9 | 0.997 |
22 | 100.0/95.2 | 95.2/100.0 | 100.0/100.0 | 98.3 | 0.994 |
23 | 97.2/94.6 | 95.2/96.4 | 100.0/100.0 | 97.4 | 0.998 |
24 | 99.3/98.7 | 100.0/99.0 | 98.6/100.0 | 99.3 | 1 |
25 | 96.6/100.0 | 100.0/96.3 | 99.3/100.0 | 98.5 | 0.994 |
26 | 99.3/99.3 | 100.0/99.0 | 99.3/100.0 | 99.5 | 1 |
27 | 99.0/100.0 | 100.0/99.0 | 100.0/100.0 | 99.6 | 1 |
28 | 100.0/83.6 | 80.4/100.0 | 100.0/100.0 | 93.4 | 0.982 |
29 | 99.0/96.0 | 96.0/98.8 | 100.0/100.0 | 98.3 | 1 |
30 | 99.0/95.0 | 94.8/99.8 | 100.0/100.0 | 97.9 | 0.993 |
31 | 95.9/96.9 | 96.9/95.9 | 100.0/100.0 | 97.6 | 0.998 |
32 | 91.8/93.7 | 93.8/91.9 | 100.0/100.0 | 95.2 | 0.993 |
33 | 97.9/99.0 | 99.0/98.0 | 100.0/100.0 | 98.9 | 0.996 |
Subjects | Average Game Score | Average Coin Clusters |
---|---|---|
1 | 28.7 (57.4%) | 10.6 (62.3%) |
2 | 36.8 (73.6%) | 13.2 (77.6%) |
3 | 30.2 (60.4%) | 10.9 (64.1%) |
4 | 28.6 (57.2%) | 10.4 (61.1%) |
5 | 39.4 (78.8%) | 13.8 (81.1%) |
6 | 29.6 (59.2%) | 11.0 (64.7%) |
7 | 31.4 (62.8%) | 11.3 (66.4%) |
8 | 21.0 (42.0%) | 8.2 (48.2%) |
9 | 25.1 (50.2%) | 9.3 (54.7%) |
10 | 27.6 (55.2%) | 10.1 (59.4%) |
11 | 24.3 (48.6%) | 8.5 (50.0%) |
12 | 25.5 (51.0%) | 9.5 (55.8%) |
13 | 32.0 (64.0%) | 11.5 (67.6%) |
14 | 35.0 (70.0%) | 12.5 (73.5%) |
15 | 21.3 (42.6%) | 7.8 (45.8%) |
16 | 26.4 (52.8%) | 9.5 (55.8%) |
17 | 27.9 (55.8%) | 10.3 (60.5%) |
18 | 25.8 (51.6%) | 9.2 (54.1%) |
19 | 31.4 (62.8%) | 11.3 (66.4%) |
20 | 27.6 (55.2%) | 10.0 (58.8%) |
21 | 23.3 (46.6%) | 8.2 (48.2%) |
22 | 27.9 (55.8%) | 10.4 (61.1%) |
23 | 31.5 (63.0%) | 11.4 (67.0%) |
24 | 25.3 (50.6%) | 9.0 (52.9%) |
25 | 24.8 (49.6%) | 8.6 (50.5%) |
26 | 24.2 (48.4%) | 8.2 (48.2%) |
27 | 28.6 (57.2%) | 10.2 (60.0%) |
28 | 22.3 (44.6%) | 8.4 (49.4%) |
29 | 31.5 (63.0%) | 11.4 (67.0%) |
30 | 22.6 (45.2%) | 8.8 (51.7%) |
31 | 23.2 (46.4%) | 8.4 (49.4%) |
32 | 28.4 (56.8%) | 10.8 (63.5%) |
33 | 23.8 (47.6%) | 8.8 (51.7%) |
Subjects | Left MI (TPR/Precision) (%) (Before Training) | Left MI (TPR/Precision) (%) (After Training) | Right MI (TPR/Precision) (%) (Before Training) | Right MI (TPR/Precision) (%) (After Training) |
---|---|---|---|---|
1 | 73.3/66.8 | 80.0/80.0 | 66.7/71.4 | 80.0/80.0 |
2 | 86.7/81.3 | 93.3/87.5 | 80.0/88.5 | 86.7/92.9 |
3 | 80.0/70.6 | 86.7/72.2 | 66.7/76.9 | 66.7/83.3 |
4 | 73.3/73.3 | 73.3/78.6 | 73.3/73.3 | 80.0/75.0 |
5 | 93.3/82.4 | 100.0/88.2 | 80.0/92.3 | 86.7/100 |
6 | 80.0/75.0 | 80.0/80.0 | 73.3/78.6 | 80.0/80.0 |
7 | 86.7/81.3 | 93.3/93.3 | 80.0/85.7 | 93.3/93.3 |
8 | 46.7/50.0 | 60.0/56.3 | 53.3/50.0 | 53.3/57.1 |
9 | 66.7/71.4 | 73.3/78.6 | 73.3/68.8 | 80.0/75.0 |
10 | 73.3/68.8 | 80.0/80.0 | 66.7/71.4 | 80.0/80.0 |
11 | 66.7/62.5 | 73.3/66.8 | 60.0/64.3 | 66.7/71.4 |
12 | 66.7/66.7 | 80.0/75.0 | 66.7/66.7 | 73.3/78.6 |
13 | 80.0/80.0 | 93.3/87.5 | 80.0/80.0 | 86.7/92.9 |
14 | 86.7/92.9 | 93.3/100 | 93.3/87.5 | 100.0/93.8 |
15 | 60.0/52.9 | 53.3/57.1 | 46.7/53.8 | 60.0/56.3 |
16 | 66.7/62.5 | 73.3/64.7 | 60.0/64.3 | 60.0/69.2 |
17 | 80.0/75.0 | 80.0/80.0 | 73.3/78.6 | 80.0/80.0 |
18 | 73.3/68.8 | 80.0/75.0 | 66.7/71.4 | 73.3/78.6 |
19 | 86.7/92.9 | 93.3/87.5 | 93.3/87.5 | 86.7/92.9 |
20 | 80.0/75.0 | 80.0/80.0 | 73.3/78.6 | 80.0/80.0 |
21 | 53.3/57.1 | 60.0/64.3 | 60.0/56.3 | 66.7/62.5 |
22 | 66.7/62.5 | 73.3/73.3 | 60.0/64.3 | 73.3/73.3 |
23 | 86.7/72.2 | 100.0/78.9 | 66.7/83.3 | 73.3/100 |
24 | 100.0/75.0 | 100.0/88.2 | 66.7/100 | 86.7/100 |
25 | 93.3/87.5 | 93.3/87.5 | 86.7/92.9 | 86.7/92.9 |
26 | 66.7/90.9 | 80.0/70.6 | 53.3/73.7 | 66.7/76.9 |
27 | 100.0/78.9 | 100.0/100.0 | 73.3/100.0 | 100.0/100.0 |
28 | 40.0/100 | 53.3/88.9 | 100.0/62.5 | 93.3/66.7 |
29 | 100.0/100.0 | 100.0/100.0 | 100.0/100.0 | 100.0/100.0 |
30 | 0.0/- | 20.0/100.0 | 100.0/50.0 | 100.0/55.6 |
31 | 100.0/50.0 | 100.0/60.0 | 0.0/- | 33.3/100.0 |
32 | 66.7/90.9 | 86.7/100.0 | 93.3/73.7 | 100.0/88.2 |
33 | 53.3/72.7 | 80.0/100 | 80.0/63.2 | 100.0/83.3 |
Authors | Subjects | EEG Device | Mental Commands | Reps per Subj | Experiment Duration (per Subj) | Evaluation Metrics |
---|---|---|---|---|---|---|
Hjørungdal et al. [18] | 3 | Emotiv EPOC+ | 4 | - | - | Time to Complete the Task |
Fiałek and Liarokapis [20] | 31 | NeuroSky MindWave Emotiv EPOC+ | 2 | - | - | Avg Rating Values Learnability, Satisfaction Performance, Effort |
Alchalabi et al. [19] | 4 | Emotiv EPOC+ | 2 | 1 | - | Avg Focus (0.38), Avg Stress (0.49) Avg Relaxation (0.32) Avg Excitement (0.25) Avg Engagement (0.65) |
Djamal et al. [21] | 20 | NeuroSky MindWave | 2 | 4 | - | Average Accuracy of Training and Testing Data |
Joselli et al. [22] | 11 | NeuroSky MindWave | 1 | 5 | 10 min | Avg Player Score (163.36) Avg Missed Cuts (8) Avg Attention Level (74) Avg Stress Level (41.27) Avg Engagement Level (75.73) Avg Evolution Attention (1.70) |
Glavas et al. [23] | 38 | Muse 2 Headband | 2 | 20 | 40 min | Classification Accuracy (98.75%) Avg Game Score 1 52.70% Avg Game Score 2 70.35% Improvement |
This work | 33 | Muse 2 Headband | 3 | 20 | 55 min | Classification Accuracy (96.94%) Avg Game Score 27.6 (55.3%) Avg Number of Clusters 10.04 (59%) Avg Improvement 7.5% |
Disclaimer/Publisher’s Note: The statements, opinions and data contained in all publications are solely those of the individual author(s) and contributor(s) and not of MDPI and/or the editor(s). MDPI and/or the editor(s) disclaim responsibility for any injury to people or property resulting from any ideas, methods, instructions or products referred to in the content. |
© 2023 by the authors. Licensee MDPI, Basel, Switzerland. This article is an open access article distributed under the terms and conditions of the Creative Commons Attribution (CC BY) license (https://creativecommons.org/licenses/by/4.0/).
Share and Cite
Prapas, G.; Glavas, K.; Tzimourta, K.D.; Tzallas, A.T.; Tsipouras, M.G. Mind the Move: Developing a Brain-Computer Interface Game with Left-Right Motor Imagery. Information 2023, 14, 354. https://doi.org/10.3390/info14070354
Prapas G, Glavas K, Tzimourta KD, Tzallas AT, Tsipouras MG. Mind the Move: Developing a Brain-Computer Interface Game with Left-Right Motor Imagery. Information. 2023; 14(7):354. https://doi.org/10.3390/info14070354
Chicago/Turabian StylePrapas, Georgios, Kosmas Glavas, Katerina D. Tzimourta, Alexandros T. Tzallas, and Markos G. Tsipouras. 2023. "Mind the Move: Developing a Brain-Computer Interface Game with Left-Right Motor Imagery" Information 14, no. 7: 354. https://doi.org/10.3390/info14070354
APA StylePrapas, G., Glavas, K., Tzimourta, K. D., Tzallas, A. T., & Tsipouras, M. G. (2023). Mind the Move: Developing a Brain-Computer Interface Game with Left-Right Motor Imagery. Information, 14(7), 354. https://doi.org/10.3390/info14070354