[go: up one dir, main page]
More Web Proxy on the site http://driver.im/ skip to main content
10.1145/3038439.3038446acmconferencesArticle/Chapter ViewAbstractPublication PagesiuiConference Proceedingsconference-collections
research-article

The Expectation Based Eye-Brain-Computer Interface: An Attempt of Online Test

Published: 13 March 2017 Publication History

Abstract

In this preliminary study we tested online a new Eye-Brain-Computer Interface (EBCI) for selection of positions on a screen with a combination of gaze based control and a passive brain-computer interface (BCI). This hybrid BCI was trained offline to recognize the electroencephalogram (EEG) patterns recorded during gaze dwells intentionally used to make moves in a computer game. The patterns were presumably related to expectation of the interface feedback. In the online test, 500 ms gaze dwells led to actions each time the BCI classified them as intentional. When the BCI made a miss, a participant could still communicate the intention by prolonging the dwell up to 1000 ms. Also playing the game was possible, it was found that defining the ground truth for such an online system is not trivial and that further efforts will be needed to evaluate the performance of the expectation based EBCI reliably.

References

[1]
Benjamin Blankertz, Laura Acqualagna, Sven Dähne, Stefan Haufe, Matthias Schultze-Kraft, Irene Sturm, Marija Ušćumlic, Markus A. Wenzel, Gabriel Curio, and Klaus-Robert Müller. 2016. The Berlin Brain-Computer Interface: Progress beyond communication and control. Front Neurosci 10: 530.
[2]
Anne-Marie Brouwer, Boris Reuderink, Joris Vincent, Marcel A. J. van Gerven, and Jan B. F. van Erp. 2013. Distinguishing between target and nontarget fixations in a visual search task using fixation-related potentials. J Vis 13: 17.
[3]
C. H. M. Brunia and G. J. M. van Boxtel. 200 Wait and see. Int J Psychophysiol 43, 59--75.
[4]
Douglas C. Engelbart. 1962. Augmenting Human Intellect: A Conceptual Framework. SRI Summary Report AFOSR-322.
[5]
Manuel J. A. Eugster, Tuukka Ruotsalo, Michiel M. Spapé, Oswald Barral, Niklas Ravaja, Giulio Jacucci, and Samuel Kaskia. 2016. Natural brain-information interfaces: Recommending information by relevance inferred from human brain signals. Sci Rep 6: 38580.
[6]
Andrea Finke, Kai Essig, Giuseppe Marchioro, and Helge Ritter. 2016. Toward FRP-based brain-machine interfaces: Single-trial classification of fixation-related potentials. PLoS ONE 11: e0146848.
[7]
Jan-Eike Golenia, Markus Wenzel, and Benjamin Blankertz. 2015. Live demonstrator of EEG and eye-tracking input for disambiguation of image search results. In Symbiotic Interaction, 81--86.
[8]
Bernhard Graimann, Brendan Allison, and Gert Pfurtscheller. 2009. Brain--computer interfaces: A gentle introduction. In Brain-Computer Interfaces. Springer, Berlin, Germany, 1--27.
[9]
Klas Ihme and Thorsten Oliver Zander. 2011. What you expect is what you get? Potential use of contingent negative variation for passive BCI systems in gaze-based HCI. In Affective Computing and Intelligent Interaction (ACII 2011), 447--456.
[10]
Robert J. K. Jacob. 1991. The use of eye movements in human-computer interaction techniques: what you look at is what you get. ACM Trans Inf Syst 9, 152--169.
[11]
Päivi Majaranta and Andreas Bulling. 2014. Eye tracking and eye-based human--computer interaction. In Advances in Physiological Computing, S. H. Fairclough and K. Gilleade (eds.). Springer-Verlag, London, UK, 39--65.
[12]
Yuri O. Nuzhdin. 2016. [A library for data processing in a brain-computer interface.] Lomonosov-2016: International Scientific Conference of Students and Young Scientists. Lomonosov Moscow State University, Moscow, Russia, 78--80. (In Russian) https://lomonosov-msu.ru/archive/Lomonosov_2016/data/8342/uid73990_be24b87706be48e2ea8263b7d53dd62a7822e46b.pdf
[13]
Janna Protzak, Klas Ihme, Thorsten Oliver Zander. 2013. A passive brain-computer interface for supporting gaze-based human-machine interaction. In International Conference on Universal Access in Human-Computer Interaction. Springer, Heidelberg, Germany, 662--671.
[14]
Felix Putze, Johannes Popp, Jutta Hild, Jürgen Beyerer, and Tanja Schultz. 2016. Intervention-free selection using EEG and eye tracking. In Proceedings of the 18th ACM International Conference on Multimodal Interaction (ICMI 2016). ACM, New York, 153--160.
[15]
David Regan. 1989. Human Brain Electrophysiology: Evoked Potentials and Evoked Magnetic Fields in Science and Medicine. Elsevier.
[16]
Sergei L. Shishkin, Yuri O. Nuzhdin, Evgeny P. Svirin, Alexander G. Trofimov, Anastasia A. Fedorova, Bogdan L. Kozyrskiy, and Boris M. Velichkovsky. 2016. EEG negativity in fixations used for gaze-based control: Toward converting intentions into actions with an Eye-Brain-Computer Interface. Front Neurosci 10: 528.
[17]
Marija Ušćumlić and Benjamin Blankertz. 2016. Active visual search in non-stationary scenes: Coping with temporal variability and uncertainty. J Neural Eng 13: 016015.
[18]
Boris M. Velichkovsky and John Paulin Hansen. 1996. New technological windows into mind: there is more in eyes and brains for human-computer interaction. In Proceedings of the SIGCHI Conference on Human Factors in Computing Systems (CHI '96), 496--503.
[19]
Boris Velichkovsky, Andreas Sprenger, and Pieter Unema. 1997. Towards gaze-mediated interaction: Collecting solutions of the "Midas touch problem". Human-Computer Interaction (INTERACT'97), 509--516.
[20]
William Grey Walter. 1966. Expectancy waves and intention waves in the human brain and their application to the direct cerebral control of machines. Electroenceph Clin Neurophysiol 21, 616.
[21]
Thorsten O. Zander, Matti Gaertner, Christian Kothe, and Roman Vilimek. 2010. Combining eye gaze input with a brain--computer interface for touchless human--computer interaction. Int J Hum Comput Interact 27, 38--51.
[22]
Thorsten O. Zander and Christian Kothe. 2011. Towards passive brain--computer interfaces: Applying brain--computer interface technology to human-machine systems in general. J Neural Eng 8: 025005.

Cited By

View all
  • (2020)Experiments with Neural Net Object Detection System YOLO on Small Training Datasets for Intelligent RoboticsAdvanced Technologies in Robotics and Intelligent Systems10.1007/978-3-030-33491-8_19(157-162)Online publication date: 2-Jan-2020
  • (2019)Recognition Algorithm for Biological and Criminalistics ObjectsBiologically Inspired Cognitive Architectures 201910.1007/978-3-030-25719-4_36(283-294)Online publication date: 17-Jul-2019
  • (2019)Factographic Information Retrieval for Biological ObjectsBiologically Inspired Cognitive Architectures 201910.1007/978-3-030-25719-4_35(277-282)Online publication date: 17-Jul-2019
  • Show More Cited By

Recommendations

Comments

Please enable JavaScript to view thecomments powered by Disqus.

Information & Contributors

Information

Published In

cover image ACM Conferences
BCIforReal '17: Proceedings of the 2017 ACM Workshop on An Application-oriented Approach to BCI out of the laboratory
March 2017
50 pages
ISBN:9781450349017
DOI:10.1145/3038439
Permission to make digital or hard copies of all or part of this work for personal or classroom use is granted without fee provided that copies are not made or distributed for profit or commercial advantage and that copies bear this notice and the full citation on the first page. Copyrights for components of this work owned by others than the author(s) must be honored. Abstracting with credit is permitted. To copy otherwise, or republish, to post on servers or to redistribute to lists, requires prior specific permission and/or a fee. Request permissions from [email protected].

Sponsors

Publisher

Association for Computing Machinery

New York, NY, United States

Publication History

Published: 13 March 2017

Permissions

Request permissions for this article.

Check for updates

Author Tags

  1. brain-computer interfaces
  2. eye tracking
  3. eye-brain-computer interfaces
  4. gaze interaction
  5. human-computer interfaces

Qualifiers

  • Research-article

Funding Sources

  • Russian Science Foundation

Conference

IUI'17
Sponsor:

Acceptance Rates

BCIforReal '17 Paper Acceptance Rate 8 of 12 submissions, 67%;
Overall Acceptance Rate 8 of 12 submissions, 67%

Upcoming Conference

IUI '25

Contributors

Other Metrics

Bibliometrics & Citations

Bibliometrics

Article Metrics

  • Downloads (Last 12 months)13
  • Downloads (Last 6 weeks)2
Reflects downloads up to 13 Dec 2024

Other Metrics

Citations

Cited By

View all
  • (2020)Experiments with Neural Net Object Detection System YOLO on Small Training Datasets for Intelligent RoboticsAdvanced Technologies in Robotics and Intelligent Systems10.1007/978-3-030-33491-8_19(157-162)Online publication date: 2-Jan-2020
  • (2019)Recognition Algorithm for Biological and Criminalistics ObjectsBiologically Inspired Cognitive Architectures 201910.1007/978-3-030-25719-4_36(283-294)Online publication date: 17-Jul-2019
  • (2019)Factographic Information Retrieval for Biological ObjectsBiologically Inspired Cognitive Architectures 201910.1007/978-3-030-25719-4_35(277-282)Online publication date: 17-Jul-2019
  • (2018)Classification of the gaze fixations in the eye-brain-computer interface paradigm with a compact convolutional neural networkProcedia Computer Science10.1016/j.procs.2018.11.062145(293-299)Online publication date: 2018

View Options

Login options

View options

PDF

View or Download as a PDF file.

PDF

eReader

View online with eReader.

eReader

Media

Figures

Other

Tables

Share

Share

Share this Publication link

Share on social media