EEG-Based Brain Computer Interface: Real-time Control
More Info
expand_more
Abstract
In this thesis, the development of a graphical user interface for a brain computer interface (BCI) system is discussed. This system is based on electroencephalographic (EEG) signals from motor imagery (MI). BCI applications for low consumer-grade are very limited, since most of the applications are towards medical use or games. This limits the potential BCI can have, since it only applies to one specific environment. The goal of this thesis is to show that a low consumer-grade BCI can be made, ranging from educational to personal use, with the needed quality and usability. This is achieved by showing the possibility of distinguishing MI signals. Specifically in this thesis, left hand, right hand, tongue and feet. This can be done and utilized through a machine learning algorithm. Presentation methods for this are displaying EEG signals and using demos that show the purpose of the BCI system. An example of such a demo is moving a simulated computer cursor. In order to have this system for a wider public, the interface should be able to adapt to each individual user using personalized machine learning models. Due to the distinct, personal EEG patterns, the accuracy of classifying the MI-EEG signals using personalized ML models is much higher than with an ML model trained solely on a public dataset. For more advanced personal use of the system, the interface contains everything needed regarding EEG signals, including real-time data streaming in order to directly see change in EEG signals and to help the user detect potential errors in the measurement setup