Electroencephalography (EEG) is a safe, non-invasive method of monitoring the brain’s electrical activity that can be used for Brain Computer Interfaces (BCIs). However, the usability of EEG in everyday BCIs is limited since clinical EEG systems consist of wet electrodes that must be placed across the scalp by a trained technician. Recently, it has been demonstrated that EEG signals may be recorded from dry electrodes placed inside the ear canal (in-ear EEG), yet in order to perform the signal classification necessary for BCIs these systems must overcome the challenges of reduced spatial covering and reduced SNR of the recorded EEG signals. In this technical report, a wireless, multielectrode, user-generic Ear EEG system is used to record voluntary eye blink events. Though eye blinks are an ocular artifact in EEG signals, eye blink event classification is a component of many EEG-based BCIs allowing for user choice selection and drowsiness detection. Here, classification of this signal is demonstrated with four machine learning classifier models: logistic regression, support vector machine, random forest, and an artificial neural network. A combination of temporal, spectral, and spatial features available to the Ear EEG system are implemented and analyzed in order to optimize classification results across these models and demonstrate the feasibility of more complex signal classification with in-ear EEG recordings. The result of this work is a comparison of four eye blink classifiers for the Ear EEG system each with sensitivity above 95% and specificity above 98%. The model that achieves the highest eye blink classification results is a random forest classifier with 100% sensitivity and 99.5% specificity.




Download Full History