Developing Electroencephalogram Signal Processing Techniques for Brain Computer Interface Applications

Loading...
Thumbnail Image

Journal Title

Journal ISSN

Volume Title

Publisher

Abstract

Multi-class motor imagery EEG activity decoding has always been challenging for the development of Brain Computer Interface (BCI) system. Deep learning has recently emerged as a powerful approach for BCI system development using EEG activity. However, the EEG activity analysis and classification should be robust, automated and accurate. Currently, available BCI systems perform well for binary task identification whereas, multi-class classification of EEG activity for BCI applications is still a challenging task. The frequent occurrence of multiple artifactual origins makes it imperative to adopt an adequate artifacts removal methodology prior to feature estimation in Brain-Computer Interface applications. This study proposes a novel artifact removal methodology using a joint application of Fast-Power Independent Component Analysis and General Linear Chirplet Transform for automatic identification and rejection of artifactual origins. After segregating ongoing Electroencephalogram activity into Independent Components, Katz-Fractal Sparsity criterion is employed to identify artifactual components. The identified artifactual components are treated by General Linear Chirplet Transform-based EEG de-noising method to recover useful cerebral information leaked with artifactual origins. Thereafter, Inverse Independent Component Analysis yields artifact corrected clean Electroencephalogram activity for further analysis. The effectiveness of the developed methodology is validated with simulated as well as empirical Electroencephalogram datasets. The experimental results establish the proposed method as a potential candidate for non-cerebral artifacts correction and noise suppression from Electroencephalogram records. Further, to enhance the multi-class BCI system's performance, a novel channel selection, and features optimization methodology have been proposed. First, the multi-channel EEG dataset is reduced to an optimum no. of channel subsets using the developed MDA-SOGWO-based EEG channel selection criterion. After that, the discriminable feature-set is generated from the time, frequency, and FAWT-based time-frequency domain of the EEG dataset. Then, an informative feature-set is constructed from extracted features through the CCA-RFE-based feature selection criterion. Finally, training and validation of the selected feature-set are carried out using ELM, LDA, RF, and MLP classifiers. The proposed methodology is evaluated on multi-class MI datasets of BCI Competition IV 2a and BCI Competition III 3a, yielding classification accuracy of 85.11% and 92.46%, respectively. The classification results pose the proposed methodology as an efficient method for future BCI system design. Afterwards, for efficient feature extraction and classification methodology development, a hybrid AttentionNet ensemble voting classifier model is developed. The Time-Frequency Representation (TFR) of the multi-class EEG activity is generated using Transient Extracting Transform (TET). The TFR spectrogram images are input to the designed AttentionNet ensemble voting classifier model for training and classification purposes. The model is evaluated using different fusion strategies viz. feature-level and score-level fusion of layers. The proposed model is evaluated on two MI-BCI datasets, BCI competition IV 2a and BCI competition III 3a, yielding the highest classification accuracies of 88.14 % and 93.13 %, respectively. The results obtained on a large multi-class MI-BCI dataset confirm that the proposed hybrid AttentionNet ensemble voting classifier model significantly outperforms the conventional algorithm and achieves significantly high classification accuracy for the feature-level fusion of layers. The developed framework aids in identifying different tasks for multi-class MI-BCI EEG activity.

Description

Citation

Endorsement

Review

Supplemented By

Referenced By