1. EEG-TCNTransformer: A Temporal Convolutional Transformer for Motor Imagery Brain–Computer Interfaces
- Author
-
Anh Hoang Phuc Nguyen, Oluwabunmi Oyefisayo, Maximilian Achim Pfeffer, and Sai Ho Ling
- Subjects
brain–computer interface ,motor imagery ,electroencephalography ,convolutional neural network ,transformer ,self-attention ,Applied mathematics. Quantitative methods ,T57-57.97 - Abstract
In brain–computer interface motor imagery (BCI-MI) systems, convolutional neural networks (CNNs) have traditionally dominated as the deep learning method of choice, demonstrating significant advancements in state-of-the-art studies. Recently, Transformer models with attention mechanisms have emerged as a sophisticated technique, enhancing the capture of long-term dependencies and intricate feature relationships in BCI-MI. This research investigates the performance of EEG-TCNet and EEG-Conformer models, which are trained and validated using various hyperparameters and bandpass filters during preprocessing to assess improvements in model accuracy. Additionally, this study introduces EEG-TCNTransformer, a novel model that integrates the convolutional architecture of EEG-TCNet with a series of self-attention blocks employing a multi-head structure. EEG-TCNTransformer achieves an accuracy of 83.41% without the application of bandpass filtering.
- Published
- 2024
- Full Text
- View/download PDF