1. Magnetoencephalogram-based brain-computer interface for hand-gesture decoding using deep learning.
- Author
-
Bu Y, Harrington DL, Lee RR, Shen Q, Angeles-Quinto A, Ji Z, Hansen H, Hernandez-Lucas J, Baumgartner J, Song T, Nichols S, Baker D, Rao R, Lerman I, Lin T, Tu XM, and Huang M
- Subjects
- Humans, Magnetoencephalography, Gestures, Electroencephalography methods, Algorithms, Brain-Computer Interfaces, Deep Learning
- Abstract
Advancements in deep learning algorithms over the past decade have led to extensive developments in brain-computer interfaces (BCI). A promising imaging modality for BCI is magnetoencephalography (MEG), which is a non-invasive functional imaging technique. The present study developed a MEG sensor-based BCI neural network to decode Rock-Paper-scissors gestures (MEG-RPSnet). Unique preprocessing pipelines in tandem with convolutional neural network deep-learning models accurately classified gestures. On a single-trial basis, we found an average of 85.56% classification accuracy in 12 subjects. Our MEG-RPSnet model outperformed two state-of-the-art neural network architectures for electroencephalogram-based BCI as well as a traditional machine learning method, and demonstrated equivalent and/or better performance than machine learning methods that have employed invasive, electrocorticography-based BCI using the same task. In addition, MEG-RPSnet classification performance using an intra-subject approach outperformed a model that used a cross-subject approach. Remarkably, we also found that when using only central-parietal-occipital regional sensors or occipitotemporal regional sensors, the deep learning model achieved classification performances that were similar to the whole-brain sensor model. The MEG-RSPnet model also distinguished neuronal features of individual hand gestures with very good accuracy. Altogether, these results show that noninvasive MEG-based BCI applications hold promise for future BCI developments in hand-gesture decoding., (© The Author(s) 2023. Published by Oxford University Press. All rights reserved. For permissions, please e-mail: journals.permissions@oup.com.)
- Published
- 2023
- Full Text
- View/download PDF