Back to Search Start Over

Temporal-channel cascaded transformer for imagined handwriting character recognition.

Authors :
Zhou, Wenhui
Wang, Yuhan
Mo, Liangyan
Li, Changsheng
Xu, Mingyue
Kong, Wanzeng
Dai, Guojun
Source :
Neurocomputing. Mar2024, Vol. 573, pN.PAG-N.PAG. 1p.
Publication Year :
2024

Abstract

Neuroelectric signals recorded by micro-electrodes reflect the spontaneous and rhythmic activities of brain neurons. Numerous deep learning frameworks have been designed for various neuroelectric signal decoding tasks, most of which are based on convolutional neural network (CNN) and recurrent neural network (RNN). However, neither CNNs or RNNs can perceive the global dependencies of neural activities in both time and channel dimensions. To address this issue, this paper presents a temporal-channel cascaded transformer network to decode the neural activities of imagined handwriting movements, which can perform imagined handwriting character recognition from spiking activity recorded by two micro-electrode arrays (MEAs). Specifically, we design a temporal-channel cascaded framework and a dense residual transformer encoder structure, which can promote the hierarchical learning and fusion of the temporal and channel features. In addition, a mutual learning strategy of multiple class tokens is proposed to improve classification performance. We conduct performance evaluation experiments on a single-character handwriting-imagination dataset and a sentence handwriting-imagination dataset, which are collected from the public Handwriting BCI dataset. The comparison results demonstrate the superiority of the proposed framework and strategy. Especially in the imagined single-character recognition task, the recognition accuracy of our model can achieve 95.78%, which provides an improvement of + 2 % over the existing state of the art models. [ABSTRACT FROM AUTHOR]

Details

Language :
English
ISSN :
09252312
Volume :
573
Database :
Academic Search Index
Journal :
Neurocomputing
Publication Type :
Academic Journal
Accession number :
175164787
Full Text :
https://doi.org/10.1016/j.neucom.2024.127243