Back to Search Start Over

SkipSNN: Efficiently Classifying Spike Trains with Event-attention

Authors :
Yin, Hang
Su, Yao
Liu, Liping
Hartvigsen, Thomas
Dai, Xin
Kong, Xiangnan
Publication Year :
2024

Abstract

Spike train classification has recently become an important topic in the machine learning community, where each spike train is a binary event sequence with \emph{temporal-sparsity of signals of interest} and \emph{temporal-noise} properties. A promising model for it should follow the design principle of performing intensive computation only when signals of interest appear. So such tasks use mainly Spiking Neural Networks (SNNs) due to their consideration of temporal-sparsity of spike trains. However, the basic mechanism of SNNs ignore the temporal-noise issue, which makes them computationally expensive and thus high power consumption for analyzing spike trains on resource-constrained platforms. As an event-driven model, an SNN neuron makes a reaction given any input signals, making it difficult to quickly find signals of interest. In this paper, we introduce an event-attention mechanism that enables SNNs to dynamically highlight useful signals of the original spike trains. To this end, we propose SkipSNN, which extends existing SNN models by learning to mask out noise by skipping membrane potential updates and shortening the effective size of the computational graph. This process is analogous to how people choose to open and close their eyes to filter the information they see. We evaluate SkipSNN on various neuromorphic tasks and demonstrate that it achieves significantly better computational efficiency and classification accuracy than other state-of-the-art SNNs.<br />Comment: Published as a research paper at IEEE BigData 2024

Details

Database :
arXiv
Publication Type :
Report
Accession number :
edsarx.2411.05806
Document Type :
Working Paper