Back to Search
Start Over
High-accuracy deep ANN-to-SNN conversion using quantization-aware training framework and calcium-gated bipolar leaky integrate and fire neuron
- Source :
- Frontiers in Neuroscience, Vol 17 (2023)
- Publication Year :
- 2023
- Publisher :
- Frontiers Media S.A., 2023.
-
Abstract
- Spiking neural networks (SNNs) have attracted intensive attention due to the efficient event-driven computing paradigm. Among SNN training methods, the ANN-to-SNN conversion is usually regarded to achieve state-of-the-art recognition accuracies. However, many existing ANN-to-SNN techniques impose lengthy post-conversion steps like threshold balancing and weight renormalization, to compensate for the inherent behavioral discrepancy between artificial and spiking neurons. In addition, they require a long temporal window to encode and process as many spikes as possible to better approximate the real-valued ANN neurons, leading to a high inference latency. To overcome these challenges, we propose a calcium-gated bipolar leaky integrate and fire (Ca-LIF) spiking neuron model to better approximate the functions of the ReLU neurons widely adopted in ANNs. We also propose a quantization-aware training (QAT)-based framework leveraging an off-the-shelf QAT toolkit for easy ANN-to-SNN conversion, which directly exports the learned ANN weights to SNNs requiring no post-conversion processing. We benchmarked our method on typical deep network structures with varying time-step lengths from 8 to 128. Compared to other research, our converted SNNs reported competitively high-accuracy performance, while enjoying relatively short inference time steps.
Details
- Language :
- English
- ISSN :
- 1662453X
- Volume :
- 17
- Database :
- Directory of Open Access Journals
- Journal :
- Frontiers in Neuroscience
- Publication Type :
- Academic Journal
- Accession number :
- edsdoj.7020f7939de4ec59637ffc828109d03
- Document Type :
- article
- Full Text :
- https://doi.org/10.3389/fnins.2023.1141701