1. Training Spiking Neural Networks with Local Tandem Learning
- Author
-
Yang, Qu, Wu, Jibin, Zhang, Malu, Chua, Yansong, Wang, Xinchao, and Li, Haizhou
- Subjects
FOS: Computer and information sciences ,Computer Science - Neural and Evolutionary Computing ,Neural and Evolutionary Computing (cs.NE) - Abstract
Spiking neural networks (SNNs) are shown to be more biologically plausible and energy efficient over their predecessors. However, there is a lack of an efficient and generalized training method for deep SNNs, especially for deployment on analog computing substrates. In this paper, we put forward a generalized learning rule, termed Local Tandem Learning (LTL). The LTL rule follows the teacher-student learning approach by mimicking the intermediate feature representations of a pre-trained ANN. By decoupling the learning of network layers and leveraging highly informative supervisor signals, we demonstrate rapid network convergence within five training epochs on the CIFAR-10 dataset while having low computational complexity. Our experimental results have also shown that the SNNs thus trained can achieve comparable accuracies to their teacher ANNs on CIFAR-10, CIFAR-100, and Tiny ImageNet datasets. Moreover, the proposed LTL rule is hardware friendly. It can be easily implemented on-chip to perform fast parameter calibration and provide robustness against the notorious device non-ideality issues. It, therefore, opens up a myriad of opportunities for training and deployment of SNN on ultra-low-power mixed-signal neuromorphic computing chips.10, Comment: Accepted by NeurIPS 2022
- Published
- 2022
- Full Text
- View/download PDF