Back to Search Start Over

Continuous learning of spiking networks trained with local rules.

Authors :
Antonov, D.I.
Sviatov, K.V.
Sukhov, S.
Source :
Neural Networks. Nov2022, Vol. 155, p512-522. 11p.
Publication Year :
2022

Abstract

Artificial neural networks (ANNs) experience catastrophic forgetting (CF) during sequential learning. In contrast, the brain can learn continuously without any signs of catastrophic forgetting. Spiking neural networks (SNNs) are the next generation of ANNs with many features borrowed from biological neural networks. Thus, SNNs potentially promise better resilience to CF. In this paper, we study the susceptibility of SNNs to CF and test several biologically inspired methods for mitigating catastrophic forgetting. SNNs are trained with biologically plausible local training rules based on spike-timing-dependent plasticity (STDP). Local training prohibits the direct use of CF prevention methods based on gradients of a global loss function. We developed and tested the method to determine the importance of synapses (weights) based on stochastic Langevin dynamics without the need for the gradients. Several other methods of catastrophic forgetting prevention adapted from analog neural networks were tested as well. The experiments were performed on freely available datasets in the SpykeTorch environment. • Coding information in spikes does not prevent catastrophic forgetting during continuous learning. • Pseudo-rehearsal with random samples provoke fast unlearning of previous knowledge in convolutional spiking neural networks. • Saving a small number of samples from previous datasets can effectively decrease catastrophic forgetting. • A stochastic process of weights diffusion can establish the importance of the weights in spiking networks trained with local rules. [ABSTRACT FROM AUTHOR]

Details

Language :
English
ISSN :
08936080
Volume :
155
Database :
Academic Search Index
Journal :
Neural Networks
Publication Type :
Academic Journal
Accession number :
159743943
Full Text :
https://doi.org/10.1016/j.neunet.2022.09.003