Back to Search Start Over

Backpropagation-based learning techniques for deep spiking neural networks: a survey

Authors :
Manon Dampfhoffer
Thomas Mesquida
Alexandre Valentian
Lorena Anghel
Département Systèmes et Circuits Intégrés Numériques (DSCIN)
Laboratoire d'Intégration des Systèmes et des Technologies (LIST (CEA))
Direction de Recherche Technologique (CEA) (DRT (CEA))
Commissariat à l'énergie atomique et aux énergies alternatives (CEA)-Commissariat à l'énergie atomique et aux énergies alternatives (CEA)-Direction de Recherche Technologique (CEA) (DRT (CEA))
Commissariat à l'énergie atomique et aux énergies alternatives (CEA)-Commissariat à l'énergie atomique et aux énergies alternatives (CEA)
SPINtronique et TEchnologie des Composants (SPINTEC)
Centre National de la Recherche Scientifique (CNRS)-Institut de Recherche Interdisciplinaire de Grenoble (IRIG)
Direction de Recherche Fondamentale (CEA) (DRF (CEA))
Commissariat à l'énergie atomique et aux énergies alternatives (CEA)-Commissariat à l'énergie atomique et aux énergies alternatives (CEA)-Direction de Recherche Fondamentale (CEA) (DRF (CEA))
Commissariat à l'énergie atomique et aux énergies alternatives (CEA)-Commissariat à l'énergie atomique et aux énergies alternatives (CEA)-Université Grenoble Alpes (UGA)
Source :
IEEE Transactions on Neural Networks and Learning Systems, IEEE Transactions on Neural Networks and Learning Systems, inPress, ⟨10.1109/TNNLS.2023.3263008⟩
Publication Year :
2023
Publisher :
HAL CCSD, 2023.

Abstract

International audience; With the adoption of smart systems, artificial neural networks (ANNs) have become ubiquitous. Conventional ANN implementations have high energy consumption, limiting their use in embedded and mobile applications. Spiking neural networks (SNNs) mimic the dynamics of biological neural networks by distributing information over time through binary spikes. Neuromorphic hardware has emerged to leverage the characteristics of SNNs, such as asynchronous processing and high activation sparsity. Therefore, SNNs have recently gained interest in the machine learning community as a brain-inspired alternative to ANNs for low-power applications. However, the discrete representation of the information makes the training of SNNs by backpropagation-based techniques challenging. In this survey, we review training strategies for deep SNNs targeting deep learning applications such as image processing. We start with methods based on the conversion from an ANN to an SNN and compare these with backpropagation-based techniques. We propose a new taxonomy of spiking backpropagation algorithms into three categories, namely, spatial, spatiotemporal, and single-spike approaches. In addition, we analyze different strategies to improve accuracy, latency, and sparsity, such as regularization methods, training hybridization, and tuning of the parameters specific to the SNN neuron model. We highlight the impact of input encoding, network architecture, and training strategy on the accuracy–latency tradeoff. Finally, in light of the remaining challenges for accurate and efficient SNN solutions, we emphasize the importance of joint hardware–software codevelopment.

Details

Language :
English
ISSN :
2162237X
Database :
OpenAIRE
Journal :
IEEE Transactions on Neural Networks and Learning Systems, IEEE Transactions on Neural Networks and Learning Systems, inPress, ⟨10.1109/TNNLS.2023.3263008⟩
Accession number :
edsair.doi.dedup.....dde62e7a5bf8280326de72d70c60be96
Full Text :
https://doi.org/10.1109/TNNLS.2023.3263008⟩