Back to Search Start Over

An active memristor based rate-coded spiking neural network.

Authors :
Amin Fida, Aabid
Khanday, Farooq A.
Mittal, Sparsh
Source :
Neurocomputing. May2023, Vol. 533, p61-71. 11p.
Publication Year :
2023

Abstract

• Physical behaviors of memristive systems can be related to the bio-physical dynamics of biological neural elements. • Spiking behaviors subject to input stimuli of LIF neurons made of memristive elements can be extrapolated to develop on-chip learning algorithms. • Rate coding is a viable alternative to temporal or population coding for in-hardware SNNs. • It is possible to perform a non-linear functions like XOR using a single neuron in SNNs. • A hybrid approach relying on ANN like gradient calculation can be used to learn in SNNs. Neuromorphic computing is a novel computing paradigm that aims to mimic the behavior of biological neural networks for efficiently solving complex problems. While CMOS based neurons and synapses have been developed, they are limited in their ability to demonstrate bio-realistic dynamics. This, coupled with the fact that a huge number of these individual devices are required to build neurons and synapses, limits the scaling and power efficiency of such systems. A viable answer to this problem is neuromemristive systems that are based on memristor devices. These devices exhibit physical behaviors that can be related to the bio-physical dynamics of synapses and neurons. In this paper, a rate-coded all memristive "spiking neural network" (SNN) is presented. The proposed SNN is built with an active memristor neuron based on vanadium dioxide (VO 2) coupled with a non-volatile memristor synapse. The results are validated by first simulating spiking versions of two Boolean functions viz., AND and XOR gates in SPICE. With features extracted from the small neural nets, a large-scale 3-layer spiking neural network is then simulated in Python which yields a validation accuracy of 87% on the MNIST dataset of handwritten digits. One of the prime features of this work is the realization of the XOR function using a single neuron which is not possible without the use of 2-layers of neurons in traditional neural networks. Another significant contribution is the utilization of a gradient-based learning approach for online training of a large-scale SNN. For this, we use the inherent activation function (Sigmoid/ReLU) of the proposed neuron design. [ABSTRACT FROM AUTHOR]

Details

Language :
English
ISSN :
09252312
Volume :
533
Database :
Academic Search Index
Journal :
Neurocomputing
Publication Type :
Academic Journal
Accession number :
162593402
Full Text :
https://doi.org/10.1016/j.neucom.2023.02.038