Back to Search Start Over

Memristor-Based Multilayer Neural Networks With Online Gradient Descent Training.

Authors :
Soudry, Daniel
Di Castro, Dotan
Gal, Asaf
Kolodny, Avinoam
Kvatinsky, Shahar
Source :
IEEE Transactions on Neural Networks & Learning Systems; Oct2015, Vol. 26 Issue 10, p2408-2421, 14p
Publication Year :
2015

Abstract

Learning in multilayer neural networks (MNNs) relies on continuous updating of large matrices of synaptic weights by local rules. Such locality can be exploited for massive parallelism when implementing MNNs in hardware. However, these update rules require a multiply and accumulate operation for each synaptic weight, which is challenging to implement compactly using CMOS. In this paper, a method for performing these update operations simultaneously (incremental outer products) using memristor-based arrays is proposed. The method is based on the fact that, approximately, given a voltage pulse, the conductivity of a memristor will increment proportionally to the pulse duration multiplied by the pulse magnitude if the increment is sufficiently small. The proposed method uses a synaptic circuit composed of a small number of components per synapse: one memristor and two CMOS transistors. This circuit is expected to consume between 2% and 8% of the area and static power of previous CMOS-only hardware alternatives. Such a circuit can compactly implement hardware MNNs trainable by scalable algorithms based on online gradient descent (e.g., backpropagation). The utility and robustness of the proposed memristor-based circuit are demonstrated on standard supervised learning tasks. [ABSTRACT FROM PUBLISHER]

Details

Language :
English
ISSN :
2162237X
Volume :
26
Issue :
10
Database :
Complementary Index
Journal :
IEEE Transactions on Neural Networks & Learning Systems
Publication Type :
Periodical
Accession number :
110171915
Full Text :
https://doi.org/10.1109/TNNLS.2014.2383395