Back to Search Start Over

Stochastic Gradient Descent Inspired Training Technique for a CMOS/Nano Memristive Trainable Threshold Gate Array.

Authors :
Manem, H.
Rajendran, J.
Rose, G. S.
Source :
IEEE Transactions on Circuits & Systems. Part I: Regular Papers; May2012, Vol. 59 Issue 5, p1051-1060, 10p
Publication Year :
2012

Abstract

Neuromorphic computing is an attractive avenue of research for processing and learning complex real-world data. With technology migration into nano and molecular scales several area and power efficient approaches to the design and implementation of artificial neural networks have been proposed. The discovery of the memristor has further enabled the realization of denser nanoscale logic and memory systems by facilitating the implementation of multilevel logic. Specifically, the innate reconfigurability of memristors can be exploited to realize synapses in artificial neural networks. This work focuses on the development of a variation-tolerant training methodology to efficiently reconfigure memristive synapses in a Trainable Threshold Gate Array (TTGA) system. The training process is inspired from the gradient descent machine learning algorithm commonly used to train artificial threshold neural networks, perceptrons. The design and CMOS/Nano implementation of the TTGA system from trainable perceptron based threshold gates is detailed and results are provided to showcase the training process and performance characteristics of the proposed system. Also shown are the results for training a 1T1M (1 Transistor and 1 Memristor) multilevel memristive memory and its performance characteristics. [ABSTRACT FROM PUBLISHER]

Details

Language :
English
ISSN :
15498328
Volume :
59
Issue :
5
Database :
Complementary Index
Journal :
IEEE Transactions on Circuits & Systems. Part I: Regular Papers
Publication Type :
Periodical
Accession number :
101316711
Full Text :
https://doi.org/10.1109/TCSI.2012.2190665