Back to Search Start Over

Training of Quantized Deep Neural Networks using a Magnetic Tunnel Junction-Based Synapse

Authors :
Toledo, Tzofnat Greenberg
Perach, Ben
Hubara, Itay
Soudry, Daniel
Kvatinsky, Shahar
Source :
Semicond. Sci. Technol. 36 114003 (2021)
Publication Year :
2019

Abstract

Quantized neural networks (QNNs) are being actively researched as a solution for the computational complexity and memory intensity of deep neural networks. This has sparked efforts to develop algorithms that support both inference and training with quantized weight and activation values, without sacrificing accuracy. A recent example is the GXNOR framework for stochastic training of ternary (TNN) and binary (BNN) neural networks. In this paper, we show how magnetic tunnel junction (MTJ) devices can be used to support QNN training. We introduce a novel hardware synapse circuit that uses the MTJ stochastic behavior to support the quantize update. The proposed circuit enables processing near memory (PNM) of QNN training, which subsequently reduces data movement. We simulated MTJ-based stochastic training of a TNN over the MNIST, SVHN, and CIFAR10 datasets and achieved an accuracy of 98.61%, 93.99% and 82.71%, respectively (less than 1% degradation compared to the GXNOR algorithm). We evaluated the synapse array performance potential and showed that the proposed synapse circuit can train ternary networks in situ, with 18.3TOPs/W for feedforward and 3TOPs/W for weight update.<br />Comment: Published in Semiconductor Science and Technology, Vol 36

Details

Database :
arXiv
Journal :
Semicond. Sci. Technol. 36 114003 (2021)
Publication Type :
Report
Accession number :
edsarx.1912.12636
Document Type :
Working Paper
Full Text :
https://doi.org/10.1088/1361-6641/ac251b