Back to Search Start Over

MTJ-Based Hardware Synapse Design for Quantized Deep Neural Networks

Authors :
Toledo, Tzofnat Greenberg
Perach, Ben
Soudry, Daniel
Kvatinsky, Shahar
Publication Year :
2019
Publisher :
arXiv, 2019.

Abstract

Quantized neural networks (QNNs) are being actively researched as a solution for the computational complexity and memory intensity of deep neural networks. This has sparked efforts to develop algorithms that support both inference and training with quantized weight and activation values without sacrificing accuracy. A recent example is the GXNOR framework for stochastic training of ternary and binary neural networks. In this paper, we introduce a novel hardware synapse circuit that uses magnetic tunnel junction (MTJ) devices to support the GXNOR training. Our solution enables processing near memory (PNM) of QNNs, therefore can further reduce the data movements from and into the memory. We simulated MTJ-based stochastic training of a TNN over the MNIST and SVHN datasets and achieved an accuracy of 98.61% and 93.99%, respectively.

Details

Database :
OpenAIRE
Accession number :
edsair.doi...........a3117fda711c03c138a841bd3ca527ab
Full Text :
https://doi.org/10.48550/arxiv.1912.12636