1. MTJ-Based Hardware Synapse Design for Quantized Deep Neural Networks
- Author
-
Toledo, Tzofnat Greenberg, Perach, Ben, Soudry, Daniel, and Kvatinsky, Shahar
- Subjects
FOS: Computer and information sciences ,Emerging Technologies (cs.ET) ,Hardware Architecture (cs.AR) ,Neural and Evolutionary Computing (cs.NE) ,Machine Learning (cs.LG) - Abstract
Quantized neural networks (QNNs) are being actively researched as a solution for the computational complexity and memory intensity of deep neural networks. This has sparked efforts to develop algorithms that support both inference and training with quantized weight and activation values without sacrificing accuracy. A recent example is the GXNOR framework for stochastic training of ternary and binary neural networks. In this paper, we introduce a novel hardware synapse circuit that uses magnetic tunnel junction (MTJ) devices to support the GXNOR training. Our solution enables processing near memory (PNM) of QNNs, therefore can further reduce the data movements from and into the memory. We simulated MTJ-based stochastic training of a TNN over the MNIST and SVHN datasets and achieved an accuracy of 98.61% and 93.99%, respectively.
- Published
- 2019
- Full Text
- View/download PDF