Back to Search Start Over

Energy-Efficient Neural Network Acceleration in the Presence of Bit-Level Memory Errors.

Authors :
Kim, Sung
Howe, Patrick
Moreau, Thierry
Alaghi, Armin
Ceze, Luis
Sathe, Visvesh S.
Source :
IEEE Transactions on Circuits & Systems. Part I: Regular Papers; Dec2018, Vol. 65 Issue 12, p4285-4298, 14p
Publication Year :
2018

Abstract

As a result of the increasing demand for deep neural network (DNN)-based services, efforts to develop hardware accelerators for DNNs are growing rapidly. However, while highly efficient accelerators on convolutional DNNs (Conv-DNNs) have been developed, less progress has been made with regards to fully-connected DNNs. Based on analysis of bit-level SRAM errors, we propose memory adaptive training with in-situ canaries (MATIC), a methodology that enables aggressive voltage scaling of accelerator weight memories to improve the energy-efficiency of DNN accelerators. To enable accurate operation with voltage overscaling, MATIC combines characteristics of SRAM bit failures with the error resilience of neural networks in a memory-adaptive training (MAT) process. Furthermore, PVT-related voltage margins are eliminated using bit-cells from synaptic weights as in-situ canaries to track runtime environmental variation. Demonstrated on a low-power DNN accelerator fabricated in 65 nm CMOS, MATIC enables up to $3.3\times $ energy reduction versus the nominal voltage, or $18.6\times $ application error reduction. We also perform a simulation study that extends MAT to Conv-DNNs, and characterize the accuracy impact of bit failure statistics. Finally, we develop a weight refinement algorithm to improve the performance of MAT, and show that it improves absolute accuracy by 0.8–1.3% or reduces training time by 5– $10\times $. [ABSTRACT FROM AUTHOR]

Details

Language :
English
ISSN :
15498328
Volume :
65
Issue :
12
Database :
Complementary Index
Journal :
IEEE Transactions on Circuits & Systems. Part I: Regular Papers
Publication Type :
Periodical
Accession number :
132683341
Full Text :
https://doi.org/10.1109/TCSI.2018.2839613