Back to Search Start Over

Examining the Role and Limits of Batchnorm Optimization to Mitigate Diverse Hardware-noise in In-memory Computing

Authors :
Bhattacharjee, Abhiroop
Moitra, Abhishek
Kim, Youngeun
Venkatesha, Yeshwanth
Panda, Priyadarshini
Source :
Great Lakes Symposium on VLSI 2023 (GLSVLSI 2023) conference
Publication Year :
2023

Abstract

In-Memory Computing (IMC) platforms such as analog crossbars are gaining focus as they facilitate the acceleration of low-precision Deep Neural Networks (DNNs) with high area- & compute-efficiencies. However, the intrinsic non-idealities in crossbars, which are often non-deterministic and non-linear, degrade the performance of the deployed DNNs. In addition to quantization errors, most frequently encountered non-idealities during inference include crossbar circuit-level parasitic resistances and device-level non-idealities such as stochastic read noise and temporal drift. In this work, our goal is to closely examine the distortions caused by these non-idealities on the dot-product operations in analog crossbars and explore the feasibility of a nearly training-less solution via crossbar-aware fine-tuning of batchnorm parameters in real-time to mitigate the impact of the non-idealities. This enables reduction in hardware costs in terms of memory and training energy for IMC noise-aware retraining of the DNN weights on crossbars.<br />Comment: Accepted in Great Lakes Symposium on VLSI 2023 (GLSVLSI 2023) conference

Details

Database :
arXiv
Journal :
Great Lakes Symposium on VLSI 2023 (GLSVLSI 2023) conference
Publication Type :
Report
Accession number :
edsarx.2305.18416
Document Type :
Working Paper
Full Text :
https://doi.org/10.1145/3583781.3590241