Back to Search Start Over

Variability-Aware Training and Self-Tuning of Highly Quantized DNNs for Analog PIM

Authors :
Deng, Zihao
Orshansky, Michael
Publication Year :
2021

Abstract

DNNs deployed on analog processing in memory (PIM) architectures are subject to fabrication-time variability. We developed a new joint variability- and quantization-aware DNN training algorithm for highly quantized analog PIM-based models that is significantly more effective than prior work. It outperforms variability-oblivious and post-training quantized models on multiple computer vision datasets/models. For low-bitwidth models and high variation, the gain in accuracy is up to 35.7% for ResNet-18 over the best alternative. We demonstrate that, under a realistic pattern of within- and between-chip components of variability, training alone is unable to prevent large DNN accuracy loss (of up to 54% on CIFAR-100/ResNet-18). We introduce a self-tuning DNN architecture that dynamically adjusts layer-wise activations during inference and is effective in reducing accuracy loss to below 10%.<br />Comment: This is the preprint version of our paper accepted in DATE 2022

Details

Database :
arXiv
Publication Type :
Report
Accession number :
edsarx.2111.06457
Document Type :
Working Paper