Back to Search Start Over

Prospects for Analog Circuits in Deep Networks

Authors :
Liu, Shih-Chii
Strachan, John Paul
Basu, Arindam
Publication Year :
2021

Abstract

Operations typically used in machine learning al-gorithms (e.g. adds and soft max) can be implemented bycompact analog circuits. Analog Application-Specific Integrated Circuit (ASIC) designs that implement these algorithms using techniques such as charge sharing circuits and subthreshold transistors, achieve very high power efficiencies. With the recent advances in deep learning algorithms, focus has shifted to hardware digital accelerator designs that implement the prevalent matrix-vector multiplication operations. Power in these designs is usually dominated by the memory access power of off-chip DRAM needed for storing the network weights and activations. Emerging dense non-volatile memory technologies can help to provide on-chip memory and analog circuits can be well suited to implement the needed multiplication-vector operations coupled with in-computing memory approaches. This paper presents abrief review of analog designs that implement various machine learning algorithms. It then presents an outlook for the use ofanalog circuits in low-power deep network accelerators suitable for edge or tiny machine learning applications.<br />Comment: 6 pages, 4 figures

Details

Database :
arXiv
Publication Type :
Report
Accession number :
edsarx.2106.12444
Document Type :
Working Paper