Back to Search Start Over

Efficient Mitchell’s Approximate Log Multipliers for Convolutional Neural Networks.

Authors :
Kim, Min Soo
Barrio, Alberto A. Del
Oliveira, Leonardo Tavares
Hermida, Roman
Bagherzadeh, Nader
Source :
IEEE Transactions on Computers; May2019, Vol. 68 Issue 5, p660-675, 16p
Publication Year :
2019

Abstract

This paper proposes energy-efficient approximate multipliers based on the Mitchell's log multiplication, optimized for performing inferences on convolutional neural networks (CNN). Various design techniques are applied to the log multiplier, including a fully-parallel LOD, efficient shift amount calculation, and exact zero computation. Additionally, the truncation of the operands is studied to create the customizable log multiplier that further reduces energy consumption. The paper also proposes using the one's complements to handle negative numbers, as an approximation of the two's complements that had been used in the prior works. The viability of the proposed designs is supported by the detailed formal analysis as well as the experimental results on CNNs. The experiments also provide insights into the effect of approximate multiplication in CNNs, identifying the importance of minimizing the range of error.The proposed customizable design at $w$w = 8 saves up to 88 percent energy compared to the exact fixed-point multiplier at 32 bits with just a performance degradation of 0.2 percent for the ImageNet ILSVRC2012 dataset. [ABSTRACT FROM AUTHOR]

Details

Language :
English
ISSN :
00189340
Volume :
68
Issue :
5
Database :
Complementary Index
Journal :
IEEE Transactions on Computers
Publication Type :
Academic Journal
Accession number :
135863696
Full Text :
https://doi.org/10.1109/TC.2018.2880742