1. Efficient Mitchell’s Approximate Log Multipliers for Convolutional Neural Networks.
- Author
-
Kim, Min Soo, Barrio, Alberto A. Del, Oliveira, Leonardo Tavares, Hermida, Roman, and Bagherzadeh, Nader
- Subjects
CONVOLUTIONAL neural networks ,ARTIFICIAL neural networks ,ENERGY consumption ,DESIGN techniques ,COMPUTER vision - Abstract
This paper proposes energy-efficient approximate multipliers based on the Mitchell's log multiplication, optimized for performing inferences on convolutional neural networks (CNN). Various design techniques are applied to the log multiplier, including a fully-parallel LOD, efficient shift amount calculation, and exact zero computation. Additionally, the truncation of the operands is studied to create the customizable log multiplier that further reduces energy consumption. The paper also proposes using the one's complements to handle negative numbers, as an approximation of the two's complements that had been used in the prior works. The viability of the proposed designs is supported by the detailed formal analysis as well as the experimental results on CNNs. The experiments also provide insights into the effect of approximate multiplication in CNNs, identifying the importance of minimizing the range of error.The proposed customizable design at $w$w = 8 saves up to 88 percent energy compared to the exact fixed-point multiplier at 32 bits with just a performance degradation of 0.2 percent for the ImageNet ILSVRC2012 dataset. [ABSTRACT FROM AUTHOR]
- Published
- 2019
- Full Text
- View/download PDF