Back to Search Start Over

Combining Learning and Optimization for Transprecision Computing

Authors :
Borghesi, Andrea
Tagliavini, Giuseppe
Lombardi, Michele
Benini, Luca
Milano, Michela
Source :
Proceedings of the 17th ACM International Conference on Computing Frontiers, May 2020, Pages 10-18
Publication Year :
2020

Abstract

The growing demands of the worldwide IT infrastructure stress the need for reduced power consumption, which is addressed in so-called transprecision computing by improving energy efficiency at the expense of precision. For example, reducing the number of bits for some floating-point operations leads to higher efficiency, but also to a non-linear decrease of the computation accuracy. Depending on the application, small errors can be tolerated, thus allowing to fine-tune the precision of the computation. Finding the optimal precision for all variables in respect of an error bound is a complex task, which is tackled in the literature via heuristics. In this paper, we report on a first attempt to address the problem by combining a Mathematical Programming (MP) model and a Machine Learning (ML) model, following the Empirical Model Learning methodology. The ML model learns the relation between variables precision and the output error; this information is then embedded in the MP focused on minimizing the number of bits. An additional refinement phase is then added to improve the quality of the solution. The experimental results demonstrate an average speedup of 6.5\% and a 3\% increase in solution quality compared to the state-of-the-art. In addition, experiments on a hardware platform capable of mixed-precision arithmetic (PULPissimo) show the benefits of the proposed approach, with energy savings of around 40\% compared to fixed-precision.

Details

Database :
arXiv
Journal :
Proceedings of the 17th ACM International Conference on Computing Frontiers, May 2020, Pages 10-18
Publication Type :
Report
Accession number :
edsarx.2002.10890
Document Type :
Working Paper
Full Text :
https://doi.org/10.1145/3387902.3392615