Back to Search Start Over

Graph neural networks with configuration cross-attention for tensor compilers

Authors :
Khizbullin, Dmitrii
de Andrade, Eduardo Rocha
Nguyen, Thanh Hau
Ferreira, Matheus Pedroza
Pugh, David R.
Publication Year :
2024

Abstract

With the recent popularity of neural networks comes the need for efficient serving of inference workloads. A neural network inference workload can be represented as a computational graph with nodes as operators transforming multidimensional tensors. The tensors can be transposed and/or tiled in a combinatorially large number of ways, some configurations leading to accelerated inference. We propose TGraph, a neural graph architecture that allows screening for fast configurations of the target computational graph, thus representing an artificial intelligence (AI) tensor compiler in contrast to the traditional heuristics-based compilers. The proposed solution improves mean Kendall's $\tau$ across layout collections of TpuGraphs from 29.8% of the reliable baseline to 67.4% of TGraph. We estimate the potential CO$_2$ emission reduction associated with our work to be equivalent to over 50% of the total household emissions in the areas hosting AI-oriented data centers.

Details

Database :
arXiv
Publication Type :
Report
Accession number :
edsarx.2405.16623
Document Type :
Working Paper