Sorry, I don't understand your search. ×
Back to Search Start Over

Mirage: A Multi-Level Superoptimizer for Tensor Programs

Authors :
Wu, Mengdi
Cheng, Xinhao
Liu, Shengyu
Shi, Chunan
Ji, Jianan
Ao, Kit
Velliengiri, Praveen
Miao, Xupeng
Padon, Oded
Jia, Zhihao
Publication Year :
2024

Abstract

We introduce Mirage, the first multi-level superoptimizer for tensor programs. A key idea in Mirage is $\mu$Graphs, a uniform representation of tensor programs at the kernel, thread block, and thread levels of the GPU compute hierarchy. $\mu$Graphs enable Mirage to discover novel optimizations that combine algebraic transformations, schedule transformations, and generation of new custom kernels. To navigate the large search space, Mirage introduces a pruning technique based on abstraction that significantly reduces the search space and provides a certain optimality guarantee. To ensure that the optimized $\mu$Graph is equivalent to the input program, Mirage introduces a probabilistic equivalence verification procedure with strong theoretical guarantees. Our evaluation shows that Mirage outperforms existing approaches by 1.1-2.9$\times$ even for DNNs that are widely used and heavily optimized. Mirage is publicly available at https://github.com/mirage-project/mirage.

Details

Database :
arXiv
Publication Type :
Report
Accession number :
edsarx.2405.05751
Document Type :
Working Paper