1. Relax: Composable Abstractions for End-to-End Dynamic Machine Learning
- Author
-
Lai, Ruihang, Shao, Junru, Feng, Siyuan, Lyubomirsky, Steven S., Hou, Bohan, Lin, Wuwei, Ye, Zihao, Jin, Hongyi, Jin, Yuchen, Liu, Jiawei, Jin, Lesheng, Cai, Yaxing, Jiang, Ziheng, Wu, Yong, Park, Sunghyun, Srivastava, Prakalp, Roesch, Jared G., Mowry, Todd C., and Chen, Tianqi
- Subjects
Computer Science - Machine Learning ,Computer Science - Artificial Intelligence ,Computer Science - Programming Languages - Abstract
Dynamic shape computations have become critical in modern machine learning workloads, especially in emerging large language models. The success of these models has driven demand for deploying them to a diverse set of backend environments. In this paper, we present Relax, a compiler abstraction for optimizing end-to-end dynamic machine learning workloads. Relax introduces first-class symbolic shape annotations to track dynamic shape computations globally across the program. It also introduces a cross-level abstraction that encapsulates computational graphs, loop-level tensor programs, and library calls in a single representation to enable cross-level optimizations. We build an end-to-end compilation framework using the proposed approach to optimize dynamic shape models. Experimental results on large language models show that Relax delivers performance competitive with state-of-the-art hand-optimized systems across platforms and enables deployment of emerging dynamic models to a broader set of environments, including mobile phones, embedded devices, and web browsers.
- Published
- 2023