Back to Search Start Over

An Intermediate Representation for Optimizing Machine Learning Pipelines

Authors :
Kunft, Andreas (author)
Katsifodimos, A (author)
Schelter, Sebastian (author)
Bress, Sebastian (author)
Rabl, Tilmann (author)
Markl, Volker (author)
Kunft, Andreas (author)
Katsifodimos, A (author)
Schelter, Sebastian (author)
Bress, Sebastian (author)
Rabl, Tilmann (author)
Markl, Volker (author)
Publication Year :
2019

Abstract

Machine learning (ML) pipelines for model training and validation typically include preprocessing, such as data cleaning and feature engineering, prior to training an ML model. Preprocessing combines relational algebra and user-defined functions (UDFs), while model training uses iterations and linear algebra. Current systems are tailored to either of the two. As a consequence, preprocessing and ML steps are optimized in isolation. To enable holistic optimization of ML training pipelines, we present Lara, a declarative domainspecific language for collections and matrices. Lara's intermediate representation (IR) re ects on the complete program, i.e., UDFs, control ow, and both data types. Two views on the IR enable diverse optimizations. Monads enable operator pushdown and fusion across type and loop boundaries. Combinators provide the semantics of domainspecific operators and optimize data access and cross-validation of ML algorithms. Our experiments on preprocessing pipelines and selected ML algorithms show the effects of our proposed optimizations on dense and sparse data, which achieve speedups of up to an order of magnitude.<br />Web Information Systems

Details

Database :
OAIster
Notes :
English
Publication Type :
Electronic Resource
Accession number :
edsoai.on1149837416
Document Type :
Electronic Resource
Full Text :
https://doi.org/10.14778.3342263.3342633