Back to Search Start Over

Extending Models Via Gradient Boosting: An Application to Mendelian Models

Authors :
Huang, Theodore
Idos, Gregory
Hong, Christine
Gruber, Stephen
Parmigiani, Giovanni
Braun, Danielle
Publication Year :
2021

Abstract

Improving existing widely-adopted prediction models is often a more efficient and robust way towards progress than training new models from scratch. Existing models may (a) incorporate complex mechanistic knowledge, (b) leverage proprietary information and, (c) have surmounted barriers to adoption. Compared to model training, model improvement and modification receive little attention. In this paper we propose a general approach to model improvement: we combine gradient boosting with any previously developed model to improve model performance while retaining important existing characteristics. To exemplify, we consider the context of Mendelian models, which estimate the probability of carrying genetic mutations that confer susceptibility to disease by using family pedigrees and health histories of family members. Via simulations we show that integration of gradient boosting with an existing Mendelian model can produce an improved model that outperforms both that model and the model built using gradient boosting alone. We illustrate the approach on genetic testing data from the USC-Stanford Cancer Genetics Hereditary Cancer Panel (HCP) study.<br />Comment: 46 pages, 4 figures

Details

Database :
arXiv
Publication Type :
Report
Accession number :
edsarx.2105.06559
Document Type :
Working Paper