Back to Search Start Over

Sample selection bias in evaluation of prediction performance of causal models.

Authors :
Long, James P.
Ha, Min Jin
Source :
Statistical Analysis & Data Mining. Feb2022, Vol. 15 Issue 1, p5-14. 10p.
Publication Year :
2022

Abstract

Causal models are notoriously difficult to validate because they make untestable assumptions regarding confounding. New scientific experiments offer the possibility of evaluating causal models using prediction performance. Prediction performance measures are typically robust to violations in causal assumptions. However, prediction performance does depend on the selection of training and test sets. Biased training sets can lead to optimistic assessments of model performance. In this work, we revisit the prediction performance of several recently proposed causal models tested on a genetic perturbation data set of Kemmeren. We find that sample selection bias is likely a key driver of model performance. We propose using a less‐biased evaluation set for assessing prediction performance and compare models on this new set. In this setting, the causal models have similar or worse performance compared to standard association‐based estimators such as Lasso. Finally, we compare the performance of causal estimators in simulation studies that reproduce the Kemmeren structure of genetic knockout experiments but without any sample selection bias. These results provide an improved understanding of the performance of several causal models and offer guidance on how future studies should use Kemmeren. [ABSTRACT FROM AUTHOR]

Details

Language :
English
ISSN :
19321864
Volume :
15
Issue :
1
Database :
Academic Search Index
Journal :
Statistical Analysis & Data Mining
Publication Type :
Academic Journal
Accession number :
154579538
Full Text :
https://doi.org/10.1002/sam.11559