1. Simple Rules to Guide Expert Classifications
- Author
-
Sharad Goel, Jongbin Jung, Connor Concannon, Ravi Shroff, and Daniel G. Goldstein
- Subjects
Statistics and Probability ,Economics and Econometrics ,Focus (computing) ,business.industry ,Computer science ,Linear model ,Judicial opinion ,02 engineering and technology ,Machine learning ,computer.software_genre ,01 natural sciences ,Test (assessment) ,010104 statistics & probability ,0202 electrical engineering, electronic engineering, information engineering ,020201 artificial intelligence & image processing ,Artificial intelligence ,0101 mathematics ,Statistics, Probability and Uncertainty ,Heuristics ,business ,Construct (philosophy) ,computer ,Social Sciences (miscellaneous) ,Predictive modelling ,Simple (philosophy) - Abstract
Summary Judges, doctors and managers are among those decision makers who must often choose a course of action under limited time, with limited knowledge and without the aid of a computer. Because data-driven methods typically outperform unaided judgements, resource-constrained practitioners can benefit from simple, statistically derived rules that can be applied mentally. In this work, we formalize long-standing observations about the efficacy of improper linear models to construct accurate yet easily applied rules. To test the performance of this approach, we conduct a large-scale evaluation in 22 domains and focus in detail on one: judicial decisions to release or detain defendants while they await trial. In these domains, we find that simple rules rival the accuracy of complex prediction models that base decisions on considerably more information. Further, comparing with unaided judicial decisions, we find that simple rules substantially outperform the human experts. To conclude, we present an analytical framework that sheds light on why simple rules perform as well as they do.
- Published
- 2020
- Full Text
- View/download PDF