1. The Fairness of Credit Scoring Models
- Author
-
Christophe Pérignon, Christophe Hurlin, Sébastien Saurin, Université d'Orléans (UO), Ecole des Hautes Etudes Commerciales (HEC Paris), and HEC Paris Research Paper Series
- Subjects
FOS: Computer and information sciences ,Computer Science - Machine Learning ,History ,Polymers and Plastics ,Computer science ,media_common.quotation_subject ,Population ,Machine Learning (stat.ML) ,JEL: C - Mathematical and Quantitative Methods/C.C5 - Econometric Modeling/C.C5.C55 - Large Data Sets: Modeling and Analysis ,Screening algorithm ,Industrial and Manufacturing Engineering ,Machine Learning (cs.LG) ,FOS: Economics and business ,Race (biology) ,JEL: G - Financial Economics/G.G2 - Financial Institutions and Services/G.G2.G21 - Banks • Depository Institutions • Micro Finance Institutions • Mortgages ,Statistics - Machine Learning ,Rest (finance) ,0502 economics and business ,050207 economics ,Business and International Management ,10. No inequality ,education ,media_common ,education.field_of_study ,050208 finance ,Actuarial science ,05 social sciences ,Significant difference ,JEL: C - Mathematical and Quantitative Methods/C.C1 - Econometric and Statistical Methods and Methodology: General/C.C1.C10 - General ,Test (assessment) ,Interest rate ,Risk Management (q-fin.RM) ,[SHS.GESTION]Humanities and Social Sciences/Business administration ,JEL: G - Financial Economics/G.G2 - Financial Institutions and Services/G.G2.G29 - Other ,Quantitative Finance - Risk Management ,JEL: C - Mathematical and Quantitative Methods/C.C3 - Multiple or Simultaneous Equation Models • Multiple Variables/C.C3.C38 - Classification Methods • Cluster Analysis • Principal Components • Factor Models - Abstract
In credit markets, screening algorithms discriminate between good-type and bad-type borrowers. This is their raison d’être. However, by doing so, they also often discriminate between individuals sharing a protected attribute (e.g. gender, age, race) and the rest of the population. In this paper, we show how to test (1) whether there exists a statistical significant difference in terms of rejection rates or interest rates, called lack of fairness, between protected and unprotected groups and (2) whether this difference is only due to credit worthiness. When condition (2) is not met, the screening algorithm does not comply with the fair-lending principle and can be qualified as illegal. Our framework provides guidance on how algorithmic fairness can be monitored by lenders, controlled by their regulators, and improved for the benefit of protected groups.
- Published
- 2021