Back to Search
Start Over
A bias detection tree approach for detecting disparities in a recommendation model's errors.
- Source :
- User Modeling & User-Adapted Interaction; Mar2023, Vol. 33 Issue 1, p43-79, 37p
- Publication Year :
- 2023
-
Abstract
- Many of the current recommendation systems are considered to be blackboxes that are tuned to optimize some global objective function. However, their error distribution may differ dramatically among different combinations of attributes, and such algorithms may lead to propagating hidden data biases. Identifying potential disparities in an algorithm's functioning is essential for building recommendation systems in a fair and responsible way. In this work, we propose a model-agnostic technique to automatically detect the combinations of user and item attributes correlated with unequal treatment by the recommendation model. We refer to this technique as the Bias Detection Tree. In contrast to the existing works in this field, our method automatically detects disparities related to combinations of attributes without any a priori knowledge about protected attributes, assuming that relevant metadata is available. Our results on five public recommendation datasets show that the proposed technique can identify hidden biases in terms of four kinds of metrics for multiple collaborative filtering models. Moreover, we adapt a minimax model selection technique to control the trade-off between the global and the worst-case optimizations and improve the recommendation model's performance for biased attributes. [ABSTRACT FROM AUTHOR]
- Subjects :
- RECOMMENDER systems
GLOBAL optimization
TREES
METADATA
Subjects
Details
- Language :
- English
- ISSN :
- 09241868
- Volume :
- 33
- Issue :
- 1
- Database :
- Complementary Index
- Journal :
- User Modeling & User-Adapted Interaction
- Publication Type :
- Academic Journal
- Accession number :
- 162033150
- Full Text :
- https://doi.org/10.1007/s11257-022-09334-x