Back to Search Start Over

Group Fairness Refocused: Assessing the Social Impact of ML Systems

Authors :
Hertweck, Corinna; https://orcid.org/0000-0002-7639-2771
Loi, Michele; https://orcid.org/0000-0002-7053-4724
Heitz, Christoph; https://orcid.org/0000-0002-6683-4150
Hertweck, Corinna; https://orcid.org/0000-0002-7639-2771
Loi, Michele; https://orcid.org/0000-0002-7053-4724
Heitz, Christoph; https://orcid.org/0000-0002-6683-4150
Source :
Hertweck, Corinna; Loi, Michele; Heitz, Christoph (2024). Group Fairness Refocused: Assessing the Social Impact of ML Systems. In: 11th IEEE Swiss Conference on Data Science (SDS), Zürich, 30 May 2024 - 31 May 2024. Institute of Electrical and Electronics Engineers, 189-196.
Publication Year :
2024

Abstract

Fairness as a property of a prediction-based decision system is a question of its impact on the lives of affected people, which is only partially captured by standard fairness metrics. In this paper, we present a formal framework for the impact assessment of prediction-based decision systems based on the paradigm of group fairness. We generalize the equality requirements of standard fairness criteria to the concept of equality of expected impact, and we show that standard fairness criteria can be interpreted as special cases of this generalization. Furthermore, we provide a systematic and practical method for determining the necessary utility functions for modeling the impact. We conclude with a discussion of possible extensions of our approach.

Details

Database :
OAIster
Journal :
Hertweck, Corinna; Loi, Michele; Heitz, Christoph (2024). Group Fairness Refocused: Assessing the Social Impact of ML Systems. In: 11th IEEE Swiss Conference on Data Science (SDS), Zürich, 30 May 2024 - 31 May 2024. Institute of Electrical and Electronics Engineers, 189-196.
Notes :
application/pdf, info:doi/10.5167/uzh-265825, English, English
Publication Type :
Electronic Resource
Accession number :
edsoai.on1482459793
Document Type :
Electronic Resource