Back to Search
Start Over
Fairness Under Cover: Evaluating the Impact of Occlusions on Demographic Bias in Facial Recognition
- Publication Year :
- 2024
-
Abstract
- This study investigates the effects of occlusions on the fairness of face recognition systems, particularly focusing on demographic biases. Using the Racial Faces in the Wild (RFW) dataset and synthetically added realistic occlusions, we evaluate their effect on the performance of face recognition models trained on the BUPT-Balanced and BUPT-GlobalFace datasets. We note increases in the dispersion of FMR, FNMR, and accuracy alongside decreases in fairness according to Equilized Odds, Demographic Parity, STD of Accuracy, and Fairness Discrepancy Rate. Additionally, we utilize a pixel attribution method to understand the importance of occlusions in model predictions, proposing a new metric, Face Occlusion Impact Ratio (FOIR), that quantifies the extent to which occlusions affect model performance across different demographic groups. Our results indicate that occlusions exacerbate existing demographic biases, with models placing higher importance on occlusions in an unequal fashion, particularly affecting African individuals more severely.<br />Comment: Accepted at ECCV Workshop FAILED
Details
- Database :
- arXiv
- Publication Type :
- Report
- Accession number :
- edsarx.2408.10175
- Document Type :
- Working Paper