1. Ghosts in the Machine: How Past and Present Biases Haunt Algorithmic Tenant Screening Systems.
- Author
-
Rhoades, Gary
- Subjects
- *
AMICI curiae , *ALGORITHMIC bias , *SEX discrimination , *HUMAN facial recognition software , *TENANTS , *FAIR Housing Act of 1968 (U.S.) , *POOR communities - Abstract
The article discusses how the use of algorithmic tenant screening systems in housing can perpetuate biases and discrimination, despite the Fair Housing Act's attempts to end such unfair treatment. The algorithms used in these systems often consider factors such as criminal records, credit reports, and civil court records to predict the suitability of prospective tenants. However, these algorithms can still be influenced by biases related to race and source of income. Lawsuits have been filed against companies that provide these screening systems, highlighting the need for reform and accountability in addressing algorithmic bias. Efforts have been made to promote transparency, data accuracy, and the inclusion of housing vouchers in the screening process. The article emphasizes the importance of combining objective data and technology with human common sense to address the civil rights issues associated with tenant screening. [Extracted from the article]
- Published
- 2024