Back to Search
Start Over
Accounting for diversity in AI for medicine.
- Source :
-
Computer Law & Security Review . Nov2022, Vol. 47, pN.PAG-N.PAG. 1p. - Publication Year :
- 2022
-
Abstract
- In healthcare, gender and sex considerations are crucial because they affect individuals' health and disease differences. Yet, most algorithms deployed in the healthcare context do not consider these aspects and do not account for bias detection. Missing these dimensions in algorithms used in medicine is a huge point of concern, as neglecting these aspects will inevitably produce far from optimal results and generate errors that may lead to misdiagnosis and potential discrimination. This paper explores how current algorithmic-based systems may reinforce gender biases and affect marginalized communities in healthcare-related applications. To do so, we bring together notions and reflections from computer science, queer media studies, and legal insights to better understand the magnitude of failing to consider gender and sex difference in the use of algorithms for medical purposes. Our goal is to illustrate the potential impact that algorithmic bias may have on inadvertent discriminatory, safety, and privacy-related concerns for patients in increasingly automated medicine. This is necessary because by rushing the deployment of AI technologies that do not account for diversity, we risk having an even more unsafe and inadequate healthcare delivery. By promoting the account for privacy, safety, diversity, and inclusion in algorithmic developments with health-related outcomes, we ultimately aim to inform the Artificial Intelligence (AI) global governance landscape and practice on the importance of integrating gender and sex considerations in the development of algorithms to avoid exacerbating existing or new prejudices. [ABSTRACT FROM AUTHOR]
Details
- Language :
- English
- ISSN :
- 2212473X
- Volume :
- 47
- Database :
- Academic Search Index
- Journal :
- Computer Law & Security Review
- Publication Type :
- Academic Journal
- Accession number :
- 161016563
- Full Text :
- https://doi.org/10.1016/j.clsr.2022.105735