Racist Technology in Action: The “underdiagnosis bias” in AI algorithms for health: Chest radiographs

This study builds upon work in algorithmic bias, and bias in healthcare. The use of AI-based diagnostic tools has been motivated by a shortage of radiologists globally, and research which shows that AI algorithms can match specialist performance (particularly in medical imaging). Yet, the topic of AI-driven underdiagnosis has been relatively unexplored.

The authors conducted a systematic study of underdiagnosis bias on AI-based chest X-ray prediction models (designed to predict diagnostic labels from X-ray images) in three large public radiology datasets, and a multi-source dataset which combines three shared diseases.

What was observed is that female patients, patients under 20 years old, Black patients, Hispanic patients and patients of lower socioeconomic status (with Medicaid insurance as a proxy), receive higher rates of algorithmic underdiagnosis than other groups. These effects persist for intersectional subgroups – eg, Black female patients. In other words, these groups are at a higher risk of being falsely flagged through AI-based diagnostic tools as healthy, and of receiving no clinical treatment.

The authors stress that underdiagnosis, falsely claiming that the patient is healthy, “leads to no clinical treatment when a patient needs it most, and could be harmful.”

These findings have demonstrated a concrete way in which deployed algorithms can escalate existing systemic health inequities, particularly if there is no robust audit of performance disparities across subpopulations. At the pace in which algorithms are moving from the lab to real-world deployment, regulators and policymakers have to genuinely consider the ethical concerns with regards to access to medical treatment for racialised, under-served subpopulations, and the effective and ethical deployment of these models.

See: Underdiagnosis bias of artificial intelligence algorithms applied to chest radiographs in under-served patient populations at Nature.

Comments are closed.

Proudly powered by WordPress | Theme: Baskerville 2 by Anders Noren.

Up ↑