In Spain, an algorithm used by police to ‘combat’ gender violence determines whether women live or die

Lobna Hemid. Stefany González Escarraman. Eva Jaular (and her 11-month-old baby). The lives of these three women and an infant, amongst many others, tragically ended due to gender-related killings in Spain. As reported in this article, they were all classified as “low” or “negligible” risk by VioGén, despite reporting abuse to the police. In the case of Lobna Hemid, after reporting her husband’s abuse to the police and being assessed as “low risk” by VioGén, the police provided her with minimal protection, and weeks later, her husband stabbed her to death.

VioGén, an algorithmic tool developed by Spain’s Interior Ministry, used by law enforcement and in judicial proceedings, evaluates cases through a 35-question survey, categorising victims into five risk levels: negligible, low, medium, high, or extreme. Protection measures are determined accordingly, yet misclassifications have led to fatal consequences. Currently, 92,000 cases are active in the system, with 83% classified as low or negligible risk. However, many women in these categories continue to face violence, and since 2007, at least 247 women have been killed by their partners after being assessed by VioGén.

Unsurprisingly, victims are often unaware of the algorithm’s role in their cases, and the Spanish government has not disclosed comprehensive data on its effectiveness. External audits of the algorithm have been refused. While predictive algorithms are increasingly used to address domestic violence in countries like Germany, the U.S.. Spain is employing such a system nationwide, excluding Basque Country and Catalonia.

The use of VioGén and the lack of accountability by the Spanish government sit within a broader trend of governments utilising algorithmic risk scoring and predictive systems across other areas of policing, welfare and many others. These decision-making systems have life and death consequences for many, especially racialised, gendered, low income and other vulnerable folks.

In Spain, these algorithmic “experiments” have been touted to increase effectiveness (despite evidence proving otherwise) and to manage law enforcement resources (see also: Netherlands, UK). But experimenting comes at a cost: people’s lives.

Such a reductionist and dehumanising approach and response to gender-based violence reflect the underlying classist, sexist and racist criminal system in Spain (and beyond). Pouring more money into algorithmic systems that are not proven to work and increasing the power of law enforcement to deal with such violence will enact more harm than protection to women, especially sex workers, immigrants, racialised Black and brown, transgender, gender-diverse, refugee, undocumented people.

It was International Women’s Day several days ago, on 8 March. Femicide persists as a universal problem. The intersectional feminist struggle continues to resist and refuse a patriarchal, classist and racist system of psychologists, coders, law enforcers, consultants, and politicians who dictate whether we live or die.

See: An Algorithm Told Police She Was Safe. Then Her Husband Killed Her. at the New York Times.

Image from the original article.

Comments are closed.

Proudly powered by WordPress | Theme: Baskerville 2 by Anders Noren.

Up ↑