Amnesty report (yet again) exposes racist AI in UK police forces

Amnesty International UK’s report Automated Racism (from last February, PDF), reveals that almost three-quarters of UK police forces use discriminatory predictive policing systems that perpetuate racial profiling. At least 33 deploy AI tools that predict crime locations and profile individuals as future criminals based on biased historical data, perpetuating and entrenching racism and inequality.

The report exhaustively demonstrates these systems systematically target Black and racialised communities, with Black people stopped and searched up to four times more than white people in areas using predictive tools.

Sacha Deshmukh, Chief Executive at Amnesty International UK, said:

The use of predictive policing tools violates human rights. The evidence that this technology keeps us safe just isn’t there, the evidence that it violates our fundamental rights is clear as day. We are all much more than computer-generated risk scores.

It has not been the first time the UK police have been shown to use harmful and racist technologies. Up until last year, the Metropolitan police was using the ‘gangs matrix’ to risk profile and subsequently surveil, harm and stigmatise mainly young Black men. The system was only scrapped after years of tireless campaigning, reporting, and legal action.

It has been shown time and again that these types of predictive policing are racist and routinely violate the fundamental human rights of racialised communities including the presumption of innocence, privacy, and freedom of association. People are being profiled as criminals without evidence, creating what Amnesty calls “automated racism” that treats entire communities as potential threats.

The Dutch police also employ predictive policing with enthusiasm. The Amsterdam police, for example, proudly uses CAS (“Criminality Anticipation System”).

As Amnesty CEO Sacha Deshmukh stated: “These systems have been built with discriminatory data and serve only to supercharge racism.” The organisation calls for an immediate prohibition of these technologies and demands transparency about their use.

See: UK: Police forces ‘supercharging racism’ with crime predicting tech at Amnesty.

Comments are closed.

Proudly powered by WordPress | Theme: Baskerville 2 by Anders Noren.

Up ↑