Racist Technology in Action: Dutch Probation Service uses discriminatory algorithm for predicting recidivism

The Dutch Inspectorate of Justice and Security reviewed the algorithm that predicts whether a suspect or an offender will reoffend (to help with sentencing) and found it lacking. Not only was it implemented incorrectly, but it was also discriminatory. The Probation Service has suspended the use of the algorithm for now while it assesses whether it can fix its problems.

It is probably the most widely known case of algorithmic bias: COMPAS, an algorithm used in the US to predict recidivism. ProPublica showed in 2016 (!) how the algorithm was hugely discriminatory, harming Black people.

You would think that knowing about this case would lead to maximum care and at least some restraint in applying similar algorithms here in the Netherlands. Alas…, no can do: One of the algorithms used by the Probation Service uses a neighbourhood score and income as variables to make the predictions. It is blatantly obvious that using these variables could lead to a form of ethnic profiling. That is why these variables have been prohibited from use by the Netherlands Institute for Human Rights.

It is time to fundamentally question these machine-learning-based predictions as a legitimate basis for public decision-making. Individualised statistical ‘reasoning’ will always lead to injustice, as an algorithm will necessarily only look at a small set of variables and will not take into account the specific individual circumstances.

You can download the full research report here.

See: Risicovol algoritmegebruik door reclassering at Inspectie Justitie en Veiligheid.

Image from the research report.

Comments are closed.

Proudly powered by WordPress | Theme: Baskerville 2 by Anders Noren.

Up ↑