Shocking report by the Algemene Rekenkamer: state algorithms are a shitshow

The Algemene Rekenkamer (Netherlands Court of Audit) looked into nine different algorithms used by the Dutch state. It found that only three of them fulfilled the most basic of requirements.

The Rekenkamer selected algorithms that have an impact on Dutch citizens, that carry a risk for incorrect use, and that are in actual use. They also made sure to look at different types of algorithms and in different domains. Then they used an assessment framework to evaluate the algorithms on governance, privacy, data and model, and on IT management. This resulted in the following table:

The tone of report is very matter of fact, but the results are truly shocking. When it comes to bias for example, organizations like the Dutch Police, the Migration Directorate at the Ministry of Justice, and the National Office for Identity Data, have not even properly checked whether the models behind their algorithms contain bias.

The predictive policing system that is in use by the police does not get a single passing score for any of the elements in the assessment framework. Unbelievably, the police has chosen to dig in and dispute the report. In their letter to the government they write that the ‘Crime Anticipation System’ is low risk in their books, and so they see no problem in continuing to use the system. The Rekenkamer calls their reaction ‘worrying’, as the algorithm could negatively impact real people.

Let’s hope that other organisations are less foolhardy and will follow the Algemene Rekenkamer’s recommendation to continuously (and perpetually) check for bias in algorithms during both the design and the implementation phase, making sure that there are no unwanted systemic deviations for specific individuals or groups.

See: Diverse algoritmes Rijk voldoen niet aan basisvereisten at the Algemene Rekenkamer, or download the full report.

Comments are closed.

Proudly powered by WordPress | Theme: Baskerville 2 by Anders Noren.

Up ↑