The Rekenkamer Rotterdam (a Court of Audit) looked at how the city of Rotterdam is using predictive algorithms and whether that use could lead to ethical problems. In their report, they describe how the city lacks a proper overview of the algorithms that it is using, how there is no coordination and thus no one takes responsibility when things go wrong, and how sensitive data (like nationality) were not used by one particular fraud detection algorithm, but that so-called proxy variables for ethnicity – like low literacy, which might correlate with ethnicity – were still part of the calculations. According to the Rekenkamer this could lead to unfair treatment, or as we would call it: ethnic profiling.
Unfortunately, Rotterdam is quite typical in its approach: they use these algorithms in their attempt to find benefits fraud, they run the algorithm for more than four years as ‘a pilot’ without evaluating whether the new algorithmic approach works better than the old non-algorithmic approach, and when they are being told that this could lead to biased decision making they tell us we shouldn’t worry because there is always a human in the loop.
However, the city does say it will take all the Rekenkamer’s advice on board. This means they should soon have a complete and public registry of all the algorithms that are in use, and that there will be a single team responsible for ensuring that none of the algorithms that are in use could have an unjust impact. Other cities should take heed.
See: Gekleurde technologie at the Rekenkamer Rotterdam or read the full report here.
The Rekenkamer’s report recommends the Data Ethics Decision Aid (DEDA) as a tool to help you recognize ethical issues when working with data projects. It forces you or your team to ask yourself the harder questions. Make sure to take a look at it.
See: Data Ethics Decision Aid (DEDA) at the Utrecht Data School.