As part of a series of investigative reporting by Lighthouse Reports and WIRED, Gabriel Geiger has revealed some of the findings about the use of welfare fraud algorithms in Denmark. This comes in the trajectory of the increasing use of algorithmic systems to detect welfare fraud across European cities, or at least systems which are currently known.
Denmark’s fraud detection system is reportedly more sophisticated and far-reaching than many other systems, in terms of the level of surveillance and ethnic profiling. Under Annika Jacobsen, the Danish Public Benefits Administration’s head of data mining unit, the agency has tripled the number of state databases which her agency can access (from three to nine) and compiled deeply private information on people to build models in order to analyse and predict who may be fraudulent. The agency uses variables such as nationality – which has, in the Netherlands, been ruled as illegal – as well as whether someone, or their “family relations” might be connected to a non-EU country. According to the report, other variables include “welfare recipients’ marital status [including “presumed partner”], the length of their marriage, who they live with, the size of their house, their income, whether they’ve ever lived outside Denmark, their call history with the Public Benefits Administration, and whether their children are Danish residents.” These proxies have time and time again been proven to be discriminatory.
Building such a system has also been spurred by private sector consultancy company, Deloitte (through the narrative of cost-savings and efficiency), alongside anti-immigration sentiment and rhetoric. This system highlights how technology and political agendas are intertwined (which is also evident in Netherland’s crime prevention approach, Top400) which results in significant and devastating consequences for the poorest, marginalised and racialised populations who are often disproportionately harmed. As Victoria Adelmant, director of the Digital Welfare and Human Rights Project says,
The ideology that underlies these algorithmic systems, and the very intrusive surveillance and monitoring of people who receive welfare, is a deep suspicion of the poor.
The sophistication of these algorithmic systems may vary, but the underlying logic of racism, classism, and sexism, continues to be mediated and perpetuated through the development and use of such welfare fraud systems. In Denmark, the use of Palantir since 2015 to police people living in “ghettoised” communities, and the “ghetto package” policies on housing in 2018, makes it clear whose lives are deserving, and whose aren’t.
See How Denmark’s Welfare State Became a Surveillance Nightmare at Wired.
Image by Katherine Lam, from the original Wired article.