Algorithm to help find fraudulent students turns out to be racist

DUO is the Dutch organisation for administering student grants. It uses an algorithm to help them decide which students get a home visit to check for fraudulent behaviour. Turns out they basically only check students of colour, and they have no clue why.

If you are student and you don’t live with your parents you get more state support than when you live at home. Private enforcers work for DUO doing around 25.000 checks per year to see if students are indeed living where they say they are living. When these enforcers suspect something is wrong, the burden of proof for where they live shift to the student.

A homemade algorithm applying a risk profile without any scientific merits picks out the students to check. The Hoger Onderwijs Persbureau, NOS op 3 and Investico investigated how this profile works. Check out this Dutch video to get an accessible (Dutch) explanation of their results:

Two presenters for NOS op 3 with at the start of the video with the title “Uitgediept, discriminatie” and and looking glass highlighting “fraudeur” in a letter from the government.

The journalists spoke to 32 lawyers who had supported hundreds of cases of students who said they were falsely accused of fraud. According to these lawyers nearly all the students have a migration or bi-cultural background. DUO says that they don’t select on ethnicity, yet it seems to be clear that they are using criteria that function as a proxy for exactly this characteristic. In the more than ten years that DUO has used the algorithm they have never evaluated the risk profiles. They also have no policies to avoid bias.

As a result of this investigation, the Minister of Education decided to immediately stop the use of the algorithm and switch to using random samples (other organisations should take note). The Dutch data protection authority will investigate the profiling algorithm that DUO has used.

The child benefits scandal, a terrible algorithm checking for fraud in Rotterdam, ethnic profiling for visa applications, an ignored audit report pointing out how algorithms of the Dutch state don’t fulfill the most basic requirements: the list just keeps growing. When will we finally and collectively learn that we shouldn’t use algorithms to classify people?

See: ‘Ik dacht gewoon: ik pak je’ at De Groene Amsterdammer.

Image by Charlotte van Hacht from the original article.

Comments are closed.

Proudly powered by WordPress | Theme: Baskerville 2 by Anders Noren.

Up ↑