Dutch Institute for Human Rights: Use of anti-cheating software can be algorithmic discrimination (i.e. racist)

Dutch student Robin Pocornie filed a complaint with Dutch Institute for Human Rights. The surveillance software that her university used, had trouble recognising her as human being because of her skin colour. After a hearing, the Institute has now ruled that Robin has presented enough evidence to assume that she was indeed discriminated against. The ball is now in the court of the VU (her university) to prove that the software treated everybody the same.

The Institute made a historic (intermediate) judgement

In its judgement, the Institute cites human rights case law showing that it is important to look at whether skin colour has played any role in the authentication or verification processes. If this is the case, then this is already disadvantageous in itself, regardless of whether Robin lost any time doing the exam or had any other negative material effects. In simple language: if the software has more trouble ‘seeing’ a person with a darker skin colour than this is already illegal.

The Institute also cites case law to argue that a more general proof of discrimination can suffice, whenever it is very difficult for the complainant to prove the specific case, for example because of a lack of access to the technology or the data. This was very much the case for Robin. For the Institute it is clear that Robin had problems (“face not found”) and that there is a wealth of academic literature showing that this type of technology doesn’t work equally well for people of all colours of skin. Together, this is enough proof.

At the hearing, the VU argued that an audit commissioned by Proctorio (the software in question) has shown that skin colour does not influence how the software works. However, the report of this audit is not public. The Institute rightfully argues that this cannot count as evidence in any way. So they conclude:

That the woman has presented enough facts that can lead to the presumption that, by using the anti-cheating software, the Stichting Vrije Universiteit has made an indirect distinction on the basis of race.

You can read the full judgement (in Dutch) here: https://oordelen.mensenrechten.nl/oordeel/2022-146

There was a lot of media attention for the ruling

This was the first time that the Dutch Institute for Human Rights ruled that algorithmic discrimination was proven. In their press release, the chair of the Institute calls it an important moment in their history and a meaningful first step: “Somebody has managed to say: ‘Hey, what this algorithm is doing is weird. I suspect that this algorithm discriminates.’”

In the Racism and Technology Center press release, Robin said: “I am very happy that the law is on my side. It is hard to make this type of discrimination visible and the Institute has clearly understood this. I look forward to their final judgement with full confidence. I am also curious to hear the reaction of all the other Dutch institutes for higher education that have used this software. They too have students of colour in their classroom.”

Chair of the Center, Naomi Appelman, said: “We hope that this is a warning for all organisations who (want to) use facial recognition or detection in their processes. You can only do this if you are sure that you are not discriminating anybody. Guarantees from the supplier of the software just aren’t good enough.”

Many news outlets wrote about the judgement: Volkskrant, NU.nl, NRC, Parool, Trouw, AT5, and Noordhollands Dagblad. Student news sites from the VU, the University of Twente and the Radboud University published stories about the case. There were some English language outlets that gave attention to the case too: NL Times and DutchNews.nl. Robin was a guest in the Bits of Freedom podcast, and both Robin and Naomi appeared on the daily news show on Radio 1, see below:

VU is doing Proctorio’s dirty work

As said, the ball is in the VU’s court. They can choose to stop paying a laywer (and use the money for educational purposes instead), apologize to their students of colour, and seize the use of Proctorio. However, this is not what they seem to be planning to do. To the NU.nl journalist they said that they want to use the next ten weeks to prove that the surveillance software they used did not discriminate.

Through this, they seem intent on doing the dirty work for Proctorio, a litigious American company that is not trasparent and refuses to give insight into how their software works. A truly questionable choice.

Next steps in the case

The VU now has ten weeks to deliver the technical proof that the software doesn’t discriminate. If and when they do, Robin and the Center have the opportunity to reply to that proof. Only then will the Institute for Human Rights make their final ruling. They will not only rule on whether the VU discriminated Robin by using Proctorio, they will also judge on whether the VU had proper complaints procedures in place for people experiencing this type of discrimination.

If the Institute rules that Proctorio is discriminatory, then all the other Dutch institutes that used the software have some explaining and reckoning to do. The Center will likely campaign to that effect. Do let us know if you want to help with that.

Comments are closed.

Proudly powered by WordPress | Theme: Baskerville 2 by Anders Noren.

Up ↑