Judgement of the Dutch Institute for Human Rights shows how difficult it is to legally prove algorithmic discrimination

On October 17th, the Netherlands Institute for Human Rights ruled that the VU did not discriminate against bioinformatics student Robin Pocornie on the basis of race by using anti-cheating software. However, according to the institute, the VU has discriminated on the grounds of race in how they handled her complaint.

The Institute for Human Rights ignores the fact that the software discriminates

Robin filed a complaint against the VU in July 2022 because the anti-cheating software that she was obliged to use at home during Covid did not recognize her as a person due to her dark skin colour. She was not the only student of colour with this problem, students worldwide had to take exams with a lamp directly in their face to be recognized.

In its previous interim judgement, the Institute found that Pocornie had made it sufficiently plausible that the software is discriminatory. However, in its final assessment, the Institute decided to only look at Pocornie’s individual experience. The Institute is of the opinion that the VU “does not have to demonstrate in this case that there was no discrimination at all – against any student – on the grounds of race in the use of Proctorio’s software.” This means that the Institute has not looked at the question of whether the software treats people equally in general.

The judgement shows how difficult it is to legally prove that an algorithm discriminates, even though there is an overwhelming amount of scientific evidence that facial detection works less well for people with dark skin. The Institute seems to share this insight: although, according to them, it has not been conclusively proven in this specific case that discrimination took place, this does not rule out “that the use of Proctorio or comparable AI software in other situations may lead to discrimination.”

RTL News: Black people are recognized less often by anti-cheating software Proctorio

A week before the judgement, more proof for Proctorio’s problems came through research done by RTL in the Netherlands. They replicated the research that was done by Lucy Satheesan, checking the software against faces with many different skin tones, now using the most recent version of Proctorio’s browser plug-in.

Their conclusions are clear: Black people are recognized less often by the software. Doe watch their (Dutch) news item explaining their research:

They don’t come more shameless than Proctorio

In the meantime Proctorio reared its ugly head once again. They create and sell software that is based on distrust and is privacy invading at its core, yet their tagline is “privacy and integrity for today’s digital learning”. Their software has been proven again and again to not work as well for people of colour as for people with a lighter skin colour. Yet, Proctorio thought it necessary to try and score some PR points by cancelling their contract with the VU:

In light of the findings from the Dutch National Human Rights Institute, which highlighted issues regarding Vrije Universiteit Amsterdam’s (VU) handling of a student’s racial discrimination complaint, we have chosen to terminate our Software-As-A-Service Agreement with the VU.

… 🤯

Let’s hope that the thirteen other Dutch educational institutions that have used Proctorio will now realise that Proctorio isn’t a reliable partner. They should show solidarity with the VU by cancelling their Proctorio contracts with immediate effect. If they need a little more convincing, then they only need to read about all the disgusting things that Mike Olsen, Proctorio’s CEO, has done in recent years.

Let’s start trusting Black women more than we trust big tech

Proctorio’s shameful behaviour underlines the point that Nani Jansen Reventlow makes in her column in the Volkskrant. She chides the Institute for Human Rights for choosing to believe the untruths of big tech over the experiences of a woman of colour. Through its judgement the Institute keeps a dynamic in place that tries to minimalise individually felt harms while ignoring the effects of systemic racism.

Robin continues to work on raising awareness about technology and discrimination

Robin is of course disappointed by the judgement, and she expected a different outcome. For her the facts remain as they are: she had to take her exams with a light in her face, while her white fellow students did not have to do that. The fact that the Institute has ruled that she has been discriminated against by her own university in how they handled her complaint is painful. She is happy however with all the attention the case has received. In recent months, partially as a result of the case, educational institutions have started to think a lot more about whether the technology they use works the same for everyone.

Robin will continue her fight against racism, also within technology. As a computer scientist and as a person of color she understands very well how institutional racism can manifest itself in technology. So she will use that knowledge to keep fighting for a society in which everyone is treated equally.

Comments are closed.

Proudly powered by WordPress | Theme: Baskerville 2 by Anders Noren.

Up ↑