Dutch Institute for Human Rights: Use of anti-cheating software can be algorithmic discrimination (i.e. racist)

Dutch student Robin Pocornie filed a complaint with Dutch Institute for Human Rights. The surveillance software that her university used, had trouble recognising her as human being because of her skin colour. After a hearing, the Institute has now ruled that Robin has presented enough evidence to assume that she was indeed discriminated against. The ball is now in the court of the VU (her university) to prove that the software treated everybody the same.

Continue reading “Dutch Institute for Human Rights: Use of anti-cheating software can be algorithmic discrimination (i.e. racist)”

Antispieksoftware op de VU discrimineert

Antispieksoftware checkt voorafgaand aan een tentamen of jij wel echt een mens bent. Maar wat als het systeem je niet herkent, omdat je een donkere huidskleur hebt? Dat overkwam student Robin Pocornie, zij stapte naar het College voor de Rechten van de Mens. Samen met Naomi Appelman van het Racism and Technology Centre, die Robin bijstond in haar zaak, vertelt ze erover.

By Naomi Appelman, Natasja Gibbs and Robin Pocornie for NPO Radio 1 on December 12, 2022

Eerste keer vermoeden van algoritmische discriminatie succesvol onderbouwd

Een student is erin geslaagd voldoende feiten aan te dragen voor een vermoeden van algoritmische discriminatie. De vrouw klaagt dat de Vrije Universiteit haar discrimineerde door antispieksoftware in te zetten. Deze software maakt gebruik van gezichtsdetectiealgoritmes. De software detecteerde haar niet als ze moest inloggen voor tentamens. De vrouw vermoedt dat dit komt door haar donkere huidskleur. De universiteit krijgt tien weken de tijd om aan te tonen dat de software niet heeft gediscrimineerd. Dat blijkt uit het tussenoordeel dat het College publiceerde.  

From College voor de Rechten van de Mens on December 9, 2022

Dutch student files complaint with the Netherlands Institute for Human Rights about the use of racist software by her university

During the pandemic, Dutch student Robin Pocornie had to do her exams with a light pointing straight at her face. Her fellow students who were White didn’t have to do that. Her university’s surveillance software discriminated her, and that is why she has filed a complaint (read the full complaint in Dutch) with the Netherlands Institute for Human Rights.

Continue reading “Dutch student files complaint with the Netherlands Institute for Human Rights about the use of racist software by her university”

Student meldt discriminatie met antispieksoftware bij College Rechten van de Mens

Een student van de Vrije Universiteit Amsterdam (VU) dient een klacht in bij het College voor de Rechten van de Mens (pdf). Bij het gebruik van de antispieksoftware voor tentamens werd ze alleen herkend als ze met een lamp in haar gezicht scheen. De VU had volgens haar vooraf moeten controleren of studenten met een zwarte huidskleur even goed herkend zouden worden als witte studenten.

From NU.nl on July 15, 2022

Student stapt naar College voor de Rechten van de Mens vanwege gebruik racistische software door de VU

Student Robin Pocornie moest tijdens de coronapandemie tentamens maken met een lamp direct op haar gezicht. Haar witte medestudenten hoefden dat niet. De surveillance-software van de VU heeft haar gediscrimineerd, daarom dient ze vandaag een klacht in bij het College voor de Rechten van de Mens.

Continue reading “Student stapt naar College voor de Rechten van de Mens vanwege gebruik racistische software door de VU”

Racist Technology in Action: U.S. universities using race in their risk algorithms as a predictor for student success

An investigation by The Markup in March 2021, revealed that some universities in the U.S. are using a software and risk algorithm that uses the race of student as one of the factors to predict and evaluate how successful a student may be. Several universities have described race as a “high impact predictor”. The investigation found large disparities in how the software treated students of different races, with Black students deemed a four times higher risk than their White peers.

Continue reading “Racist Technology in Action: U.S. universities using race in their risk algorithms as a predictor for student success”

Opinion: Biden must act to get racism out of automated decision-making

Despite Biden’s announced commitment to advancing racial justice, not a single appointee to the task force has focused experience on civil rights and liberties in the development and use of AI. That has to change. Artificial intelligence, invisible but pervasive, affects vast swaths of American society and will affect many more. Biden must ensure that racial equity is prioritized in AI development.

By ReNika Moore for Washington Post on August 9, 2021

Moses Namara

Working to break down the barriers keeping young Black people from careers in AI.

By Abby Ohlheiser for MIT Technology Review on June 30, 2021

Racist Technology in Action: Proctoring software disadvantaging students of colour in the Netherlands

In an opinion piece in Parool, The Racism and Technology Center wrote about how Dutch universities use proctoring software that uses facial recognition technology that systematically disadvantages students of colour (see the English translation of the opinion piece). Earlier the center has written on the racial bias of these systems, leading to black students being excluded from exams or being labeled as frauds because the software did not properly recognise their faces as a face. Despite the clear proof that Procorio disadvantages students of colour, the University of Amsterdam has still used Proctorio extensively in this June’s exam weeks.

Continue reading “Racist Technology in Action: Proctoring software disadvantaging students of colour in the Netherlands”

The pandemic showed remote proctoring to be worse than useless

Before covid, “remote proctoring” tools were a niche product, invasive tools that spied on students who needed to take high-stakes tests but couldn’t get to campus or a satellite test-taking room. But the lockdown meant that all students found themselves in this position.

By Cory Doctorow for Pluralistic on June 24, 2021

Online proctoring excludes and discriminates

The use of software to automatically detect cheating on online exams – online proctoring – has been the go-to solution for many schools and universities in response to the COVID-19 pandemic. In this article, Shea Swauger addresses some of the potential discriminatory, privacy and security harms that can impact groups of students across class, gender, race, and disability lines. Swauger provides a critique on how technologies encode “normal” bodies – cisgender, white, able-bodied, neurotypical, male – as the standard and how students who do not (or cannot) conform, are punished by it.

Continue reading “Online proctoring excludes and discriminates”

Robot Teachers, Racist Algorithms, and Disaster Pedagogy

I have volunteered to be a guest speaker in classes this Fall. It’s really the least I can do to help teachers and students through another tough term. I spoke tonight in Dorothy Kim’s class “Race Before Race: Premodern Critical Race Studies.” Here’s a bit of what I said…

By Audrey Watters for Hack Education on September 3, 2020

Digital Ethics in Higher Education: 2020

New technologies, especially those relying on artificial intelligence or data analytics, are exciting but also present ethical challenges that deserve our attention and action. Higher education can and must lead the way.

By John O’Brien for EDUCAUSE Review on May 18, 2020

Proudly powered by WordPress | Theme: Baskerville 2 by Anders Noren.

Up ↑