In a roundtable on artificial intelligence in the Dutch Parliament, Quirine Eijkman spoke on behalf of the Netherlands Institute for Human Rights about Robin Pocornie’s case against the discriminatory use of Proctiorio at the VU university.
Continue reading “Dutch Institute for Human Rights speaks about Proctorio at Dutch Parliament”Dutch Institute for Human Rights: Use of anti-cheating software can be algorithmic discrimination (i.e. racist)
Dutch student Robin Pocornie filed a complaint with Dutch Institute for Human Rights. The surveillance software that her university used, had trouble recognising her as human being because of her skin colour. After a hearing, the Institute has now ruled that Robin has presented enough evidence to assume that she was indeed discriminated against. The ball is now in the court of the VU (her university) to prove that the software treated everybody the same.
Continue reading “Dutch Institute for Human Rights: Use of anti-cheating software can be algorithmic discrimination (i.e. racist)”Antispieksoftware op de VU discrimineert
Antispieksoftware checkt voorafgaand aan een tentamen of jij wel echt een mens bent. Maar wat als het systeem je niet herkent, omdat je een donkere huidskleur hebt? Dat overkwam student Robin Pocornie, zij stapte naar het College voor de Rechten van de Mens. Samen met Naomi Appelman van het Racism and Technology Centre, die Robin bijstond in haar zaak, vertelt ze erover.
By Naomi Appelman, Natasja Gibbs and Robin Pocornie for NPO Radio 1 on December 12, 2022
VU moet bewijzen dat antispieksoftware zwarte studente niet discrimineerde
De Vrije Universiteit Amsterdam (VU) moet aantonen dat haar antispieksoftware een studente niet heeft gediscrimineerd vanwege haar donkere huidskleur. Zij heeft namelijk voldoende aannemelijk gemaakt dat dit wel gebeurde.
By Afran Groenewoud for NU.nl on December 9, 2022
Eerste keer vermoeden van algoritmische discriminatie succesvol onderbouwd
Een student is erin geslaagd voldoende feiten aan te dragen voor een vermoeden van algoritmische discriminatie. De vrouw klaagt dat de Vrije Universiteit haar discrimineerde door antispieksoftware in te zetten. Deze software maakt gebruik van gezichtsdetectiealgoritmes. De software detecteerde haar niet als ze moest inloggen voor tentamens. De vrouw vermoedt dat dit komt door haar donkere huidskleur. De universiteit krijgt tien weken de tijd om aan te tonen dat de software niet heeft gediscrimineerd. Dat blijkt uit het tussenoordeel dat het College publiceerde.
From College voor de Rechten van de Mens on December 9, 2022
Mensenrechtencollege: discriminatie door algoritme voor het eerst ‘aannemelijk’, VU moet tegendeel bewijzen
Het is ‘aannemelijk’ dat het algoritme van antispieksoftware een student aan de Vrije Universiteit (VU) discrimineerde, zegt het College voor de Rechten van de Mens. Het is nu aan de VU om het tegendeel aan te tonen.
By Fleur Damen for Volkskrant on December 9, 2022
Students Are Fighting Remote Exam Surveillance — and Winning
The use of remote proctoring services by schools is facing challenges from students in court, and from lawmakers concerned about privacy and surveillance.
By Kristy P. Kennedy for Teen Vogue on October 20, 2022
ExamSoft’s remote bar exam sparks privacy and facial recognition concerns
To administer bar exams in 20 different states next week, ExamSoft is using facial recognition and collecting the biometric data of legal professionals.
By Khari Johnson for VentureBeat on September 29, 2020
How Big Tech Is Importing India’s Caste Legacy to Silicon Valley
Graduates from the Indian Institutes of Technology are highly sought after by employers. They can also bring problems from home.
By Saritha Rai for Bloomberg on March 11, 2021
Dutch student files complaint with the Netherlands Institute for Human Rights about the use of racist software by her university
During the pandemic, Dutch student Robin Pocornie had to do her exams with a light pointing straight at her face. Her fellow students who were White didn’t have to do that. Her university’s surveillance software discriminated her, and that is why she has filed a complaint (read the full complaint in Dutch) with the Netherlands Institute for Human Rights.
Continue reading “Dutch student files complaint with the Netherlands Institute for Human Rights about the use of racist software by her university”Student meldt discriminatie met antispieksoftware bij College Rechten van de Mens
Een student van de Vrije Universiteit Amsterdam (VU) dient een klacht in bij het College voor de Rechten van de Mens (pdf). Bij het gebruik van de antispieksoftware voor tentamens werd ze alleen herkend als ze met een lamp in haar gezicht scheen. De VU had volgens haar vooraf moeten controleren of studenten met een zwarte huidskleur even goed herkend zouden worden als witte studenten.
From NU.nl on July 15, 2022
Student stapt naar College voor de Rechten van de Mens vanwege gebruik racistische software door de VU
Student Robin Pocornie moest tijdens de coronapandemie tentamens maken met een lamp direct op haar gezicht. Haar witte medestudenten hoefden dat niet. De surveillance-software van de VU heeft haar gediscrimineerd, daarom dient ze vandaag een klacht in bij het College voor de Rechten van de Mens.
Continue reading “Student stapt naar College voor de Rechten van de Mens vanwege gebruik racistische software door de VU”Accused of Cheating by an Algorithm, and a Professor She Had Never Met
An unsettling glimpse at the digitization of education.
By Kashmir Hill for The New York Times on May 27, 2022
‘Smart’ techologies to detect racist chants at Dutch football matches
The KNVB (Royal Dutch Football Association) is taking a tech approach at tackling racist fan behaviour during matches, an approach that stands a great risk of falling in the techno solutionism trap.
Continue reading “‘Smart’ techologies to detect racist chants at Dutch football matches”How our world is designed for the ‘reference man’ and why proctoring should be abolished
We belief that software used for monitoring students during online tests (so-called proctoring software) should be abolished because it discriminates against students with a darker skin colour.
Continue reading “How our world is designed for the ‘reference man’ and why proctoring should be abolished”Racist Technology in Action: U.S. universities using race in their risk algorithms as a predictor for student success
An investigation by The Markup in March 2021, revealed that some universities in the U.S. are using a software and risk algorithm that uses the race of student as one of the factors to predict and evaluate how successful a student may be. Several universities have described race as a “high impact predictor”. The investigation found large disparities in how the software treated students of different races, with Black students deemed a four times higher risk than their White peers.
Continue reading “Racist Technology in Action: U.S. universities using race in their risk algorithms as a predictor for student success”Questioning the ethics of online proctoring
Instead of relying on a ‘technological fix,’ we need to ask what drives students to cheat in the first place.
By Kari Zacharias and Ketra Schmitt for University Affairs on December 3, 2021
Opinion: Biden must act to get racism out of automated decision-making
Despite Biden’s announced commitment to advancing racial justice, not a single appointee to the task force has focused experience on civil rights and liberties in the development and use of AI. That has to change. Artificial intelligence, invisible but pervasive, affects vast swaths of American society and will affect many more. Biden must ensure that racial equity is prioritized in AI development.
By ReNika Moore for Washington Post on August 9, 2021
Brazil’s embrace of facial recognition worries Black communities
Activists say the biometric tools, developed principally around white datasets, risk reinforcing racist practices.
By Charlotte Peet for Rest of World on October 22, 2021
The use of racist technology is not inevitable, but a choice we make
Last month, we wrote a piece in Lilith Mag that builds on some of the examples we have previously highlighted – the Dutch childcare benefits scandal, the use of online proctoring software, and popular dating app Grindr – to underscore two central ideas.
Continue reading “The use of racist technology is not inevitable, but a choice we make”Technology can be racist and we should talk about that
The past year has been filled with examples of technologies being racist. Yet, how we can fight this is hardly part of societal debate in the Netherlands. This must change. Making these racist technologies visible is the first step towards acknowledging that technology can indeed be racist.
Continue reading “Technology can be racist and we should talk about that”Opinie: Stop algoritmen van overheid die tot discriminatie en uitsluiting leiden
Uitvoeringsdiensten gebruiken talloze ‘zwarte lijsten’ met potentiële fraudeurs. Dat kan leiden tot (indirecte) etnische profilering en nieuwe drama’s, na de toeslagenaffaire.
By Nani Jansen Reventlow for Volkskrant on July 15, 2021
Moses Namara
Working to break down the barriers keeping young Black people from careers in AI.
By Abby Ohlheiser for MIT Technology Review on June 30, 2021
Racist Technology in Action: Proctoring software disadvantaging students of colour in the Netherlands
In an opinion piece in Parool, The Racism and Technology Center wrote about how Dutch universities use proctoring software that uses facial recognition technology that systematically disadvantages students of colour (see the English translation of the opinion piece). Earlier the center has written on the racial bias of these systems, leading to black students being excluded from exams or being labeled as frauds because the software did not properly recognise their faces as a face. Despite the clear proof that Procorio disadvantages students of colour, the University of Amsterdam has still used Proctorio extensively in this June’s exam weeks.
Continue reading “Racist Technology in Action: Proctoring software disadvantaging students of colour in the Netherlands”Call to the University of Amsterdam: Stop using racist proctoring software
The University of Amsterdam can no longer justify the use of proctoring software for remote examinations now that we know that it has a negative impact on people of colour.
Continue reading “Call to the University of Amsterdam: Stop using racist proctoring software”Oproep aan de UvA: stop het gebruik van racistische proctoringsoftware
De UvA kan het niet meer maken om proctoring in te zetten bij het afnemen van tentamens, nu duidelijk is dat de surveillance-software juist op mensen van kleur een negatieve impact heeft.
Continue reading “Oproep aan de UvA: stop het gebruik van racistische proctoringsoftware”Opinie: ‘UvA, verhul racisme van proctoring niet met mooie woorden’
Surveillancesoftware benadeelt mensen van kleur, blijkt uit onderzoek. Waarom gebruikt de UvA het dan nog, vragen Naomi Appelman, Jill Toh en Hans de Zwart.
By Hans de Zwart, Jill Toh and Naomi Appelman for Het Parool on July 6, 2021
The pandemic showed remote proctoring to be worse than useless
Before covid, “remote proctoring” tools were a niche product, invasive tools that spied on students who needed to take high-stakes tests but couldn’t get to campus or a satellite test-taking room. But the lockdown meant that all students found themselves in this position.
By Cory Doctorow for Pluralistic on June 24, 2021
Proctorio Is Using Racist Algorithms to Detect Faces
A student researcher has reverse-engineered the controversial exam software—and discovered a tool infamous for failing to recognize non-white faces.
By Todd Feathers for VICE on April 8, 2021
Online proctoring excludes and discriminates
The use of software to automatically detect cheating on online exams – online proctoring – has been the go-to solution for many schools and universities in response to the COVID-19 pandemic. In this article, Shea Swauger addresses some of the potential discriminatory, privacy and security harms that can impact groups of students across class, gender, race, and disability lines. Swauger provides a critique on how technologies encode “normal” bodies – cisgender, white, able-bodied, neurotypical, male – as the standard and how students who do not (or cannot) conform, are punished by it.
Continue reading “Online proctoring excludes and discriminates”Our Bodies Encoded: Algorithmic Test Proctoring in Higher Education
Cheating is not a technological problem, but a social and pedagogical problem. Technology is often blamed for creating the conditions in which cheating proliferates and is then offered as the solution to the problem it created; both claims are false.
By Shea Swauger for Hybrid Pedagogy on April 2, 2020
Robot Teachers, Racist Algorithms, and Disaster Pedagogy
I have volunteered to be a guest speaker in classes this Fall. It’s really the least I can do to help teachers and students through another tough term. I spoke tonight in Dorothy Kim’s class “Race Before Race: Premodern Critical Race Studies.” Here’s a bit of what I said…
By Audrey Watters for Hack Education on September 3, 2020
Digital Ethics in Higher Education: 2020
New technologies, especially those relying on artificial intelligence or data analytics, are exciting but also present ethical challenges that deserve our attention and action. Higher education can and must lead the way.
By John O’Brien for EDUCAUSE Review on May 18, 2020
UK ditches exam results generated by biased algorithm after student protests
The UK government has said that students in England and Wales will no longer receive exam results based on a controversial algorithm. The system developed by exam regulator Ofqual was accused of being biased.
By Jon Porter for The Verge on August 17, 2020
England A-level downgrades hit pupils from disadvantaged areas hardest
Analysis also shows pupils at private schools benefited most from algorithm.
By Niamh McIntyre and Richard Adams for The Guardian on August 13, 2020
Who won and who lost: when A-levels meet the algorithm
Disadvantaged students among those more likely to have received lower grades than predicted.
By Cath Levett, Niamh McIntyre, Pamela Duncan and Rhi Storer for The Guardian on August 13, 2020