We belief that software used for monitoring students during online tests (so-called proctoring software) should be abolished because it discriminates against students with a darker skin colour.Continue reading “How our world is designed for the ‘reference man’ and why proctoring should be abolished”
An investigation by The Markup in March 2021, revealed that some universities in the U.S. are using a software and risk algorithm that uses the race of student as one of the factors to predict and evaluate how successful a student may be. Several universities have described race as a “high impact predictor”. The investigation found large disparities in how the software treated students of different races, with Black students deemed a four times higher risk than their White peers.Continue reading “Racist Technology in Action: U.S. universities using race in their risk algorithms as a predictor for student success”
Instead of relying on a ‘technological fix,’ we need to ask what drives students to cheat in the first place.
By Kari Zacharias and Ketra Schmitt for University Affairs on December 3, 2021
Despite Biden’s announced commitment to advancing racial justice, not a single appointee to the task force has focused experience on civil rights and liberties in the development and use of AI. That has to change. Artificial intelligence, invisible but pervasive, affects vast swaths of American society and will affect many more. Biden must ensure that racial equity is prioritized in AI development.
By ReNika Moore for Washington Post on August 9, 2021
Activists say the biometric tools, developed principally around white datasets, risk reinforcing racist practices.
By Charlotte Peet for Rest of World on October 22, 2021
Last month, we wrote a piece in Lilith Mag that builds on some of the examples we have previously highlighted – the Dutch childcare benefits scandal, the use of online proctoring software, and popular dating app Grindr – to underscore two central ideas.Continue reading “The use of racist technology is not inevitable, but a choice we make”
The past year has been filled with examples of technologies being racist. Yet, how we can fight this is hardly part of societal debate in the Netherlands. This must change. Making these racist technologies visible is the first step towards acknowledging that technology can indeed be racist.Continue reading “Technology can be racist and we should talk about that”
Uitvoeringsdiensten gebruiken talloze ‘zwarte lijsten’ met potentiële fraudeurs. Dat kan leiden tot (indirecte) etnische profilering en nieuwe drama’s, na de toeslagenaffaire.
By Nani Jansen Reventlow for Volkskrant on July 15, 2021
Working to break down the barriers keeping young Black people from careers in AI.
By Abby Ohlheiser for MIT Technology Review on June 30, 2021
In an opinion piece in Parool, The Racism and Technology Center wrote about how Dutch universities use proctoring software that uses facial recognition technology that systematically disadvantages students of colour (see the English translation of the opinion piece). Earlier the center has written on the racial bias of these systems, leading to black students being excluded from exams or being labeled as frauds because the software did not properly recognise their faces as a face. Despite the clear proof that Procorio disadvantages students of colour, the University of Amsterdam has still used Proctorio extensively in this June’s exam weeks.Continue reading “Racist Technology in Action: Proctoring software disadvantaging students of colour in the Netherlands”
The University of Amsterdam can no longer justify the use of proctoring software for remote examinations now that we know that it has a negative impact on people of colour.Continue reading “Call to the University of Amsterdam: Stop using racist proctoring software”
De UvA kan het niet meer maken om proctoring in te zetten bij het afnemen van tentamens, nu duidelijk is dat de surveillance-software juist op mensen van kleur een negatieve impact heeft.Continue reading “Oproep aan de UvA: stop het gebruik van racistische proctoringsoftware”
Surveillancesoftware benadeelt mensen van kleur, blijkt uit onderzoek. Waarom gebruikt de UvA het dan nog, vragen Naomi Appelman, Jill Toh en Hans de Zwart.
By Hans de Zwart, Jill Toh and Naomi Appelman for Het Parool on July 6, 2021
Before covid, “remote proctoring” tools were a niche product, invasive tools that spied on students who needed to take high-stakes tests but couldn’t get to campus or a satellite test-taking room. But the lockdown meant that all students found themselves in this position.
By Cory Doctorow for Pluralistic on June 24, 2021
A student researcher has reverse-engineered the controversial exam software—and discovered a tool infamous for failing to recognize non-white faces.
By Todd Feathers for VICE on April 8, 2021
The use of software to automatically detect cheating on online exams – online proctoring – has been the go-to solution for many schools and universities in response to the COVID-19 pandemic. In this article, Shea Swauger addresses some of the potential discriminatory, privacy and security harms that can impact groups of students across class, gender, race, and disability lines. Swauger provides a critique on how technologies encode “normal” bodies – cisgender, white, able-bodied, neurotypical, male – as the standard and how students who do not (or cannot) conform, are punished by it.Continue reading “Online proctoring excludes and discriminates”
Cheating is not a technological problem, but a social and pedagogical problem. Technology is often blamed for creating the conditions in which cheating proliferates and is then offered as the solution to the problem it created; both claims are false.
By Shea Swauger for Hybrid Pedagogy on April 2, 2020
I have volunteered to be a guest speaker in classes this Fall. It’s really the least I can do to help teachers and students through another tough term. I spoke tonight in Dorothy Kim’s class “Race Before Race: Premodern Critical Race Studies.” Here’s a bit of what I said…
By Audrey Watters for Hack Education on September 3, 2020
New technologies, especially those relying on artificial intelligence or data analytics, are exciting but also present ethical challenges that deserve our attention and action. Higher education can and must lead the way.
By John O’Brien for EDUCAUSE Review on May 18, 2020
The UK government has said that students in England and Wales will no longer receive exam results based on a controversial algorithm. The system developed by exam regulator Ofqual was accused of being biased.
By Jon Porter for The Verge on August 17, 2020
Analysis also shows pupils at private schools benefited most from algorithm.
By Niamh McIntyre and Richard Adams for The Guardian on August 13, 2020