The University of Amsterdam can no longer justify the use of proctoring software for remote examinations now that we know that it has a negative impact on people of colour.
After the Black Lives Matter protests, the University of Amsterdam (UvA) decided to make a statement too. They blacked out their Instagram page, declared their solidarity with #blackouttuesday, and wrote that they would not tolerate any form of racism. Yet, around the same time, they started rolling out racist software. To be able to do remote exams during the pandemic, they are using ‘proctoring software’ which runs on the student’s computer at home, checking to see if the student isn’t cheating. During the past exam weeks, proctoring was still being used extensively at the UvA.
With proctoring, students are required install a form of spyware on their computer, which uses their webcam and microphone to check if they don’t show any strange behaviour, and which makes sure only the examination software is running. After a student has identified themselves on camera, they are no longer allowed to look away from their screen. Staring outside to do a little thinking is no longer an option. Some students nearly wet their pants because they aren’t allowed to use the toilet during the exam.
It is important to realise that students are not checked by human supervisors. Instead, the software uses artificial intelligence to monitor hundreds of students in real time. An algorithmic watchdog gives every student a score at the end of the exam, as a measure of how ‘aberrant’ their behaviour was. The video recording of students with high scores can then be reviewed manually.
It isn’t surprising that students aren’t particularly fond of proctoring software. A group of students from the UvA went to court last year to try and stop it use. The court case looked at whether the university shouldn’t have asked the Student Council for permission for rolling out proctoring software and whether it is compliant with privacy legislation. The students have now lost their case in two different courts.
But one crucial detail hasn’t been mentioned by any of the parties in the court case: is everybody treated equally by the proctoring software? Couldn’t it be the case that some people are marked as potentially fraudulant more often than others?
Many anecdotes have been published about how students with a darker skin colour experience more stress during an exam because the proctoring software doesn’t recognize them as a human being. The New Yorker, for example, writes about the student Femi Yemi-Ese. For each of his seven online exames, Proctorio—the software that is used by the UvA too—failed to recognize his face at the first attempt. This is the case, even though he has a ring light and has to stare into its brightness during the exam. His white students have none of these issues.
These problems aren’t just anecdotal. Researcher Lucy Satheesan found out that Proctorio uses OpenCV, a free-to-use software library for recognizing faces. With an ingenious experiment, Satheesan showed that OpenCV is better at recognizing white face than faces with a darker skin colour. This evidently means that Proctorio has a racist impact on certain students.
Educational institutions should treat all their students equally. They don’t do this as long as they use Proctorio. You can’t make promises to fight racism while at the same time using software that is demonstrably racist. It is unconscionable that the UvA has yet to find an alternative for a ‘solution’ that dehumanizes a subset of their students.