The use of software to automatically detect cheating on online exams – online proctoring – has been the go-to solution for many schools and universities in response to the COVID-19 pandemic. In this article, Shea Swauger addresses some of the potential discriminatory, privacy and security harms that can impact groups of students across class, gender, race, and disability lines. Swauger provides a critique on how technologies encode “normal” bodies – cisgender, white, able-bodied, neurotypical, male – as the standard and how students who do not (or cannot) conform, are punished by it.
Online proctoring uses facial recognition technologies to detect cheating based on characteristics of the person and the room; technologies that have consistently shown to produce racist outcomes. For example, while using Proctorio, Black students have reported that the system were unable to detect their faces. As a result, students with dark skin have to shine more light on themselves to verify their identities prior to taking an exam.
In the Netherlands, several universities have utilised online proctoring software, resulting in a court case brought forth by students (who eventually lost). The case, while important, centred around privacy and data protection, leaving out issues of discrimination, inclusion and exclusion. In the shift to online education, it is crucial to think of how the choices to adopt certain technological pedagogical tools are political choices that can reinforce exclusion and discrimination in our education systems.
See: Our Bodies Encoded: Algorithmic Test Proctoring in Higher Education in Hybrid Pedagogy.