The use of software to automatically detect cheating on online exams – online proctoring – has been the go-to solution for many schools and universities in response to the COVID-19 pandemic. In this article, Shea Swauger addresses some of the potential discriminatory, privacy and security harms that can impact groups of students across class, gender, race, and disability lines. Swauger provides a critique on how technologies encode “normal” bodies – cisgender, white, able-bodied, neurotypical, male – as the standard and how students who do not (or cannot) conform, are punished by it.
Continue reading “Online proctoring excludes and discriminates”IBM is failing to increase diversity while successfully producing racist information technologies
Charlton McIlwain, author of the book Black Software, takes a good hard look at IBM in a longread for Logic magazine.
Continue reading “IBM is failing to increase diversity while successfully producing racist information technologies”Race, tech, and medicine: Remarks from Dr. Dorothy Roberts and Dr. Ruha Benjamin
Race, tech, and medicine: Remarks from Dr. Dorothy Roberts and Dr. Ruha Benjamin.
By Dorothy Roberts, Kim M Reynolds and Ruha Benjamin for Our Data Bodies Project on August 15, 2020
Our Bodies Encoded: Algorithmic Test Proctoring in Higher Education
Cheating is not a technological problem, but a social and pedagogical problem. Technology is often blamed for creating the conditions in which cheating proliferates and is then offered as the solution to the problem it created; both claims are false.
By Shea Swauger for Hybrid Pedagogy on April 2, 2020
Technology has codified structural racism – will the EU tackle racist tech?
The EU is preparing its ‘Action Plan’ to address structural racism in Europe. With digital high on the EU’s legislative agenda, it’s time we tackle racism perpetuated by technology, writes Sarah Chander.
By Sarah Chander for EURACTIV.com on September 3, 2020
Decode the Default
Technology has never been colorblind. It’s time to abolish notions of “universal” users of software.
From The Internet Health Report 2020 on January 1, 2021
Racism and “Smart Borders”
As many of us had our attention focused on the use of biometric surveillance technologies in managing the COVID-19 pandemic, in a new UN report prof. E. Tendayi Achiume forcefully puts the spotlight on the racial and discriminatory dimension of biometric surveillance technology in border enforcement.
Continue reading “Racism and “Smart Borders””Race, surveillance and tech
Today, on the Attack Surface Lectures – a series of 8 panels at 8 indie bookstores that Tor Books and I ran to launch the third Little Brother novel in Oct: Race, Surveillance, and Tech with Meredith Whittaker and Malkia Devich-Cyril, hosted by The Booksmith.
By Cory Doctorow for Pluralistic on November 18, 2020
Community Defense: Sarah T. Hamid on Abolishing Carceral Technologies
A conversation about how to break cages.
By Sarah T. Hamid for Logic on August 31, 2020
UN warns of impact of smart borders on refugees: ‘Data collection isn’t apolitical’
Special rapporteur on racism and xenophobia believes there is a misconception that biosurveillance technology is without bias.
By Katy Fallon for The Guardian on November 11, 2020
Dataminr Targets Communities of Color for Police
Insiders say Dataminr’s “algorithmic” Twitter search involves human staffers perpetuating confirmation biases.
By Sam Biddle for The Intercept on October 21, 2020
Friction-Free Racism
Surveillance capitalism turns a profit by making people more comfortable with discrimination.
By Chris Gilliard for Real Life on October 15, 2018