In Wired, Chris Gilliard strings together an incisive account of the racist history of surveillance: from the invention of home security system to modern day surveillance devices and technologies, such as Amazon and Google’s suite of security products.
Continue reading “Intentional or otherwise, surveillance systems serve existing power structures”A Black Woman Invented Home Security. Why Did It Go So Wrong?
Surveillance systems, no matter the intention, will always exist to serve power.
By Chris Gilliard for WIRED on November 14, 2021
Crowd-Sourced Suspicion Apps Are Out of Control
Technology rarely invents new societal problems. Instead, it digitizes them, supersizes them, and allows them to balloon and duplicate at the speed of light. That’s exactly the problem we’ve seen with location-based, crowd-sourced “public safety” apps like Citizen.
By Matthew Guariglia for Electronic Frontier Foundation (EFF) on October 21, 2021
Brazil’s embrace of facial recognition worries Black communities
Activists say the biometric tools, developed principally around white datasets, risk reinforcing racist practices.
By Charlotte Peet for Rest of World on October 22, 2021
A Detroit community college professor is fighting Silicon Valley’s surveillance machine. People are listening.
Chris Gilliard grew up with racist policing in Detroit. He sees a new form of oppression in the tech we use every day.
By Chris Gilliard and Will Oremus for Washington Post on September 17, 2021
Reinforce rights, not racism: Why we must fight biometric mass surveillance in Europe
Gwendoline Delbos-Corfield MEP in conversation with Laurence Meyer, from the Digital Freedom Fund, about the dangers of the increasing use of biometric mass surveillance – both within the EU and outside it, as well as the impact it can have on the lives of people who are already being discriminated against.
By Gwendoline Delbos-Corfield and Laurence Meyer for Greens/EFA on June 24, 2021
‘I don’t think you can have an anti-racist tech company at scale’
Surveillance expert Chris Gilliard reflects on 2020’s racial justice protests, the hypocrisy of tech companies’ commitments, and where we are one year later.
By Chris Gilliard and Katharine Schwab for Fast Company on June 16, 2021
Racist and classist predictive policing exists in Europe too
The enduring idea that technology will be able to solve many of the existing problems in society continues to permeate across governments. For the EUObserver, Fieke Jansen and Sarah Chander illustrate some of the problematic and harmful uses of ‘predictive’ algorithmic systems by states and public authorities across the UK and Europe.
Continue reading “Racist and classist predictive policing exists in Europe too”EU’s new AI law risks enabling Orwellian surveillance states
“Far from a ‘human-centred’ approach, the draft law in its current form runs the risk of enabling Orwellian surveillance states,” writes @sarahchander from @edri.
By Sarah Chander for Euronews on April 22, 2021
Online proctoring excludes and discriminates
The use of software to automatically detect cheating on online exams – online proctoring – has been the go-to solution for many schools and universities in response to the COVID-19 pandemic. In this article, Shea Swauger addresses some of the potential discriminatory, privacy and security harms that can impact groups of students across class, gender, race, and disability lines. Swauger provides a critique on how technologies encode “normal” bodies – cisgender, white, able-bodied, neurotypical, male – as the standard and how students who do not (or cannot) conform, are punished by it.
Continue reading “Online proctoring excludes and discriminates”IBM is failing to increase diversity while successfully producing racist information technologies
Charlton McIlwain, author of the book Black Software, takes a good hard look at IBM in a longread for Logic magazine.
Continue reading “IBM is failing to increase diversity while successfully producing racist information technologies”Race, tech, and medicine: Remarks from Dr. Dorothy Roberts and Dr. Ruha Benjamin
Race, tech, and medicine: Remarks from Dr. Dorothy Roberts and Dr. Ruha Benjamin.
By Dorothy Roberts, Kim M Reynolds and Ruha Benjamin for Our Data Bodies Project on August 15, 2020
Our Bodies Encoded: Algorithmic Test Proctoring in Higher Education
Cheating is not a technological problem, but a social and pedagogical problem. Technology is often blamed for creating the conditions in which cheating proliferates and is then offered as the solution to the problem it created; both claims are false.
By Shea Swauger for Hybrid Pedagogy on April 2, 2020
Technology has codified structural racism – will the EU tackle racist tech?
The EU is preparing its ‘Action Plan’ to address structural racism in Europe. With digital high on the EU’s legislative agenda, it’s time we tackle racism perpetuated by technology, writes Sarah Chander.
By Sarah Chander for EURACTIV.com on September 3, 2020
Decode the Default
Technology has never been colorblind. It’s time to abolish notions of “universal” users of software.
From The Internet Health Report 2020 on January 1, 2021
Racism and “Smart Borders”
As many of us had our attention focused on the use of biometric surveillance technologies in managing the COVID-19 pandemic, in a new UN report prof. E. Tendayi Achiume forcefully puts the spotlight on the racial and discriminatory dimension of biometric surveillance technology in border enforcement.
Continue reading “Racism and “Smart Borders””Race, surveillance and tech
Today, on the Attack Surface Lectures – a series of 8 panels at 8 indie bookstores that Tor Books and I ran to launch the third Little Brother novel in Oct: Race, Surveillance, and Tech with Meredith Whittaker and Malkia Devich-Cyril, hosted by The Booksmith.
By Cory Doctorow for Pluralistic on November 18, 2020
Community Defense: Sarah T. Hamid on Abolishing Carceral Technologies
A conversation about how to break cages.
By Sarah T. Hamid for Logic on August 31, 2020
UN warns of impact of smart borders on refugees: ‘Data collection isn’t apolitical’
Special rapporteur on racism and xenophobia believes there is a misconception that biosurveillance technology is without bias.
By Katy Fallon for The Guardian on November 11, 2020
Dataminr Targets Communities of Color for Police
Insiders say Dataminr’s “algorithmic” Twitter search involves human staffers perpetuating confirmation biases.
By Sam Biddle for The Intercept on October 21, 2020
Friction-Free Racism
Surveillance capitalism turns a profit by making people more comfortable with discrimination.
By Chris Gilliard for Real Life on October 15, 2018
