We at the Racism and Technology Center stand in solidarity with the Palestinian people. We condemn the violence enacted against the innocent people in Palestine and Israel, and mourn alongside all who are dead, injured and still missing. Palestinian communities are being subjected to unlawful collective punishment in Gaza and the West Bank, including the ongoing bombings and the blockade of water, food and energy. We call for an end to the blockade and an immediate ceasefire.
Continue reading “Standing in solidarity with the Palestinian people”How facial recognition algorithms discriminate against people with facial differences
In previous newsletters, we have discussed how facial recognition algorithms affect racialised people. However, another community heavily affected by the large-scale use of facial recognition tools are people with facial differences.
Continue reading “How facial recognition algorithms discriminate against people with facial differences”UK police facial recognition: Another chapter in a long story of racist Technology
The UK police admitted that their facial recognition technology has a significant racial bias.
Continue reading “UK police facial recognition: Another chapter in a long story of racist Technology”Setting the record straight: Scientists show that the algorithm that Proctorio used is incredibly biased towards people with a darker skin colour
Do you remember when our Robin Pocornie filed a complaint with the Dutch Human Rights Institute because Proctorio, the spyware that she was forced to use as a proctor for doing her exams from home, couldn’t find her face as it was “too dark”? (If not, read the dossier of her case.)
Continue reading “Setting the record straight: Scientists show that the algorithm that Proctorio used is incredibly biased towards people with a darker skin colour”Easily developed facial recognition glasses outline how underprepared we are for privacy violations
Two engineering students, Caine Ardayfio and AnnPhu Nguyen, at Harvard University developed real-time facial recognition glasses. They went testing it out on passengers in the Boston subway, and easily identified a former journalist and some of his articles. A great way to produce small-talk conversations of break the ice – you might think.
Continue reading “Easily developed facial recognition glasses outline how underprepared we are for privacy violations”Racist Technology in Action: AI detection of emotion rates Black basketball players as ‘angrier’ than their White counterparts
In 2018, Lauren Rhue showed that two leading emotion detection software products had a racial bias against Black Men: Face++ thought they were more angry, and Microsoft AI thought they were more contemptuous.
Continue reading “Racist Technology in Action: AI detection of emotion rates Black basketball players as ‘angrier’ than their White counterparts”Robin Aisha Pocornie’s TEDx talk: “Error 404: Human Face Not Found”
Robin Aisha Pocornie’s case should by now be familiar for regular readers of our Center’s work. Robin has now told this story in her own voice at TEDxAmsterdam.
Continue reading “Robin Aisha Pocornie’s TEDx talk: “Error 404: Human Face Not Found””Dutch Higher Education continues to use inequitable proctoring software
In October last year, RTL news showed that Proctorio’s software, used to check if students aren’t cheating during online exams, works less for students of colour. Five months later, RTL asked the twelve Dutch educational institutions on Proctorio’s client list whether they were still using the tool. Eight say they still do.
Continue reading “Dutch Higher Education continues to use inequitable proctoring software”Automating apartheid in the Occupied Palestinian Territories
In this interview, Matt Mahmoudi explains the Amnesty report titled Automating Apartheid, which he contributed to. The report exposes how the Israeli authorities extensively use surveillance tools, facial recognition technologies, and networks of CCTV cameras to support, intensify and entrench their continued domination and oppression of Palestinians in the Occupied Territories (OPT), Hebron and East Jerusalem. Facial recognition software is used by Israeli authorities to consolidate existing practices of discriminatory policing and segregation, violating Palestinians’ basic rights.
Continue reading “Automating apartheid in the Occupied Palestinian Territories”Judgement of the Dutch Institute for Human Rights shows how difficult it is to legally prove algorithmic discrimination
On October 17th, the Netherlands Institute for Human Rights ruled that the VU did not discriminate against bioinformatics student Robin Pocornie on the basis of race by using anti-cheating software. However, according to the institute, the VU has discriminated on the grounds of race in how they handled her complaint.
Continue reading “Judgement of the Dutch Institute for Human Rights shows how difficult it is to legally prove algorithmic discrimination”Proctoring software uses fudge-factor for dark skinned students to adjust their suspicion score
Respondus, a vendor of online proctoring software, has been granted a patent for their “systems and methods for assessing data collected by automated proctoring.” The patent shows that their example method for calculating a risk score is adjusted on the basis of people’s skin colour.
Continue reading “Proctoring software uses fudge-factor for dark skinned students to adjust their suspicion score”Al Jazeera asks: Can AI eliminate human bias or does it perpetuate it?
In its online series of digital dilemmas, Al Jazeera takes a look at AI in relation to social inequities. Loyal readers of this newsletter will recognise many of the examples they touch on, like how Stable Diffusion exacerbates and amplifies racial and gender disparities or the Dutch childcare benefits scandal.
Continue reading “Al Jazeera asks: Can AI eliminate human bias or does it perpetuate it?”Another false facial recognition match: pregnant woman wrongfully arrested
The police in America is using facial recognition software to match security footage of crimes to people. Kashmir Hill describes for the New York Times another example of a wrong match leading to a wrongful arrest.
Continue reading “Another false facial recognition match: pregnant woman wrongfully arrested”Current state of research: Face detection still has problems with darker faces
Scientific research on the quality of face detection systems keeps finding the same result: no matter how, when, and with which system testing is done, every time it is found that faces of people with a darker skin tone are not detected as well as the faces of people with a lighter skin tone.
Continue reading “Current state of research: Face detection still has problems with darker faces”France wants to legalise mass surveillance for the Paris Olympics 2024: “Safety” and “security”, for whom?
Many governments are using mass surveillance to support law enforcement for the purposes of safety and security. In France, the French Parliament (and before, the French Senate) have approved the use of automated behavioural video surveillance at the 2024 Paris Olympics. Simply put, France wants to legalise mass surveillance at the national level which can violate many rights, such as the freedom of assembly and association, privacy, and non-discrimination.
Continue reading “France wants to legalise mass surveillance for the Paris Olympics 2024: “Safety” and “security”, for whom?”Representing skin tone, or Google’s hubris versus the simplicity of Crayola
Google wants to “help computers ‘see’ our world”, and one of their ways of battling how current AI and machine learning systems perpetuate biases is to introduce a more inclusive scale of skin tone, the ‘Monk Skin Tone Scale’.
Continue reading “Representing skin tone, or Google’s hubris versus the simplicity of Crayola”Attempts to eliminate bias through diversifying datasets? A distraction from the root of the problem
In this eloquent and haunting piece by Hito Steyerl, she weaves the ongoing narratives of the eugenicist history of statistics with its integration into machine learning. She elaborates why the attempts to eliminate bias in facial recognition technology through diversifying datasets obscures the root of the problem: machine learning and automation are fundamentally reliant on extracting and exploiting human labour.
Continue reading “Attempts to eliminate bias through diversifying datasets? A distraction from the root of the problem”Doing an exam as if “driving at night with a car approaching from the other direction with its headlights on full-beam”
Robin Pocornie’s complaint against the VU for their use of Proctorio, which had trouble detecting her face as a person of colour, is part of larger and international story as an article in Wired shows.
Continue reading “Doing an exam as if “driving at night with a car approaching from the other direction with its headlights on full-beam””First Dutch citizen proves that an algorithm discriminated against her on the basis of her skin colour
Robin Pocornie was featured in the Dutch current affairs programme EenVandaag. Professor Sennay Ghebreab and former Member of Parliament Kees Verhoeven provided expertise and commentary.
Continue reading “First Dutch citizen proves that an algorithm discriminated against her on the basis of her skin colour”Uit Vrij Nederland: Krijgen we wat we verdienen?
Laat ik de vraag specificeren voor mijn vakgebied: zijn besluiten die door technologie worden genomen rechtvaardig? Verdien je de beslissing die uit de machine rolt?
Continue reading “Uit Vrij Nederland: Krijgen we wat we verdienen?”Dutch Institute for Human Rights speaks about Proctorio at Dutch Parliament
In a roundtable on artificial intelligence in the Dutch Parliament, Quirine Eijkman spoke on behalf of the Netherlands Institute for Human Rights about Robin Pocornie’s case against the discriminatory use of Proctiorio at the VU university.
Continue reading “Dutch Institute for Human Rights speaks about Proctorio at Dutch Parliament”Dutch Institute for Human Rights: Use of anti-cheating software can be algorithmic discrimination (i.e. racist)
Dutch student Robin Pocornie filed a complaint with Dutch Institute for Human Rights. The surveillance software that her university used, had trouble recognising her as human being because of her skin colour. After a hearing, the Institute has now ruled that Robin has presented enough evidence to assume that she was indeed discriminated against. The ball is now in the court of the VU (her university) to prove that the software treated everybody the same.
Continue reading “Dutch Institute for Human Rights: Use of anti-cheating software can be algorithmic discrimination (i.e. racist)”Dutch student files complaint with the Netherlands Institute for Human Rights about the use of racist software by her university
During the pandemic, Dutch student Robin Pocornie had to do her exams with a light pointing straight at her face. Her fellow students who were White didn’t have to do that. Her university’s surveillance software discriminated her, and that is why she has filed a complaint (read the full complaint in Dutch) with the Netherlands Institute for Human Rights.
Continue reading “Dutch student files complaint with the Netherlands Institute for Human Rights about the use of racist software by her university”How our world is designed for the ‘reference man’ and why proctoring should be abolished
We belief that software used for monitoring students during online tests (so-called proctoring software) should be abolished because it discriminates against students with a darker skin colour.
Continue reading “How our world is designed for the ‘reference man’ and why proctoring should be abolished”Racist Technology in Action: Uber’s racially discriminatory facial recognition system firing workers
This example of racist technology in action combines racist facial recognition systems with exploitative working conditions and algorithmic management to produce a perfect example of how technology can exacarbate both economic precarity and racial discrimination.
Continue reading “Racist Technology in Action: Uber’s racially discriminatory facial recognition system firing workers”Are we automating racism?
Vox host Joss Fong wanted to know… “Why do we think tech is neutral? How do algorithms become biased? And how can we fix these algorithms before they cause harm?”
Continue reading “Are we automating racism?”Racist Technology in Action: Proctoring software disadvantaging students of colour in the Netherlands
In an opinion piece in Parool, The Racism and Technology Center wrote about how Dutch universities use proctoring software that uses facial recognition technology that systematically disadvantages students of colour (see the English translation of the opinion piece). Earlier the center has written on the racial bias of these systems, leading to black students being excluded from exams or being labeled as frauds because the software did not properly recognise their faces as a face. Despite the clear proof that Procorio disadvantages students of colour, the University of Amsterdam has still used Proctorio extensively in this June’s exam weeks.
Continue reading “Racist Technology in Action: Proctoring software disadvantaging students of colour in the Netherlands”Call to the University of Amsterdam: Stop using racist proctoring software
The University of Amsterdam can no longer justify the use of proctoring software for remote examinations now that we know that it has a negative impact on people of colour.
Continue reading “Call to the University of Amsterdam: Stop using racist proctoring software”Racist Technology in Action: Amazon’s racist facial ‘Rekognition’
An already infamous example of racist technology is Amazon’s facial recognition system ‘Rekognition’ that had an enormous racial and gender bias. Researcher and founder of the Algorithmic Justice League Joy Buolawini (the ‘poet of code‘), together with Deborah Raji, meticulously reconstructed how accurate Rekognition was in identifying different types of faces. Buolawini and Raji’s study has been extremely consequencial in laying bare the racism and sexism in these facial recognition systems and was featured in the popular Coded Bias documentary.
Continue reading “Racist Technology in Action: Amazon’s racist facial ‘Rekognition’”Online proctoring excludes and discriminates
The use of software to automatically detect cheating on online exams – online proctoring – has been the go-to solution for many schools and universities in response to the COVID-19 pandemic. In this article, Shea Swauger addresses some of the potential discriminatory, privacy and security harms that can impact groups of students across class, gender, race, and disability lines. Swauger provides a critique on how technologies encode “normal” bodies – cisgender, white, able-bodied, neurotypical, male – as the standard and how students who do not (or cannot) conform, are punished by it.
Continue reading “Online proctoring excludes and discriminates”Racism and “Smart Borders”
As many of us had our attention focused on the use of biometric surveillance technologies in managing the COVID-19 pandemic, in a new UN report prof. E. Tendayi Achiume forcefully puts the spotlight on the racial and discriminatory dimension of biometric surveillance technology in border enforcement.
Continue reading “Racism and “Smart Borders””Racist technology in action: Cropping out the non-white
A recent, yet already classic, example of racist technology is Twitter’s photo cropping machine learning algorithm. The algorithm was shown to consistently preference white faces in the cropped previews of pictures.
Continue reading “Racist technology in action: Cropping out the non-white”