Racist Technology in Action: “Race-neutral” traffic cameras have a racially disparate impact

Traffic cameras that are used to automatically hand out speeding tickets don’t look at the colour of the person driving the speeding car. Yet, ProPublica has convincingly shown how cameras that don’t have a racial bias can still have a disparate racial impact.

Continue reading “Racist Technology in Action: “Race-neutral” traffic cameras have a racially disparate impact”

Racist Technology in Action: U.S. universities using race in their risk algorithms as a predictor for student success

An investigation by The Markup in March 2021, revealed that some universities in the U.S. are using a software and risk algorithm that uses the race of student as one of the factors to predict and evaluate how successful a student may be. Several universities have described race as a “high impact predictor”. The investigation found large disparities in how the software treated students of different races, with Black students deemed a four times higher risk than their White peers.

Continue reading “Racist Technology in Action: U.S. universities using race in their risk algorithms as a predictor for student success”

Racist Technology in Action: Uber’s racially discriminatory facial recognition system firing workers

This example of racist technology in action combines racist facial recognition systems with exploitative working conditions and algorithmic management to produce a perfect example of how technology can exacarbate both economic precarity and racial discrimination.

Continue reading “Racist Technology in Action: Uber’s racially discriminatory facial recognition system firing workers”

Racist Technology in Action: Facebook labels black men as ‘primates’

In the reckoning of the Black Lives Matter movement in summer 2020, a video that featured black men in altercation with the police and white civilians was posted by the Daily Mail, a British tabloid. In the New York Times, Ryan Mac reports how Facebook users who watched that video, saw an automated prompt that asked if they would like to “keep seeing videos about Primates,” despite there being no relatedness to primates or monkeys.

Continue reading “Racist Technology in Action: Facebook labels black men as ‘primates’”

Racist Technology in Action: White preference in mortage-approval algorithms

A very clear example of racist technology was exposed by Emmanuel Martinez and Lauren Kirchner in an article for the Markup. Algorithms used by a variety of American banks and lenders to automatically assess or advice on mortgages display clear racial disparity. In national data from the United States in 2019 they found that “loan applicants of color were 40%–80% more likely to be denied than their White counterparts. In certain metro areas, the disparity was greater than 250%.”

Continue reading “Racist Technology in Action: White preference in mortage-approval algorithms”

Racist Technology in Action: Proctoring software disadvantaging students of colour in the Netherlands

In an opinion piece in Parool, The Racism and Technology Center wrote about how Dutch universities use proctoring software that uses facial recognition technology that systematically disadvantages students of colour (see the English translation of the opinion piece). Earlier the center has written on the racial bias of these systems, leading to black students being excluded from exams or being labeled as frauds because the software did not properly recognise their faces as a face. Despite the clear proof that Procorio disadvantages students of colour, the University of Amsterdam has still used Proctorio extensively in this June’s exam weeks.

Continue reading “Racist Technology in Action: Proctoring software disadvantaging students of colour in the Netherlands”

Racist Technology in Action: Predicting future criminals with a bias against Black people

In 2016, ProPublica investigated the fairness of COMPAS, a system used by the courts in the United States to assess the likelihood of a defendant committing another crime. COMPAS uses a risk assessment form to assess this risk of a defendant offending again. Judges are expected to take this risk prediction into account when they decide on sentencing.

Continue reading “Racist Technology in Action: Predicting future criminals with a bias against Black people”

Racist Technology in Action: Amazon’s racist facial ‘Rekognition’

An already infamous example of racist technology is Amazon’s facial recognition system ‘Rekognition’ that had an enormous racial and gender bias. Researcher and founder of the Algorithmic Justice League Joy Buolawini (the ‘poet of code‘), together with Deborah Raji, meticulously reconstructed how accurate Rekognition was in identifying different types of faces. Buolawini and Raji’s study has been extremely consequencial in laying bare the racism and sexism in these facial recognition systems and was featured in the popular Coded Bias documentary.

Continue reading “Racist Technology in Action: Amazon’s racist facial ‘Rekognition’”

Racist technology in action: Gun, or electronic device?

The answer to that question depends on your skin colour, apparently. An AlgorithmWatch reporter, Nicholas Kayser-Bril, conducted an experiment that went viral on Twitter, showing that Google Vision Cloud (a service which is based on a subset of AI known as “computer vision” that focuses on automated image labelling), labelled an image of a dark-skinned individual holding a thermometer with the word “gun”, whilst a lighter skinned individual was labelled holding an “electronic device”.

Continue reading “Racist technology in action: Gun, or electronic device?”

Proudly powered by WordPress | Theme: Baskerville 2 by Anders Noren.

Up ↑