Racist Technology in Action: Amazon’s racist facial ‘Rekognition’

An already infamous example of racist technology is Amazon’s facial recognition system ‘Rekognition’ that had an enormous racial and gender bias. Researcher and founder of the Algorithmic Justice League Joy Buolawini (the ‘poet of code‘), together with Deborah Raji, meticulously reconstructed how accurate Rekognition was in identifying different types of faces. Buolawini and Raji’s study has been extremely consequencial in laying bare the racism and sexism in these facial recognition systems and was featured in the popular Coded Bias documentary.

Concretely, their research made stark discrepancies visible in how good the system was in identifying white men as opposed to black women. This large discrepancy between the races and sexes can be explained by the limited datasets these systems are trained on that underrepresent black people and women, together with the fact that the people developing them are mainly white men. These discrepancies are not just relevant from an abstract perspective, they have grave real life consequences. Facial recognition systems developed by the big tech companies such as Amazon are increasingly used in extremely consequential settings such as in policing, leading to wrongful arrests, or on refugees. Amazons’s facial recognition systems are already in use by the U.S. police. However, last year af the hight of the Black Lives Matter movement after the killing of George Floyd, Amazon announced a one year moratorium on the use of their facial recognition systems for policing. In two months this moratorium will be lifted and it is as of yet unclear what will happen.

Comments are closed.

Proudly powered by WordPress | Theme: Baskerville 2 by Anders Noren.

Up ↑