Racist Technology in Action: Image recognition is still not capable of differentiating gorillas from Black people

If this title feels like a deja-vu it is because you most likely have, in fact, seen this before (perhaps even in our newsletter). It was back in 2015 that the controversy first arose when Google released image recognition software that kept mislabelling Black people as gorillas (read here and here).

Not only has Google been unable to address this perpetuation of an anti-black racist trope, instead opting in 2018 to just delete the category of ‘gorilla’ (here and here), other companies seemingly started to join the trend. Facebook in 2021 had to issue a public apology after it labelled Black men as ‘primates’.

After all this controversy and the several rounds of corporate apologies, the New York Times decided to check whether (eight years on) the tools from Google, Apple, Amazon and Microsoft had resolved the issue. To test the image recognition function in these different apps, the journalists curated a set of 44 imagines containing people, animals and common objects, and then simply searched within the set, using the tool, for imagines containing ‘cats’, ‘kangaroo’s’ and finally ‘gorillas’.

Google and Apple failed to provide any search results for gorillas, while they scored much better on the other categories than the tools of Amazon and Microsoft. Most likely Google and Apple “made the decision to turn off the ability to visually search for primates for fear of making an offensive mistake and labeling a person as an animal.”

Of course, not being able to search for monkeys in your photo’s will not cause hindrance that often. Though the problem is, that if the software is unable to make this simple differentiation it cannot be deemed reliable:

The issue raises larger questions about other unfixed, or unfixable, flaws lurking in services that rely on computer vision — a technology that interprets visual images — as well as other products powered by A.I.

As we rely more and more on these types of software, these unknown errors can have far reaching and unintended consequences. Think for example of the oxygen meters that underperformed for Black people (read our piece here), or the exam software that doesn’t detect Black faces as well as white ones (read our piece here).

See: Google’s Photo App Still Can’t Find Gorillas. And Neither Can Apple’s. at The New York Times.

Image from the original article.

Comments are closed.

Proudly powered by WordPress | Theme: Baskerville 2 by Anders Noren.

Up ↑