Just upload a selfie in the “AI avatar app” Lensa and it will generate a digital portrait of you. Think, for example, of a slightly more fit or beautiful version of yourself as an astronaut or the lead singer in a band. If you are a man that is. As it turns out, for women, and especially women with Asian heritage, Lensa churns out pornified, sexy and skimpily clothed avatars.
Melissa Heikkilä, senior reporter at MIT Technology Review, is herself of Asian heritage and the generator did not hold back: “I got images of generic Asian women clearly modeled on anime or video-game characters. Or most likely porn, considering the sizable chunk of my avatars that were nude or showed a lot of skin. A couple of my avatars appeared to be crying.” As a matter of fact, almost a third of the images she generated (30/100) were nude, topless or “in extremely skimpy clothes and overtly sexualized poses.” Meanwhile, the images of her white female colleagues were far less sexualised – although still miles away from the cool, tough and successful images produced for men.
This is yet another manifestation of a well-known problem. Online imagines of racialised women are disproportionately sexualised – a consequence of the fetishization of these women by the Western men that have been creating the most online content. Safiya Noble already saw the significant bias in Google images back in 2009 when queries for “Black girls” “Latina girls” or “Asian girls” almost only linked to porn. In her 2018 book Algorithms of Oppression, she gives a compelling analysis of how access to online knowledge can reproduce social inequality.
Lensa’s racism and sexism is a consequence of this general bias in images available online. The system is trained on the enormous dataset of LIAN-5B, which scrapes its images of the internet. We can see clearly how social oppression and prejudice, together with global inequalities, can materialize in an innocuous looking cartoon generating app to reproduce racism and sexism.
Image by Melissa Heikkilä via Lensa