Stories about the hidden and exploitative racialised labour which fuels the development of technologies continue to surface, and this time it is on ChatGPT. Billy Perrigo, who previously reported on Meta’s content moderation sweatshop and on whistleblower Daniel Moutang, who took Meta to court, has shed light on how OpenAI has relied upon outsourced exploitative labour in Kenya to make ChatGPT less toxic.
Employed by Sama (a contractor for the world’s largest tech companies), these workers who earn less than $2 an hour, are expected to read and label between 150 and 250 passages of text per nine-hour shift. In the interviews, they reveal how torturous and traumatic the nature of their work is, due to the exposure of violent, toxic and explicit content – all the dark parts of the web. A side pilot project conducted by Sama for OpenAI (who also builds image-generation technology) also involved collecting illegal content – sexual and violent images – some of which were illegal under U.S. law. This led to the end of Sama and OpenAI’s partnership. As a consequence, workers were either laid off, or moved to lower-paying workstreams, making it even more challenging for workers to sufficiently provide for themselves and their families. For all the “promises” of job creation through platform work, it seems to be simply reproducing structural norms, and continuing the patterns of colonial history.
In the quest to develop “safer” AI systems in natural language processing, content moderation, or computer vision, it is clear that this “safety” is only meant for a selected few. OpenAI’s idea to build an additional safety mechanism – an automated tool to detect harmful content – into ChatGPT ironically relies not on automation, but on cheap, exploitable, racialised labour to do their dirty workOpenAI Used Kenyan Workers on Less Than $2 Per Hour to Make ChatGPT Less Toxic in order to create apparently, “the best artificial intelligence chatbot”.
See: OpenAI Used Kenyan Workers on Less Than $2 Per Hour to Make ChatGPT Less Toxic at Time (content warning: the article contains descriptions of sexual abuse).