Back in 2013, Harvard professor Latanya Sweeney was one of the first people to demonstrate racism (she called it ‘discrimination’) in online algorithms. She did this with her research on the ad delivery practices of Google.
In an article for the Communication of the ACM, Discrimination in Online Ad Delivery (PDF), she showed how online ads suggestive of arrest records appeared more often with searches of black-sounding names than with white-sounding names.
In a very smart experiment, Sweeney first made lists of black-identifying and white-identifying full names. She then searched for these names on Google and on Reuters (who was using Google’s ad delivery network at that point in time), and stored the ads that were displayed with the search results. These were mostly ads for companies that sell public records about people. By closely looking at the shown ad templates she then showed that the word ‘arrest’ was used significantly more in ads for black-identifying names, than for white-identifying names.
In her article, Sweeney is also one of the first people to suggest the use of algorithms to correct against these biases. She can therefore be considered one of the founders of the research field of what is now called ‘algorithmic fairness’.