Meta forced to change its advertisement algorithm to address algorithmic discrimination

In his New York Times article, Mike Isaac describes how Meta is implementing a new system to automatically check whether the housing, employment and credit ads it hosts are shown to people equally. This is a move following a 111,054 US dollar fine the US Justice Department has issued Meta because its ad systems have been shown to discriminate its users by, amongst other things, excluding black people from seeing certain housing ads in predominately white neighbourhoods. This is the outcome of a long process, which we have written about previously.

That Meta’s housing ad delivery system is discriminatory was already shown in 2016 in a ProPublica investigation and, the company has repeatedly been criticised for similar discrimination, for example, against women in job ads and based on age in credit card ads. Following the fine, Meta is now, finally, vowing to take action by installing a system that will periodically check whether its ads are shown equally to protected classes of people. It will check this based on the demographics of age, gender and “estimated race or ethnicity”. Previously, Meta had already started restricting the demographics on which advertisers can target, with restrictions on age, gender and postal code.

This latest controversy and promises of betterment show us two important things. Firstly, it takes a tremendous amount of public pressure as well as regulatory activity to make companies such as Meta change their ad systems, no matter how clearly discrimination has been proven. This is because Meta is fundamentally in the business of advertisement, and changing its ad delivery system goes to the core of its business model.

Second, the system Meta will put in place shows a deeper truth about algorithmic discrimination. Refraining from using race or gender as explicit variables or targets is not enough to ensure an algorithmic system does not discriminate. These systems are trained on and work with data that reflect patterns of social injustices and inequality in our societies and are, as such very likely to repeat these patterns. For example, a postal code can be a very accurate proxy for race as well as social economic status. This is most likely why Meta had to chose to actively check its system for discrimination instead of just being able to ‘fix the algorithm’.

See: Meta Agrees to Alter Ad Technology in Settlement With U.S. at the New York Times.

Picture by Jim Wilson for the original New York Times article.

Comments are closed.

Proudly powered by WordPress | Theme: Baskerville 2 by Anders Noren.

Up ↑