Racist Techology in Action: Beauty is in the eye of the AI

Where people’s notion of beauty is often steeped in cultural preferences or plain prejudice, the objectivity of an AI-system would surely allow it to access a more universal conception of beauty – or so thought the developers of Beauty.AI. Alex Zhavoronkov, who consulted in the development of the Beaut.AI-system, described the dystopian motivation behind the system clearly: “Humans are generally biased and there needs to be a robot to provide an impartial opinion. Beauty.AI is the first step in a much larger story, in which a mobile app trained to evaluate perception of human appearance will evolve into a caring personal assistant to help users look their best and retain their youthful looks.”

In 2016, the coalition behind Beauty.AI set out to use this ‘impartial’ system to judge a global beauty contest. Contestants send in a photo which the system used to rate their beauty based on criteria such as symmetry, wrinkles, and age group. However, it did not stop there as skin colour was an explicit criterion and the system was apparently able to gather ethnicity and gender from the photos as these were criteria used to judge contestants’ beauty. Jordan Pearson, in a story for Motherboard, described the results:

Out of the 44 people that the algorithms judged to be the most “attractive,” all of the finalists were white except for six who were Asian. Only one finalist had visibly dark skin.

The clear racial preference of the system can be explained by the biased training data and homogenous group of system developers as well as the fact that skin colour was used as a criterion to begin with. One can only imagine how this criterion was concretely weighed in the system. The problem of biased training data is one we have seen many times before in systems using any type of facial recognition software (here, here and here).

However, this example of a racist technology also points towards deeper problems. The basic notion that a concept such as beauty can be judged ‘objectively’ if only the right algorithm is found is fundamentally flawed, even if deeply problematic categories such as ethnicity or gender are removed from the equation (which cannot be ascertained based on appearance to begin with). And, even if it would be technically possible, these should never factor into a conception of beauty.

Comments are closed.

Proudly powered by WordPress | Theme: Baskerville 2 by Anders Noren.

Up ↑