The answer to that question depends on your skin colour, apparently. An AlgorithmWatch reporter, Nicholas Kayser-Bril, conducted an experiment that went viral on Twitter, showing that Google Vision Cloud (a service which is based on a subset of AI known as “computer vision” that focuses on automated image labelling), labelled an image of a dark-skinned individual holding a thermometer with the word “gun”, whilst a lighter skinned individual was labelled holding an “electronic device”.
Google has since apologised, citing accidental mislabelling and has updated its algorithm. Yet, this is nothing new. In 2015, Google had another incident where a photo of two dark-skinned individuals were “accidentally” tagged as “gorillas”, which still does not appear to be fixed. Moreover, computer vision has proven over time to provide discriminatory and racist outputs. In reality, these “accidents” and labelling “errors” have tangible consequences for individuals.