Time and time again, big tech companies have shown their ability and power to (mis)represent and (re)shape our digital world. From speech, to images, and most recently, to the emojis that we regularly use.
This Rest of World article highlights how Apple’s emoji keyboard – an emoji search bar in the iMessage feature that allows users to search for emojis by typing related search terms – tends to reinforce Western stereotypes, while inaccurately representing other countries such as Africa and China. When a user searches for ‘Africa’ the emoji results returns a hut emoji, in contrast to a search for ‘Europe’ which recommends a more diverse set of emoji options. While huts are associated with Africa, reducing an entire continent to a monolithic view, and reinforcing Western stereotypes about certain places and cultures remains problematic.
This raises questions about encoding existing biases into new technological systems and products; something that isn’t new to Apple (“Type ‘ceo’ into your iPhone keyboard for a sexist surprise”), or to Instagram for that matter (“A Search For “Dog” On Instagram Surfaces An Emoji For A Chinese Takeout Box”), and about the lack of transparency of Apple’s language processing systems and how its recommendations are generated.
The question as to whether implementing a technical fix to address representation issues by increasing the types of emojis is sufficient remains murky. Unicode, the consortium made up of (not exclusively) big tech companies, is responsible for standardising emoji usage and is the gatekeeper that approves or rejects new emoji entries. The Rest of World article highlights how Afrocentric designers struggle to have their emojis approved.
These types of misrepresentation and reductionist views are not new, and have a long history in the media too. It should remind us that encoding biases into new tech products can reinforce or shape our worldview in subtle ways, and broader representation requires moving beyond technical fixes.