Google’s mistake with Gemini

You have probably heard that Google had to suspend it’s Gemini image feature after showing people black Nazis and female popes. Well I have a simple explanation for what happened here. Namely, the folks at Google wanted to avoid an embarrassment that they’d been involved with multiple times, and seen others get involved with, namely the “pale male” dataset problem, that happens especially at tech companies dominated by white men, and ironically, especially especially at tech companies dominated by white men who are careful about privacy, because then they only collect pictures of people who give consent, which is typically people who work there! See for example this webpage, or Safiya Noble’s entire book.

By Cathy O’Neil for mathbabe on March 12, 2024

Google does performative identity politics, nonpologises, pauses their efforts, and will invariably move on to its next shitty moneymaking move

In a shallow attempt to do representation for representation’s sake, Google has managed to draw the ire of the right-wing internet by generating historically inaccurate and overly inclusive portraits of historical figures.

Continue reading “Google does performative identity politics, nonpologises, pauses their efforts, and will invariably move on to its next shitty moneymaking move”

Racist Technology in Action: Image recognition is still not capable of differentiating gorillas from Black people

If this title feels like a deja-vu it is because you most likely have, in fact, seen this before (perhaps even in our newsletter). It was back in 2015 that the controversy first arose when Google released image recognition software that kept mislabelling Black people as gorillas (read here and here).

Continue reading “Racist Technology in Action: Image recognition is still not capable of differentiating gorillas from Black people”

Consensus and subjectivity of skin tone annotation for ML fairness

Skin tone is an observable characteristic that is subjective, perceived differently by individuals (e.g., depending on their location or culture) and thus is complicated to annotate. That said, the ability to reliably and accurately annotate skin tone is highly important in computer vision. This became apparent in 2018, when the Gender Shades study highlighted that computer vision systems struggled to detect people with darker skin tones, and performed particularly poorly for women with darker skin tones. The study highlights the importance for computer researchers and practitioners to evaluate their technologies across the full range of skin tones and at intersections of identities. Beyond evaluating model performance on skin tone, skin tone annotations enable researchers to measure diversity and representation in image retrieval systems, dataset collection, and image generation. For all of these applications, a collection of meaningful and inclusive skin tone annotations is key.

By Candice Schumann and Gbolahan O. Olanubi for Google AI Blog on May 15, 2023

You Are Not a Parrot

You are not a parrot. And a chatbot is not a human. And a linguist named Emily M. Bender is very worried what will happen when we forget this.

By Elizabeth Weil and Emily M. Bender for New York Magazine on March 1, 2023

Algorithmic power and African indigenous languages: search engine autocomplete and the global multilingual Internet

Predictive language technologies – such as Google Search’s Autocomplete – constitute forms of algorithmic power that reflect and compound global power imbalances between Western technology companies and multilingual Internet users in the global South. Increasing attention is being paid to predictive language technologies and their impacts on individual users and public discourse. However, there is a lack of scholarship on how such technologies interact with African languages. Addressing this gap, the article presents data from experimentation with autocomplete predictions/suggestions for gendered or politicised keywords in Amharic, Kiswahili and Somali. It demonstrates that autocomplete functions for these languages and how users may be exposed to harmful content due to an apparent lack of filtering of problematic ‘predictions’. Drawing on debates on algorithmic power and digital colonialism, the article demonstrates that global power imbalances manifest here not through a lack of online African indigenous language content, but rather in regard to the moderation of content across diverse cultural and linguistic contexts. This raises dilemmas for actors invested in the multilingual Internet between risks of digital surveillance and effective platform oversight, which could prevent algorithmic harms to users engaging with platforms in a myriad of languages and diverse socio-cultural and political environments.

By Peter Chonka, Stephanie Diepeveen and Yidnekachew Haile for SAGE Journals on June 22, 2022

We leven helaas nog steeds in een ­wereld waarin huidskleur een probleem is

Papa, mag ik die huidskleur?’ Verbaasd keek ik op van de kleurplaat die ik aan het inkleuren was, om mijn dochter te zien wijzen naar een stift met een perzikachtige kleur. Of misschien had die meer de kleur van een abrikoos. Afijn, de stift had in ieder geval niet háár huidskleur. Mijn dochter mag dan wel twee tinten lichter van kleur zijn dan ik, toch is zij overduidelijk bruin.

By Ilyaz Nasrulla for Trouw on September 23, 2021

Long overdue: Google has improved its camera app to work better for Black people

The following short video by Vox shows how white skin has always been the norm in photography. Black people didn’t start to look good on film until in the 1970s furniture makers complained to Kodak that their film didn’t render the difference between dark and light grained wood, and chocolate companies were upset that you couldn’t see the difference between dark and light chocolate.

Continue reading “Long overdue: Google has improved its camera app to work better for Black people”

Building a More Equitable Camera

Pictures are deeply personal and play an important role in shaping how people see you and how you see yourself. But historical biases in the medium of photography have carried through to some of today’s camera technologies, leading to tools that haven’t seen people of color as they want and ought to be seen.

From YouTube on May 18, 2021

Google blocks advertisers from targeting Black Lives Matter

In this piece for Markup, Leon Yin and Aaron Sankin expose how Google bans advertisers from targeting terms such as “Black lives matter”, “antifascist” or “Muslim fashion”. At the same time, keywords such as “White lives matter” or “Christian fashion” are not banned. When they raised this striking discrepancy with Google, its response was to fix the discrepancies between religions and races by blocking all such terms, as well as by blocking even more social justice related keywords such as “I can’t breathe” or “LGBTQ”. Blocking these terms for ad placement can reduce the revenue for YouTuber’s fighting for these causes. Yin and Sankin place this policy in stark contrast to Google’s support for the Black Lives Matter movement.

Continue reading “Google blocks advertisers from targeting Black Lives Matter”

Youtube blocks advertisers from targeting

For a Markup feature, Leon Yin and Aaron Sankin compiled a list of “social and racial justice terms” with help from Color of Change, Media Justice, Mijente and Muslim Advocates, then checked if YouTube would let them target those terms for ads.

By Cory Doctorow for Pluralistic on April 10, 2021

The Fort Rodman Experiment

In 1965, IBM launched the most ambitious attempt ever to diversify a tech company. The industry still needs to learn the lessons of that failure.

By Charlton McIlwain for Logic on December 20, 2021

Filtering out the “Asians”

The article’s title speaks for itself, “Your iPhone’s Adult Content Filter Blocks Anything ‘Asian’”. Victoria Song has tested the claims made by The Independent: if you enable the “Limit Adult Websites” function in your iPhone’s Screen Time setting, then you are blocked from seeing any Google search results for “Asian”. Related searches such as “Asian recipes,” or “Southeast Asian,” are also blocked by the adult content filter. There is no clarity or transparency to how search terms are considered adult content or not, and whether the process is automated or done manually. Regardless of intention, the outcome and the lack of action by Google or Apple is unsurprising but disconcerting. It is far from a mistake, but rather, a feature of their commercial practices and their disregard to the social harms of their business model.

Continue reading “Filtering out the “Asians””

Proudly powered by WordPress | Theme: Baskerville 2 by Anders Noren.

Up ↑