Tech Workers’ Testimonies: Stories of Suppression of Palestinian Advocacy in the Workplace

The Arab Center for the Advancement of Social Media has released a new report titled, “Delete the Issue: Tech Worker Testimonies on Palestinian Advocacy and Workplace Suppression.” The report, the first of its kind, shares testimonies gathered from current and former employees in major technology companies, including Meta, Google, PayPal, Microsoft, LinkedIn, and Cisco. It highlights their experiences supporting Palestinian rights in the workplace and the companies’ efforts to restrict freedom of expression on the matter.

From 7amleh on November 11, 2024

Tech companies’ complicity in the ongoing genocide in Gaza and Palestine

As I write this piece, an Israeli airstrike has hit makeshift tents near Al-Aqsa Martyrs Hospital in Deir al Balah, burning tents and people alive. The Israeli military bombed an aid distribution point in Jabalia, wounding 50 casualties who were waiting for flour. The entire north of Gaza has been besieged by the Israeli Occupying Forces for the past 10 days, trapping 400,000 Palestinians without food, drink, and medical supplies. Every day since last October, Israel, with the help of its western allies, intensifies its assault on Palestine, each time pushing the boundaries of what is comprehensible. There are no moral or legal boundaries Israel, and its allies, will not cross. The systematic ethnic cleansing of Palestine, which has been the basis of the settler-colonial Zionist project since its inception, has accelerated since 7th October 2023. From Palestine to Lebanon, Syria and Yemen, Israel and its allies continue their violence with impunity. Meanwhile, mainstream western news media are either silent in their reporting or complicit in abetting the ongoing destruction of the Palestinian people and the resistance.

Continue reading “Tech companies’ complicity in the ongoing genocide in Gaza and Palestine”

Tech workers demand Google and Amazon to stop their complicity in Israel’s genocide against the Palestinian people

Since 2021, thousands of Amazon and Google tech workers have been organising against Project Nimbus, Google and Amazon’s shared USD$1.2 billion contract with the Israeli government and military. Since then, there has been no response from management or executive. Their organising efforts have accelerated since 7 October 2023, with the ongoing genocide on Gaza and occupied Palestinian territories by the Israeli state.

Continue reading “Tech workers demand Google and Amazon to stop their complicity in Israel’s genocide against the Palestinian people”

Google’s mistake with Gemini

You have probably heard that Google had to suspend it’s Gemini image feature after showing people black Nazis and female popes. Well I have a simple explanation for what happened here. Namely, the folks at Google wanted to avoid an embarrassment that they’d been involved with multiple times, and seen others get involved with, namely the “pale male” dataset problem, that happens especially at tech companies dominated by white men, and ironically, especially especially at tech companies dominated by white men who are careful about privacy, because then they only collect pictures of people who give consent, which is typically people who work there! See for example this webpage, or Safiya Noble’s entire book.

By Cathy O’Neil for mathbabe on March 12, 2024

Google does performative identity politics, nonpologises, pauses their efforts, and will invariably move on to its next shitty moneymaking move

In a shallow attempt to do representation for representation’s sake, Google has managed to draw the ire of the right-wing internet by generating historically inaccurate and overly inclusive portraits of historical figures.

Continue reading “Google does performative identity politics, nonpologises, pauses their efforts, and will invariably move on to its next shitty moneymaking move”

Racist Technology in Action: Image recognition is still not capable of differentiating gorillas from Black people

If this title feels like a deja-vu it is because you most likely have, in fact, seen this before (perhaps even in our newsletter). It was back in 2015 that the controversy first arose when Google released image recognition software that kept mislabelling Black people as gorillas (read here and here).

Continue reading “Racist Technology in Action: Image recognition is still not capable of differentiating gorillas from Black people”

Consensus and subjectivity of skin tone annotation for ML fairness

Skin tone is an observable characteristic that is subjective, perceived differently by individuals (e.g., depending on their location or culture) and thus is complicated to annotate. That said, the ability to reliably and accurately annotate skin tone is highly important in computer vision. This became apparent in 2018, when the Gender Shades study highlighted that computer vision systems struggled to detect people with darker skin tones, and performed particularly poorly for women with darker skin tones. The study highlights the importance for computer researchers and practitioners to evaluate their technologies across the full range of skin tones and at intersections of identities. Beyond evaluating model performance on skin tone, skin tone annotations enable researchers to measure diversity and representation in image retrieval systems, dataset collection, and image generation. For all of these applications, a collection of meaningful and inclusive skin tone annotations is key.

By Candice Schumann and Gbolahan O. Olanubi for Google AI Blog on May 15, 2023

You Are Not a Parrot

You are not a parrot. And a chatbot is not a human. And a linguist named Emily M. Bender is very worried what will happen when we forget this.

By Elizabeth Weil and Emily M. Bender for New York Magazine on March 1, 2023

Algorithmic power and African indigenous languages: search engine autocomplete and the global multilingual Internet

Predictive language technologies – such as Google Search’s Autocomplete – constitute forms of algorithmic power that reflect and compound global power imbalances between Western technology companies and multilingual Internet users in the global South. Increasing attention is being paid to predictive language technologies and their impacts on individual users and public discourse. However, there is a lack of scholarship on how such technologies interact with African languages. Addressing this gap, the article presents data from experimentation with autocomplete predictions/suggestions for gendered or politicised keywords in Amharic, Kiswahili and Somali. It demonstrates that autocomplete functions for these languages and how users may be exposed to harmful content due to an apparent lack of filtering of problematic ‘predictions’. Drawing on debates on algorithmic power and digital colonialism, the article demonstrates that global power imbalances manifest here not through a lack of online African indigenous language content, but rather in regard to the moderation of content across diverse cultural and linguistic contexts. This raises dilemmas for actors invested in the multilingual Internet between risks of digital surveillance and effective platform oversight, which could prevent algorithmic harms to users engaging with platforms in a myriad of languages and diverse socio-cultural and political environments.

By Peter Chonka, Stephanie Diepeveen and Yidnekachew Haile for SAGE Journals on June 22, 2022

We leven helaas nog steeds in een ­wereld waarin huidskleur een probleem is

Papa, mag ik die huidskleur?’ Verbaasd keek ik op van de kleurplaat die ik aan het inkleuren was, om mijn dochter te zien wijzen naar een stift met een perzikachtige kleur. Of misschien had die meer de kleur van een abrikoos. Afijn, de stift had in ieder geval niet háár huidskleur. Mijn dochter mag dan wel twee tinten lichter van kleur zijn dan ik, toch is zij overduidelijk bruin.

By Ilyaz Nasrulla for Trouw on September 23, 2021

Long overdue: Google has improved its camera app to work better for Black people

The following short video by Vox shows how white skin has always been the norm in photography. Black people didn’t start to look good on film until in the 1970s furniture makers complained to Kodak that their film didn’t render the difference between dark and light grained wood, and chocolate companies were upset that you couldn’t see the difference between dark and light chocolate.

Continue reading “Long overdue: Google has improved its camera app to work better for Black people”

Building a More Equitable Camera

Pictures are deeply personal and play an important role in shaping how people see you and how you see yourself. But historical biases in the medium of photography have carried through to some of today’s camera technologies, leading to tools that haven’t seen people of color as they want and ought to be seen.

From YouTube on May 18, 2021

Proudly powered by WordPress | Theme: Baskerville 2 by Anders Noren.

Up ↑