An article in Nature shows how an AI approach can explain racial disparities in the experience of pain that standard radiographic measures of the severity of pain couldn’t see.
Continue reading “Racist Technology in Action: Deep learning algorithm shows the pain is in the knee after all”Why Stopping Algorithmic Inequality Requires Taking Race Into Account
Let us explain. With cats
By Aaron Sankin and Natasha Uzcátegui-Liggett for The Markup on July 18, 2024
Encode Justice
Encode Justice is the world’s first and largest youth movement for safe, equitable AI. Powered by 1,000 young people across every inhabited continent, we believe AI must be steered in a direction that benefits society.
From Encode Justice
Tech For Palestine
A loose coalition of 4,000+ founders, engineers, product marketers, community builders, investors, and other tech folks working towards Palestinian freedom. We aim to end the dehumanisation of Palestinians within the tech community, and to bring voice to those who speak up.
From Tech For Palestine
Anti-discrimination agencies have launched a single point of contact for reporting discrimination in the Netherlands
There is now a single point to report cases of discrimination. In a simple web form, you can make a report for yourself or somebody else, and you can do this anonymously if you want to.
Continue reading “Anti-discrimination agencies have launched a single point of contact for reporting discrimination in the Netherlands”Los Angeles Becomes First US City to Outlaw Digital Discrimination
A Markup investigation in 2022 found households in L.A.’s poorest neighborhoods were disproportionately asked to pay high prices for slow internet service.
By Aaron Sankin for The Markup on February 1, 2024
Landelijk punt discriminatiezaken
Heb jij of iemand uit je omgeving wel eens met discriminatie te maken? Meld het wél. Jouw ervaring telt. Wij luisteren naar jouw verhaal en helpen je. Je staat er niet alleen voor!
From Landelijk meldpunt discriminatie
Community toolkit for change
Systemic Justice has designed a series of resources to help answer this question and build the knowledge and power of communities and movements fighting for justice: ‘Strategic litigation: A guide for legal action’, ‘Words for justice: A glossary of essential legal terms’, and ‘How can we use the courts: A conversation starter’.
From Systemic Justice on November 13, 2023
Equal love: Dating App Breeze seeks to address Algorithmic Discrimination
In a world where swiping left or right is the main route to love, whose profiles dating apps show you can change the course of your life.
Continue reading “Equal love: Dating App Breeze seeks to address Algorithmic Discrimination”These new tools could make AI vision systems less biased
Two new papers from Sony and Meta describe novel methods to make bias detection fairer.
By Melissa Heikkilä for MIT Technology Review on September 25, 2023
Civil society calls on EU to protect people’s rights in the AI Act ‘trilogue’ negotiations
As EU institutions start decisive meetings on the Artificial Intelligence (AI) Act, a broad civil society coalition is urging them to prioritise people and fundamental rights.
From European Digital Rights (EDRi) on July 12, 2023
How a New Generation Is Combatting Digital Surveillance
Younger voices are using technology to respond to the needs of marginalized communities and nurture Black healing and liberation.
By Kenia Hale, Nate File and Payton Croskey for Boston Review on June 2, 2022
Skin Tone Research @ Google
Introducing the Monk Skin Tone (MST) Scale, one of the ways we are moving AI forward with more inclusive computer vision tools.
From Skin Tone at Google
Consensus and subjectivity of skin tone annotation for ML fairness
Skin tone is an observable characteristic that is subjective, perceived differently by individuals (e.g., depending on their location or culture) and thus is complicated to annotate. That said, the ability to reliably and accurately annotate skin tone is highly important in computer vision. This became apparent in 2018, when the Gender Shades study highlighted that computer vision systems struggled to detect people with darker skin tones, and performed particularly poorly for women with darker skin tones. The study highlights the importance for computer researchers and practitioners to evaluate their technologies across the full range of skin tones and at intersections of identities. Beyond evaluating model performance on skin tone, skin tone annotations enable researchers to measure diversity and representation in image retrieval systems, dataset collection, and image generation. For all of these applications, a collection of meaningful and inclusive skin tone annotations is key.
By Candice Schumann and Gbolahan O. Olanubi for Google AI Blog on May 15, 2023
Data & Society Announces the Launch of its Algorithmic Impact Methods Lab
Lab will advance assessments of AI systems in the public interest.
From Data & Society on May 10, 2023
Stories of everyday life with AI in the global majority
This collection by the Data & Society Research Institute sheds an intimate and grounded light on what impact AI-systems can have. The guiding question that connects all of the 13 non-fiction pieces in Parables of AI in/from the Majority world: An Anthology is what stories can be told about a world in which solving societal issues is more and more dependent on AI-based and data-driven technologies? The book, edited by Rigoberto Lara Guzmán, Ranjit Singh and Patrick Davison, through narrating ordinary, everyday experiences in the majority world, slowly disentangles the global and unequally distributed impact of digital technologies.
Continue reading “Stories of everyday life with AI in the global majority”She’s working to make German tech more inclusive
Nakeema Stefflbauer is bringing women from underrepresented backgrounds into the Berlin tech scene.
By Gouri Sharma and Nakeema Stefflbauer for MIT Technology Review on February 21, 2023
Meet The Former Black Twitter Workers Behind New Social Platform Spill
Social media app, Spill, designed by former Twitter employees, Alphonzo “Phonz” Terrell and DeVaris Brown, is becoming the chosen alternative for many.
By Kumba Kpakima for POCIT on December 21, 2022
HRDAG – Human Rights Data Analysis Group
We show that statistics have human consequences.
From Human Rights Data Analysis Group
Better Images of AI
We are a non-profit creating more realistic and inclusive images of artificial intelligence. Visit our growing repository available for anyone to use for free under CC licences, or just to use as inspiration for more helpful and diverse representations of AI.
From Better Images of AI
Auto-detecting racist language in housing documents
DoNotPay is a ‘robot lawyer’ service, allowing its customers (regular citizens) to automatically do things like fighting parking tickets, getting refunds on flight tickets, or auto-cancelling their free trials. Earlier this year, it expanded its service to include finding and helping remove racist language in housing documents.
Continue reading “Auto-detecting racist language in housing documents”Listen to Sennay Ghebreab for clarity about what AI should and shouldn’t do
Sennay Ghebreab, head of the Civic AI Lab which aims to develop AI in a socially inclusive manner, was interviewed by Kustaw Bessems for the Volkskrant podcast Stuurloos (in Dutch).
Continue reading “Listen to Sennay Ghebreab for clarity about what AI should and shouldn’t do”Met kunstmatige intelligentie kun je ook iets goeds doen.
Je kunt al snel denken dat kunstmatige intelligentie alleen maar iets is om voor op te passen. Een machtig wapen in handen van de overheid of van techbedrijven die zich schuldig maken aan privacyschending, discriminatie of onterechte straffen. Maar we kunnen met algoritmen juist problemen oplossen en werken aan een rechtvaardiger wereld, zegt informaticus Sennay Ghebreab van het Civic AI Lab tegen Kustaw Bessems. Dan moeten we wel de basis een beetje snappen én er meer over te zeggen hebben.
By Kustaw Bessems and Sennay Ghebreab for Volkskrant on September 11, 2022
AutoCheck workshops on Automated Decision-Making Systems and Discrimination
Understanding causes, recognizing cases, supporting those affected: documents for implementing a workshop.
By Waldemar Kesler for AlgorithmWatch on September 7, 2022
Californians Can Now Auto-Detect and Remove Racist Language in Housing Docs
DoNotPay, which bills itself as a “robot lawyer,” has developed an automated way for people to remove racist language from real estate documents.
By Maxwell Strachan for VICE on July 28, 2022
A guidebook on how to combat algorithmic discrimination
What is algorithmic discrimination, how is it caused and what can be done about it? These are the questions that are addressed in AlgorithmWatch’s newly published report Automated Decision-Making Systems and Discrimination.
Continue reading “A guidebook on how to combat algorithmic discrimination”How to combat algorithmic discrimination? A guidebook by AutoCheck
We are faced with automated decision-making systems almost every day and they might be discriminating, without us even knowing about it. A new guidebook helps to better recognize such cases and support those affected.
From AlgorithmWatch on June 21, 2022
‘Smart’ techologies to detect racist chants at Dutch football matches
The KNVB (Royal Dutch Football Association) is taking a tech approach at tackling racist fan behaviour during matches, an approach that stands a great risk of falling in the techno solutionism trap.
Continue reading “‘Smart’ techologies to detect racist chants at Dutch football matches”Centring communities in the fight against injustice
In this interview with OneWorld, Nani Jansen Reventlow reflects on the harmful uses of technology, perpetuated by private and public actors. Ranging from the Dutch child benefits scandal, to the use of proctoring in education and to ‘super SyRI’ in public services.
Continue reading “Centring communities in the fight against injustice”Pilot met slimme technologie tegen discriminerende spreekkoren
Met als doel discriminerende spreekkoren in stadions te bestrijden, is een pilot van start gegaan met slimme technologie. Tot nu toe schoten bijvoorbeeld beschikbare videobeelden in combinatie met geluidsopnames te vaak tekort als bewijsmateriaal. In het kader van ‘Ons voetbal is van iedereen’, een gezamenlijk plan van de Rijksoverheid en het voetbal, is het bedrijfsleven uitgedaagd om in samenwerking met Betaald Voetbal Organisaties (BVO) met concrete oplossingen te komen. Met de pilot gaat deze challenge een nieuwe fase in.
From KNVB.nl on June 1, 2022
Don’t miss this 4-part journalism series on ‘AI Colonialism’
The MIT Technology Review has written a four-part series on how the impact of AI is “repeating the patterns of colonial history.” The Review is careful not to directly compare the current situation with the colonialist capturing of land, extraction of resources, and exploitation of people. Yet, they clearly show that AI does further enrich the wealthy at the tremendous expense of the poor.
Continue reading “Don’t miss this 4-part journalism series on ‘AI Colonialism’”Centering social injustice, de-centering tech
The Racism and Technology Center organised a panel titled Centering social injustice, de-centering tech: The case of the Dutch child benefits scandal and beyond at Privacy Camp 2022, a conference that brings together digital rights advocates, activists, academics and policymakers. Together with Merel Koning (Amnesty International), Nadia Benaissa (Bits of Freedom) and Sanne Stevens (Justice, Equity and Technology Table), the discussion used the Dutch child benefits scandal as an example to highlight issues of deeply rooted racism and discrimination in the public sector. The fixation on algorithms and automated decision-making systems tends to obscure these fundamental problems. Often, the use of technology by governments functions to normalise and rationalise existing racist and classist practices.
Continue reading “Centering social injustice, de-centering tech”Nani Jansen Reventlow receives Dutch prize for championing privacy and digital rights
The Dutch digital rights NGO Bits of Freedom has awarded Nani Jansen Reventlow the “Felipe Rodriguez Award” for her outstanding work championing digital rights and her crucial efforts in decolonising the field. In this (Dutch language) podcast she is interviewed by Bits of Freedom’s Inge Wannet about her strategic litigation work and her ongoing fight to decolonise the digital rights field.
Continue reading “Nani Jansen Reventlow receives Dutch prize for championing privacy and digital rights”Decolonising Digital Rights: The Challenge of Centring Relations and Trust
The Decolonising Digital Rights project is a collaborative design process to build a decolonising programme for the European digital rights field. Together, 30 participants are working to envision and build toward a decolonised field. This blog post charts the progress, learnings and challenges of the process so far.
By Laurence Meyer for Digital Freedom Fund on December 27, 2021
Two new technology initiatives focused on (racial) justice
We are happy to see that more and more attention is being paid to how technology intersects with problems around (racial) justice. Recently two new initiatives have launched that we would like to highlight.
Continue reading “Two new technology initiatives focused on (racial) justice”Dutch Scientific Council knows: AI is neither neutral nor always rational
AI should be seen as a new system technology, according to The Netherlands Scientific Council for Government Policy, meaning that its impact is large, affects the whole of society, and is hard to predict. In their new Mission AI report, the Council lists five challenges for successfully embedding system technologies in society, leading to ten recommendations for governments.
Continue reading “Dutch Scientific Council knows: AI is neither neutral nor always rational”Digital Rights for All: harmed communities should be front and centre
Earlier this month, Digital Freedom Fund kicked off a series of online workshops of the ‘Digital Rights for All’ programme. In this post, Laurence Meyer details the reasons for this initiative with the fundamental aim of addressing why individuals and communities most affected by the harms of technologies are not centred in the advocacy, policy, and strategic litigation work on digital rights in Europe, and how to tackle challenges around funding, sustainable collaborations and language barriers.
Continue reading “Digital Rights for All: harmed communities should be front and centre”AI projects to tackle racial inequality in UK healthcare, says Javid
Health secretary signs up to hi-tech schemes countering health disparities and reflecting minority ethnic groups’ data.
By Andrew Gregory for The Guardian on October 20, 2021
Building a More Equitable Camera
Pictures are deeply personal and play an important role in shaping how people see you and how you see yourself. But historical biases in the medium of photography have carried through to some of today’s camera technologies, leading to tools that haven’t seen people of color as they want and ought to be seen.
From YouTube on May 18, 2021
Skin in the frame: black photographers welcome Google initiative
Attempt to tackle racial bias long overdue say practitioners, but it’s not just about the equipment.
By Aamna Mohdin for The Guardian on May 28, 2021