Tech For Palestine

A loose coalition of 4,000+ founders, engineers, product marketers, community builders, investors, and other tech folks working towards Palestinian freedom. We aim to end the dehumanisation of Palestinians within the tech community, and to bring voice to those who speak up.

From Tech For Palestine

Landelijk punt discriminatiezaken

Heb jij of iemand uit je omgeving wel eens met discriminatie te maken? Meld het wél. Jouw ervaring telt. Wij luisteren naar jouw verhaal en helpen je. Je staat er niet alleen voor!

From Landelijk meldpunt discriminatie

Community toolkit for change

Systemic Justice has designed a series of resources to help answer this question and build the knowledge and power of communities and movements fighting for justice: ‘Strategic litigation: A guide for legal action’, ‘Words for justice: A glossary of essential legal terms’, and ‘How can we use the courts: A conversation starter’.

From Systemic Justice on November 13, 2023

Consensus and subjectivity of skin tone annotation for ML fairness

Skin tone is an observable characteristic that is subjective, perceived differently by individuals (e.g., depending on their location or culture) and thus is complicated to annotate. That said, the ability to reliably and accurately annotate skin tone is highly important in computer vision. This became apparent in 2018, when the Gender Shades study highlighted that computer vision systems struggled to detect people with darker skin tones, and performed particularly poorly for women with darker skin tones. The study highlights the importance for computer researchers and practitioners to evaluate their technologies across the full range of skin tones and at intersections of identities. Beyond evaluating model performance on skin tone, skin tone annotations enable researchers to measure diversity and representation in image retrieval systems, dataset collection, and image generation. For all of these applications, a collection of meaningful and inclusive skin tone annotations is key.

By Candice Schumann and Gbolahan O. Olanubi for Google AI Blog on May 15, 2023

Stories of everyday life with AI in the global majority

This collection by the Data & Society Research Institute sheds an intimate and grounded light on what impact AI-systems can have. The guiding question that connects all of the 13 non-fiction pieces in Parables of AI in/from the Majority world: An Anthology is what stories can be told about a world in which solving societal issues is more and more dependent on AI-based and data-driven technologies? The book, edited by Rigoberto Lara Guzmán, Ranjit Singh and Patrick Davison, through narrating ordinary, everyday experiences in the majority world, slowly disentangles the global and unequally distributed impact of digital technologies.

Continue reading “Stories of everyday life with AI in the global majority”

Better Images of AI

We are a non-profit creating more realistic and inclusive images of artificial intelligence. Visit our growing repository available for anyone to use for free under CC licences, or just to use as inspiration for more helpful and diverse representations of AI.

From Better Images of AI

Met kunstmatige intelligentie kun je ook iets goeds doen.

Je kunt al snel denken dat kunstmatige intelligentie alleen maar iets is om voor op te passen. Een machtig wapen in handen van de overheid of van techbedrijven die zich schuldig maken aan privacyschending, discriminatie of onterechte straffen. Maar we kunnen met algoritmen juist problemen oplossen en werken aan een rechtvaardiger wereld, zegt informaticus Sennay Ghebreab van het Civic AI Lab tegen Kustaw Bessems. Dan moeten we wel de basis een beetje snappen én er meer over te zeggen hebben.

By Kustaw Bessems and Sennay Ghebreab for Volkskrant on September 11, 2022

Pilot met slimme technologie tegen discriminerende spreekkoren

Met als doel discriminerende spreekkoren in stadions te bestrijden, is een pilot van start gegaan met slimme technologie. Tot nu toe schoten bijvoorbeeld beschikbare videobeelden in combinatie met geluidsopnames te vaak tekort als bewijsmateriaal. In het kader van ‘Ons voetbal is van iedereen’, een gezamenlijk plan van de Rijksoverheid en het voetbal, is het bedrijfsleven uitgedaagd om in samenwerking met Betaald Voetbal Organisaties (BVO) met concrete oplossingen te komen. Met de pilot gaat deze challenge een nieuwe fase in.

From KNVB.nl on June 1, 2022

Don’t miss this 4-part journalism series on ‘AI Colonialism’

The MIT Technology Review has written a four-part series on how the impact of AI is “repeating the patterns of colonial history.” The Review is careful not to directly compare the current situation with the colonialist capturing of land, extraction of resources, and exploitation of people. Yet, they clearly show that AI does further enrich the wealthy at the tremendous expense of the poor.

Continue reading “Don’t miss this 4-part journalism series on ‘AI Colonialism’”

Centering social injustice, de-centering tech

The Racism and Technology Center organised a panel titled Centering social injustice, de-centering tech: The case of the Dutch child benefits scandal and beyond at Privacy Camp 2022, a conference that brings together digital rights advocates, activists, academics and policymakers. Together with Merel Koning (Amnesty International), Nadia Benaissa (Bits of Freedom) and Sanne Stevens (Justice, Equity and Technology Table), the discussion used the Dutch child benefits scandal as an example to highlight issues of deeply rooted racism and discrimination in the public sector. The fixation on algorithms and automated decision-making systems tends to obscure these fundamental problems. Often, the use of technology by governments functions to normalise and rationalise existing racist and classist practices.

Continue reading “Centering social injustice, de-centering tech”

Nani Jansen Reventlow receives Dutch prize for championing privacy and digital rights

The Dutch digital rights NGO Bits of Freedom has awarded Nani Jansen Reventlow the “Felipe Rodriguez Award” for her outstanding work championing digital rights and her crucial efforts in decolonising the field. In this (Dutch language) podcast she is interviewed by Bits of Freedom’s Inge Wannet about her strategic litigation work and her ongoing fight to decolonise the digital rights field.

Continue reading “Nani Jansen Reventlow receives Dutch prize for championing privacy and digital rights”

Decolonising Digital Rights: The Challenge of Centring Relations and Trust

The Decolonising Digital Rights project is a collaborative design process to build a decolonising programme for the European digital rights field. Together, 30 participants are working to envision and build toward a decolonised field. This blog post charts the progress, learnings and challenges of the process so far.

By Laurence Meyer for Digital Freedom Fund on December 27, 2021

Dutch Scientific Council knows: AI is neither neutral nor always rational

AI should be seen as a new system technology, according to The Netherlands Scientific Council for Government Policy, meaning that its impact is large, affects the whole of society, and is hard to predict. In their new Mission AI report, the Council lists five challenges for successfully embedding system technologies in society, leading to ten recommendations for governments.

Continue reading “Dutch Scientific Council knows: AI is neither neutral nor always rational”

Digital Rights for All: harmed communities should be front and centre

Earlier this month, Digital Freedom Fund kicked off a series of online workshops of the ‘Digital Rights for All’ programme. In this post, Laurence Meyer details the reasons for this initiative with the fundamental aim of addressing why individuals and communities most affected by the harms of technologies are not centred in the advocacy, policy, and strategic litigation work on digital rights in Europe, and how to tackle challenges around funding, sustainable collaborations and language barriers.

Continue reading “Digital Rights for All: harmed communities should be front and centre”

Building a More Equitable Camera

Pictures are deeply personal and play an important role in shaping how people see you and how you see yourself. But historical biases in the medium of photography have carried through to some of today’s camera technologies, leading to tools that haven’t seen people of color as they want and ought to be seen.

From YouTube on May 18, 2021

Proudly powered by WordPress | Theme: Baskerville 2 by Anders Noren.

Up ↑