The current wave of reporting on the AI-bubble has one advantage: it also creates a bit of space in the media to write about how AI reflects the existing inequities in our society.
Continue reading “Work related to the Racism and Technology Center is getting media attention”Denmark’s welfare fraud system reflects a deeply racist and exclusionary society
As part of a series of investigative reporting by Lighthouse Reports and WIRED, Gabriel Geiger has revealed some of the findings about the use of welfare fraud algorithms in Denmark. This comes in the trajectory of the increasing use of algorithmic systems to detect welfare fraud across European cities, or at least systems which are currently known.
Continue reading “Denmark’s welfare fraud system reflects a deeply racist and exclusionary society”Kunstmatige intelligentie moet in de pas marcheren van mensenrechten
Nederland wil graag een voorloper zijn in het gebruik van kunstmatige intelligentie in militaire situaties. Deze technologie kan echter leiden tot racisme en discriminatie. In een open brief roepen critici op tot een moratorium op het gebruik van kunstmatige intelligentie. Initiatiefnemer Oumaima Hajri legt uit waarom.
By Oumaima Hajri for De Kanttekening on February 22, 2023
An alliance against military AI
The past week the Dutch goverment hosted and organised the military AI conference REAIM 2023. Together with eight other NGOs we signed an open letter, initated by Oumaima Hajri, that calls on the Dutch government to stop promoting narratives of “innovation” and “opportunities” but, rather, centre the very real and often disparate human impact.
Continue reading “An alliance against military AI”Alliance Against Military AI
Civil society organisations urge the Dutch government to immediately establish a moratorium on developing AI systems in the military domain.
By Oumaima Hajri for Alliantie tegen militaire AI on February 15, 2023
Profiting off Black bodies
Tiera Tanksley’s work seeks to better understand how forms of digitally mediated traumas, such as seeing images of Black people dead and dying on social media, are impacting Black girls’ mental and emotional wellness in the U.S. and Canada. Her fears were confirmed in her findings: Black girls report unprecedented levels of fear, depression, anxiety and chronic stress. Viewing Black people being killed by the state was deeply traumatic, with mental, emotional and physiological effects.
Continue reading “Profiting off Black bodies”Amsterdam’s Top400 project stigmatises and over-criminalises youths
A critical, in depth report on Top400 – a crime prevention project by the Amsterdam municipality – which targets and polices minors (between the ages of 12 to 23) has emphasised the stigmatising, discriminatory, and invasive effects of the Top400 on youths and their families.
Continue reading “Amsterdam’s Top400 project stigmatises and over-criminalises youths”Report: How police surveillance tech reinforces abuses of power
The UK organisation No Tech for Tyrants (NT4T) has published an extensive report on the use of surveillance technologies by the police in the UK, US, Mexico, Brazil, Denmark and India, in collaboration with researchers and activists from these countries. The report, titled “Surveillance Tech Perpetuates Police Abuse of Power” examines the relation between policing and technology through in-depth case studies.
Continue reading “Report: How police surveillance tech reinforces abuses of power”New research report: Top400: A top-down crime prevention strategy in Amsterdam
The advent of predictive policing systems demonstrates an increased interest in more novel forms of data processing for the purpose of crime control. These developments have been the subject of much controversy, as there are significant concerns on the role these technologies play in shaping life chances and opportunities for individuals and different groups in society.
By Fieke Jansen for Data Justive Lab on November 17, 2022
Kritiek op Eberhard van der Laans Top 400: ‘Moeders weten nog steeds niet waarom hun zonen op die lijst staan’
In Moeders – donderdag in première op Idfa – hekelt Nirit Peled de Top 400, een lijst met daarop namen van Amsterdamse jongeren die dreigen af te glijden in de serieuze criminaliteit.
By David Hielkema and Nirit Peled for Het Parool on November 9, 2022
Surveillance Tech Perpeptuates Police Abuse of Power
Among global movements to reckon with police powers, a new report from UK research group No Tech For Tyrants unveils how police use surveillance technology to abuse power around the world.
From No Tech for Tyrants on November 7, 2022
The devastating consequences of risk based profiling by the Dutch police
Diana Sardjoe writes for Fair Trials about how her sons were profiled by the Amsterdam police on the basis of risk models (a form of predictive policing) called ‘Top600’ (for adults) and ‘Top400’ for people aged 12 to 23). Because of this profiling her sons were “continually monitored and harassed by police.”
Continue reading “The devastating consequences of risk based profiling by the Dutch police”My sons were profiled by a racist predictive policing system — the AI Act must prohibit these systems
When I found out my sons were placed on lists called the ‘Top 600’ and the ‘Top 400’ by the local Amsterdam council, I thought I was finally getting help. The council says the purpose of these lists, created by predictive and profiling systems, is to identify and give young people who have been in contact with the police “extra attention from the council and organisations such as the police, local public health service and youth protection,” to prevent them from coming into contact with police again. This could not have been further from the truth.
By Diana Sardjoe for Medium on September 28, 2022
NoTechFor: Forced Assimilation
Following the terror attack in Denmark of 2015, the state amped upits data analytics capabilities for counter-terrorism within the police and their Danish Security and Intelligence Service (PET). Denmark, a country which hosts an established, normalised, and widely accepted public surveillance infrastructure – justified in service of public health and greater centralisation and coordination between government and municipalities in delivery of citizen services – also boasts an intelligence service with extraordinarily expansive surveillance capabilities, and the enjoyment of wide exemptions from data protection regulations.
From No Tech for Tyrants on July 13, 2020
Shocking report by the Algemene Rekenkamer: state algorithms are a shitshow
The Algemene Rekenkamer (Netherlands Court of Audit) looked into nine different algorithms used by the Dutch state. It found that only three of them fulfilled the most basic of requirements.
Continue reading “Shocking report by the Algemene Rekenkamer: state algorithms are a shitshow”The Dutch government wants to continue to spy on activists’ social media
Investigative journalism of the NRC brought to light that the Dutch NCTV (the National Coordinator for Counterterrorism and Security) uses fake social media accounts to track Dutch activists. The agency also targets activists working in the social justice or anti-discrimination space and tracks their work, sentiments and movements through their social media accounts. This is a clear example of how digital communication allows governments to intensify their surveillance and criminalisation of political opinions outside the mainstream.
Continue reading “The Dutch government wants to continue to spy on activists’ social media”Minneapolis police used fake social media profiles to surveil Black people
An alarming report outlines an extensive pattern of racial discrimination within the city’s police department.
By Sam Richards and Tate Ryan-Mosley for MIT Technology Review on April 27, 2022
How AI reinforces racism in Brazil
Author Tarcízio Silva on how algorithmic racism exposes the myth of “racial democracy.”
By Alex González Ormerod and Tarcízio Silva for Rest of World on April 22, 2022
Technology, Racism and Justice at Roma Day 2022
Our own Jill Toh recently presented at a symposium on the use of technology and how it intersects with racism in the context of housing and policing. She spoke on a panel organised in the contex of the World Roma Day 2022 titled Technolution: Yearned-for Hopes or Old Injustices?.
Continue reading “Technology, Racism and Justice at Roma Day 2022”Bits of Freedom speaks to the Dutch Senate on discriminatory algorithms
In an official parliamentary investigative committee, the Dutch Senate is investigating how new regulation or law-making processes can help combat discrimination in the Netherlands. The focus of the investigative committee is on four broad domains: labour market, education, social security and policing. As a part of these wide investigative efforts the senate is hearing from a range of experts and civil society organisations. Most notably, one contribution stands out from the perspective of racist technology: Nadia Benaissa from Bits of Freedom highlighted the dangers of predictive policing and other uses of automated systems in law enforcement.
Continue reading “Bits of Freedom speaks to the Dutch Senate on discriminatory algorithms”Racist Technology in Action: “Race-neutral” traffic cameras have a racially disparate impact
Traffic cameras that are used to automatically hand out speeding tickets don’t look at the colour of the person driving the speeding car. Yet, ProPublica has convincingly shown how cameras that don’t have a racial bias can still have a disparate racial impact.
Continue reading “Racist Technology in Action: “Race-neutral” traffic cameras have a racially disparate impact”De discriminatie die in data schuilt
De Eerste Kamer doet onderzoek naar de effectiviteit van wetgeving tegen discriminatie. Wij mochten afgelopen vrijdag de parlementsleden vertellen over discriminatie en algoritmen. Hieronder volgt de kern van ons verhaal.
By Nadia Benaissa for Bits of Freedom on February 8, 2022
Predictive policing constrains our possibilities for better futures
In the context of the use of crime predictive software in policing, Chris Gilliard reiterated in WIRED how data-driven policing systems and programs are fundamentally premised on the assumption that historical data about crimes determines the future.
Continue reading “Predictive policing constrains our possibilities for better futures”Crime Prediction Keeps Society Stuck in the Past
So long as algorithms are trained on racist historical data and outdated values, there will be no opportunities for change.
By Chris Gilliard for WIRED on January 2, 2022
Technologies of Black Freedoms: Calling On Black Studies Scholars, with SA Smythe
Refusing to see like a state.
By J. Khadijah Abdurahman and SA Smythe for Logic on December 25, 2022
The Humanities Can’t Save Big Tech From Itself
Hiring sociocultural workers to correct bias overlooks the limitations of these underappreciated fields.
By Elena Maris for WIRED on January 12, 2022
Predictive policing reinforces and accelerates racial bias
The Markup and Gizmodo, in a recent investigative piece, analysed 5.9 million crime predictions by PredPol, crime prediction software used by law enforcement agencies in the U.S. The results confirm the racist logics and impact driven by predictive policing on individuals and neighbourhoods. As compared to Whiter, middle- and upper-income neighbourhoods, Black, Latino and poor neighbourhoods were relentlessly targeted by the software, which recommended increased police presence. The fewer White residents who lived in an area – and the more Black and Latino residents who lived there – the more likely PredPol would predict a crime there. Some neighbourhoods, in their dataset, were the subject of more than 11,000 predictions.
Continue reading “Predictive policing reinforces and accelerates racial bias”Dutch Data Protection Authority (AP) fines the tax agency for discriminatory data processing
The Dutch Data Protection Authority, the Autoriteit Persoonsgegevens (AP), has fined the Dutch Tax Agency 2.75 milion euros for discriminatory data processing as part of the child benefits scandal.
Continue reading “Dutch Data Protection Authority (AP) fines the tax agency for discriminatory data processing”Boete Belastingdienst voor discriminerende en onrechtmatige werkwijze
De Autoriteit Persoonsgegevens (AP) legt de Belastingdienst een boete op van 2,75 miljoen euro. Dit doet de AP omdat de Belastingdienst jarenlang de (dubbele) nationaliteit van aanvragers van kinderopvangtoeslag op onrechtmatige, discriminerende en daarmee onbehoorlijke wijze heeft verwerkt. Dit zijn ernstige overtredingen van de privacywet, de Algemene verordening gegevensbescherming (AVG).
From Autoriteit Persoonsgegevens on December 7, 2021
Politie koppelde onschuldige asielzoekers aan strafrechtelijke informatie
De politie vergeleek telefoongegevens van asielzoekers met strafrechtelijke informatie. Dat „rijmde” niet met de privacywet, aldus de politie zelf.
By Martin Kuiper and Romy van der Poel for NRC on December 7, 2021
Crime Prediction Software Promised to Be Free of Biases. New Data Shows It Perpetuates Them
Millions of crime predictions left on an unsecured server show PredPol mostly avoided Whiter neighborhoods, targeted Black and Latino neighborhoods.
By Aaron Sankin, Annie Gilbertson, Dhruv Mehrotra and Surya Mattu for The Markup on December 2, 2021
Massive Predpol leak confirms that it drives racist policing
When you or I seek out evidence to back up our existing beliefs and ignore the evidence that shows we’re wrong, it’s called “confirmation bias.” It’s a well-understood phenomenon that none of us are immune to, and thoughtful people put a lot of effort into countering it in themselves.
By Cory Doctorow for Pluralistic on December 2, 2021
A Black Woman Invented Home Security. Why Did It Go So Wrong?
Surveillance systems, no matter the intention, will always exist to serve power.
By Chris Gilliard for WIRED on November 14, 2021
Revealed: the software that studies your Facebook friends to predict who may commit a crime
Voyager, which pitches its tech to police, has suggested indicators such as Instagram usernames that show Arab pride can signal inclination towards extremism.
By Johana Bhuiyan and Sam Levin for The Guardian on November 17, 2021
Amnesty’s grim warning against another ‘Toeslagenaffaire’
In its report of the 25 of October, Amnesty slams the Dutch government’s use of discriminatory algorithms in the child benefits schandal (toeslagenaffaire) and warns that the likelihood of such a scandal occurring again is very high. The report is aptly titled ‘Xenophobic machines – Discrimination through unregulated use of algorithms in the Dutch childcare benefits scandal’ and it conducts a human rights analysis of a specific sub-element of the scandal: the use of algorithms and risk models. The report is based on the report of the Dutch data protection authority and several other government reports.
Continue reading “Amnesty’s grim warning against another ‘Toeslagenaffaire’”Crowd-Sourced Suspicion Apps Are Out of Control
Technology rarely invents new societal problems. Instead, it digitizes them, supersizes them, and allows them to balloon and duplicate at the speed of light. That’s exactly the problem we’ve seen with location-based, crowd-sourced “public safety” apps like Citizen.
By Matthew Guariglia for Electronic Frontier Foundation (EFF) on October 21, 2021
A Detroit community college professor is fighting Silicon Valley’s surveillance machine. People are listening.
Chris Gilliard grew up with racist policing in Detroit. He sees a new form of oppression in the tech we use every day.
By Chris Gilliard and Will Oremus for Washington Post on September 17, 2021
How Stereotyping and Bias Lingers in Product Design
Brands originally built on racist stereotypes have existed for more than a century. Now racial prejudice is also creeping into the design of tech products and algorithms.
From YouTube on September 15, 2021
Racist Technology in Action: Predicting future criminals with a bias against Black people
In 2016, ProPublica investigated the fairness of COMPAS, a system used by the courts in the United States to assess the likelihood of a defendant committing another crime. COMPAS uses a risk assessment form to assess this risk of a defendant offending again. Judges are expected to take this risk prediction into account when they decide on sentencing.
Continue reading “Racist Technology in Action: Predicting future criminals with a bias against Black people”An automated policing program got this man shot twice
Chicago’s predictive policing program told a man he would be involved with a shooting, but it couldn’t determine which side of the gun he would be on. Instead, it made him the victim of a violent crime.
By Matt Stroud for The Verge on May 24, 2021