As part of a series of investigative reporting by Lighthouse Reports and WIRED, Gabriel Geiger has revealed some of the findings about the use of welfare fraud algorithms in Denmark. This comes in the trajectory of the increasing use of algorithmic systems to detect welfare fraud across European cities, or at least systems which are currently known.
Continue reading “Denmark’s welfare fraud system reflects a deeply racist and exclusionary society”Door het algoritme waren opeens álle giften aan moskeeën verdacht
Giften aan islamitische instellingen werden door de Belastingdienst jarenlang bij voorbaat als verdacht bestempeld. Deze ‘institutionele islamofobie’ werd aangewakkerd door een schandaal dat nu eindelijk onder de rechter is.
By Marco de Vries for De Groene Amsterdammer on February 22, 2023
Chinese security firm advertises ethnicity recognition technology while facing UK ban
Campaigners concerned that ‘same racist technology used to repress Uyghurs is being marketed in Britain’.
By Alex Hern for The Guardian on December 4, 2022
Report: How police surveillance tech reinforces abuses of power
The UK organisation No Tech for Tyrants (NT4T) has published an extensive report on the use of surveillance technologies by the police in the UK, US, Mexico, Brazil, Denmark and India, in collaboration with researchers and activists from these countries. The report, titled “Surveillance Tech Perpetuates Police Abuse of Power” examines the relation between policing and technology through in-depth case studies.
Continue reading “Report: How police surveillance tech reinforces abuses of power”Silencing Black women in tech journalism
In this op-ed, Sydette Harry unpacks how the tech sector, particularly tech journalism, has largely failed to meaningfully listen and account for the experiences of Black women, a group that most often bears the brunt of the harmful and racist effects of technological “innovations”. While the role of tech journalism is supposedly to hold the tech industry accountable through access and insight, it has repeatedly failed to include Black people in their reporting, neither by hiring Black writers nor by addressing them seriously as an audience. Rather, their experiences and culture are often co-opted, silenced, unreported, and pushed out of newsrooms.
Continue reading “Silencing Black women in tech journalism”Minneapolis police used fake social media profiles to surveil Black people
An alarming report outlines an extensive pattern of racial discrimination within the city’s police department.
By Sam Richards and Tate Ryan-Mosley for MIT Technology Review on April 27, 2022
Bits of Freedom speaks to the Dutch Senate on discriminatory algorithms
In an official parliamentary investigative committee, the Dutch Senate is investigating how new regulation or law-making processes can help combat discrimination in the Netherlands. The focus of the investigative committee is on four broad domains: labour market, education, social security and policing. As a part of these wide investigative efforts the senate is hearing from a range of experts and civil society organisations. Most notably, one contribution stands out from the perspective of racist technology: Nadia Benaissa from Bits of Freedom highlighted the dangers of predictive policing and other uses of automated systems in law enforcement.
Continue reading “Bits of Freedom speaks to the Dutch Senate on discriminatory algorithms”Facebook has finally stopped enabling racial profiling for targeted advertising
Around 2016 Facebook was still proud of its ability to target to “Black affinity” and “White affinity” adiences for the ads of their customers. I then wrote an op-ed decrying this form of racial profiling that was enabled by Facebook’s data lust.
Continue reading “Facebook has finally stopped enabling racial profiling for targeted advertising”Amnesty’s grim warning against another ‘Toeslagenaffaire’
In its report of the 25 of October, Amnesty slams the Dutch government’s use of discriminatory algorithms in the child benefits schandal (toeslagenaffaire) and warns that the likelihood of such a scandal occurring again is very high. The report is aptly titled ‘Xenophobic machines – Discrimination through unregulated use of algorithms in the Dutch childcare benefits scandal’ and it conducts a human rights analysis of a specific sub-element of the scandal: the use of algorithms and risk models. The report is based on the report of the Dutch data protection authority and several other government reports.
Continue reading “Amnesty’s grim warning against another ‘Toeslagenaffaire’”Opinie: Stop algoritmen van overheid die tot discriminatie en uitsluiting leiden
Uitvoeringsdiensten gebruiken talloze ‘zwarte lijsten’ met potentiële fraudeurs. Dat kan leiden tot (indirecte) etnische profilering en nieuwe drama’s, na de toeslagenaffaire.
By Nani Jansen Reventlow for Volkskrant on July 15, 2021
The Dutch elections and racist tech
In last week’s Dutch parliamentary elections, digitisation and the impact of technology on society was definitely part of the political debate. However, racism in technology was, with the exception of BIJ1, hardly explicitly addressed with most parties focussing on topics such as cybersecurity, the power of big tech, and privacy in their party programmes.
Continue reading “The Dutch elections and racist tech”How the LAPD and Palantir Use Data to Justify Racist Policing
In a new book, a sociologist who spent months embedded with the LAPD details how data-driven policing techwashes bias.
By Mara Hvistendahl for The Intercept on January 30, 2021
Hoe Nederland A.I. inzet voor etnisch profileren
China dat kunstmatige intelligentie inzet om Oeigoeren te onderdrukken: klinkt als een ver-van-je-bed-show? Ook Nederland (ver)volgt specifieke bevolkingsgroepen met algoritmes. Zoals in Roermond, waar camera’s alarm slaan bij auto’s met een Oost-Europees nummerbord.
By Florentijn van Rootselaar for OneWorld on January 14, 2021
Dataminr Targets Communities of Color for Police
Insiders say Dataminr’s “algorithmic” Twitter search involves human staffers perpetuating confirmation biases.
By Sam Biddle for The Intercept on October 21, 2020
Asymmetrical Power: The intransparency of the Dutch Police
In this interview with Jair Schalkwijk and Naomi Appelman, we try to bring some transparency to the use of facial recognition technologies in law enforcement.
By Margarita Osipian for The Hmm on October 8, 2020
‘In de Tweede Wereldoorlog hadden we wél wat te verbergen’
Welke lessen over privacy kunnen we nu trekken uit de aanslag op het Amsterdamse bevolkingsregister in 1943? ‘Vanuit een gebrek aan vrijheid krijg je een helderder perspectief op wat vrijheid betekent.’
By Hans de Zwart for De Correspondent on May 8, 2014
Facebook —het grootste land ter wereld— is gemaakt om te profileren (ook etnisch)
Typhoon werd als zwarte rapper in een mooie auto aangehouden. Sindsdien is de discussie over etnisch profileren terecht losgebarsten. Er is daarbij zelden aandacht voor het feit dat de businessmodellen van diensten uit Silicon Valley grotendeels zijn gebaseerd op profilering, en dat etnisch profileren daarbij als innovatief marketinginstrument wordt aangeprezen.
By Hans de Zwart for Bits of Freedom on June 23, 2016