Yet another Dutch government department has become embroiled in a row over the use of algorithms – this time the unit which assesses whether prisoners are likely to re-offend.
From DutchNews.nl on February 13, 2026
Yet another Dutch government department has become embroiled in a row over the use of algorithms – this time the unit which assesses whether prisoners are likely to re-offend.
From DutchNews.nl on February 13, 2026
De reclassering gebruikt algoritmes op een niet verantwoorde manier. Haar belangrijkste algoritmes kennen meerdere gebreken die negatieve gevolgen kunnen hebben voor de maatschappij, verdachten en veroordeelden. Dit blijkt uit onderzoek van de Inspectie Justitie en Veiligheid (Inspectie JenV).
From Inspectie Justitie en Veiligheid on February 12, 2026
Alvi Choudhury claiming damages against Thames Valley police after biased technology confused him with man looking ‘10 years younger’.
By Mark Wilding and Robert Booth for The Guardian on February 25, 2026
After tireless campaigning, advocacy, and legal action by the affected communities themselves and together with NGO’s such as Bits of Freedom, PILP, and Controle Alt Delete the Amsterdam municipality seems to have relented: the Top 400 and Top 600 will be no more.
Continue reading “The end of the Top400/600: What will come next?”The EU’s ongoing negotiations on a new “Deportation Regulation” follows Trump playbook. As terrifying ICE raids have continued across the US, European leaders have critiqued Trump’s actions and immigration policies. Yet, Europe itself is creating its own ICE-style systems: an expansion of its racist, inhumane, carceral migration infrastructure.
Continue reading “Reject the EU’s deportation regime: We Keep Us Safe campaign”The UK police admitted that their facial recognition technology has a significant racial bias.
Continue reading “UK police facial recognition: Another chapter in a long story of racist Technology”Testing showing racial bias against black and Asian people prompts watchdog to ask Home Office for explanation.
By Rachel Hall for The Guardian on December 5, 2025
Calls for review after technology found to return more false positives for ‘some demographic groups’ on certain settings.
By Rajeev Syal for The Guardian on December 5, 2025
De Nederlandse overheid haalt alles uit de kast om fraude te bestrijden en criminaliteit te voorkomen, inclusief onrechtmatige middelen. Dat gaat zelfs zo ver dat ook kinderen het doelwit worden van overheidssurveillance.
By Nadia Benaissa for Volkskrant
Recent investigative reporting by Follow the Money has revealed significant concerns regarding the Netherlands’ extensive reliance on the immoral American data analytics firm Palantir Technologies for both police and military operations since 2010.
Continue reading “Dutch police and military secretly work with evil tech giant Palantir”In March 2025, the European Commission presented a new proposal for a Return Regulation – more aptly named, the Deportation Regulation – to replace its current Return Directive.
Continue reading “New EU deportation law must be rejected”Academic says sample size was too small to claim new sensitivity guidelines have removed racial, gender or age bias.
By Vikram Dodd for The Guardian on August 23, 2025
Laila Lalami’s prescient new novel follows a woman imprisoned by the government for her dreams.
By Sue Halpern for The New York Review of Books on July 31, 2025
This Playbook brings together facilitation tools we have developed and research we have done in cities across Europe.
From Justice, Equity and Technology Project on July 31, 2025
De Wetenschappelijke Adviesraad Politie (WARP) adviseert de korpschef van de politie over zeven urgente uitdagingen rondom digitalisering en AI in politiewerk.
From Wetenschappelijke Adviesraad Politie on June 5, 2025
Amnesty International UK’s report Automated Racism (from last February, PDF), reveals that almost three-quarters of UK police forces use discriminatory predictive policing systems that perpetuate racial profiling. At least 33 deploy AI tools that predict crime locations and profile individuals as future criminals based on biased historical data, perpetuating and entrenching racism and inequality.
Continue reading “Amnesty report (yet again) exposes racist AI in UK police forces”A machine learning system that predicts where white collar crimes will occur throughout the US.
From White Collare Crime Risk Zones on March 1, 2017
State department launches AI-assisted reviews of accounts to look for what it perceives as Hamas supporters.
From The Guardian on March 7, 2025
Lobna Hemid. Stefany González Escarraman. Eva Jaular (and her 11-month-old baby). The lives of these three women and an infant, amongst many others, tragically ended due to gender-related killings in Spain. As reported in this article, they were all classified as “low” or “negligible” risk by VioGén, despite reporting abuse to the police. In the case of Lobna Hemid, after reporting her husband’s abuse to the police and being assessed as “low risk” by VioGén, the police provided her with minimal protection, and weeks later, her husband stabbed her to death.
Continue reading “In Spain, an algorithm used by police to ‘combat’ gender violence determines whether women live or die”De Nederlandse politie en veiligheidsdiensten kennen een traditie van moslimdiscriminatie, schrijft Evelyn Austin, directeur van Bits of Freedom. Zij vreest dat moslims wederom de dupe zijn als politie meer bevoegdheden krijgt om online te surveilleren.
By Evelyn Austin for Het Parool on February 8, 2025
Let us explain. With cats
By Aaron Sankin and Natasha Uzcátegui-Liggett for The Markup on July 18, 2024
Mass protests used to offer a degree of safety in numbers. Facial recognition technology changes the equation.
By Darren Loucaides for Rest of World on March 27, 2024
The Detroit Police Department arrested three people after bad facial recognition matches, a national record. But it’s adopting new policies that even the A.C.L.U. endorses.
By Kashmir Hill for The New York Times on June 29, 2024
Automated decision-making systems contain hidden discriminatory prejudices. We’ll explain the causes, possible consequences, and the reasons why existing laws do not provide sufficient protection against algorithmic discrimination.
By Pie Sombetzki for AlgorithmWatch on June 26, 2024
Three men falsely arrested based on face recognition technology have joined the fight against a California bill that aims to place guardrails around police use of the technology. They say it will still allow abuses and misguided arrests.
By Khari Johnson for The Markup on June 12, 2024
For more than a year now, the Dutch Ministry of Foreign Affairs has ignored advice from its experts and continued its use of discriminatory risk profiling of visa applicants.
Continue reading “Dutch Ministry of Foreign Affairs dislikes the conclusions of a solid report that marks their visa process as discriminatory so buys a shoddy report saying the opposite”The ubiquitous availability of AI has made plagiarism detection software utterly useless, argues our Hans de Zwart in the Volkskrant.
Continue reading “AI detection has no place in education”Since 2021, thousands of Amazon and Google tech workers have been organising against Project Nimbus, Google and Amazon’s shared USD$1.2 billion contract with the Israeli government and military. Since then, there has been no response from management or executive. Their organising efforts have accelerated since 7 October 2023, with the ongoing genocide on Gaza and occupied Palestinian territories by the Israeli state.
Continue reading “Tech workers demand Google and Amazon to stop their complicity in Israel’s genocide against the Palestinian people”In 2020, two NGOs finally forced the UK Home Office’s hand, compelling it to abandon its secretive and racist algorithm for sorting visitor visa applications. Foxglove and The Joint Council for the Welfare of Immigrants (JCWI) had been battling the algorithm for years, arguing that it is a form of institutionalized racism and calling it “speedy boarding for white people.”
Continue reading “Racist Technology in Action: The UK Home Office’s Sorting Algorithm and the Racist Violence of Borders”Tracing the history of Telangana’s police state and its Brahminical investments.
By Aditya Rawat, Mrinalini R, Nikita Sonavane, and Ramani Mohanakrishnan and Vikas Yadav for Logic on December 13, 2023
Volgens bijzonder hoogleraar digitale surveillance Marc Schuilenburg hebben wij geen geheimen meer. Bij alles wat we doen kijkt er wel iets of iemand mee die onze gangen registreert. We weten het, maar doen er gewoon aan mee. Zo diep zit digitale surveillance in de haarvaten van onze samenleving: ‘We herkennen het vaak niet eens meer.’
By Marc Schuilenburg and Sebastiaan Brommersma for Follow the Money on February 4, 2024
A large part of Israel’s economy and global influence are dependent on its military-technology complex that not only fuels the ongoing genocide in Gaza but is also exported to facilitate oppression around the world. In this thorough 2023 book, journalist Anthony Loewenstein makes explicit how Israel’s military industrial complex profits exorbitantly from exporting technologies “battle-tested” on occupied Gaza and the West-Bank.
Continue reading “On “The Palestine Laboratory””A software company sold a New Jersey police department an algorithm that was right less than 1 percent of the time.
By Aaron Sankin and Surya Mattu for WIRED on October 2, 2023
We at the Racism and Technology Center stand in solidarity with the Palestinian people. We condemn the violence enacted against the innocent people in Palestine and Israel, and mourn alongside all who are dead, injured and still missing. Palestinian communities are being subjected to unlawful collective punishment in Gaza and the West Bank, including the ongoing bombings and the blockade of water, food and energy. We call for an end to the blockade and an immediate ceasefire.
Continue reading “Standing in solidarity with the Palestinian people”The use of and reliance on machine translation tools in asylum seeking procedures has become increasingly common amongst government contractors and organisations working with refugees and migrants. This Guardian article highlights many of the issues documented by Respond Crisis Translation, a network of people who provide urgent interpretation services for migrants and refugees. The problems with machine translation tools occur throughout the asylum process, from border stations to detention centers to immigration courts.
Continue reading “Use of machine translation tools exposes already vulnerable asylum seekers to even more risks”This collaborative investigative effort by Spotlight Bureau, Lighthouse Reports and Follow the Money, dives into the story of a Moroccan-Dutch family in Veenendaal which was targeted for fraud by the Dutch government.
Continue reading “Racist Technology in Action: Flagged as risky simply for requesting social assistance in Veenendaal, The Netherlands”Fraudejagers van de overheid die samenwerken onder de vlag van de Landelijke Stuurgroep Interventieteams selecteren overal in het land ‘verwonderadressen’ waar bewoners misschien wel frauderen. Uit een reconstructie blijkt hoe een familie in Veenendaal in beeld kwam en op drie adressen controleurs aan de deur kreeg. ‘We hoorden van de buren dat ze vanuit de bosjes ons huis in de gaten hielden.’
By David Davidson for Follow the Money on September 6, 2023
De politie stopt ‘per direct’ met het algoritme waarmee ze voorspelt of iemand in de toekomst geweld gaat gebruiken. Eerder deze week onthulde Follow the Money dat het zogeheten Risicotaxatie Instrument Geweld op ethisch en statistisch gebied ondermaats is.
By David Davidson for Follow the Money on August 25, 2023
De politie voorspelt al sinds 2015 met een algoritme wie er in de toekomst geweld zal plegen. Van Marokkaanse en Antilliaanse Nederlanders werd die kans vanwege hun achtergrond groter geschat. Dat gebeurt nu volgens de politie niet meer, maar daarmee zijn de gevaren van het model niet opgelost. ‘Aan dit algoritme zitten enorme risico’s.’
By David Davidson and Marc Schuilenburg for Follow the Money on August 23, 2023
The police in America is using facial recognition software to match security footage of crimes to people. Kashmir Hill describes for the New York Times another example of a wrong match leading to a wrongful arrest.
Continue reading “Another false facial recognition match: pregnant woman wrongfully arrested”Proudly powered by WordPress | Theme: Baskerville 2 by Anders Noren.