The US and Israeli war on Iran is a live testing ground for deploying AI in a large-scale war. The Wall Street Journal first reported that Anthropic’s Claude is central to the onslaught: assessing intelligence and identifying targets. This isn’t the first time Anthropic’s systems have been used to assist in illegal US operations: they were also deployed in Maduroo’s kidnapping from Venezuela.
Continue reading “Anthropic is powering the horrific strikes on Iran”The Twilight Zone
Laila Lalami’s prescient new novel follows a woman imprisoned by the government for her dreams.
By Sue Halpern for The New York Review of Books on July 31, 2025
Ook Nederlandse politie gebruikt omstreden software van Palantir
Nederland blijkt al jaren terug in het geheim de omstreden software van Palantir te hebben aangeschaft. Demissionair premier Dick Schoof was hier in 2011 als directeur-generaal van de politie bij betrokken. Dit blijkt uit documenten die via een beroep op de Wet open overheid (Woo) onlangs openbaar zijn gemaakt.
By Peter Olsthoorn for Volkskrant on August 22, 2025
AI zou in oorlogstijd burgerlevens sparen. In realiteit vallen er juist meer doden
Kunstmatige intelligentie zou ervoor zorgen dat er tijdens oorlogen minder burgerdoden vallen. In realiteit vallen er juist meer. Want waar mensen worden gereduceerd tot datapunten, voelt vuren al snel als objectief en correct.
By Lauren Gould, Linde Arentze, and Marijn Hoijtink for De Groene Amsterdammer on July 24, 2024
NoTechFor: Forced Assimilation
Following the terror attack in Denmark of 2015, the state amped upits data analytics capabilities for counter-terrorism within the police and their Danish Security and Intelligence Service (PET). Denmark, a country which hosts an established, normalised, and widely accepted public surveillance infrastructure – justified in service of public health and greater centralisation and coordination between government and municipalities in delivery of citizen services – also boasts an intelligence service with extraordinarily expansive surveillance capabilities, and the enjoyment of wide exemptions from data protection regulations.
From No Tech for Tyrants on July 13, 2020
Crime Prediction Keeps Society Stuck in the Past
So long as algorithms are trained on racist historical data and outdated values, there will be no opportunities for change.
By Chris Gilliard for WIRED on January 2, 2022
How the LAPD and Palantir Use Data to Justify Racist Policing
In a new book, a sociologist who spent months embedded with the LAPD details how data-driven policing techwashes bias.
By Mara Hvistendahl for The Intercept on January 30, 2021
Racism and “Smart Borders”
As many of us had our attention focused on the use of biometric surveillance technologies in managing the COVID-19 pandemic, in a new UN report prof. E. Tendayi Achiume forcefully puts the spotlight on the racial and discriminatory dimension of biometric surveillance technology in border enforcement.
Continue reading “Racism and “Smart Borders””UN warns of impact of smart borders on refugees: ‘Data collection isn’t apolitical’
Special rapporteur on racism and xenophobia believes there is a misconception that biosurveillance technology is without bias.
By Katy Fallon for The Guardian on November 11, 2020
Data-Informed Predictive Policing Was Heralded As Less Biased. Is It?
Critics say it merely techwashes injustice.
By Annie Gilbertson for The Markup on August 20, 2020
