‘Vergeet de controlestaat, we leven in een controlemaatschappij’

Volgens bijzonder hoogleraar digitale surveillance Marc Schuilenburg hebben wij geen geheimen meer. Bij alles wat we doen kijkt er wel iets of iemand mee die onze gangen registreert. We weten het, maar doen er gewoon aan mee. Zo diep zit digitale surveillance in de haarvaten van onze samenleving: ‘We herkennen het vaak niet eens meer.’

By Marc Schuilenburg and Sebastiaan Brommersma for Follow the Money on February 4, 2024

On “The Palestine Laboratory”

A large part of Israel’s economy and global influence are dependent on its military-technology complex that not only fuels the ongoing genocide in Gaza but is also exported to facilitate oppression around the world. In this thorough 2023 book, journalist Anthony Loewenstein makes explicit how Israel’s military industrial complex profits exorbitantly from exporting technologies “battle-tested” on occupied Gaza and the West-Bank.

Continue reading “On “The Palestine Laboratory””

Standing in solidarity with the Palestinian people

We at the Racism and Technology Center stand in solidarity with the Palestinian people. We condemn the violence enacted against the innocent people in Palestine and Israel, and mourn alongside all who are dead, injured and still missing. Palestinian communities are being subjected to unlawful collective punishment in Gaza and the West Bank, including the ongoing bombings and the blockade of water, food and energy. We call for an end to the blockade and an immediate ceasefire.

Continue reading “Standing in solidarity with the Palestinian people”

Use of machine translation tools exposes already vulnerable asylum seekers to even more risks

The use of and reliance on machine translation tools in asylum seeking procedures has become increasingly common amongst government contractors and organisations working with refugees and migrants. This Guardian article highlights many of the issues documented by Respond Crisis Translation, a network of people who provide urgent interpretation services for migrants and refugees. The problems with machine translation tools occur throughout the asylum process, from border stations to detention centers to immigration courts.

Continue reading “Use of machine translation tools exposes already vulnerable asylum seekers to even more risks”

Verdacht omdat je op een ‘verwonderadres’ woont: ‘Ze bleven aandringen dat ik moest opendoen’

Fraudejagers van de overheid die samenwerken onder de vlag van de Landelijke Stuurgroep Interventieteams selecteren overal in het land ‘verwonderadressen’ waar bewoners misschien wel frauderen. Uit een reconstructie blijkt hoe een familie in Veenendaal in beeld kwam en op drie adressen controleurs aan de deur kreeg. ‘We hoorden van de buren dat ze vanuit de bosjes ons huis in de gaten hielden.’

By David Davidson for Follow the Money on September 6, 2023

Dubieus algoritme van de politie ‘voorspelt’ wie in de toekomst geweld zal plegen

De politie voorspelt al sinds 2015 met een algoritme wie er in de toekomst geweld zal plegen. Van Marokkaanse en Antilliaanse Nederlanders werd die kans vanwege hun achtergrond groter geschat. Dat gebeurt nu volgens de politie niet meer, maar daarmee zijn de gevaren van het model niet opgelost. ‘Aan dit algoritme zitten enorme risico’s.’

By David Davidson and Marc Schuilenburg for Follow the Money on August 23, 2023

Dutch police used algorithm to predict violent behaviour without any safeguards

For many years the Dutch police has used a risk modeling algorithm to predict the chance that an individual suspect will commit a violent crime. Follow the Money exposed the total lack of a moral, legal, and statistical justification for its use, and now the police has stopped using the system.

Continue reading “Dutch police used algorithm to predict violent behaviour without any safeguards”

France wants to legalise mass surveillance for the Paris Olympics 2024: “Safety” and “security”, for whom?

Many governments are using mass surveillance to support law enforcement for the purposes of safety and security. In France, the French Parliament (and before, the French Senate) have approved the use of automated behavioural video surveillance at the 2024 Paris Olympics. Simply put, France wants to legalise mass surveillance at the national level which can violate many rights, such as the freedom of assembly and association, privacy, and non-discrimination.

Continue reading “France wants to legalise mass surveillance for the Paris Olympics 2024: “Safety” and “security”, for whom?”

On Race, AI, and Representation Or, Why Democracy Now Needs To Redo Its June 1 Segment

On June 1, Democracy Now featured a roundtable discussion hosted by Amy Goodman and Nermeen Shaikh, with three experts on Artificial Intelligence (AI), about their views on AI in the world. They included Yoshua Bengio, a computer scientist at the Université de Montréal, long considered a “godfather of AI,” Tawana Petty, an organiser and Director of Policy at the Algorithmic Justice League (AJL), and Max Tegmark, a physicist at the Massachusetts Institute of Technology. Recently, the Future of Life Institute, of which Tegmark is president, issued an open letter “on all AI labs to immediately pause for at least 6 months the training of AI systems more powerful than GPT-4.” Bengio is a signatory on the letter (as is Elon Musk). The AJL has been around since 2016, and has (along with other organisations) been calling for a public interrogation of racialised surveillance technology, the use of police robots, and other ways in which AI can be directly responsible for bodily harm and even death.

By Yasmin Nair for Yasmin Nair on June 3, 2023

Enough is Enough. Tell Congress to Ban Federal Use of Face Recognition

Cities and counties across the country have banned government use of face surveillance technology, and many more are weighing proposals to do so. From Boston to San Francisco, Jackson, Mississippi to Minneapolis, elected officials and activists know that face surveillance gives police the power to track us wherever we go. It also disproportionately impacts people of color, turns us all into perpetual suspects, increases the likelihood of being falsely arrested, and chills people’s willingness to participate in first amendment protected activities. Even Amazon, known for operating one of the largest video surveillance networks in the history of the world, extended its moratorium on selling face recognition to police.

By Matthew Guariglia for Electronic Frontier Foundation (EFF) on April 4, 2023

Racist Technology in Action: You look similar to someone we didn’t like → Dutch visa denied

Ignoring earlier Dutch failures in automated decision making, and ignoring advice from its own experts, the Dutch ministry of Foreign Affairs has decided to cut costs and cut corners through implementing a discriminatory profiling system to process visa applications.

Continue reading “Racist Technology in Action: You look similar to someone we didn’t like → Dutch visa denied”

Denmark’s welfare fraud system reflects a deeply racist and exclusionary society

As part of a series of investigative reporting by Lighthouse Reports and WIRED, Gabriel Geiger has revealed some of the findings about the use of welfare fraud algorithms in Denmark. This comes in the trajectory of the increasing use of algorithmic systems to detect welfare fraud across European cities, or at least systems which are currently known.

Continue reading “Denmark’s welfare fraud system reflects a deeply racist and exclusionary society”

Kunstmatige intelligentie moet in de pas marcheren van mensenrechten

Nederland wil graag een voorloper zijn in het gebruik van kunstmatige intelligentie in militaire situaties. Deze technologie kan echter leiden tot racisme en discriminatie. In een open brief roepen critici op tot een moratorium op het gebruik van kunstmatige intelligentie. Initiatiefnemer Oumaima Hajri legt uit waarom.

By Oumaima Hajri for De Kanttekening on February 22, 2023

Alliance Against Military AI

Civil society organisations urge the Dutch government to immediately establish a moratorium on developing AI systems in the military domain.

By Oumaima Hajri for Alliantie tegen militaire AI on February 15, 2023

Profiting off Black bodies

Tiera Tanksley’s work seeks to better understand how forms of digitally mediated traumas, such as seeing images of Black people dead and dying on social media, are impacting Black girls’ mental and emotional wellness in the U.S. and Canada. Her fears were confirmed in her findings: Black girls report unprecedented levels of fear, depression, anxiety and chronic stress. Viewing Black people being killed by the state was deeply traumatic, with mental, emotional and physiological effects.

Continue reading “Profiting off Black bodies”

Report: How police surveillance tech reinforces abuses of power

The UK organisation No Tech for Tyrants (NT4T) has published an extensive report on the use of surveillance technologies by the police in the UK, US, Mexico, Brazil, Denmark and India, in collaboration with researchers and activists from these countries. The report, titled “Surveillance Tech Perpetuates Police Abuse of Power” examines the relation between policing and technology through in-depth case studies.

Continue reading “Report: How police surveillance tech reinforces abuses of power”

New research report: Top400: A top-down crime prevention strategy in Amsterdam

The advent of predictive policing systems demonstrates an increased interest in more novel forms of data processing for the purpose of crime control. These developments have been the subject of much controversy, as there are significant concerns on the role these technologies play in shaping life chances and opportunities for individuals and different groups in society.

By Fieke Jansen for Data Justive Lab on November 17, 2022

The devastating consequences of risk based profiling by the Dutch police

Diana Sardjoe writes for Fair Trials about how her sons were profiled by the Amsterdam police on the basis of risk models (a form of predictive policing) called ‘Top600’ (for adults) and ‘Top400’ for people aged 12 to 23). Because of this profiling her sons were “continually monitored and harassed by police.”

Continue reading “The devastating consequences of risk based profiling by the Dutch police”

My sons were profiled by a racist predictive policing system — the AI Act must prohibit these systems

When I found out my sons were placed on lists called the ‘Top 600’ and the ‘Top 400’ by the local Amsterdam council, I thought I was finally getting help. The council says the purpose of these lists, created by predictive and profiling systems, is to identify and give young people who have been in contact with the police “extra attention from the council and organisations such as the police, local public health service and youth protection,” to prevent them from coming into contact with police again. This could not have been further from the truth.

By Diana Sardjoe for Medium on September 28, 2022

NoTechFor: Forced Assimilation

Following the terror attack in Denmark of 2015, the state amped upits data analytics capabilities for counter-terrorism within the police and their Danish Security and Intelligence Service (PET). Denmark, a country which hosts an established, normalised, and widely accepted public surveillance infrastructure – justified in service of public health and greater centralisation and coordination between government and municipalities in delivery of citizen services – also boasts an intelligence service with extraordinarily expansive surveillance capabilities, and the enjoyment of wide exemptions from data protection regulations.

From No Tech for Tyrants on July 13, 2020

The Dutch government wants to continue to spy on activists’ social media

Investigative journalism of the NRC brought to light that the Dutch NCTV (the National Coordinator for Counterterrorism and Security) uses fake social media accounts to track Dutch activists. The agency also targets activists working in the social justice or anti-discrimination space and tracks their work, sentiments and movements through their social media accounts. This is a clear example of how digital communication allows governments to intensify their surveillance and criminalisation of political opinions outside the mainstream.

Continue reading “The Dutch government wants to continue to spy on activists’ social media”

Proudly powered by WordPress | Theme: Baskerville 2 by Anders Noren.

Up ↑