Let us explain. With cats
By Aaron Sankin and Natasha Uzcátegui-Liggett for The Markup on July 18, 2024
Let us explain. With cats
By Aaron Sankin and Natasha Uzcátegui-Liggett for The Markup on July 18, 2024
Mass protests used to offer a degree of safety in numbers. Facial recognition technology changes the equation.
By Darren Loucaides for Rest of World on March 27, 2024
The Detroit Police Department arrested three people after bad facial recognition matches, a national record. But it’s adopting new policies that even the A.C.L.U. endorses.
By Kashmir Hill for The New York Times on June 29, 2024
Automated decision-making systems contain hidden discriminatory prejudices. We’ll explain the causes, possible consequences, and the reasons why existing laws do not provide sufficient protection against algorithmic discrimination.
By Pie Sombetzki for AlgorithmWatch on June 26, 2024
Three men falsely arrested based on face recognition technology have joined the fight against a California bill that aims to place guardrails around police use of the technology. They say it will still allow abuses and misguided arrests.
By Khari Johnson for The Markup on June 12, 2024
For more than a year now, the Dutch Ministry of Foreign Affairs has ignored advice from its experts and continued its use of discriminatory risk profiling of visa applicants.
Continue reading “Dutch Ministry of Foreign Affairs dislikes the conclusions of a solid report that marks their visa process as discriminatory so buys a shoddy report saying the opposite”The ubiquitous availability of AI has made plagiarism detection software utterly useless, argues our Hans de Zwart in the Volkskrant.
Continue reading “AI detection has no place in education”Since 2021, thousands of Amazon and Google tech workers have been organising against Project Nimbus, Google and Amazon’s shared USD$1.2 billion contract with the Israeli government and military. Since then, there has been no response from management or executive. Their organising efforts have accelerated since 7 October 2023, with the ongoing genocide on Gaza and occupied Palestinian territories by the Israeli state.
Continue reading “Tech workers demand Google and Amazon to stop their complicity in Israel’s genocide against the Palestinian people”In 2020, two NGOs finally forced the UK Home Office’s hand, compelling it to abandon its secretive and racist algorithm for sorting visitor visa applications. Foxglove and The Joint Council for the Welfare of Immigrants (JCWI) had been battling the algorithm for years, arguing that it is a form of institutionalized racism and calling it “speedy boarding for white people.”
Continue reading “Racist Technology in Action: The UK Home Office’s Sorting Algorithm and the Racist Violence of Borders”Tracing the history of Telangana’s police state and its Brahminical investments.
By Aditya Rawat, Mrinalini R, Nikita Sonavane, and Ramani Mohanakrishnan and Vikas Yadav for Logic on December 13, 2023
Volgens bijzonder hoogleraar digitale surveillance Marc Schuilenburg hebben wij geen geheimen meer. Bij alles wat we doen kijkt er wel iets of iemand mee die onze gangen registreert. We weten het, maar doen er gewoon aan mee. Zo diep zit digitale surveillance in de haarvaten van onze samenleving: ‘We herkennen het vaak niet eens meer.’
By Marc Schuilenburg and Sebastiaan Brommersma for Follow the Money on February 4, 2024
A large part of Israel’s economy and global influence are dependent on its military-technology complex that not only fuels the ongoing genocide in Gaza but is also exported to facilitate oppression around the world. In this thorough 2023 book, journalist Anthony Loewenstein makes explicit how Israel’s military industrial complex profits exorbitantly from exporting technologies “battle-tested” on occupied Gaza and the West-Bank.
Continue reading “On “The Palestine Laboratory””A software company sold a New Jersey police department an algorithm that was right less than 1 percent of the time.
By Aaron Sankin and Surya Mattu for WIRED on October 2, 2023
We at the Racism and Technology Center stand in solidarity with the Palestinian people. We condemn the violence enacted against the innocent people in Palestine and Israel, and mourn alongside all who are dead, injured and still missing. Palestinian communities are being subjected to unlawful collective punishment in Gaza and the West Bank, including the ongoing bombings and the blockade of water, food and energy. We call for an end to the blockade and an immediate ceasefire.
Continue reading “Standing in solidarity with the Palestinian people”The use of and reliance on machine translation tools in asylum seeking procedures has become increasingly common amongst government contractors and organisations working with refugees and migrants. This Guardian article highlights many of the issues documented by Respond Crisis Translation, a network of people who provide urgent interpretation services for migrants and refugees. The problems with machine translation tools occur throughout the asylum process, from border stations to detention centers to immigration courts.
Continue reading “Use of machine translation tools exposes already vulnerable asylum seekers to even more risks”This collaborative investigative effort by Spotlight Bureau, Lighthouse Reports and Follow the Money, dives into the story of a Moroccan-Dutch family in Veenendaal which was targeted for fraud by the Dutch government.
Continue reading “Racist Technology in Action: Flagged as risky simply for requesting social assistance in Veenendaal, The Netherlands”Fraudejagers van de overheid die samenwerken onder de vlag van de Landelijke Stuurgroep Interventieteams selecteren overal in het land ‘verwonderadressen’ waar bewoners misschien wel frauderen. Uit een reconstructie blijkt hoe een familie in Veenendaal in beeld kwam en op drie adressen controleurs aan de deur kreeg. ‘We hoorden van de buren dat ze vanuit de bosjes ons huis in de gaten hielden.’
By David Davidson for Follow the Money on September 6, 2023
De politie stopt ‘per direct’ met het algoritme waarmee ze voorspelt of iemand in de toekomst geweld gaat gebruiken. Eerder deze week onthulde Follow the Money dat het zogeheten Risicotaxatie Instrument Geweld op ethisch en statistisch gebied ondermaats is.
By David Davidson for Follow the Money on August 25, 2023
De politie voorspelt al sinds 2015 met een algoritme wie er in de toekomst geweld zal plegen. Van Marokkaanse en Antilliaanse Nederlanders werd die kans vanwege hun achtergrond groter geschat. Dat gebeurt nu volgens de politie niet meer, maar daarmee zijn de gevaren van het model niet opgelost. ‘Aan dit algoritme zitten enorme risico’s.’
By David Davidson and Marc Schuilenburg for Follow the Money on August 23, 2023
The police in America is using facial recognition software to match security footage of crimes to people. Kashmir Hill describes for the New York Times another example of a wrong match leading to a wrongful arrest.
Continue reading “Another false facial recognition match: pregnant woman wrongfully arrested”For many years the Dutch police has used a risk modeling algorithm to predict the chance that an individual suspect will commit a violent crime. Follow the Money exposed the total lack of a moral, legal, and statistical justification for its use, and now the police has stopped using the system.
Continue reading “Dutch police used algorithm to predict violent behaviour without any safeguards”US government tests find even top-performing facial recognition systems misidentify blacks at rates 5 to 10 times higher than they do whites.
By Tom Simonite for WIRED on July 22, 2019
Porcha Woodruff thought the police who showed up at her door to arrest her for carjacking were joking. She is the first woman known to be wrongfully accused as a result of facial recognition technology.
By Kashmir Hill for The New York Times on August 6, 2023
Many governments are using mass surveillance to support law enforcement for the purposes of safety and security. In France, the French Parliament (and before, the French Senate) have approved the use of automated behavioural video surveillance at the 2024 Paris Olympics. Simply put, France wants to legalise mass surveillance at the national level which can violate many rights, such as the freedom of assembly and association, privacy, and non-discrimination.
Continue reading “France wants to legalise mass surveillance for the Paris Olympics 2024: “Safety” and “security”, for whom?”On June 1, Democracy Now featured a roundtable discussion hosted by Amy Goodman and Nermeen Shaikh, with three experts on Artificial Intelligence (AI), about their views on AI in the world. They included Yoshua Bengio, a computer scientist at the Université de Montréal, long considered a “godfather of AI,” Tawana Petty, an organiser and Director of Policy at the Algorithmic Justice League (AJL), and Max Tegmark, a physicist at the Massachusetts Institute of Technology. Recently, the Future of Life Institute, of which Tegmark is president, issued an open letter “on all AI labs to immediately pause for at least 6 months the training of AI systems more powerful than GPT-4.” Bengio is a signatory on the letter (as is Elon Musk). The AJL has been around since 2016, and has (along with other organisations) been calling for a public interrogation of racialised surveillance technology, the use of police robots, and other ways in which AI can be directly responsible for bodily harm and even death.
By Yasmin Nair for Yasmin Nair on June 3, 2023
Chinese social media, like Xiaohongshu, Kuaishou, and Douyin, are full of hundreds of users with American cop profile photos with the aim of taunting Black users.
By Viola Zhou for Rest of World on May 23, 2023
Because of a bad facial recognition match and other hidden technology, Randal Reid spent nearly a week in jail, falsely accused of stealing purses in a state he said he had never even visited.
By Kashmir Hill and Ryan Mac for The New York Times on March 31, 2023
The application of face recognition technology in the criminal justice system threatens to perpetuate racial inequality.
By Alex Najibi for Science in the News on October 24, 2020
Cities and counties across the country have banned government use of face surveillance technology, and many more are weighing proposals to do so. From Boston to San Francisco, Jackson, Mississippi to Minneapolis, elected officials and activists know that face surveillance gives police the power to track us wherever we go. It also disproportionately impacts people of color, turns us all into perpetual suspects, increases the likelihood of being falsely arrested, and chills people’s willingness to participate in first amendment protected activities. Even Amazon, known for operating one of the largest video surveillance networks in the history of the world, extended its moratorium on selling face recognition to police.
By Matthew Guariglia for Electronic Frontier Foundation (EFF) on April 4, 2023
Ignoring earlier Dutch failures in automated decision making, and ignoring advice from its own experts, the Dutch ministry of Foreign Affairs has decided to cut costs and cut corners through implementing a discriminatory profiling system to process visa applications.
Continue reading “Racist Technology in Action: You look similar to someone we didn’t like → Dutch visa denied”The current wave of reporting on the AI-bubble has one advantage: it also creates a bit of space in the media to write about how AI reflects the existing inequities in our society.
Continue reading “Work related to the Racism and Technology Center is getting media attention”As part of a series of investigative reporting by Lighthouse Reports and WIRED, Gabriel Geiger has revealed some of the findings about the use of welfare fraud algorithms in Denmark. This comes in the trajectory of the increasing use of algorithmic systems to detect welfare fraud across European cities, or at least systems which are currently known.
Continue reading “Denmark’s welfare fraud system reflects a deeply racist and exclusionary society”Nederland wil graag een voorloper zijn in het gebruik van kunstmatige intelligentie in militaire situaties. Deze technologie kan echter leiden tot racisme en discriminatie. In een open brief roepen critici op tot een moratorium op het gebruik van kunstmatige intelligentie. Initiatiefnemer Oumaima Hajri legt uit waarom.
By Oumaima Hajri for De Kanttekening on February 22, 2023
The past week the Dutch goverment hosted and organised the military AI conference REAIM 2023. Together with eight other NGOs we signed an open letter, initated by Oumaima Hajri, that calls on the Dutch government to stop promoting narratives of “innovation” and “opportunities” but, rather, centre the very real and often disparate human impact.
Continue reading “An alliance against military AI”Civil society organisations urge the Dutch government to immediately establish a moratorium on developing AI systems in the military domain.
By Oumaima Hajri for Alliantie tegen militaire AI on February 15, 2023
Tiera Tanksley’s work seeks to better understand how forms of digitally mediated traumas, such as seeing images of Black people dead and dying on social media, are impacting Black girls’ mental and emotional wellness in the U.S. and Canada. Her fears were confirmed in her findings: Black girls report unprecedented levels of fear, depression, anxiety and chronic stress. Viewing Black people being killed by the state was deeply traumatic, with mental, emotional and physiological effects.
Continue reading “Profiting off Black bodies”A critical, in depth report on Top400 – a crime prevention project by the Amsterdam municipality – which targets and polices minors (between the ages of 12 to 23) has emphasised the stigmatising, discriminatory, and invasive effects of the Top400 on youths and their families.
Continue reading “Amsterdam’s Top400 project stigmatises and over-criminalises youths”The UK organisation No Tech for Tyrants (NT4T) has published an extensive report on the use of surveillance technologies by the police in the UK, US, Mexico, Brazil, Denmark and India, in collaboration with researchers and activists from these countries. The report, titled “Surveillance Tech Perpetuates Police Abuse of Power” examines the relation between policing and technology through in-depth case studies.
Continue reading “Report: How police surveillance tech reinforces abuses of power”The advent of predictive policing systems demonstrates an increased interest in more novel forms of data processing for the purpose of crime control. These developments have been the subject of much controversy, as there are significant concerns on the role these technologies play in shaping life chances and opportunities for individuals and different groups in society.
By Fieke Jansen for Data Justive Lab on November 17, 2022
In Moeders – donderdag in première op Idfa – hekelt Nirit Peled de Top 400, een lijst met daarop namen van Amsterdamse jongeren die dreigen af te glijden in de serieuze criminaliteit.
By David Hielkema and Nirit Peled for Het Parool on November 9, 2022
Proudly powered by WordPress | Theme: Baskerville 2 by Anders Noren.