Easily developed facial recognition glasses outline how underprepared we are for privacy violations

Two engineering students, Caine Ardayfio and AnnPhu Nguyen, at Harvard University developed real-time facial recognition glasses. They went testing it out on passengers in the Boston subway, and easily identified a former journalist and some of his articles. A great way to produce small-talk conversations of break the ice – you might think.

Continue reading “Easily developed facial recognition glasses outline how underprepared we are for privacy violations”

Surveilling Europe’s edges: when research legitimises border violence

In May 2024, Access Now’s Caterina Rodelli travelled across Greece to meet with local civil society organisations supporting migrant people and monitoring human rights violations, and to see first-hand how and where surveillance technologies are deployed at Europe’s borders. In the second instalment of a three-part blog series, she explains how EU-funded research projects on border surveillance are legitimising violent migration policies. Catch up on part one here.

By Caterina Rodelli for Access Now on September 25, 2024

Surveilling Europe’s edges: detention centres as a blueprint for mass surveillance

In May 2024, Access Now’s Caterina Rodelli travelled across Greece to meet with local civil society organisations supporting migrant people and monitoring human rights violations, and to see first-hand how and where surveillance technologies are deployed at Europe’s borders. In the third and final instalment of a three-part blog series, she explains how new migrant detention centres on the Greek island of Samos are shaping the blueprint for EU-wide mass surveillance.

By Caterina Rodelli for Access Now on October 2, 2024

Borders and Bytes

So-called “smart” borders are just more sophisticated sites of racialized surveillance and violence. We need abolitionist tools to counter them.

By Ruha Benjamin for Inquest on February 13, 2024

‘Vergeet de controlestaat, we leven in een controlemaatschappij’

Volgens bijzonder hoogleraar digitale surveillance Marc Schuilenburg hebben wij geen geheimen meer. Bij alles wat we doen kijkt er wel iets of iemand mee die onze gangen registreert. We weten het, maar doen er gewoon aan mee. Zo diep zit digitale surveillance in de haarvaten van onze samenleving: ‘We herkennen het vaak niet eens meer.’

By Marc Schuilenburg and Sebastiaan Brommersma for Follow the Money on February 4, 2024

Automating apartheid in the Occupied Palestinian Territories

In this interview, Matt Mahmoudi explains the Amnesty report titled Automating Apartheid, which he contributed to. The report exposes how the Israeli authorities extensively use surveillance tools, facial recognition technologies, and networks of CCTV cameras to support, intensify and entrench their continued domination and oppression of Palestinians in the Occupied Territories (OPT), Hebron and East Jerusalem. Facial recognition software is used by Israeli authorities to consolidate existing practices of discriminatory policing and segregation, violating Palestinians’ basic rights.

Continue reading “Automating apartheid in the Occupied Palestinian Territories”

Standing in solidarity with the Palestinian people

We at the Racism and Technology Center stand in solidarity with the Palestinian people. We condemn the violence enacted against the innocent people in Palestine and Israel, and mourn alongside all who are dead, injured and still missing. Palestinian communities are being subjected to unlawful collective punishment in Gaza and the West Bank, including the ongoing bombings and the blockade of water, food and energy. We call for an end to the blockade and an immediate ceasefire.

Continue reading “Standing in solidarity with the Palestinian people”

France wants to legalise mass surveillance for the Paris Olympics 2024: “Safety” and “security”, for whom?

Many governments are using mass surveillance to support law enforcement for the purposes of safety and security. In France, the French Parliament (and before, the French Senate) have approved the use of automated behavioural video surveillance at the 2024 Paris Olympics. Simply put, France wants to legalise mass surveillance at the national level which can violate many rights, such as the freedom of assembly and association, privacy, and non-discrimination.

Continue reading “France wants to legalise mass surveillance for the Paris Olympics 2024: “Safety” and “security”, for whom?”

Your Voice is (Not) Your Passport

In summer 2021, sound artist, engineer, musician, and educator Johann Diedrick convened a panel at the intersection of racial bias, listening, and AI technology at Pioneerworks in Brooklyn, NY. Diedrick.

By Michelle Pfeifer for Sounding Out! on June 12, 2023

Mean Images

An artist considers a new form of machinic representation: the statistical rendering of large datasets, indexed to the probable rather than the real of photography; to the uncanny composite rather than the abstraction of the graph.

By Hito Steyerl for New Left Review on April 28, 2023

Dutch student files complaint with the Netherlands Institute for Human Rights about the use of racist software by her university

During the pandemic, Dutch student Robin Pocornie had to do her exams with a light pointing straight at her face. Her fellow students who were White didn’t have to do that. Her university’s surveillance software discriminated her, and that is why she has filed a complaint (read the full complaint in Dutch) with the Netherlands Institute for Human Rights.

Continue reading “Dutch student files complaint with the Netherlands Institute for Human Rights about the use of racist software by her university”

NoTechFor: Forced Assimilation

Following the terror attack in Denmark of 2015, the state amped upits data analytics capabilities for counter-terrorism within the police and their Danish Security and Intelligence Service (PET). Denmark, a country which hosts an established, normalised, and widely accepted public surveillance infrastructure – justified in service of public health and greater centralisation and coordination between government and municipalities in delivery of citizen services – also boasts an intelligence service with extraordinarily expansive surveillance capabilities, and the enjoyment of wide exemptions from data protection regulations.

From No Tech for Tyrants on July 13, 2020

The Dutch government wants to continue to spy on activists’ social media

Investigative journalism of the NRC brought to light that the Dutch NCTV (the National Coordinator for Counterterrorism and Security) uses fake social media accounts to track Dutch activists. The agency also targets activists working in the social justice or anti-discrimination space and tracks their work, sentiments and movements through their social media accounts. This is a clear example of how digital communication allows governments to intensify their surveillance and criminalisation of political opinions outside the mainstream.

Continue reading “The Dutch government wants to continue to spy on activists’ social media”

Crowd-Sourced Suspicion Apps Are Out of Control

Technology rarely invents new societal problems. Instead, it digitizes them, supersizes them, and allows them to balloon and duplicate at the speed of light. That’s exactly the problem we’ve seen with location-based, crowd-sourced “public safety” apps like Citizen.

By Matthew Guariglia for Electronic Frontier Foundation (EFF) on October 21, 2021

Racist and classist predictive policing exists in Europe too

The enduring idea that technology will be able to solve many of the existing problems in society continues to permeate across governments. For the EUObserver, Fieke Jansen and Sarah Chander illustrate some of the problematic and harmful uses of ‘predictive’ algorithmic systems by states and public authorities across the UK and Europe.

Continue reading “Racist and classist predictive policing exists in Europe too”

Online proctoring excludes and discriminates

The use of software to automatically detect cheating on online exams – online proctoring – has been the go-to solution for many schools and universities in response to the COVID-19 pandemic. In this article, Shea Swauger addresses some of the potential discriminatory, privacy and security harms that can impact groups of students across class, gender, race, and disability lines. Swauger provides a critique on how technologies encode “normal” bodies – cisgender, white, able-bodied, neurotypical, male – as the standard and how students who do not (or cannot) conform, are punished by it.

Continue reading “Online proctoring excludes and discriminates”

Proudly powered by WordPress | Theme: Baskerville 2 by Anders Noren.

Up ↑