D66 wil opheldering over discriminerende antispieksoftware

D66 wil het fijne weten over Proctorio. De antispieksoftware herkende een studente niet vanwege haar donkere huidskleur. Daarop besloot ze om een klacht wegens discriminatie in te dienen bij het College voor de Rechten van de Mens. D66 wil weten hoe het kabinet ervoor gaat zorgen dat het hoger onderwijs in de toekomst geen discriminerende software of technologie gaat gebruiken. Dat blijkt uit schriftelijke vragen van Jeanet van der Laan (D66) aan de minister van Onderwijs, Cultuur en Wetenschap Robbert Dijkgraaf.

By Anton Mous for VPNGids.nl on July 19, 2022

Easily developed facial recognition glasses outline how underprepared we are for privacy violations

Two engineering students, Caine Ardayfio and AnnPhu Nguyen, at Harvard University developed real-time facial recognition glasses. They went testing it out on passengers in the Boston subway, and easily identified a former journalist and some of his articles. A great way to produce small-talk conversations of break the ice – you might think.

Continue reading “Easily developed facial recognition glasses outline how underprepared we are for privacy violations”

Racist Technology in Action: AI detection of emotion rates Black basketball players as ‘angrier’ than their White counterparts

In 2018, Lauren Rhue showed that two leading emotion detection software products had a racial bias against Black Men: Face++ thought they were more angry, and Microsoft AI thought they were more contemptuous.

Continue reading “Racist Technology in Action: AI detection of emotion rates Black basketball players as ‘angrier’ than their White counterparts”

Voorvechters aan het woord: Robin Pocornie

Tijdens de Big Brother Awards was er dit jaar extra aandacht voor de positieve impact van voorvechters van onze internetvrijheid. De zogenoemde Felipe Rodriquez Award – naar één van de oprichters van XS4ALL en grondlegger van de digitale burgerrechtenbeweging in Nederland – ging dit jaar naar maar liefst vijf winnaars. Met deze prijs willen we anderen inspireren en motiveren om samen werk te maken van onze digitale rechten. We lichten dan ook graag de winnaars één voor één voor je uit in deze interviewreeks. Deze keer aan het woord: Robin Pocornie. Zij won de prijs voor het aankaarten van racistische anti-spieksoftware op de Vrije Universiteit van Amsterdam.

By Lotje Beek, Lotte Houwing, and Robin Pocornie for Bits of Freedom on March 5, 2024

Dutch Higher Education continues to use inequitable proctoring software

In October last year, RTL news showed that Proctorio’s software, used to check if students aren’t cheating during online exams, works less for students of colour. Five months later, RTL asked the twelve Dutch educational institutions on Proctorio’s client list whether they were still using the tool. Eight say they still do.

Continue reading “Dutch Higher Education continues to use inequitable proctoring software”

De Netwerkmaatschappij, deel 48: Joy Buolamwini

Ik zal er geen doekjes om winden: ik ben fan van deze vrouw. Daarom draagt deze blog dan ook haar naam. Hieronder gaat het vooral over een boek dat ze schreef: Unmasking AI, my mission to protect what is human in a world of machines. Ik bewonder haar vooral omdat ze zich, terwijl ze alle mogelijkheden had voor een grootse wetenschappelijke carrière, toch voortdurend ook bekommerde om slachtoffers van gezichtsherkenning: de door haar onderzochte digitale technologie.

By Roeland Smeets for Netwerk Mediawijsheid on January 30, 2024

Late Night Talks: Studenten slepen universiteit voor de rechter vanwege discriminerende AI-software

Vrije Universiteit Amsterdam student Robin Pocornie en Naomi Appelman, co-founder van non-profitorganisatie Racism and Technology Center, gaan met elkaar in gesprek over discriminatie binnen kunstmatige intelligentie (artificial intelligence). Wat zijn de voor- en nadelen van kunstmatige intelligentie en in hoeverre hebben we grip en hoe kunnen we discriminatie tegengaan in de snelle ontwikkelingen van technologie?

By Charisa Chotoe, Naomi Appelman and Robin Pocornie for YouTube on December 3, 2023

Automating apartheid in the Occupied Palestinian Territories

In this interview, Matt Mahmoudi explains the Amnesty report titled Automating Apartheid, which he contributed to. The report exposes how the Israeli authorities extensively use surveillance tools, facial recognition technologies, and networks of CCTV cameras to support, intensify and entrench their continued domination and oppression of Palestinians in the Occupied Territories (OPT), Hebron and East Jerusalem. Facial recognition software is used by Israeli authorities to consolidate existing practices of discriminatory policing and segregation, violating Palestinians’ basic rights.

Continue reading “Automating apartheid in the Occupied Palestinian Territories”

Judgement of the Dutch Institute for Human Rights shows how difficult it is to legally prove algorithmic discrimination

On October 17th, the Netherlands Institute for Human Rights ruled that the VU did not discriminate against bioinformatics student Robin Pocornie on the basis of race by using anti-cheating software. However, according to the institute, the VU has discriminated on the grounds of race in how they handled her complaint.

Continue reading “Judgement of the Dutch Institute for Human Rights shows how difficult it is to legally prove algorithmic discrimination”

Waarom we zwarte vrouwen meer zouden moeten geloven dan techbedrijven

Stel je voor dat bedrijven technologie bouwen die fundamenteel racistisch is: het is bekend dat die technologie voor zwarte mensen bijna 30 procent vaker niet werkt dan voor witte mensen. Stel je vervolgens voor dat deze technologie wordt ingezet op een cruciaal gebied van je leven: je werk, onderwijs, gezondheidszorg. En stel je tot slot voor dat je een zwarte vrouw bent en dat de technologie werkt zoals verwacht: niet voor jou. Je dient een klacht in. Om vervolgens van de nationale mensenrechteninstantie te horen dat het in dit geval waarschijnlijk geen racisme was.

By Nani Jansen Reventlow for Volkskrant on October 22, 2023

Judgement of the Dutch Institute for Human Rights shows how difficult it is to legally prove algorithmic discrimination

Today, the Netherlands Institute for Human Rights ruled that the VU did not discriminate against bioinformatics student Robin Pocornie on the basis of race by using anti-cheating software. However, the VU has discriminated on the grounds of race when handling her complaint.

Continue reading “Judgement of the Dutch Institute for Human Rights shows how difficult it is to legally prove algorithmic discrimination”

Uitspraak College voor de Rechten van de Mens laat zien hoe moeilijk het is om algoritmische discriminatie juridisch te bewijzen

Vandaag heeft het College van de Rechten van de Mens geoordeeld dat de VU de student bioinformatica Robin Pocornie niet heeft gediscrimineerd op basis van ras door de inzet van antispieksoftware. Wel heeft de VU verboden onderscheid op grond van ras gemaakt bij de klachtbehandeling.

Continue reading “Uitspraak College voor de Rechten van de Mens laat zien hoe moeilijk het is om algoritmische discriminatie juridisch te bewijzen”

Zwarte mensen vaker niet herkend door antispieksoftware Proctorio

Gezichten van mensen met een zwarte huidskleur worden veel minder goed herkend door tentamensoftware Proctorio, blijkt uit onderzoek van RTL Nieuws. De software, die fraude moet herkennen, zoekt bij online tentamens naar het gezicht van een student. Dat zwarte gezichten beduidend slechter worden herkend, leidt tot discriminatie, zeggen deskundigen die het onderzoek van RTL Nieuws beoordeelden.

By Stan Hulsen for RTL Nieuws on October 7, 2023

Proctoring software uses fudge-factor for dark skinned students to adjust their suspicion score

Respondus, a vendor of online proctoring software, has been granted a patent for their “systems and methods for assessing data collected by automated proctoring.” The patent shows that their example method for calculating a risk score is adjusted on the basis of people’s skin colour.

Continue reading “Proctoring software uses fudge-factor for dark skinned students to adjust their suspicion score”

Al Jazeera asks: Can AI eliminate human bias or does it perpetuate it?

In its online series of digital dilemmas, Al Jazeera takes a look at AI in relation to social inequities. Loyal readers of this newsletter will recognise many of the examples they touch on, like how Stable Diffusion exacerbates and amplifies racial and gender disparities or the Dutch childcare benefits scandal.

Continue reading “Al Jazeera asks: Can AI eliminate human bias or does it perpetuate it?”

Vooral vrouwen van kleur klagen de vooroordelen van AI aan

Wat je in zelflerende AI-systemen stopt, krijg je terug. Technologie, veelal ontwikkeld door witte mannen, versterkt en verbergt daardoor de vooroordelen. Met name vrouwen (van kleur) luiden de alarmbel.

By Marieke Rotman, Nani Jansen Reventlow, Oumaima Hajri and Tanya O’Carroll for De Groene Amsterdammer on July 12, 2023

France wants to legalise mass surveillance for the Paris Olympics 2024: “Safety” and “security”, for whom?

Many governments are using mass surveillance to support law enforcement for the purposes of safety and security. In France, the French Parliament (and before, the French Senate) have approved the use of automated behavioural video surveillance at the 2024 Paris Olympics. Simply put, France wants to legalise mass surveillance at the national level which can violate many rights, such as the freedom of assembly and association, privacy, and non-discrimination.

Continue reading “France wants to legalise mass surveillance for the Paris Olympics 2024: “Safety” and “security”, for whom?”

Your Voice is (Not) Your Passport

In summer 2021, sound artist, engineer, musician, and educator Johann Diedrick convened a panel at the intersection of racial bias, listening, and AI technology at Pioneerworks in Brooklyn, NY. Diedrick.

By Michelle Pfeifer for Sounding Out! on June 12, 2023

Attempts to eliminate bias through diversifying datasets? A distraction from the root of the problem

In this eloquent and haunting piece by Hito Steyerl, she weaves the ongoing narratives of the eugenicist history of statistics with its integration into machine learning. She elaborates why the attempts to eliminate bias in facial recognition technology through diversifying datasets obscures the root of the problem: machine learning and automation are fundamentally reliant on extracting and exploiting human labour.

Continue reading “Attempts to eliminate bias through diversifying datasets? A distraction from the root of the problem”

Consensus and subjectivity of skin tone annotation for ML fairness

Skin tone is an observable characteristic that is subjective, perceived differently by individuals (e.g., depending on their location or culture) and thus is complicated to annotate. That said, the ability to reliably and accurately annotate skin tone is highly important in computer vision. This became apparent in 2018, when the Gender Shades study highlighted that computer vision systems struggled to detect people with darker skin tones, and performed particularly poorly for women with darker skin tones. The study highlights the importance for computer researchers and practitioners to evaluate their technologies across the full range of skin tones and at intersections of identities. Beyond evaluating model performance on skin tone, skin tone annotations enable researchers to measure diversity and representation in image retrieval systems, dataset collection, and image generation. For all of these applications, a collection of meaningful and inclusive skin tone annotations is key.

By Candice Schumann and Gbolahan O. Olanubi for Google AI Blog on May 15, 2023

Mean Images

An artist considers a new form of machinic representation: the statistical rendering of large datasets, indexed to the probable rather than the real of photography; to the uncanny composite rather than the abstraction of the graph.

By Hito Steyerl for New Left Review on April 28, 2023

Proudly powered by WordPress | Theme: Baskerville 2 by Anders Noren.

Up ↑