Using a very clever methodology, this year’s Digital Method Initiative Summer School participants show how generative AI models like OpenAI’s GTP-4o will “dress up” controversial topics when you push the model to work with controversial content, like war, protest, or porn.
Continue reading “Generative AI’s ability to ‘pink-wash’ Black and Queer protests”Racist Technology in Action: AI detection of emotion rates Black basketball players as ‘angrier’ than their White counterparts
In 2018, Lauren Rhue showed that two leading emotion detection software products had a racial bias against Black Men: Face++ thought they were more angry, and Microsoft AI thought they were more contemptuous.
Continue reading “Racist Technology in Action: AI detection of emotion rates Black basketball players as ‘angrier’ than their White counterparts”Events, exhibits and other things to do
Starting July 23rd, 2024.
Continue reading “Events, exhibits and other things to do”The datafication of race and ethnicity
The New York Times published a fascinating overview of the American census forms since the late 18th century. It shows how the form keeps trying to ‘capture’ the country’s demographics, “creating and reshaping the ever-changing views of racial and ethnic identity.”
Continue reading “The datafication of race and ethnicity”How generative AI tools represent EU politicians: in a biased way
Algorithm Watch experimented with three major generative AI tools, generating 8,700 images of politicians. They found that all these tools make an active effort to lessen bias, but that the way they attempt to do this is problematic.
Continue reading “How generative AI tools represent EU politicians: in a biased way”Podcast: Art as a prophetic activity for the future of AI
Our own Hans de Zwart was a guest in the ‘Met Nerds om Tafel’ podcast. With Karen Palmer (creator of Consensus Gentium, a film about surveillance that watches you back), they discussed the role of art and storytelling in getting us ready for the future.
Continue reading “Podcast: Art as a prophetic activity for the future of AI”Racist Technology in Action: MyLife.com and discriminatory predation
MyLife.com is one of those immoral American companies that collect personal information to sell onwards as profiles on the one hand, while at the same suggesting to the people that are being profiled that incriminating information about them exists online that they can get removed by buying a subscription (that then does nothing and auto-renews in perpetuity).
Continue reading “Racist Technology in Action: MyLife.com and discriminatory predation”Events, exhibits and other things to do
Starting June 26th, 2024.
Continue reading “Events, exhibits and other things to do”Students with a non-European migration background had a 3.0 times higher chance of receiving an unfounded home visit from the Dutch student grants fraud department
Last year, Investico revealed how DUO, the Dutch organization for administering student grants, was using a racist algorithm to decide which students would get a home visit to check for fraudulent behaviour. The Minister of Education immediately stopped the use of the algorithm.
Continue reading “Students with a non-European migration background had a 3.0 times higher chance of receiving an unfounded home visit from the Dutch student grants fraud department”Dutch Ministry of Foreign Affairs dislikes the conclusions of a solid report that marks their visa process as discriminatory so buys a shoddy report saying the opposite
For more than a year now, the Dutch Ministry of Foreign Affairs has ignored advice from its experts and continued its use of discriminatory risk profiling of visa applicants.
Continue reading “Dutch Ministry of Foreign Affairs dislikes the conclusions of a solid report that marks their visa process as discriminatory so buys a shoddy report saying the opposite”Dutch Institute of Human Rights tells the government: “Test educational tools for possible discriminatory effects”
The Dutch Institute for Human Rights has commissioned research exploring the possible risks for discrimination and exclusion relating to the use of algorithms in education in the Netherlands.
Continue reading “Dutch Institute of Human Rights tells the government: “Test educational tools for possible discriminatory effects””Racist Technology in Action: Autocorrect is Western- and White-focused
The “I am not a typo” campaign is asking the tech giants to update their name dictionaries and stop autocorrecting the 41% of names given to babies in England and Wales.
Continue reading “Racist Technology in Action: Autocorrect is Western- and White-focused”Events, exhibits and other things to do
Starting May 28th, 2024.
Continue reading “Events, exhibits and other things to do”AI detection has no place in education
The ubiquitous availability of AI has made plagiarism detection software utterly useless, argues our Hans de Zwart in the Volkskrant.
Continue reading “AI detection has no place in education”White supremacy and Artificial General Intelligence
Many AI bros are feverishly trying to attain what they call “Artificial General Intelligence” or AGI. In a piece on Medium, David Golumbia outlines connections between this pursuit of AGI and white supremacist thinking around “race science”.
Continue reading “White supremacy and Artificial General Intelligence”Racist Technology in Action: Outsourced labour in Nigeria is shaping AI English
Generative AI uses particular English words way more than you would expect. Even though it is impossible to know for sure that a particular text was written by AI (see here), you can say something about that in aggregate.
Continue reading “Racist Technology in Action: Outsourced labour in Nigeria is shaping AI English”Events, exhibits and other things to do
Starting April 30th, 2024.
Continue reading “Events, exhibits and other things to do”OpenAI’s GPT sorts resumes with a racial bias
Bloomberg did a clever experiment: they had OpenAI’s GPT rank resumes and found that it shows a gender and racial bias just on the basis of the name of the candidate.
Continue reading “OpenAI’s GPT sorts resumes with a racial bias”Events, exhibits and other things to do
Starting April 2nd, 2024.
Continue reading “Events, exhibits and other things to do”Google does performative identity politics, nonpologises, pauses their efforts, and will invariably move on to its next shitty moneymaking move
In a shallow attempt to do representation for representation’s sake, Google has managed to draw the ire of the right-wing internet by generating historically inaccurate and overly inclusive portraits of historical figures.
Continue reading “Google does performative identity politics, nonpologises, pauses their efforts, and will invariably move on to its next shitty moneymaking move”Robin Aisha Pocornie’s TEDx talk: “Error 404: Human Face Not Found”
Robin Aisha Pocornie’s case should by now be familiar for regular readers of our Center’s work. Robin has now told this story in her own voice at TEDxAmsterdam.
Continue reading “Robin Aisha Pocornie’s TEDx talk: “Error 404: Human Face Not Found””Racist Technology in Action: ChatGPT detectors are biased against non-native English writers
Students are using ChatGPT for writing their essays. Antiplagiarism tools are trying to detect whether a text was written by AI. It turns out that these type of detectors consistently misclassify the text of non-native speakers as AI-generated.
Continue reading “Racist Technology in Action: ChatGPT detectors are biased against non-native English writers”Events, exhibits and other things to do
Starting March 5th, 2024.
Continue reading “Events, exhibits and other things to do”Dutch Higher Education continues to use inequitable proctoring software
In October last year, RTL news showed that Proctorio’s software, used to check if students aren’t cheating during online exams, works less for students of colour. Five months later, RTL asked the twelve Dutch educational institutions on Proctorio’s client list whether they were still using the tool. Eight say they still do.
Continue reading “Dutch Higher Education continues to use inequitable proctoring software”Anti-discrimination agencies have launched a single point of contact for reporting discrimination in the Netherlands
There is now a single point to report cases of discrimination. In a simple web form, you can make a report for yourself or somebody else, and you can do this anonymously if you want to.
Continue reading “Anti-discrimination agencies have launched a single point of contact for reporting discrimination in the Netherlands”Racist Technology in Action: Slower internet service for the same price in U.S. lower income areas with fewer White residents
Investigative reporting by The Markup showed how U.S. internet providers offer wildly different internet speeds for the same monthly fee. The neighbourhoods with the worst deals had lower median incomes and were very often the least White.
Continue reading “Racist Technology in Action: Slower internet service for the same price in U.S. lower income areas with fewer White residents”Events, exhibits and other things to do
Starting February 6th, 2024.
Continue reading “Events, exhibits and other things to do”Dutch Tax Office keeps breaking the law with their risk profiling algorithms
Even though the Dutch tax office (the Belastingdienst) was advised to immediately stop the use of three risk profiling algorithms, the office decided to continue their use, according to this reporting by Follow the Money.
Continue reading “Dutch Tax Office keeps breaking the law with their risk profiling algorithms”Events, exhibits and other things to do
Starting January 9th, 2024.
Continue reading “Events, exhibits and other things to do”Events, exhibits and other things to do
Starting November 27th, 2023.
Continue reading “Events, exhibits and other things to do”Judgement of the Dutch Institute for Human Rights shows how difficult it is to legally prove algorithmic discrimination
On October 17th, the Netherlands Institute for Human Rights ruled that the VU did not discriminate against bioinformatics student Robin Pocornie on the basis of race by using anti-cheating software. However, according to the institute, the VU has discriminated on the grounds of race in how they handled her complaint.
Continue reading “Judgement of the Dutch Institute for Human Rights shows how difficult it is to legally prove algorithmic discrimination”Events, exhibits and other things to do
Starting October 28th, 2023.
Continue reading “Events, exhibits and other things to do”Proctoring software uses fudge-factor for dark skinned students to adjust their suspicion score
Respondus, a vendor of online proctoring software, has been granted a patent for their “systems and methods for assessing data collected by automated proctoring.” The patent shows that their example method for calculating a risk score is adjusted on the basis of people’s skin colour.
Continue reading “Proctoring software uses fudge-factor for dark skinned students to adjust their suspicion score”Al Jazeera asks: Can AI eliminate human bias or does it perpetuate it?
In its online series of digital dilemmas, Al Jazeera takes a look at AI in relation to social inequities. Loyal readers of this newsletter will recognise many of the examples they touch on, like how Stable Diffusion exacerbates and amplifies racial and gender disparities or the Dutch childcare benefits scandal.
Continue reading “Al Jazeera asks: Can AI eliminate human bias or does it perpetuate it?”Events, exhibits and other things to do
Starting September 30th, 2023.
Continue reading “Events, exhibits and other things to do”Another false facial recognition match: pregnant woman wrongfully arrested
The police in America is using facial recognition software to match security footage of crimes to people. Kashmir Hill describes for the New York Times another example of a wrong match leading to a wrongful arrest.
Continue reading “Another false facial recognition match: pregnant woman wrongfully arrested”Dutch police used algorithm to predict violent behaviour without any safeguards
For many years the Dutch police has used a risk modeling algorithm to predict the chance that an individual suspect will commit a violent crime. Follow the Money exposed the total lack of a moral, legal, and statistical justification for its use, and now the police has stopped using the system.
Continue reading “Dutch police used algorithm to predict violent behaviour without any safeguards”Events, exhibits and other things to do
Starting September 2nd, 2023.
Continue reading “Events, exhibits and other things to do”Current state of research: Face detection still has problems with darker faces
Scientific research on the quality of face detection systems keeps finding the same result: no matter how, when, and with which system testing is done, every time it is found that faces of people with a darker skin tone are not detected as well as the faces of people with a lighter skin tone.
Continue reading “Current state of research: Face detection still has problems with darker faces”Racist Technology in Action: How Pokéman Go inherited existing racial inequities
When Aura Bogado was playing Pokémon Go in a much Whiter neighbourhood than the one where she lived, she noticed how many more PokéStops were suddenly available. She then crowdsourced locations of these stops and found out, with the Urban Institute think tank, that there were on average 55 PokéStops in majority White neighbourhoods and 19 in neighbourhoods that were majority Black.
Continue reading “Racist Technology in Action: How Pokéman Go inherited existing racial inequities”