We at the Racism and Technology Center stand in solidarity with the Palestinian people. We condemn the violence enacted against the innocent people in Palestine and Israel, and mourn alongside all who are dead, injured and still missing. Palestinian communities are being subjected to unlawful collective punishment in Gaza and the West Bank, including the ongoing bombings and the blockade of water, food and energy. We call for an end to the blockade and an immediate ceasefire.
Continue reading “Standing in solidarity with the Palestinian people”In the Netherlands, algorithmic discrimination is everywhere according to the Dutch Data Protection Authority
In its 2023 annual report, the Autoriteit Persoonsgegevens (the Dutch Data Protection Authority) is dismayed by how much algorithmic discrimination it encounters while doing its oversight.
Continue reading “In the Netherlands, algorithmic discrimination is everywhere according to the Dutch Data Protection Authority”Racist Technology in Action: Michigan car insurers are allowed to charge a higher premium in Black neighbourhoods
An investigation by The Markup and Outlier Media shows how the law in Michigan allows car insurers to take location into account when deciding on a premium, penalizing the state’s Black population.
Continue reading “Racist Technology in Action: Michigan car insurers are allowed to charge a higher premium in Black neighbourhoods”Generative AI’s ability to ‘pink-wash’ Black and Queer protests
Using a very clever methodology, this year’s Digital Method Initiative Summer School participants show how generative AI models like OpenAI’s GTP-4o will “dress up” controversial topics when you push the model to work with controversial content, like war, protest, or porn.
Continue reading “Generative AI’s ability to ‘pink-wash’ Black and Queer protests”Racist Technology in Action: AI detection of emotion rates Black basketball players as ‘angrier’ than their White counterparts
In 2018, Lauren Rhue showed that two leading emotion detection software products had a racial bias against Black Men: Face++ thought they were more angry, and Microsoft AI thought they were more contemptuous.
Continue reading “Racist Technology in Action: AI detection of emotion rates Black basketball players as ‘angrier’ than their White counterparts”The datafication of race and ethnicity
The New York Times published a fascinating overview of the American census forms since the late 18th century. It shows how the form keeps trying to ‘capture’ the country’s demographics, “creating and reshaping the ever-changing views of racial and ethnic identity.”
Continue reading “The datafication of race and ethnicity”How generative AI tools represent EU politicians: in a biased way
Algorithm Watch experimented with three major generative AI tools, generating 8,700 images of politicians. They found that all these tools make an active effort to lessen bias, but that the way they attempt to do this is problematic.
Continue reading “How generative AI tools represent EU politicians: in a biased way”Podcast: Art as a prophetic activity for the future of AI
Our own Hans de Zwart was a guest in the ‘Met Nerds om Tafel’ podcast. With Karen Palmer (creator of Consensus Gentium, a film about surveillance that watches you back), they discussed the role of art and storytelling in getting us ready for the future.
Continue reading “Podcast: Art as a prophetic activity for the future of AI”Racist Technology in Action: MyLife.com and discriminatory predation
MyLife.com is one of those immoral American companies that collect personal information to sell onwards as profiles on the one hand, while at the same suggesting to the people that are being profiled that incriminating information about them exists online that they can get removed by buying a subscription (that then does nothing and auto-renews in perpetuity).
Continue reading “Racist Technology in Action: MyLife.com and discriminatory predation”Students with a non-European migration background had a 3.0 times higher chance of receiving an unfounded home visit from the Dutch student grants fraud department
Last year, Investico revealed how DUO, the Dutch organization for administering student grants, was using a racist algorithm to decide which students would get a home visit to check for fraudulent behaviour. The Minister of Education immediately stopped the use of the algorithm.
Continue reading “Students with a non-European migration background had a 3.0 times higher chance of receiving an unfounded home visit from the Dutch student grants fraud department”Dutch Ministry of Foreign Affairs dislikes the conclusions of a solid report that marks their visa process as discriminatory so buys a shoddy report saying the opposite
For more than a year now, the Dutch Ministry of Foreign Affairs has ignored advice from its experts and continued its use of discriminatory risk profiling of visa applicants.
Continue reading “Dutch Ministry of Foreign Affairs dislikes the conclusions of a solid report that marks their visa process as discriminatory so buys a shoddy report saying the opposite”Dutch Institute of Human Rights tells the government: “Test educational tools for possible discriminatory effects”
The Dutch Institute for Human Rights has commissioned research exploring the possible risks for discrimination and exclusion relating to the use of algorithms in education in the Netherlands.
Continue reading “Dutch Institute of Human Rights tells the government: “Test educational tools for possible discriminatory effects””Racist Technology in Action: Autocorrect is Western- and White-focused
The “I am not a typo” campaign is asking the tech giants to update their name dictionaries and stop autocorrecting the 41% of names given to babies in England and Wales.
Continue reading “Racist Technology in Action: Autocorrect is Western- and White-focused”AI detection has no place in education
The ubiquitous availability of AI has made plagiarism detection software utterly useless, argues our Hans de Zwart in the Volkskrant.
Continue reading “AI detection has no place in education”The datasets to train AI models need more checks for harmful and illegal materials
This Atlantic conversation between Matteo Wong and Abeba Birhane touches on some critical issues surrounding the use of large datasets to train AI models.
Continue reading “The datasets to train AI models need more checks for harmful and illegal materials”White supremacy and Artificial General Intelligence
Many AI bros are feverishly trying to attain what they call “Artificial General Intelligence” or AGI. In a piece on Medium, David Golumbia outlines connections between this pursuit of AGI and white supremacist thinking around “race science”.
Continue reading “White supremacy and Artificial General Intelligence”Racist Technology in Action: Outsourced labour in Nigeria is shaping AI English
Generative AI uses particular English words way more than you would expect. Even though it is impossible to know for sure that a particular text was written by AI (see here), you can say something about that in aggregate.
Continue reading “Racist Technology in Action: Outsourced labour in Nigeria is shaping AI English”Ethnic profiling is a problem in all of the Dutch government
On the International Day against Racism and Discrimination, Amnesty International Netherlands published their new research on the lack of protection by the Dutch government against racial profiling. Amnesty calls for immediate action to address the pervasive issue of ethnic profiling in law enforcement practices.
Continue reading “Ethnic profiling is a problem in all of the Dutch government”Tech workers demand Google and Amazon to stop their complicity in Israel’s genocide against the Palestinian people
Since 2021, thousands of Amazon and Google tech workers have been organising against Project Nimbus, Google and Amazon’s shared USD$1.2 billion contract with the Israeli government and military. Since then, there has been no response from management or executive. Their organising efforts have accelerated since 7 October 2023, with the ongoing genocide on Gaza and occupied Palestinian territories by the Israeli state.
Continue reading “Tech workers demand Google and Amazon to stop their complicity in Israel’s genocide against the Palestinian people”OpenAI’s GPT sorts resumes with a racial bias
Bloomberg did a clever experiment: they had OpenAI’s GPT rank resumes and found that it shows a gender and racial bias just on the basis of the name of the candidate.
Continue reading “OpenAI’s GPT sorts resumes with a racial bias”Racist Technology in Action: The UK Home Office’s Sorting Algorithm and the Racist Violence of Borders
In 2020, two NGOs finally forced the UK Home Office’s hand, compelling it to abandon its secretive and racist algorithm for sorting visitor visa applications. Foxglove and The Joint Council for the Welfare of Immigrants (JCWI) had been battling the algorithm for years, arguing that it is a form of institutionalized racism and calling it “speedy boarding for white people.”
Continue reading “Racist Technology in Action: The UK Home Office’s Sorting Algorithm and the Racist Violence of Borders”The child benefits scandal: no lessons learned
“It could happen again tomorrow” is one of the main devastating conclusions of the parlementary inquiry following the child benefits scandal.
Continue reading “The child benefits scandal: no lessons learned”Google does performative identity politics, nonpologises, pauses their efforts, and will invariably move on to its next shitty moneymaking move
In a shallow attempt to do representation for representation’s sake, Google has managed to draw the ire of the right-wing internet by generating historically inaccurate and overly inclusive portraits of historical figures.
Continue reading “Google does performative identity politics, nonpologises, pauses their efforts, and will invariably move on to its next shitty moneymaking move”Robin Aisha Pocornie’s TEDx talk: “Error 404: Human Face Not Found”
Robin Aisha Pocornie’s case should by now be familiar for regular readers of our Center’s work. Robin has now told this story in her own voice at TEDxAmsterdam.
Continue reading “Robin Aisha Pocornie’s TEDx talk: “Error 404: Human Face Not Found””Racist Technology in Action: ChatGPT detectors are biased against non-native English writers
Students are using ChatGPT for writing their essays. Antiplagiarism tools are trying to detect whether a text was written by AI. It turns out that these type of detectors consistently misclassify the text of non-native speakers as AI-generated.
Continue reading “Racist Technology in Action: ChatGPT detectors are biased against non-native English writers”Dutch Higher Education continues to use inequitable proctoring software
In October last year, RTL news showed that Proctorio’s software, used to check if students aren’t cheating during online exams, works less for students of colour. Five months later, RTL asked the twelve Dutch educational institutions on Proctorio’s client list whether they were still using the tool. Eight say they still do.
Continue reading “Dutch Higher Education continues to use inequitable proctoring software”On “The Palestine Laboratory”
A large part of Israel’s economy and global influence are dependent on its military-technology complex that not only fuels the ongoing genocide in Gaza but is also exported to facilitate oppression around the world. In this thorough 2023 book, journalist Anthony Loewenstein makes explicit how Israel’s military industrial complex profits exorbitantly from exporting technologies “battle-tested” on occupied Gaza and the West-Bank.
Continue reading “On “The Palestine Laboratory””Anti-discrimination agencies have launched a single point of contact for reporting discrimination in the Netherlands
There is now a single point to report cases of discrimination. In a simple web form, you can make a report for yourself or somebody else, and you can do this anonymously if you want to.
Continue reading “Anti-discrimination agencies have launched a single point of contact for reporting discrimination in the Netherlands”Racist Technology in Action: Slower internet service for the same price in U.S. lower income areas with fewer White residents
Investigative reporting by The Markup showed how U.S. internet providers offer wildly different internet speeds for the same monthly fee. The neighbourhoods with the worst deals had lower median incomes and were very often the least White.
Continue reading “Racist Technology in Action: Slower internet service for the same price in U.S. lower income areas with fewer White residents”Dutch Tax Office keeps breaking the law with their risk profiling algorithms
Even though the Dutch tax office (the Belastingdienst) was advised to immediately stop the use of three risk profiling algorithms, the office decided to continue their use, according to this reporting by Follow the Money.
Continue reading “Dutch Tax Office keeps breaking the law with their risk profiling algorithms”“Mowing the lawn”: The weaponisation of water and technology in Palestine
In the most recent issue of Logic(s) Magazine, Edward Ongweso Jr. writes about Israel’s strategy towards Gaza called “mowing the lawn”: bursts of horrifying violence – a collective punishment of Palestinian people – followed by “calmer” periods where survivors are left to bury the dead, and rebuild their infrastructure while Israel continues to deepen its occupation.
Continue reading ““Mowing the lawn”: The weaponisation of water and technology in Palestine”Racist Technology in Action: Meta systemically censors and silences Palestinian content globally
The censorship and silencing of Palestinian voices, and voices of those who support Palestine, is not new. However, since the escalation of Israel’s violence on the Gaza strip since 7 October 2023, the scale of censorship has significantly heightened, particular on social media platforms such as Instagram and Facebook. In December 2023, Human Rights Watch (HRW) released a 51-page report*, stating that Meta has engaged in systematic and global censorship of content related to Palestine since October 7th.
Continue reading “Racist Technology in Action: Meta systemically censors and silences Palestinian content globally”Automating apartheid in the Occupied Palestinian Territories
In this interview, Matt Mahmoudi explains the Amnesty report titled Automating Apartheid, which he contributed to. The report exposes how the Israeli authorities extensively use surveillance tools, facial recognition technologies, and networks of CCTV cameras to support, intensify and entrench their continued domination and oppression of Palestinians in the Occupied Territories (OPT), Hebron and East Jerusalem. Facial recognition software is used by Israeli authorities to consolidate existing practices of discriminatory policing and segregation, violating Palestinians’ basic rights.
Continue reading “Automating apartheid in the Occupied Palestinian Territories”Racist Technology in Action: Generative/ing AI Bias
By now we know that generative image AI reproduces and amplifies sexism, racism, and other social systems of oppression. The latest example is of AI-generated stickers in WhatsApp that systematically depict Palestinian men and boys with rifles and guns.
Continue reading “Racist Technology in Action: Generative/ing AI Bias”Judgement of the Dutch Institute for Human Rights shows how difficult it is to legally prove algorithmic discrimination
On October 17th, the Netherlands Institute for Human Rights ruled that the VU did not discriminate against bioinformatics student Robin Pocornie on the basis of race by using anti-cheating software. However, according to the institute, the VU has discriminated on the grounds of race in how they handled her complaint.
Continue reading “Judgement of the Dutch Institute for Human Rights shows how difficult it is to legally prove algorithmic discrimination”Proctoring software uses fudge-factor for dark skinned students to adjust their suspicion score
Respondus, a vendor of online proctoring software, has been granted a patent for their “systems and methods for assessing data collected by automated proctoring.” The patent shows that their example method for calculating a risk score is adjusted on the basis of people’s skin colour.
Continue reading “Proctoring software uses fudge-factor for dark skinned students to adjust their suspicion score”Equal love: Dating App Breeze seeks to address Algorithmic Discrimination
In a world where swiping left or right is the main route to love, whose profiles dating apps show you can change the course of your life.
Continue reading “Equal love: Dating App Breeze seeks to address Algorithmic Discrimination”Use of machine translation tools exposes already vulnerable asylum seekers to even more risks
The use of and reliance on machine translation tools in asylum seeking procedures has become increasingly common amongst government contractors and organisations working with refugees and migrants. This Guardian article highlights many of the issues documented by Respond Crisis Translation, a network of people who provide urgent interpretation services for migrants and refugees. The problems with machine translation tools occur throughout the asylum process, from border stations to detention centers to immigration courts.
Continue reading “Use of machine translation tools exposes already vulnerable asylum seekers to even more risks”Al Jazeera asks: Can AI eliminate human bias or does it perpetuate it?
In its online series of digital dilemmas, Al Jazeera takes a look at AI in relation to social inequities. Loyal readers of this newsletter will recognise many of the examples they touch on, like how Stable Diffusion exacerbates and amplifies racial and gender disparities or the Dutch childcare benefits scandal.
Continue reading “Al Jazeera asks: Can AI eliminate human bias or does it perpetuate it?”Racist Technology in Action: Flagged as risky simply for requesting social assistance in Veenendaal, The Netherlands
This collaborative investigative effort by Spotlight Bureau, Lighthouse Reports and Follow the Money, dives into the story of a Moroccan-Dutch family in Veenendaal which was targeted for fraud by the Dutch government.
Continue reading “Racist Technology in Action: Flagged as risky simply for requesting social assistance in Veenendaal, The Netherlands”