Digital Apartheid in Gaza: Unjust Content Moderation at the Request of Israel’s Cyber Unit

Government involvement in content moderation raises serious human rights concerns in every context. Since October 7, social media platforms have been challenged for the unjustified takedowns of pro-Palestinian content—sometimes at the request of the Israeli government—and a simultaneous failure to remove hate speech towards Palestinians. More specifically, social media platforms have worked with the Israeli Cyber Unit—a government office set up to issue takedown requests to platforms—to remove content considered as incitement to violence and terrorism, as well as any promotion of groups widely designated as terrorists.

By Jillian C. York and Paige Collings for Electronic Frontier Foundation (EFF) on July 26, 2024

Racist Technology in Action: MyLife.com and discriminatory predation

MyLife.com is one of those immoral American companies that collect personal information to sell onwards as profiles on the one hand, while at the same suggesting to the people that are being profiled that incriminating information about them exists online that they can get removed by buying a subscription (that then does nothing and auto-renews in perpetuity).

Continue reading “Racist Technology in Action: MyLife.com and discriminatory predation”

Dismantling the “Black Opticon”: Privacy, Race, Equity, and Online Data-Protection Reform

African Americans online face three distinguishable but related categories of vulnerability to bias and discrimination that I dub the “Black Opticon”: discriminatory oversurveillance, discriminatory exclusion, and discriminatory predation. Escaping the Black Opticon is unlikely without acknowledgement of privacy’s unequal distribution and privacy law’s outmoded and unduly race-neutral façade. African Americans could benefit from race-conscious efforts to shape a more equitable digital public sphere through improved laws and legal institutions. This Essay critically elaborates the Black Opticon triad and considers whether the Virginia Consumer Data Protection Act (2021), the federal Data Protection Act (2021), and new resources for the Federal Trade Commission proposed in 2021 possibly meet imperatives of a race-conscious African American Online Equity Agenda, specifically designed to help dismantle the Black Opticon. The path forward requires jumping those hurdles, regulating platforms, and indeed all of the digital economy, in the interests of nondiscrimination, antiracism, and antisubordination. Toward escaping the Black Opticon’s pernicious gaze, African Americans and their allies will continue the pursuit of viable strategies for justice and equity in the digital economy.

By Anita L. Allen for The Yale Law Journal on February 20, 2022

Google does performative identity politics, nonpologises, pauses their efforts, and will invariably move on to its next shitty moneymaking move

In a shallow attempt to do representation for representation’s sake, Google has managed to draw the ire of the right-wing internet by generating historically inaccurate and overly inclusive portraits of historical figures.

Continue reading “Google does performative identity politics, nonpologises, pauses their efforts, and will invariably move on to its next shitty moneymaking move”

‘Stoppen met Twitter doet niets tegen racisme’

Veel media en journalisten die de haat op X tegen Slimste mens-deelnemer Akwasi zo ferm veroordelen, spreken met dubbele tong, vindt OneWorld-hoofdredacteur Seada Nourhussen. Jarenlang droegen ze bij aan die haat, zonder het racisme te erkennen dat eraan ten grondslag ligt.

By Seada Nourhussen for OneWorld on January 12, 2024

Racist Technology in Action: Meta systemically censors and silences Palestinian content globally

The censorship and silencing of Palestinian voices, and voices of those who support Palestine, is not new. However, since the escalation of Israel’s violence on the Gaza strip since 7 October 2023, the scale of censorship has significantly heightened, particular on social media platforms such as Instagram and Facebook. In December 2023, Human Rights Watch (HRW) released a 51-page report*, stating that Meta has engaged in systematic and global censorship of content related to Palestine since October 7th.

Continue reading “Racist Technology in Action: Meta systemically censors and silences Palestinian content globally”

The racial economy of Instagram

This paper explores the mechanisms of white supremacy within digital spaces in relation to the body/embodiment, social justice movements, and the nature and expression of contemporary feminism. New digital political economies work through social media such as Instagram to colonise, disempower and obscure the work of Black feminists in the sphere of fat liberation (re-framed as ‘body positivity’), and in terms of imperatives for self-care, which have been co-opted by an emerging online wellness industry. I call to account the pervasiveness of neoliberal logics which are re-shaping (post)feminism and re-inscribing white supremacy onto bodies online and offline through ‘disciplined whiteness’.

By Sinéad O’Connor for RGS-IBG Publications Hub on September 28, 2023

Standing in solidarity with the Palestinian people

We at the Racism and Technology Center stand in solidarity with the Palestinian people. We condemn the violence enacted against the innocent people in Palestine and Israel, and mourn alongside all who are dead, injured and still missing. Palestinian communities are being subjected to unlawful collective punishment in Gaza and the West Bank, including the ongoing bombings and the blockade of water, food and energy. We call for an end to the blockade and an immediate ceasefire.

Continue reading “Standing in solidarity with the Palestinian people”

The Costs of Connection – How Data is Colonizing Human Life and Appropriating it for Capitalism

A profound exploration of how the ceaseless extraction of information about our intimate lives is remaking both global markets and our very selves. The Costs of Connection represents an enormous step forward in our collective understanding of capitalism’s current stage, a stage in which the final colonial input is the raw data of human life. Challenging, urgent and bracingly original.

By Nick Couldry and Ulises A. Mejias for Colonized by Data

What’s at stake with losing (Black) Twitter and moving to (white) Mastodon?

The immanent demise of Twitter after Elon Musk’s takeover sparked an exodus of people leaving the platform, which is only expected to increase. The significant increase in hate speech, and general hostile atmosphere created by the erratic decrees by it’s owner (such as Trump’s reinstatement) made, in the New Yorker writer Jelani Cobb’s words, “remaining completely untenable”. This, often vocal, movement of people from the platform has sparked a debate on what people stand to loose and what the alternative is.

Continue reading “What’s at stake with losing (Black) Twitter and moving to (white) Mastodon?”

Profiting off Black bodies

Tiera Tanksley’s work seeks to better understand how forms of digitally mediated traumas, such as seeing images of Black people dead and dying on social media, are impacting Black girls’ mental and emotional wellness in the U.S. and Canada. Her fears were confirmed in her findings: Black girls report unprecedented levels of fear, depression, anxiety and chronic stress. Viewing Black people being killed by the state was deeply traumatic, with mental, emotional and physiological effects.

Continue reading “Profiting off Black bodies”

I’m @Sinders on Mastodon but I’m not giving up on Twitter, yet

I’m sure you’ve seen the tweets, and the think pieces about how much worse Twitter is gonna get. My friend Justin Hendrix mentioned losing a few hundred followers in a case of a few hours, after Elon brought a sink into Twitter headquarters (which is the lamest bit I’ve ever seen- massive fail of a dad joke). A huge chunk of people I follow now have their Mastodon handles in their Twitter names. It’s a chunk of the influencers, academics, activists, and civil society folks- the researchers who I follow, who are actively mourning, and hand wringing, about the destruction that is to come, already in the throes of grief of the twitter that was. But the thing is- all of these folks are white.

By Caroline Sinders for Medium on October 31, 2022

The Whiteness of Mastodon

A conversation with Dr. Johnathan Flowers about Elon Musk’s changes at Twitter and the dynamics on Mastodon, the decentralized alternative.

By Johnathan Flowers and Justin Hendrix for Tech Policy Press on November 23, 2022

AI innovation for whom, and at whose expense?

This fantastic article by Williams, Miceli and Gebru, describes how the methodological shift of AI systems to deep-learning-based models has required enormous amounts of “data” for models to learn from. Large volumes of time-consuming work, such as labelling millions of images, can now be broken down into smaller tasks and outsourced to data labourers across the globe. These data labourers have terribly low wagen, often working in dire working conditions.

Continue reading “AI innovation for whom, and at whose expense?”

Algorithmic power and African indigenous languages: search engine autocomplete and the global multilingual Internet

Predictive language technologies – such as Google Search’s Autocomplete – constitute forms of algorithmic power that reflect and compound global power imbalances between Western technology companies and multilingual Internet users in the global South. Increasing attention is being paid to predictive language technologies and their impacts on individual users and public discourse. However, there is a lack of scholarship on how such technologies interact with African languages. Addressing this gap, the article presents data from experimentation with autocomplete predictions/suggestions for gendered or politicised keywords in Amharic, Kiswahili and Somali. It demonstrates that autocomplete functions for these languages and how users may be exposed to harmful content due to an apparent lack of filtering of problematic ‘predictions’. Drawing on debates on algorithmic power and digital colonialism, the article demonstrates that global power imbalances manifest here not through a lack of online African indigenous language content, but rather in regard to the moderation of content across diverse cultural and linguistic contexts. This raises dilemmas for actors invested in the multilingual Internet between risks of digital surveillance and effective platform oversight, which could prevent algorithmic harms to users engaging with platforms in a myriad of languages and diverse socio-cultural and political environments.

By Peter Chonka, Stephanie Diepeveen and Yidnekachew Haile for SAGE Journals on June 22, 2022

Meta forced to change its advertisement algorithm to address algorithmic discrimination

In his New York Times article, Mike Isaac describes how Meta is implementing a new system to automatically check whether the housing, employment and credit ads it hosts are shown to people equally. This is a move following a 111,054 US dollar fine the US Justice Department has issued Meta because its ad systems have been shown to discriminate its users by, amongst other things, excluding black people from seeing certain housing ads in predominately white neighbourhoods. This is the outcome of a long process, which we have written about previously.

Continue reading “Meta forced to change its advertisement algorithm to address algorithmic discrimination”

Proudly powered by WordPress | Theme: Baskerville 2 by Anders Noren.

Up ↑