Low uptake of ‘Smart Pricing’ feature among black hosts increased earnings gap.
By Dave Lee and Madhumita Murgia for Financial Times on May 13, 2021
Low uptake of ‘Smart Pricing’ feature among black hosts increased earnings gap.
By Dave Lee and Madhumita Murgia for Financial Times on May 13, 2021
We have written about the racist cropping algorithm that Twitter uses, and have shared how Twitter tried to fix the symptoms. Twitter also instituted an ‘algorithmic bug bounty’, asking researchers to prove bias in their algorithms.
Continue reading “Proof for Twitter’s bias toward lighter faces”Racial discrimination in dynamic pricing algorithms is neither surprising nor new. VentureBeat writes about another recent study that supports these findings, in the context of dynamic pricing algorithms used by ride-hailing companies such as Uber, Lyft and other apps. Neighbourhoods that were poorer and with larger non-white populations were significantly associated with higher fare prices. A similar issue was discovered in Airbnb’s ‘Smart Pricing’ feature which aims to help hosts secure more bookings. It turned out to be detrimental to black hosts leading to greater social inequality (even if unintentional).
Continue reading “Uber-racist: Racial discrimination in dynamic pricing algorithms”Digital photo editing tools on apps like TikTok, Snapchat and Instagram are upholding warped beauty standards—and hurting people of color.
By Tate Ryan-Mosley for MIT Technology Review on August 15, 2021
Time and time again, big tech companies have shown their ability and power to (mis)represent and (re)shape our digital world. From speech, to images, and most recently, to the emojis that we regularly use.
Continue reading “Racist Technology in Action: Apple’s emoji keyboard reinforces Western stereotypes”Platform rules often subject marginalized communities to heightened scrutiny while providing them with too little protection from harm.
By Laura Hecht-Felella and Ángel Díaz for Brennan Center for Justice on April 8, 2021
Facebook, Twitter, Instagram, YouTube and TikTok failing to act on most reported anti-Jewish posts, says study.
By Maya Wolfe-Robinson for The Guardian on August 1, 2021
The Plug and Fast Company looked at what happened to the 3.8 billion dollars that US-based tech companies committed to diversity, equity, and inclusion as their response to the Black Lives Matter protests.
Continue reading “Tech companies poured 3.8 billion USD into racial justice, but to what avail?”Some refuse to choreograph Megan Thee Stallion song, highlighting how white users get credit for Black creativity.
By Kari Paul for The Guardian on June 24, 2021
A year ago, as our lives were being upended by the pandemic, Black Americans were simultaneously processing the emotional weight and tragedy of the murders of George Floyd, Breonna Taylor, Ahmaud Arbery, and others whose lives were cut short due to police brutality. The world watched as protest after protest erupted across the country over the summer of 2020. But, unlike previous collective actions, this moment felt different. Big Tech and corporate America—predominantly white environments—broke their silence. Companies started pledging to do things differently, claiming they would doggedly support Black workers, Black organizations, and Black companies via investments, donations, and hiring pledges. At The Plug, a subscription news and insights platform covering the Black innovation economy, we quickly began documenting the commitments made by tech CEOs, cross-referencing them with data points of what Black representation looked like across their workforces and boards. (You can view the original spreadsheet here.) A year later, we’re proud to continue that work, in partnership with Fast Company. Together we set out to try to understand—through data and first-person accounts—if anything really changed. How have the lives of Black tech workers, users, and citizens been altered by the bold commitments these companies made?
From Fast Company on June 16, 2021
Surveillance expert Chris Gilliard reflects on 2020’s racial justice protests, the hypocrisy of tech companies’ commitments, and where we are one year later.
By Chris Gilliard and Katharine Schwab for Fast Company on June 16, 2021
The feature associates “Africa” with the hut emoji and “China” with the dog emoji.
By Andrew Deck for Rest of World on June 15, 2021
Online dating platforms often provide a safe space for racist attitudes.
By Brady Robards, Bronwyn Carlson and Gene Lim for The Conversation on June 7, 2020
Group publishing archival photos claims images showing traditional dress or ceremonies were deleted for allegedly containing nudity.
By Mostafa Rachwani for The Guardian on May 27, 2021
How has activism evolved in our digital society? In this episode of Sudhir Breaks the Internet, Sudhir talks to Jade Magnus Ogunnaike about the intersection of big tech and civil rights. She is a senior campaign director for Color of Change. It’s a racial justice organization that blends traditional organizing efforts with an updated playbook for how to make change.
By Jade Magnus Ogunnaike and Sudhir Venkatesh for Freakonomics on May 17, 2021
In this article for the Markup, Dara Kerr offers an interesting insight in the plight of TikTok’ers who try to earn a living on the platform. TikTok’s algorithm, or how it decides what content gets a lot of exposure, is notoriously vague. With ever changing policies and metrics, Kerr recounts how difficult it is to build up and retain a following on the platform. This vagueness does not only create difficulty for creators trying to monetize their content, but also leaves more room for TikTok to suppress or spread content at will.
Continue reading “At the mercy of the TikTok algorithm?”Color of Change petition calls Google’s block on advertisers searching for social justice content “unacceptable”.
By Leon Yin for The Markup on May 4, 2021
A secretive algorithm that’s constantly being tweaked can turn influencers’ accounts, and their prospects, upside down.
By Dara Kerr for The Markup on April 22, 2021
The company is considering how its use of machine learning may reinforce existing biases.
By Anna Kramer for Protocol on April 14, 2021
In this piece for Markup, Leon Yin and Aaron Sankin expose how Google bans advertisers from targeting terms such as “Black lives matter”, “antifascist” or “Muslim fashion”. At the same time, keywords such as “White lives matter” or “Christian fashion” are not banned. When they raised this striking discrepancy with Google, its response was to fix the discrepancies between religions and races by blocking all such terms, as well as by blocking even more social justice related keywords such as “I can’t breathe” or “LGBTQ”. Blocking these terms for ad placement can reduce the revenue for YouTuber’s fighting for these causes. Yin and Sankin place this policy in stark contrast to Google’s support for the Black Lives Matter movement.
Continue reading “Google blocks advertisers from targeting Black Lives Matter”A recent report by ADL, an anti-hate organisation in the US, has shown that social media platforms have consistently failed to prevent online hate and harassment. Despite the self-regulatory efforts made by social media companies, results from ADL’s annual survey shows that the level of online hate and harassment has barely shifted in the past three years. These online experiences disproportionately harm marginalised groups, with LGBTQI+, Asian-American, Jewish and African-American respondents reporting higher rates of various forms of harassment. Many of these problems are intrinsic to the ways in which the business models of social media platforms are optimised for maximum engagement, further exacerbating existing issues in society.
Continue reading “Online hate and harassment continue to proliferate”Evidence suggests that there has been an uptick in the amount of prejudice against East Asia during the COVID-19 pandemic.
From The Alan Turing Institute on May 8, 2020
The Oversight Board has upheld Facebook’s decision to remove specific content that violated the express prohibition on posting caricatures of Black people in the form of blackface, contained in its Hate Speech Community Standard.
From Oversight Board on April 13, 2021
Het weren van beelden van Zwarte Piet past in het beleid van Facebook om racistische blackface-stereotypen op zijn platforms tegen te gaan. Dat oordeelt een externe raad bij wie gebruikers en Facebook zelf kunnen toetsen of iets terecht wordt verwijderd of niet.
By Pieter Sabel for Volkskrant on April 13, 2021
Algorithm systematically removes their content or limits how much it can earn from advertising, they allege.
By Reed Albergotti for Washington Post on June 18, 2020
For a Markup feature, Leon Yin and Aaron Sankin compiled a list of “social and racial justice terms” with help from Color of Change, Media Justice, Mijente and Muslim Advocates, then checked if YouTube would let them target those terms for ads.
By Cory Doctorow for Pluralistic on April 10, 2021
“Black power” and “Black Lives Matter” can’t be used to find videos for ads, but “White power” and “White lives matter” were just fine.
By Aaron Sankin and Leon Yin for The Markup on April 9, 2021
Asian-Americans experienced the largest single rise in severe online hate and harassment year-over-year in comparison to other groups, with 17 percent having experienced sexual harassment, stalking, physical threats, swatting, doxing or sustained harassment this year compared to 11 percent last year, according to a survey released by ADL (the Anti-Defamation League). Fully half (50 percent) of Asian-American respondents who were harassed reported that the harassment was because of their race or ethnicity.
From ADL on March 24, 2021
The left must vie for control over the algorithms, data, and infrastructure that shape our lives.
By Meredith Whittaker and Nantina Vgontzas for The Nation on January 29, 2021
Since 2017, Mozilla – the makers of the Firefox browser – have written a yearly report on the health of the internet. This year’s report focuses on labor rights, transparency and racial justice. The piece about racial justice makes an interesting argument about how the sites we see on the first page of a search engine are a reflection of the general popularity of these sites or their ability to pay for a top result. This leads to a ‘mainstream’ bias.
Continue reading “The internet doesn’t have ‘universal’ users”Emails show that the LAPD repeatedly asked camera owners for footage during the demonstrations, raising First Amendment concerns.
By Sam Biddle for The Intercept on February 16, 2021
Enabling Apple’s “Limit Adult Websites” filter in the iOS Screen Time setting will block users from seeing any Google search results for “Asian” in any browser on their iPhone. That’s not great, folks.
By Victoria Song for Gizmodo on February 4, 2021
Facebook placed a number of leftwing organizers on a restricted list during Biden’s inauguration. It’s part of a much bigger problem.
By Akin Olla for The Guardian on January 29, 2021
Apply to participate in Data & Society’s academic workshop, The Hustle Economy: Race, Gender and Digital Entrepreneurship. This online collaborative program on May 20, 2021 will have space for both deep dives into academic works-in-progress as well as multidisciplinary discussions of alternative practitioner projects that contribute to the understanding of hustle economies and their embodiments. Data & Society’s Director of Research and Associate Professor of Anthropology at the University of Washington Sareeta Amrute, Associate Professor at the University of North Carolina at Chapel Hill School of Information and Library Science Tressie McMillan Cottom, and Assistant Professor of Media Studies at the University of Virginia Lana Swartz invite applications from project leads to workshop their academic papers, podcasts, chapters, data mappings, and so on, and from collaborators to prepare interdisciplinary feedback on the selected works-in-progress. Together, we’ll help develop this emerging field centered on the lived experience, blunders, and promises of the digital economy.
From Data & Society on January 26, 2021
In light of the Black Lives Matter protests in the U.S. and protests against police brutality in Europe, technology companies have been quick to release corporate statements, commitments, campaigns and initiatives to tackle discrimination and racial injustice. Amber Hamilton evaluated 63 public facing documents from major technology companies such as Facebook, Instagram, Twitter, YouTube, Airbnb and TikTok.
Continue reading “Corporatespeak and racial injustice”A recent, yet already classic, example of racist technology is Twitter’s photo cropping machine learning algorithm. The algorithm was shown to consistently preference white faces in the cropped previews of pictures.
Continue reading “Racist technology in action: Cropping out the non-white”Philosopher Dr. Natalie Ashton delves into the epistemic pitfalls of Facebook and the epistemic merits of Twitter.
By Natalie Ashton for Logically on November 26, 2020
The Oxford Internet Institute hosts Lisa Nakamura, Director Digital Studies Institute, Gwendolyn Calvert Baker Collegiate Professor, Department of American Culture, University of Michigan, Ann Arbor. Professor Nakamura is the founding Director of the Digital Studies Institute at the University of Michigan, and a writer focusing on digital media, race, and gender. ‘We are living in an open-ended crisis with two faces: unexpected accelerated digital adoption and an impassioned and invigorated racial justice movement. These two vast and overlapping cultural transitions require new inquiry into the entangled and intensified dialogue between race and digital technology after COVID. My project analyzes digital racial practices on Facebook, Twitter, Zoom, and TikTok while we are in the midst of a technological and racialized cultural breaking point, both to speak from within the crisis and to leave a record for those who come after us. How to Understand Digital Racism After COVID-19 contains three parts: Methods, Objects, and Making, designed to provide humanists and critical social scientists from diverse disciplines or experience levels with pragmatic and easy to use tools and methods for accelerated critical analyses of the digital racial pandemic.’
From YouTube on November 12, 2020
Did a newsletter company create a more equitable media system—or replicate the flaws of the old one?
By Clio Chang for Columbia Journalism Review
Proudly powered by WordPress | Theme: Baskerville 2 by Anders Noren.