Emma Pierson

She employs AI to get to the roots of health disparities across race, gender, and class.

By Neel V. Patel for MIT Technology Review on June 30, 2021

Covid-19 data: making racialised inequality in the Netherlands invisible

The CBS, the Dutch national statistics authority, issued a report in March showing that someone’s social economic status is a clear risk factor for dying of Covid-19. In an insightful piece, researchers Linnet Taylor and Tineke Broer criticise this report and show that the way in which the CBS collects and aggragates data on Covid-19 cases and deaths obfuscates the full extent of racialised or ethnic inequality in the impact of the pandemic.

Continue reading “Covid-19 data: making racialised inequality in the Netherlands invisible”

Now you see it, now you don’t: how the Dutch Covid-19 data gap makes ethnic and racialised inequality invisible

All over the world, in the countries hardest hit by Covid-19, there is clear evidence that marginalised groups are suffering the worst impacts of the disease. This plays out differently in different countries: for instance in the US, there are substantial differences in mortality rates by race and ethnicity. Israelis have a substantially lower death rate from Covid-19 than Palestinians in the West Bank or Gaza. In Brazil, being of mixed ancestry is the second most important risk factor, after age, for dying of Covid-19. These racial and ethnic (and related) differences appear also to be present in the Netherlands, but have been effectively rendered politically invisible by the national public health authority’s refusal to report on it.

By Linnet Taylor and Tineke Broer for Global Data Justice on June 17, 2021

Understanding Digital Racism After COVID-19

The Oxford Internet Institute hosts Lisa Nakamura, Director Digital Studies Institute, Gwendolyn Calvert Baker Collegiate Professor, Department of American Culture, University of Michigan, Ann Arbor. Professor Nakamura is the founding Director of the Digital Studies Institute at the University of Michigan, and a writer focusing on digital media, race, and gender. ‘We are living in an open-ended crisis with two faces: unexpected accelerated digital adoption and an impassioned and invigorated racial justice movement. These two vast and overlapping cultural transitions require new inquiry into the entangled and intensified dialogue between race and digital technology after COVID. My project analyzes digital racial practices on Facebook, Twitter, Zoom, and TikTok while we are in the midst of a technological and racialized cultural breaking point, both to speak from within the crisis and to leave a record for those who come after us. How to Understand Digital Racism After COVID-19 contains three parts: Methods, Objects, and Making, designed to provide humanists and critical social scientists from diverse disciplines or experience levels with pragmatic and easy to use tools and methods for accelerated critical analyses of the digital racial pandemic.’

From YouTube on November 12, 2020

Dissecting racial bias in an algorithm used to manage the health of populations

The U.S. health care system uses commercial algorithms to guide health decisions. Obermeyer et al. find evidence of racial bias in one widely used algorithm, such that Black patients assigned the same level of risk by the algorithm are sicker than White patients (see the Perspective by Benjamin). The authors estimated that this racial bias reduces the number of Black patients identified for extra care by more than half. Bias occurs because the algorithm uses health costs as a proxy for health needs. Less money is spent on Black patients who have the same level of need, and the algorithm thus falsely concludes that Black patients are healthier than equally sick White patients. Reformulating the algorithm so that it no longer uses costs as a proxy for needs eliminates the racial bias in predicting who needs extra care.

By Brian Powers, Christine Vogeli, Sendhil Mullainathan and Ziad Obermeyer for Science on October 25, 2019

Proudly powered by WordPress | Theme: Baskerville 2 by Anders Noren.

Up ↑