Crowd-Sourced Suspicion Apps Are Out of Control

Technology rarely invents new societal problems. Instead, it digitizes them, supersizes them, and allows them to balloon and duplicate at the speed of light. That’s exactly the problem we’ve seen with location-based, crowd-sourced “public safety” apps like Citizen.

By Matthew Guariglia for Electronic Frontier Foundation (EFF) on October 21, 2021

Racist Technology in Action: Predicting future criminals with a bias against Black people

In 2016, ProPublica investigated the fairness of COMPAS, a system used by the courts in the United States to assess the likelihood of a defendant committing another crime. COMPAS uses a risk assessment form to assess this risk of a defendant offending again. Judges are expected to take this risk prediction into account when they decide on sentencing.

Continue reading “Racist Technology in Action: Predicting future criminals with a bias against Black people”

Racist and classist predictive policing exists in Europe too

The enduring idea that technology will be able to solve many of the existing problems in society continues to permeate across governments. For the EUObserver, Fieke Jansen and Sarah Chander illustrate some of the problematic and harmful uses of ‘predictive’ algorithmic systems by states and public authorities across the UK and Europe.

Continue reading “Racist and classist predictive policing exists in Europe too”

Why EU needs to be wary that AI will increase racial profiling

This week the EU announces new regulations on artificial intelligence. It needs to set clear limits on the most harmful uses of AI, including predictive policing, biometric mass surveillance, and applications that exacerbate historic patterns of racist policing.

By Fieke Jansen and Sarah Chander for EUobserver on April 19, 2021

Rotterdam’s use of algorithms could lead to ethnic profiling

The Rekenkamer Rotterdam (a Court of Audit) looked at how the city of Rotterdam is using predictive algorithms and whether that use could lead to ethical problems. In their report, they describe how the city lacks a proper overview of the algorithms that it is using, how there is no coordination and thus no one takes responsibility when things go wrong, and how sensitive data (like nationality) were not used by one particular fraud detection algorithm, but that so-called proxy variables for ethnicity – like low literacy, which might correlate with ethnicity – were still part of the calculations. According to the Rekenkamer this could lead to unfair treatment, or as we would call it: ethnic profiling.

Continue reading “Rotterdam’s use of algorithms could lead to ethnic profiling”

Racist Technology in Action: Amazon’s racist facial ‘Rekognition’

An already infamous example of racist technology is Amazon’s facial recognition system ‘Rekognition’ that had an enormous racial and gender bias. Researcher and founder of the Algorithmic Justice League Joy Buolawini (the ‘poet of code‘), together with Deborah Raji, meticulously reconstructed how accurate Rekognition was in identifying different types of faces. Buolawini and Raji’s study has been extremely consequencial in laying bare the racism and sexism in these facial recognition systems and was featured in the popular Coded Bias documentary.

Continue reading “Racist Technology in Action: Amazon’s racist facial ‘Rekognition’”

This is the EU’s chance to stop racism in artificial intelligence

As the European Commission prepares its legislative proposal on artificial intelligence, human rights groups are watching closely for clear rules to limit discriminatory AI. In practice, this means a ban on biometric mass surveillance practices and red lines (legal limits) to stop harmful uses of AI-powered technologies.

By Sarah Chander for European Digital Rights (EDRi) on March 16, 2021

The Fort Rodman Experiment

In 1965, IBM launched the most ambitious attempt ever to diversify a tech company. The industry still needs to learn the lessons of that failure.

By Charlton McIlwain for Logic on December 20, 2021

The Dutch government’s love affair with ethnic profiling

In his article for One World, Florentijn van Rootselaar shows how the Dutch government uses automated systems to profile certain groups based on their ethnicity. He uses several examples to expose how, even though Western countries are often quick to denounce China’s use of technology to surveil, profile and oppress the Uighurs, the same states themselves use or contribute to the development of similar technologies.

Continue reading “The Dutch government’s love affair with ethnic profiling”

Decode the Default

Technology has never been colorblind. It’s time to abolish notions of “universal” users of software.

From The Internet Health Report 2020 on January 1, 2021

Hoe Nederland A.I. inzet voor etnisch profileren

China dat kunstmatige intelligentie inzet om Oeigoeren te onderdrukken: klinkt als een ver-van-je-bed-show? Ook Nederland (ver)volgt specifieke bevolkingsgroepen met algoritmes. Zoals in Roermond, waar camera’s alarm slaan bij auto’s met een Oost-Europees nummerbord.

By Florentijn van Rootselaar for OneWorld on January 14, 2021

Technology Can’t Predict Crime, It Can Only Weaponize Proximity to Policing

In June 2020, Santa Cruz, California became the first city in the United States to ban municipal use of predictive policing, a method of deploying law enforcement resources according to data-driven analytics that supposedly are able to predict perpetrators, victims, or locations of future crimes. Especially interesting is that Santa Cruz was one of the first cities in the country to experiment with the technology when it piloted, and then adopted, a predictive policing program in 2011. That program used historic and current crime data to break down some areas of the city into 500 foot by 500 foot blocks in order to pinpoint locations that were likely to be the scene of future crimes. However, after nine years, the city council voted unanimously to ban it over fears of how it perpetuated racial inequality.

By Matthew Guariglia for Electronic Frontier Foundation (EFF) on September 3, 2020

Proudly powered by WordPress | Theme: Baskerville 2 by Anders Noren.

Up ↑