The police in America is using facial recognition software to match security footage of crimes to people. Kashmir Hill describes for the New York Times another example of a wrong match leading to a wrongful arrest.
Porcha Woodruff was eight months pregnant and clearly in no shape to have committed a robbery and a carjacking, but that didn’t stop the police from taking her into custody on the basis of a match that their facial recognition software had made. This is the sixth known case in the U.S. of a person being accused of a crime as result of the use of facial recognition technology. All six people have been Black.
Our own Naomi Appelman was interviewed about this case for the Dutch nightly radio show Met het Oog op Morgen. You can listen to it here:
Naomi argued that these problems are not just a U.S. phenomenon but happen in the Netherlands too. She explains how we shouldn’t look at this as a purely technological problem (that could be fixed by just improving the technology), but that we should see this as a societal problem. For example, this particular case can’t be seen outside of the more broader problem of racism in policing in the United States.
See: Eight Months Pregnant and Arrested After False Facial Recognition Match at the New York Times, and Met het Oog op Morgen: Gezichtsherkenning herkent zwarte vrouw niet at NPO Radio 1.
Photo of Porcha Woodruff by Nic Antaya from the original New York Times Article.