Current state of research: Face detection still has problems with darker faces

Scientific research on the quality of face detection systems keeps finding the same result: no matter how, when, and with which system testing is done, every time it is found that faces of people with a darker skin tone are not detected as well as the faces of people with a lighter skin tone.

Here are three pieces of recent research solidifying this point:

At the 36th Conference on Neural Information Processing Systems (NeurIPS 2022), four researchers presented their research Robustness Disparities in Face Detection on how face detection algorithms deal with noise, i.e. situations that are not ideal. They looked at three best in class academic detection algorithms and the algorithms of commercial providers Google, Amazon, and Microsoft. They conclude: “Across all the datasets and systems, we generally find that photos of individuals who are masculine presenting, older, of darker skin type, or have dim lighting are more susceptible to errors than their counterparts in other identities,” and “we observe a statistically significant bias against dark skinned individuals across every model.” The researchers also conclude that if the commercial providers have invested in getting this bias to decrease as a result of the 2018 Gender Shades study, they unfortunately failed to do so.

Another group of researchers presented at the AAAI/ACM Conference on AI, Ethics, and Society (AIES 2022) their research entitled Enhancing Fairness in Face Detection in Computer Vision Systems by Demographic Bias Mitigation. In this research, they set out to find ways to reduce bias within face detection. To do that, they first had to measure the existing bias. Based on their measurement method, they concluded that the face detection systems they tested all had significant demographic bias against faces with a darker skin tone. Their mitigation methods were able to reduce that bias, but not eliminate it.

Finally, the US Maryland Test Facility organises a “rally” every year around biometric technology. Their testing facilities allow them to try to recognize real people in a realistic situation. Unique to their approach is that they use a colorimeter to measure the actual skin tone of the test subjects. In 2021, they were specifically interested in recognizing “diverse individuals” with and without a face mask. The simulated situation was that of passport scanning at airports, thus with ideal lighting conditions. The process was divided into two parts, first the face detection (a system with a camera and an algorithm to find the face) called “acquisition” by the researchers, and then the face recognition called “matching”. They tested fifty combinations of the best available systems for acquisition and matching. Face detection of people with a darker skin tone proved to be a problem. The researchers concluded at the International Face Performance Conference (IFPC 2022): “Face recognition system performance varies as a function of skin tone [with] reduced performance for people with darker skin,” and “Failure to Acquire is greater for volunteers with darker skin tone.” This is consistent with their earlier research in which they also showed that darker skin tone has a negative effect on face detection.

The current state of affairs should raise a red flag for any organization using or intending to use face detection (and therefore also facial recognition) in their processes: it is a near certainty that this software is discriminatory and therefore unlawful and immoral.

A screenshot from Google Streetview showing four pictures of faces inside the window of a dentist. The two light faces are blurred, the two dark faces are not blurred.
Of the four portraits in this window,
Google Street View only managed to blur the two
faces with a lighter skin tone.

Black people are suffering the consequences of the increasing application of these algorithm in our societies. For example, they have trouble getting into online proctored exams, or their faces aren’t recognized by the blurring algorithms that are supposed to protect their privacy (see the image above). The solution is not to improve face detection, as there are reasons to assume that the difference in quality is also a result of simple and inescapable physics with regards to light and skin reflectivity. The solution is not to implement face detection and facial recognition in the first place.

Header image by Philipp Schmitt & AT&T Laboratories Cambridge via Better Images of AI, licenced by Creative Commons BY 4.0.

Comments are closed.

Proudly powered by WordPress | Theme: Baskerville 2 by Anders Noren.

Up ↑