Dutch Ministry of Foreign Affairs dislikes the conclusions of a solid report that marks their visa process as discriminatory so buys a shoddy report saying the opposite

For more than a year now, the Dutch Ministry of Foreign Affairs has ignored advice from its experts and continued its use of discriminatory risk profiling of visa applicants.

A report by the Rijks ICT Gilde (a government auditor) concluded in April 2023 that the process is very biased against certain nationalities. The Ministry of Foreign Affairs didn’t agree with those results, so they commissioned an algorithm auditor in Canada to do another analysis. They only published the former report once the latter report (unsurprisingly exonerating them) came in.

The second report doesn’t document their research methods well enough and some of the conclusions are hard to understand, according to researcher Cynthia Liem, in the NRC.

Gabriel Geiger, one of the journalists digging up this story for Lighthouse Reports is scathing about this turn of events:

The Sigma Red report is unreadable and puzzling. […] Governments trampling over publicly accountable auditors and internal privacy watchdogs to procure the opinions they want is deeply problematic. This situation validates many of the critiques I’ve seen of the growing cottage industry of algorithmic auditing. I worry that algorithmic auditing becomes a sort of inverse kangaroo court for governments. A farcical rubberstamping exercise where the only accepted conclusions are “continue business as usual” or “adjust slightly, but carry on.”

See: LET OP, zegt de computer van Buitenlandse Zaken bij tienduizenden visumaanvragen. Is dat discriminatie? at NRC, and Gabriel Geiger on LinkedIn.

Photo by Ben Koorengevel.

Comments are closed.

Proudly powered by WordPress | Theme: Baskerville 2 by Anders Noren.

Up ↑