In the context of the use of crime predictive software in policing, Chris Gilliard reiterated in WIRED how data-driven policing systems and programs are fundamentally premised on the assumption that historical data about crimes determines the future.
He highlighted Wendy Chun’s new book Discriminating Data: Correlation, Neighborhoods, and the New Politics of Recognition, emphasising that these AI-based methods of statistical correlation and machine learning are not equipped to anticipate the myriad of possibilities of the future. Rather, these systems “restrict the future to the past” in order to ‘predict’ it. Thus, not only is it a self-fulfilling prophecy, these systems actually work to uphold the status quo; to “cement existing realities rather than change them”. Chun wrote:
If the captured and curated past is racist and sexist, these algorithms and models will only be verified as correct if they make sexist and racist predictions.
It is telling that these technologies are mainly used on certain types of crimes (not white collar crimes), thus disproportionately policing Black and brown bodies. No new and purportedly innovative technologies will solve racism if we remain bound to the same set of existing carceral logics that permeate our societies. Rather, they constrain other possibilities.
See: Crime Prediction Keeps Society Stuck in the Past in WIRED.