by BAR executive editor Glen Ford
Mass Black incarceration wound up ensnaring too many white people in the gulags, bringing forth calls from within the establishment for “reform” to spare those undeserving of imprisonment. Digital science came to the rescue. “The U.S. criminal justice system now deploys algorithm-based technology to predict who will be criminalized in the future, and to systematically push whites out of the path of the New Jim Crow juggernaut.”
Crime “Prediction”: The Algorithms of Racist Injustice
by BAR executive editor Glen Ford
“The systemic racism that was formerly the realm of police, prosecutors and judges is now codified in the algorithms of machines that have been programmed by racists on a mission.”
Mass incarceration in the United States is emphatically not a crime-fighting strategy, but a tool of racial oppression tailored to contain and control the Black population. Those whites that are also swept up in the prison gulag are collateral damage, the unintended consequence of draconian national policing and imprisonment policies designed to prevent the recurrence of Sixties-era Black radicalism and rebellion. It is, therefore, both logical and inevitable that most of the “reform” schemes that emanate from within the criminal justice establishment result in more lenient treatment and earlier release of white offenders, while further perfecting the technological and legal machinery of mass Black incarceration.
Having spent two generations creating a “New Jim Crow” regime so voracious that one out of every eight prison inmates in the world is an African American, the U.S. criminal justice system now deploys algorithm-based technology to predict who will be criminalized in the future, and to systematically push whites out of the path of the mass incarceration juggernaut. “Risk assessment scoring” uses computer algorithms to allow judges to decide how people charged with crimes will fare at every stage of the criminal justice process: from provision of bail to sentencing. The data and assumptions that underlie the algorithm – that decide who should be diverted to a rehabilitation program, and who should do hard time, for example – are private, proprietary and secret. But they do work, in the sense that they accomplish the criminal justice system’s mission: to spare as many whites as possible from sharing the harsh penalties meted out Blacks, for whom the system was designed.
“’Risk assessment scoring’ uses computer algorithms to allow judges to decide how people charged with crimes will fare at every stage of the criminal justice process: from provision of bail to sentencing.”
A study by ProPublica, based on “risk scores” obtained on 7,000 people in Broward County, Florida, between 2013 and 2014, shows the assessments to be “remarkably unreliable” in forecasting actual future violent crime, and profoundly biased against Blacks. The formula, marketed by the Northpointe corporation, “was particularly likely to falsely flag black defendants as future criminals, wrongly labeling them this way at almost twice the rate as white defendants.” Accordingly, “white defendants were mislabeled as low risk more often than black defendants.”
The systemic racism that was formerly the realm of police, prosecutors and judges is now codified in the algorithms of machines that have been programmed by racists on a mission. ProPublica reports that Northpointe’s risk assessment “product” is derived from 137 questions, including:
“’Was one of your parents ever sent to jail or prison?’ ‘How many of your friends/acquaintances are taking drugs illegally?’ and ‘How often did you get in fights while at school?’ The questionnaire also asks people to agree or disagree with statements such as ‘A hungry person has a right to steal’ and ‘If people make me angry or lose my temper, I can be dangerous.’”
The social scientists on hire to private corporations like Northpointe are quick to say that race, ethnicity and geography are not factors in their questions, but that is only semantically true. The bias is not only present, it is sculpted into the “product.” Yet, ProPublica reports that defendants’ risk assessment scores are given to judges in Arizona, Colorado, Delaware, Kentucky, Louisiana, Oklahoma, Virginia, Washington and Wisconsin, for use in sentencing.
“The formula was particularly likely to falsely flag black defendants as future criminals, wrongly labeling them this way at almost twice the rate as white defendants.”
Judges in Wisconsin, whose largest city, Milwaukee, has the nation’s highest Black incarceration rate, use Northpointe’s assessments “at each step in the prison system, from sentencing to parole,” according to ProPublica. Theoretically, the judges aren’t supposed to give high scorers longer sentences – but, clearly they do. This, despite studies showing that the assessment grossly skew the probabilities of Blacks and whites to be rearrested for other crimes – in the whites’ favor.
In a multiracial country where criminal behavior comes in all ethnicities, it is difficult to weed whites out of a system of mass Black incarceration, or to keep up the fiction of a “war on drugs” that is really a war on Blacks. The establishment prison “reform” movement hit its stride in the Nineties as small towns in Middle America (a euphemism for heavily white regions) succumbed to the meth craze. Suddenly, judges and prosecutors sought out ways to get around mandatory minimum sentencing, and to divert young whites from having to serve hard time in the Black-dominated gulag. Prison “reform” came back in vogue in the computer age, providing algorithm-based routes of emancipation for “deserving” defendants – disproportionately white, of course.
In heavily Black big cities, computers serve the opposite purpose, deepening the hyper-surveillance of Blacks and increasing the odds of incarceration. In Chicago, according to the New York Times, the police have for the past three years or so used algorithms to predict who will shoot someone, or will get shot. The cops have assembled a Strategic Subject List of people rated likely to kill or be killed, who are then visited, along with their “families, girl friends and mothers,” by teams of police officers, social workers and “community leaders” and warned that they are being watched by the police. The at-risk people are also supposed to be offered drug rehabilitation, housing and job training, but there is little funding available.
“Prison ‘reform’ came back in vogue in the computer age, providing algorithm-based routes of emancipation for ‘deserving’ defendants – disproportionately white, of course.”
Who is designated a Strategic Target? “There’s a database of citizens built on unknown factors, and there’s no way for people to challenge being on the list,” said Karen Sheley, director of the Police Practices Project of the American Civil Liberties Union of Illinois. “How do you get on the list in the first place? We think it’s dangerous to single out somebody based on secret police information.”
Dr. Miles Wernick, a professor at the Illinois Institute of Technology, developed the algorithm. He claims to have avoided bias based on race, gender, ethnicity and geography. The police say the questions used to create the list include: “Have you been shot before? Is your ‘trend line’ for crimes increasing or decreasing? Do you have an arrest for weapons?” However, Chicago’s new police superintendent, Eddie Johnson, who is a 28-year veteran of the force and a former chief of patrol, said, “It’s just all about the contacts you have with law enforcement.”
That’s the problem. After two-generations of the world’s most intensive, race-based policing and hyper-surveillance, Blacks have by far more “contacts with law enforcement” than whites. Therefore, any crime-predicting algorithm based on such criteria will be racially biased – because the criminal justice system is racially biased! And, the more class-based assessments of criminal-mindedness sold by Northpointe are twice as likely to label Blacks as future criminals, than whites.
Clearly, these algorithms are nothing but the digitally weighted feedback loop of racist criminal justice practices, and racially biased computer programming masquerading as social science.
The New Jim Crow is always developing new apps for its mass Black incarceration mission: to contain, control and terrorize the Black community.
BAR executive editor Glen Ford can be contacted at [email protected].