A professor of criminology at the University of Pennsylvania, Berk is using an algorithm that he says can help notify the justice system if someone on parole or probation has a high probability of committing a serious crime in the near future — or even killing someone.
People commonly murder people who are like themselves, Berk says — whether it’s a dispute over turf or drugs.
Predicting crime is not really like what happens in the Hollywood flick Minority Report. In that film, a “Precrime” office is tasked with arresting people who will — with 100 percent certainty — commit a crime, but haven’t.
In the real world, however, “predicting crime” really means patterning previous behavior to help law enforcement understand where crime is more likely to occur, helping them allocate limited resources more efficiently.
Berk’s algorithm uses a criminal’s background information to instantly provide a forecast, which must be modified for each jurisdiction in which it is used. For example, the system must be tailored for the gangs in Los Angeles. In Philadelphia, it hones in on homicide, rape, armed robbery and aggravated assault.
Berk says the software is already in use in Philadelphia and is in the process of being implemented in Baltimore, just 100 miles down Interstate 95.
Baltimore police are interested in using the algorithm to help pinpoint murders. In Philadelphia, police are interested in expanding its use to predict a range of serious crimes.
“Using our techniques, we find a high risk subset,” Berk said. “Of the people who will shoot or be shot, the algorithm correctly forecasts those outcomes about 75 out of 100 times. That’s a dramatic improvement.”
The current system implemented in Philadelphia offers a three-part forecast: individuals who will commit a very serious crime; individuals who will commit no crime; and individuals who will commit crimes that are not serious.
You might expect that less than 5 percent of the people released from prison commit another very serious crime. But according to Berk’s algorithm, that small subset of high-risk people are about 50 percent likely to commit another crime.
“The real bad guys are followed really closely and get help, the ones forecasted to be no threat are seen superficially, and the ones in the middle get a little of both,” Berk said. In turn, law enforcement resources are reallocated from the no- and low-risk to the high-risk.
The system, of course, is by no means perfect, and it’s an ongoing project to fine-tune its accuracy, Berk said. But it’s better than having none at all.
“Anytime you forecast something, you are going to be wrong some of the time, just like weather,” Berk said. “The point is, [the justice system] was making mistakes before. We can reduce those mistakes.”
Even on probability alone, the stakes of accuracy are high: police allocate patrol cars based on where they expect crime to be. Judges make decisions based on potential crime risks.
Often, an officer or judge’s prior experience informs that decision, Berk said. Sometimes, those decisions are accurate.
But Berk’s algorithm takes probability beyond a gut feeling. Logic follows that the system could make sentencing recommendations to judges based on its data analysis.
Parole forecasts have been in use in the United States since the 1920s, but fundamental science to support decisions was always lacking.
Say a bunch of prison officials sit around and see that a prisoner is coming up for parole: “Here’s his prior record and background. What do you think, fellas?”
Over time, psychiatrists and other clinicians were brought in to help inform that decision-making process. Behavior while sentenced and while on parole was also taken into consideration. Statistics slowly made their way into the process.
“In the last five years, we have gotten access to very large databases that we can use to find instructive patterns. It’s a dramatic step forward,” Berk said. “Criminologists have known for a long time that about 50 percent of crimes are committed by 10 percent of criminals. If you can find the 10 percent, you get a lot of return on the criminal justice dollar.”
With Berk’s algorithm, you can pore through tens of thousands of cases and draw associations based on predictors such as age and gender. ”It’s an exhaustive search for needles in the haystack,” Berk said.
For example, if a person commits a serious crime like armed robbery at the age of 13, it turns out to be a good predictor of violent behavior later in life. But if a person in their 30s commits the same crime, it’s less clear that the pattern will repeat.
“There are increasingly going to be legal and ethical issues on how these kinds of data are used,” Berk said.
To wit, Berk offered this example: “Suppose I tell you that your kid has a probability of committing a very serious crime.” Only eight years old, the child strikes the teacher. According to the algorithm, that child is expected to commit a serious crime by the time he turns 18. What do you do?
“I have no idea what you would do,” Berk said.
Despite its shortcomings, statistical crime prevention continues to spread across the nation. Berk says he’s working on a similar project for Washington, D.C.
Others are developing similar initiatives in Boston, New York City, Los Angeles and Chicago.
(Sound familiar? Frequent readers of SmartPlanet may recall that I recently profiled the city of East Orange, New Jersey for a crime prevention project. Read more about it here.)