July 15, 2009

PA- Program helps identify likely violent parolees

7-15-2009 Pennsylvania:


As part of an attempt to fight crime, Philadelphia is now the subject of an experiment never tried in another city: A computer is forecasting who among the city's 49,000 parolees is likeliest to rob, assault, or kill someone.

Since March, the city's Adult Probation and Parole Department has been using the system to reshuffle the way it assigns cases. Each time someone new comes through intake, a clerk enters his or her name and the computer takes just seconds to fish through a database for relevant information and deliver a verdict of high, medium, or low risk.

"It's a complete paradigm shift for the department," said chief probation and parole officer Robert Malvestuto. "Science has made this available to us. We'd be foolish not to use it."

Criminologists say the system works - it can identify those most likely to commit violent crimes. But whether Philadelphia can use that to intervene and change people's behavior is still not known. A full evaluation won't be done until the end of the year.

Yet some probation officers say the changes already are making it far harder for them to help those at lower risk to get off drugs and improve their lives.

The controversy over the new system cuts to the heart of a long-standing debate: whether parole agencies should control dangerous people or help them reclaim their lives.

The computer isn't merely crunching data - it is creating its own rules in what is known as "machine learning," a fast-growing technology that enables computers to encroach into the human realms of judgment and decision-making.

The Adult Probation and Parole Department started considering a technological upgrade in 2006, the year the murder rate hit a peak of 27.8 per 100,000 inhabitants, the highest of any of the nation's 10 largest cities.

University of Pennsylvania criminologist Larry Sherman suggested the department go high-tech, with the help of University of California statistician Richard Berk.

At the time, Berk had been doing computer modeling for the California prison system. "We were forecasting what types of inmates are likely to do nasty things in prison," he said.

Later that year, Berk took a job at Penn and started applying his statistical skills to predicting murder. He later added assault and robbery. "The idea was to forecast who the real bad guys were - so you could deliver special services to them and reduce the number of homicides," he said.

Berk is an expert in machine learning. Computers equipped with this capability, which is somewhat different than artificial intelligence, can execute surprising feats - predicting which products consumers will buy or which SAT essays will get top marks from panels of English teachers.

The computer doesn't use a formula, nor does it develop one that anyone could write down. Instead, it learns by itself after being fed reams of "training data," in this case on past parolees and their subsequent crimes. The system looks for patterns that connect such factors with subsequent crimes.

Only recently has computer data-processing power been up to the job of predicting crime. "You couldn't do this five years ago," Berk said.

To "train" the system, Berk fed in data on 30,000 past cases; about 1 percent had committed homicide or attempted homicide within two years of beginning probation or parole.

The data included the number and types of past crimes, sex, race, income, and other factors.

To test its power, he fed in a different set of data on 30,000 other parolees. This time he didn't tell the computer who would go on to kill.

Applying what it had previously learned, the system identified a group of several hundred who were considered especially dangerous. Of those, 45 in 100 did commit a homicide or attempted homicide within two years - much higher than the 1 in 100 among the general population of probationers and parolees.

The predictors that mattered most were age, age at first contact with adult courts, prior crimes involving guns, being male, and past violent crimes.

A typical high-risk case, he said, might be a 22-year-old male convicted of robbery, with seven priors, two involving guns. His first contact with the adult courts happened at 15, and he would return to a high-crime part of the city.

Race mattered only a little - and so Philadelphia decided to leave it out of the equation. Berk said he thinks the model should work fine without it and the decision to ignore race minimizes concerns about racial profiling.

When the Probation and Parole Department began restructuring in March, there was no money to hire more parole officers, researcher Lindsay Ahlman said. So it had to find some way to better use the resources it had. The average parole officer had been handling 150 cases, but as an early test, some were asked to supervise many more - 350 to 400 people flagged by the computer as low risk.

For comparison, the department had other officers take on the usual 150 cases, also from this low-risk group.

What this revealed was that less supervision did not increase crime among the low-risk parolees and probationers, she said.

Not all of the officers were sold. "They'd say, 'This guy had a knife at school - he's not low risk,' " she said.

She said she tries to explain that the computer can't make exact predictions about individuals, but it's good at predicting the number of crimes likely in a group of 350 to 400 people.

The difference is already becoming apparent, she said. Officers who used to handle 150 cases of all types were getting an arrest alert or two every day, Ahlman said. Now they've got upwards of 350 low-risk cases and are getting alerted to arrests only once a week or so.

Conversely, some officers are assigned much smaller groups of high-risk cases, typically fewer than 50.

Probation and Parole Department researcher Ellen Kurtz said they can't tell how well the program is working yet. A full evaluation will take at least six months, she said. She declined to say how the system rated any of the people on parole or probation who've been put through it.

But these innovations are straining a system that's already suffering from lack of resources, said Louise Carpino, president of the union that includes probation and parole officers.

Paying more attention to these "high risk" cases comes at the expense of all the others, she said. Officers can no longer help low-risk people get off drugs, go to AA meetings, or get a GED.

"I've seen this change people's lives," she said. "But you've got to have a human connection."

Criminologist Todd Clear of City University of New York said helping rehabilitate criminals was the original mission of parole and probation.

There's some evidence that it works, he said. But starting in the 1970s, the system has shifted to controlling people who are considered threatening. There's little evidence this does any good, he said.

Clear says he thinks the new machine-learning technology could tip the debate in either direction. But ultimately, he said, it will work only if it can help figure out how to transform "high risk" people into lower-risk ones.

Another hazard is that while the system isn't expected to be right all the time, it influences how people are treated.

"The main ethical concern," said Richard Bonnie, a law professor at the University of Virginia, "is the possible unfairness to the 'selected' offenders."

If the high-risk people do get more supervision, it means they face a greater risk of being caught in a technical violation that will send them back to prison. Should such power be relegated to a computer?

Berk said he's not worried. "This is not like the movie Minority Report . . . as if we are all fated to do one thing or another," he said, referring to the Tom Cruise film in which police make arrests based on psychics who see crimes committed in the future.

The Philadelphia Probation and Parole Department researchers spoke enthusiastically about plans in the fall to experiment with special classes for the highest-risk offenders, guiding parolees to change their thinking and so their actions through "cognitive behavior therapy."

Berk said he's been asked to design similar systems for Washington and other major cities. Whether it helps cut crime in Philadelphia will be closely watched. ..Source.. by Faye Flam, Inquirer Staff Writer

No comments: