Several California police departments have stopped using “predictive policing” software, but not out of concern for civil liberties – it’s simply not working to predict crime, some of the tool’s pioneers have grudgingly admitted.
“We didn’t get any value out of it,” Palo Alto police spokeswoman Janine De la Vega told the LA Times. The department tried PredPol, the predictive policing tool developed by a University of California at Los Angeles professor and the Los Angeles Police Department, for three years, only to find it served up information street patrols already had. “It didn’t help us solve crime,” De la Vega said.
Palo Alto police weren’t the only ones disappointed. Mountain View, California spent over $60,000 on the software over five years before dropping the program last June. Rio Rancho, New Mexico Police Captain Andrew Rodriguez called it a “disappointment,” adding “It wasn’t telling us anything we didn’t know.”
Elsewhere, LAPD alumni have taken it upon themselves to spread the PredPol gospel; Birmingham, Alabama Police Chief Patrick Smith, a former LAPD commander, convinced his department to spend $60,000 on the program. PredPol CEO Brian MacDonald admits the program’s marketing materials tout its usage by the LAPD, but insists that aspect has never clinched a sale.
The LAPD itself was forced to admit following an internal audit that after eight years, there was “insufficient data” showing PredPol to be effective in reducing crime, thanks to massive inconsistencies in oversight, criteria, and program implementation. In April, the department shelved another Orwellian program, which was found to be using “inconsistent criteria” to label people as future violent criminals. Last August, after a lawsuit from privacy and civil liberties groups forced the department to cough up its PredPol records, the LAPD discontinued another dystopian part of the program that picked out a list of “chronic offenders” every shift based on alleged gang membership, previous arrests, and one “point” for every “quality police contact.”
PredPol uses 10 years of crime data, fed into the algorithm by type of crime, date, time, and location, in order to predict the next 12 hours, but it’s only as scientific as the officers feeding it information. Civil liberties groups have highlighted the potential for self-fulfilling prophecies – if cops already over-police a given community, PredPol ensures their continued presence and makes it more likely they will continue to view that area and its residence as problems to be solved. Others claim PredPol weaponizes crime data to perpetuate existing racial bias, or that it is being used to give scientific cover to racist policies.
The complexity of PredPol’s algorithms, and some defenders’ assertion that if it’s truly “successful” the crimes it predicts will not occur, have shielded it from some of the criticism that might otherwise accrue to a program resembling something out of Philip K. Dick’s “Minority Report.” But data or no data, LAPD Chief Michel Moore doesn’t want to let PredPol go, claiming it is more accurate than human analysts at predicting where criminals will strike next. But even his defense of the program is a far cry from early publicity materials that trumpeted “‘cliff-like’ drops in crime often within months of deployment” among PredPol’s early adopters.
MacDonald has framed his software’s failure differently. “It’s virtually impossible to pinpoint a decline or rise in crime to one thing. I’d be more surprised and suspicious if the inspector general found PredPol reduced crime,” he told the Times.
PredPol’s effectiveness “is driven by how the agency is run,” MacDonald said pointedly, explaining that it was the department’s responsibility to apply the tools correctly and comparing it to a gym membership. “You have to use it for it to work.”