Guilt by computer

Guilt by computer

Police predicting crime causes concern among constitutional experts

By André Coleman 05/30/2013

Like it? Tweet it! SHARE IT!

In the 2002 film “Minority Report,” Tom Cruise plays a futuristic police officer using a psychic, known as a precognitive, to predict crime before it happens.

The film’s director, Steven Spielberg, said at the time that the arrest of criminals based on predictive matrices had some real-world basis in what was happening in post-Sept. 11 America, when citizens began losing some of their basic constitutional rights in the interest of national security.

Today, Spielberg’s vision of an American law enforcement community relying on computer predictions to fight crime appears to be coming true, according to one constitutional law expert, who told the Pasadena Weekly he is concerned about the civil rights implications involved with the use of PredPol, predictive policing software, which has been adopted by police in Alhambra, Santa Cruz, Seattle, parts of Los Angeles and Orange counties, and Australia and England.

“It raises ‘Minority Report’ concerns,” ACLU-Southern California Senior Staff Attorney Peter Bibring said of the PredPol program. “From what I understand, it is about [police] looking at overall crime trends so they know where to allocate resources. There is nothing wrong with that. Now, if officers are told to make aggressive stops and the [program’s] instructions lead them to stop people without provocation, that is a problem.”

The software — ranging from $25,000 to $250,000, depending on the size of the city — allows police to predict future crime trends and patterns based on data gathered from previous crimes. Some experts are questioning what effect the software’s use will have on the legal requirement of “reasonable suspicion,” which police officers must have in order to make contact with a person, all based on evaluations that a “reasonable person” would make.

“How should we evaluate the weight of a predictive tip by a computer?” Andrew G. Ferguson, assistant professor at Clarke Law School in Washington, DC, asked. “If a cop arrives on the scene and he has been told to be there by a computer and sees a man standing on a corner with a bag, which is usually not reasonable suspicion, but he has been given this computerized tip, is that enough now to go stop the man with the bag? What is interesting is this is being adopted in these communities and no one has thought out the legal implications. Do you have a lesser expectation of privacy if you are in the predictive area?”

Although the Pasadena Police Department does not use predictive software, detectives still comb through incidents looking for trends and patterns. The department will hear a presentation on the software in June.

“At the end of the day, it is like anything else; any program or power can be abusive,” said Pasadena police Chief Philip Sanchez.
“The idea is to have the proper checks and balances so it does not become abusive and so that it is used to look at the predatory nature of those people that would commit crimes in the region and the San Gabriel Valley,” Sanchez said.

LAPD Lt. Sean Malinowski said in a National Institute of Justice newsletter article that predictive policing does not violate anyone’s civil rights. “Police are not arresting people on the probability that they will commit a crime,” Malinowksi said. “Police still must have probable cause.”

In addition, he wrote, “predictive policing methods do not identify specific individuals; instead, they anticipate particular times and locations where crime is likely to occur.”

According to the PredPol Web site, the software, which utilizes the same technology used to predict aftershocks following earthquakes, accurately predicted crime trends better than experienced crime analysts — much like those used in Pasadena — over a six-month period.

The predictive software analyzes 500- by 500-square-foot blocks — or hot spots — based on past trends in the area. Earlier this year, PredPol predicted the date and location of a car burglary based on information regarding similar crimes in the area, according to Alhambra Police Chief Mark Yokoyama. An Alhambra police officer drove to the area and witnessed a suspect burglarizing a car — just as the computer said would happen.

“A year and a half ago, we were at all-time lows in crimes,” Yokoyama told the Weekly. “We knew that crime could only go so low, and the day would come when it would start rising. We also knew that convicted criminals would be released from jail due to the prison realignment, which was coming. We got creative and started thinking about what else was out there that we could do. The prediction paid off.”

In Richmond, Va., police there used the software to look at the data of New Year’s Eve gunfire, which had been increasing every year for several years. Using the software, officers there were able to anticipate the time, location and nature of future incidents. Richmond police placed officers at those locations, resulting in a 47-percent decrease in gunfire on Jan. 1 and a 246-percent increase in weapons seized. In the process, the department saved $15,000 in personnel costs.

In Arlington, Texas, in 2011 police used data on residential burglaries to identify hot spots and then compared these locations to areas with code violations. After finding a correlation, the department increased patrols in those areas and saw a 60 percent drop in burglaries.
“The information we are given from the analysis has nothing to do with ethnicity, race or sex,” Yokoyama said. “It only looks at crime type and crime date and time. It is no different than what we have been doing for decades, just more precise.”

But that is what worries some, who say that for decades police have been targeting minorities and the neighborhoods they live in.
“They are going to be feeding information into the computers about minority communities,” said Pasadena attorney and activist Philip Koebel. “If your data inputs are racially biased to begin with, and we know law enforcement has been collecting information in a racially biased way for years, the outcome will be racially biased. Minorities are stopped by the police more. A black person is eight times more likely to be arrested than a white person for committing the same offense, so we know what the data will tell them.”

But racial profiling is just one potential problem, according to Ferguson, who said the true concern should be the impacts on the Fourth Amendment against unreasonable searches and seizures and what happens if even more advanced software being developed by the Department of Homeland Security hits the streets.

Over the past two years, Homeland Security has been working on similar software, only to aid Transportation Security Administration airport screeners by checking physiological indicators, including a person’s heart rate and the steadiness of his or her gaze.

“The fascinating issue is what happens when you take it out of the airport and use it on the street,” Ferguson said. “Could someone be stopped because their heart rate is racing or they look sick? Maybe the person is late picking their child up from child care or is having an asthma attack.”

Though cautious in his assessment of the program, “There is nothing wrong with examining and acting on the trends as long as they don’t go further and violate people’s rights,” said the ACLU’s Bibring.

DIGG | del.icio.us | REDDIT

Like it? Tweet it!

Other Stories by André Coleman

Related Articles

Comments

How thin is that line between "predicting" crime and helping produce it (as a proactive conduct intended to insure job security)?

DanD

posted by DanD on 5/30/13 @ 09:40 p.m.
Post A Comment

Requires free registration.

(Forgotten your password?")