Computer algorithms use data mining to improve people's lives in a number of fields, and crime fighting is one of them. Since 2011, a program dubbed "predictive policing" has been helping law enforcement agencies in Britain and the U.S.
This might remind one of Minority Report, a movie where a future society uses psychics to arrest criminals before they commit their illegal actions. The contemporary program of predictive policing is a bit like that, but it involves zero psychics and a lot of statistical data and advanced mathematics.
In Kansas City, the police department implements predictive policing by using "call-ins." Algorithms detect persons who are suspected of being involved in criminal activities, and the individuals are called in to meet with police officials and "local and federal prosecutors, plus the police chief and the mayor."
This paper [pdf], considered a landmark on modeling crime, shows how breaking the law has a ripple effect, just like an earthquake. Significant seismic activity generates smaller earthquakes nearby. As such, places where crime is recurrent or where serial offenders team up tend to give birth to criminal clusters.
Predictive policing was also deemed one of the best innovations of 2011, and reports claim that since the predictive No Violence Alliance (NoVA) program began almost five years ago, homicide rates in Kansas City took a small dive. It is yet undetermined if the program directly influenced the fall of deadly violence or others factors are responsible as well.
Enthusiasm for the computer-aided crime detection is high among officials.
"The wave of the future" is how Commissioner William Bratton from the New York City Police called it. Numbers show that crime forecasting is 5 to 10 percent more efficient than standard policing methods.
Predictive policing, however, faced a skeptical welcome from the NGOs fighting for civil rights. In the opinion of Hanni Fakhoury, attorney for the Electronic Frontier Foundation, the use of algorithms that work on selective data is perilous.
"If the data is biased to begin with and based on human judgment," Fakhoury said, "then the results the algorithm is going to spit out will reflect those biases." It may be just a coincidence that, of 685,000 police stops that happened in New York in 2011, 87 percent of the drivers were either Latinos or Blacks.
Foster Maer, lawyer for a New York Latino advocacy group, shares the same view as Fakhoury.
"Because the data is racially biased, the names that come out will be racially biased," Maer said.
In spite of divided opinion on the efficiency and moral grounds of predictive policing, the program assists police departments and district attorneys' offices in Kansas City, Miami, Los Angeles, Manhattan, Nashville, and Philadelphia.