Big Data Refines Predictive Policing
In recent years, Big Data has revolutionized policing and police management in the U.S. Even though violent crime rates have been falling since 1992, police departments must still cope with limited budgets. One result is that police departments are looking to Big Data to improve work efficiency and take proactive measures to anticipate and prevent crimes. Successful experiments working with Big Data in the U.S. are likely to accelerate the deployment of data analytics technologies for managing police services worldwide.
CompStat Aids Policing Management
Officer patrols consume the largest part of police resources and are essential for maintaining high-quality interactions between police and communities. The use of Big Data technologies to manage neighborhood patrols is enabling police departments to maintain the twenty-three-year reduction in violent criminal activity. Typically, police patrols involve call-driven activities focused on rapid responses to crimes in progress. When public safety officers leave their stations on patrol, the operational details, including task and time, are directed by duty officers who remain indoors.
Beginning in 1994 with the New York City Police Department (NYPD), many police departments across the U.S. have adopted a computer-based policing management method called CompStat. Short for COMPuter STATistics, CompStat combines a management philosophy with organizational management tools designed for accountability. The NYPD set up a data center to keep track of crime reports, arrests, and contact information for individuals that have established a foundation for decision making. Today, CompStat includes four generally recognized components: timely intelligence, rapid response, effective tactics, and complete follow up.
With CompStat, precincts with the greatest number of incident reports are allocated the most resources. In the case of the NYPD, quantitative indicators were provided to assess the performance of each of New York City’s 77 precincts. Though some argue that other economic and demographic factors are responsible for the dramatic reduction of crime through the late 1990s, others credit CompStat for making New York one of the safest cities in the U.S. In New Orleans and Minneapolis, police departments saw double-digit reductions in the rate of crime following the adoption of CompStat, and overall, the emergence of CompStat has propelled police departments across the U.S. to incorporate data- and intelligence-driven methods for managing patrol activities. Additionally, CompStat has helped police departments standardize optimal norms for patrol behaviors and eliminate or ban ineffective tactics.
The Santa Cruz Experiment
Despite these impressive results, CompStat may have weaknesses. In CompStat mode, police departments tend to react to the recent trends rather than taking preventive measures against emerging trends. As important as it is for police to remove criminals from the streets, some consider that the best option is to prevent crimes from ever occurring. But how?
Research scientists have determined that some types of crime can be predicted using mathematical models. For instance, burglaries are largely determined by three location-related variables: ease of access, relative affluence, and security measures. George Mohler, then a mathematician and researcher at the University of California Los Angeles (UCLA) and now Assistant Professor at Santa Clara University and Director of Data Science at Metromile in San Francisco, proposed that it was possible to predict certain crimes by modifying an algorithm used to forecast earthquake aftershocks for use in property crime prediction. The result was a far more accurate property crime map than had been produced using traditional approaches, such as CompStat. Following the implementation of Mohler’s Big Data predictive policing technique, the police department in Santa Cruz, California saw a significant drop in the number of burglaries over the next six-month period.
The Santa Cruz experiment set up a benchmark for predictive policing and became a model for police departments in Los Angeles, the U.K., and the Netherlands. This experiment predicted the locations of future property crimes by analyzing 5,000 historical crime records. The prediction algorithm compared time and location information of each old crime with real-time inputs as new crimes were reported. Other factors incorporated into the model included Automatic Teller Machine (ATM) locations, bus routes, and local weather conditions.
The predictive policing experiment in Santa Cruz demonstrated a real-world application of Big Data technologies. Initially, the crime maps produced in Santa Cruz looked similar to those generated by CompStat, but soon the maps rendered from the Big Data set identified by Mohler had a much higher level of accuracy. The Santa Cruz experiment identified the Top 10 potential crime locations at a mapping resolution of 500 square feet. Projections were generated for when and where future crimes were likely to occur within these areas and which of them presented the highest risk. By patrolling these areas more intensively, local police officers found that they were able to deter or prevent potential crimes. Over the course of the six-month experiment, property crimes in Santa Cruz decreased by eleven percent over the previous year and by four percent over the historical average for the same period.
Policing with Big Data
Big Data technology brings advantages to other police activities as well, including:
- Targeted Enforcement: With Big Data analytics, specific policing targets can be set to reduce specific types of crime with breakdowns of tactical responsibilities.
- Intelligence Collection: Big Data technologies enable police officers to respond promptly and effectively to incident reports with the accurate collection and transmission of time and location data. When processed using Big Data pattern matching techniques, this dataset forms the basis for analysis, planning, and results assessment.
- Effective Crime Strategies: Predictive policing is a strategic management system that integrates the factors required to make effective decisions by police officers at all different levels. Predictive policing is not only goal-oriented but also adjustment-oriented. In other words, better decisions are possible when police departments have fast access to deep and timely insights about the quality of crime strategies and crime reduction measures, and accountability for the resulting decisions. Management personnel are charged with the duty to modify or even abandon ineffective prevention crime strategies and adopt and promote effective ones.
- Quick Resource Deployments: In predictive policing modes, tactical units at every level compete for limited resources. Big Data analytics can be used to support public safety strategies and deploy limited resources with maximum flexibly.
- Effective Follow Up: Predictive policing modes provide management with effective methods for judging if a particular crime prevention strategy has succeeded. The results are optimized policies for performing continuous follow up and accountability of public service performance at all levels.
The importance of worldwide public security has never been greater, and the application of Big Data technologies brings new opportunities with the advent of predictive analytics for policing management.