Find Us On

Follow Us on Facebook Follow Us on LinkedIn Follow Us on Twitter Follow Us on YouTube Follow Us on our RSS Feed

Connect with Your Citizens Anywhere They Want - CityConnect: New Mobile App for Law Enforcement

Crime Dashboards Should be Used In Every Department

So what exactly is a crime dashboard?  Is this just another buzz term within law enforcement or is it truly something to be utilized to drive the department’s crime-fighting efforts? To be honest, my first thought at the word is something we’re all familiar with: the dashboard in your cruiser. It’s the central hub of your patrol car that gives you an overview on the over all health of your vehicle – amount of gas in the tank, temperature of the engine, oil pressure, speed odometer, tachometer, etc.  But that clearly isn’t the same thing.

When discussing dashboards in technology applications, business executives are very familiar with the term. They’ve been using business intelligence dashboard for clear over a

Executive BI Dashboard

Executive BI Dashboard

decade. It’s purpose is similar to the car dashboard: to inform the manager of the over all health of the company by measuring key performance indicators, like monthly revenue, number of new customers, number of renewals, and so on.

 

Likewise, a crime dashboard’s main objective should be to give you an overview of crime trends in your jurisdiction. I call this the who, what, why, when, where of crime intelligence. It should be easy to read and even easier to use in order to make policy decisions that are right for your county, city, or town. Now, there is more than one way to build a crime dashboard, so I’m going to discuss below the most important considerations for creating my own department’s crime dashboard.

But first we need to ask ourselves: what needs to be included in your crime dashboards – crime type, suspect information, narratives, maps? The answer is certainly all of these and even more. Now I will grant you this, without a specific software program that assists you in creating your crime dashboard, it can be a real chore to piece this information together by manual means, but it can be done. This is where I started before using CommandCentral Analytics, which I used for many years.

Crime Dashboards Provide Agencies an Overview of Crime at a single glance.

Crime Dashboards Provide Agencies an Overview of Crime at a single glance.

A specific software platform will certainly make the creation of your crime dashboards a much easier process – essentially a matter of minutes instead of hours or even days. I have found that the best practice tenants that I’m about to outline ring true no matter which method you use to create your dashboards. In reference to the points I’m about to make, I contend that your aim is to have all pertinent information on one screen and have the ability to drill down within your dashboard to gain greater insight.

 Considerations When Creating A Crime Dashboard

1. Make sure that you can see where your crimes have occurred.

This is generally achieved through a map visualization. I like to also supplement the mapping function with something such as a pie chart or bar chart to break down the number of occurrences with in a specific beat or zone by crime type.

2. Make sure that you can see when you’re crimes have occurred.

In this case I typically use a Time of Day/Day of Week Heat Map.  This map easily displays, through a hot/cold style visualization when the crimes are occurring by a cross-reference of time of day and day of week. That being said, this information can also be displayed in a number of other ways such as; a combination bar chart displaying the time of day and day of week.  It is very important to remember that the time of day and day of week need to both be included.  Simply looking at the time of day or the day of week on their own leaves too many questions to be answered by your viewer.

3. Make sure that the who and what of your crimes can be easily viewed.

This is undoubtedly the most difficult suggestion that I will give you.  The reason it is the most difficult is because it is the most expansive information, and thus the ability to drill down within a visual on your dashboard is invaluable. Really the only way to do this without a piece of software such as CommandCentral Analytics would be to create a secondary list that you could attach to your original dashboard. Logging in to your RMS to view this information individually simple takes too much time and negates the dashboard purpose. However, within CommandCentral Analytics I used the list function for this visual which allowed me the ability to see all the specific information about the crimes I have chosen including the responsible or reporting officer and their entire narrative.

4. Make sure your dashboards are set up in an intelligent manner and in the proper mindset for their intent.

The dashboards you create can be for a number of purposes as well as a number of divisions within your department.  Ensure that when you create each dashboard it makes sense for the application it is being created for.  For instance, a tactical dashboard for a specific narcotics case should be as specific to that case in all of its visuals as well as its time parameters as it can be.  On the other hand, a dashboard that has been created to follow a strategic plan over a long term set of crimes should be modified with time, location, and other factors so as to aid in the long term planning of the specific crime-fighting series.

To sum up, your dashboards should not be viewed as cookie cutters for every situation.  Although I believe there are certainly a set of best practice procedures that should be followed to give each of your dashboards maximum effect and usability, I also would direct you to be as individualistic as possible with each dashboard in terms to the specific problem it has been created to address. Every dashboard that you create should directly lead your agency into the proper actionable, intelligence-led decisions that will ultimately aid in reducing crime.

Four Steps To Effectively Using Crime Data in Law Enforcement

It’s no secret that law enforcement agencies are consistently being asked to do more with fewer resources. Budget cuts have meant fewer feet on the street and ever increasing demands on agencies and officers alike. To meet these growing demands, many agencies are increasingly relying on technology to fill that gap.

There are four important steps to make sure that you’re using data to its fullest potential.

1. Collecting the data

The fact of the matter is there is a plethora of data available. Useful data may include department-specific information such as:

  • Recent bookings
  • Various types of crimes that have been committed
  • Calendar of when crimes occurred
  • Maps of where illegal activities took place
  • Written citations

However, according to Doug Wylie, Editor in Chief of PoliceOne, some cities take it a step further to include a much greater holistic view of the community a department serves to include everything from utilities and social services records to new building permits.

2 . Accurate Data

Recently, one big city police department announced it would no longer be releasing monthly crime reports because the Excel files they used to distribute the information were being corrupted. Someone had been changing the data the public viewed. This follows the accusations a couple of years ago that the New York Police Department had been falsifying data.

Audits by PublicEngines reveals that up to 25% of all law enforcement data housed in RMS or CAD systems is not accurately recorded.

However, there are ways to improve data accuracy. According to a recent report, some of the variables that agencies should consider include:

  • Data that is correctly captured. This is crucial because there are myriad of codes, statutes and other minor details that allow for human error. Information can be mislabeled or mapped incorrectly. Regular review and comparison can help catch errors and ensure greater accuracy.
  • Quality report writing that includes correct classifications, a built-in multiple-level review processes, and a system to all for reclassification, supplements and follow-up reports to be reviewed, approved and added.
  • Regular audits of reports to verify accuracy. This might also include periodic surveys of randomly selected citizens, who have reported criminal activity to verify your records accurately reflect the facts as they were reported.

3. Adequately Interpreted Data

Those agencies with analysts rely on these hard-working people to identify crime trends. But they’re stretched thin. The ability for officers to predict crimes not only relieves some of the pressure on analysts, it also helps reduce crime. Access to this information is the key factor.

But with sheer amount of data now being gathered, is there room to interpret it in a way that predicts even more crime?

Take some of the non-criminal data that agencies are gathering that was mentioned by PoliceOne’s Wylie. An officer knows that construction sites often experience the theft of materials, vandalism and graffiti. If he also knows from new building permits that construction is under way in several projects, redirecting himself to those areas can significantly reduce the potential for those crimes.

4.Getting data into hands of those that take action

As the above example illustrates, when officers on the street have access to data, they can act accordingly. However, that can prove a challenge.

Products like CommandCentral Predictive, work to eliminate those challenges. Since it’s cloud-based, it is available literally anywhere it is needed so long as an Internet connected device is available. Reports can even be sent directly to officers via email automatically.

Officers in the field are hardly desk jockeys, which is why allowing them to access the information while in the field via their mobile phone or tablet is so important. It can literally be the difference between a crime being prevented and a crime occurring.

Data is available – maybe even too much data is available – but there are ways to harness that information to help predict and prevent crime. Collecting that data from a wide variety of sources, ensuring its accuracy and interpreting its value are important first steps. However, utilizing technology – getting this information to officers wherever they may be – allows them to predict crime and make the streets safer for everyone.

The Science of Predictive Policing.

Author: Praneeth Vepakomma, Data Scientist, Public Engines

The scientific problem of being able to make predictions has raised the interests of researchers for centuries. Solutions that attack this problem have found all encompassing applications within various fields of societal, applied and theoretical interests. Over this period, the complexity of these predictive models gradually increased and so did the predictive accuracy and generalizability of these solutions improve with time. Stemming from the early work of Gauss in the early 1800’s to Andrew Ng’s present day experiment at Stanford consisting of autonomous helicopters that learn to fly- the problem of prediction has truly moved from the simple problem of fitting a straight line, to learning complex relationships from data.

R&D at PublicEngines

At PublicEngines we have a committed R&D team focused on this intersection of developing advanced mathematical models internally for predicting crime that also ‘learn’ from data and we are pleased to have announced our newest product, CommandCentral Predictive that culminates as a result of these dedicated research efforts.

Mathematical Modeling of Crime

The mathematical model that I have developed in-house along with my team exploits specific behaviors that are markedly inherent to crime-incident data. These behaviors include near-repeat victimization, long-term patterns, transient/short-lived patterns and interactions within crime-incident data that our model understands while separating signal from the noise thereby raising the statistical confidence levels around our predictions. These dynamic sets of crime-specific behaviors are measured through our model while not just separating one characteristic from another but also while accounting for the dependencies between them to mathematically understand the inherent predictive characteristics of the patterns of crime. We also were cognizant of some inconsistencies that crop up through the use of some of the existing academic literature around similar problems, and improve upon these as well within our patent pending system.

Focus on Models that Learn

The automated learning component of our model is another important aspect that brings in generalizability to our predictive engine where it guarantees a level of robustness, over new unseen data, when the models are deployed in reality. The scientific field of building various domain specific models that also have the ability to learn from data is an integral part of this field called statistical machine learning. This area began to gain early momentum, when scientists began to build mathematical models that mimic neural networks within the human brain, starting from late 1940’s. An early learning-system breakthrough occurred when scientists in the late 1980’s were able to develop handwriting recognition systems. The field grew leaps and bounds from there on, thereby adding analytical intelligence to various use cases. Today’s E-mail Spam Classification Systems, Voice Recognition Systems, Computer Vision Systems, Facial Recognition Systems, Document Classification Systems are just a few examples of the successes of machine learning. The main reason to focus on a learning component is because in reality, a model that works great over the data that you use to build and train the model on, wouldn’t always be replicable in new instances of these datasets when deployed in real-life scenarios. This happens primarily because the model could learn non-generalizable intricacies within the training dataset, which do not otherwise translate into practical results over future data, that the model would face after deployment. This is the reason; we rigorously train our patent pending crime-specific statistical model, and as well have a learning component comprising of an ensemble of multiple models to keep it robust in real-life deployment. This allows for our model to separate the predictive characteristics of future crimes, down to a high level of tactical/actionable granularity.

AdvancedPredictiveModelvsHotspotting

Field-testing & Evaluation Metrics

We have rigorously tested our models against the most traditional methodologies like ‘Hotspotting’ which are heat-maps generated using historic crime-incident data. In order to have a scientific evaluation, we quantify the performance of existing Hotspotting methodologies by overlaying a grid over the hotspots and measuring the success rate of our predictions in comparison to the hotspot. The number of days required for an evaluation ranges somewhere between a minimum of 80-120 days based on the dataset in order to reach a statistically significant comparison. We conclusively observe, that our patent pending model performs about 2.7 times better than the widely used hotspot.

Positive Impact of Actionable Patrol

This level of focused tactical information would help reinforce ‘directed, actionable patrol plans’ and increase ‘resource-efficiency’ so that agencies can use it in their law enforcement efforts to positively impact their communities through predictive policing. This technology unlocks tactical intelligence from your RMS and closes the analyst/officer gap by efficiently improving operational decision-making. From the perspective of the tech-industry as well, this is absolutely inline with our goal of continuing to reinforce officers with technology that provides tactical and directly actionable information in order to assist them in continuing to do their best.

 

Predictive Analytics Where it Matters: Preventing Crime

Screen Shot 2013-08-06 at 10.58.55 AM

Today there is a lot of buzz about the use of predictive analytics in business.  Spurred in part by the best selling book, Predictive Analytics: The Power to Predict Who Will Click, Buy, Lie or Die, by Eric Siegel, it seems that everyone is talking about ways to predict every major and minor event in our lives.

As with every application of technology, there are implementations that can have a higher, or lower impact on society.  At PublicEngines, we have long believed in the application of analytics to improving quality of life in general, and specifically to fighting crime.  That’s why we were pleased to announce our newest product last week: CommandCentral Predictive.

CommandCentral Predictive is a significant step forward in the use of predictive analytics to accomplishing law enforcement’s goal of preventing crime and improving our communities by making them safer.  In particular, from the time we conceived this product through to its final development, there were two things that were most important to us: accuracy and ease-of-use.

Pre-eminent in our thought process was to create a product that accurately predicts potential crime.  The entire basis of this product is to more accurately digest data and provide an accurate, unbiased view of the highest probability of crime on a daily basis.  So, the science behind it had to be sound.   That’s why we designed our prediction engine using multiple algorithms to optimize its effectiveness and verified it with extensive field-testing.

Of almost equal importance to us was to design the product with the highest ease of use.  It doesn’t matter how good the product is, if the interface isn’t easy to navigate,  it won’t be used.  So, we designed the product from the start with the idea that users need to be able to jump in the product right from the beginning with little to no training.  Then, we took that design to officers, analysts, and command staff, and asked for their input on how to make it better and we redesigned it based on their feedback (more on this in another post).  The result is something that is functional and highly usable; delivering a daily report in a way that any officer can use to improve the way they police.

With an announcement of a product like this there is no doubt that naysayers will voice their opinions.  In particular, we’ve heard those who will say that advanced crime analytics software can’t replace crime analysts.  And they are right.  That’s not our intent.  However, we know how much time analysts have and where their demands are.  Analysts spend a significant amount of time working with officers and patrol supervisors on their areas of patrol responsibility.  They’ve told us they’re overwhelmed with more work than they can handle.  CommandCentral Predictive is designed to help them take a significant demand and essentially automate the prediction of high-probability events and give them more time for analysis.  For those agencies without an analyst (a majority), this is a significant boost by giving them the tactical, directed analysis their officers need but can’t currently afford.

Others may cite instances where software has not worked well at making predictions, such as the military’s use of prediction software to forecast political unrest.  But what we have seen is that, like what you purchase, where you shop, or what you look at online, crime occurs, for the most part, in a repeatable, predictable pattern.  And our field-testing has shown that our algorithms are far more accurate in seeing both long- and short-term trends, modeling them, and learning and improving along the way.  In fact, this very field-testing has show us to be, on average, 2.7 times more accurate than traditional hotspot models at determining where the next crimes will occur.

Tim O’Reilly, the well-known media technologist, once said, “We’re entering a new world in which data may be more important than software.”  I think this is especially true in law enforcement, where enforcement officers have the data to help themselves, but have traditionally struggled through poor software tools to help them analyze it.  So, while we are proud and excited about CommandCentral Predictive, what we are most excited about is this product’s potential to unlock the patterns and intelligence in agency’s data and helping them to make better and more effective policing decisions.

Timely Crime Data that is Easy to Interpret Leads to Better Law Enforcement Decisions

Organizations of all types, especially law enforcement agencies, are being buried in Big Data. As defined by Wikipedia, the term Big Data  represents data so large and complex that it becomes difficult to process.

The same problem can also referred to as Information Overload, and aside from the technical challenges to Big Data, too much information can be difficult to access, store, share, and be useful. In today’s more electronic world, we aren’t in jeopardy of being buried in paper – rather the biggest threat of all is that the data goes unused.

Useful information leads to better knowledge, and thus better decision-making.  A typical approach to managing data and information can be:

  1. Collect raw data
  2. Sort into useful information
  3. Analyze information
  4. Share findings
  5. Information is turned into useful knowledge

The Information Sharing Environment (ISE) provides vital information about terrorism, weapons of mass destruction, and homeland security to analysts, operators, and investigators in the U.S. government. From ISE’s Website:

A fundamental component of effective enterprise-wide information sharing, for example, is the use of information systems that regularly capture relevant data and make it broadly available to authorized users in a timely and secure manner.

While focused on activities mostly related to terrorism, ISE acknowledges its ideas of data sharing are extremely effective in local law enforcement as well.

In law enforcement, timely and accurate information is vital; as untimely or inaccurate information simply causes problems. PublicEngines audits show that up to 25% of all law enforcement data housed in RMS or CAD systems in the average agency is not accurate – often mislabeled or improperly mapped. This can lead to a poor allocation of resources and confusion.

However, some agencies struggle with sharing relevant and timely information at all. Time spent reporting, analyzing, and distributing information can be tedious, and time-consuming. And with up to 59% of agencies stating there is a lack of staff for crime analysis, it’s no surprise the critical information is often not shared.

The lessons are clear – sharing data that is relevant, timely, and easy to interpret, is an effective way to become more efficient in law enforcement.

PublicEngines recently announced it has added Email Reports to its widely popular CommandCentral Analytics solution. You can read the announcement here: PublicEngines Launches Email Reports for CommandCentral

Creating more tools that allow for easier sharing of critical information is a step in the right direction of tackling this challenge of too much information for law enforcement. This is especially true when the information shared is easy-to-read and interpret. The more people that have access to timely and valuable information, the better law enforcement decisions will be made.

So, how is timely and relevant information shared at your agency? Do you have an internal process, meeting, or technology solution that works particularly well for you? Let us know in the comments section.