Find Us On

Follow Us on Facebook Follow Us on LinkedIn Follow Us on Twitter Follow Us on YouTube Follow Us on our RSS Feed

Connect with Your Citizens Anywhere They Want - CityConnect: New Mobile App for Law Enforcement

Crime Dashboards Should be Used In Every Department

So what exactly is a crime dashboard?  Is this just another buzz term within law enforcement or is it truly something to be utilized to drive the department’s crime-fighting efforts? To be honest, my first thought at the word is something we’re all familiar with: the dashboard in your cruiser. It’s the central hub of your patrol car that gives you an overview on the over all health of your vehicle – amount of gas in the tank, temperature of the engine, oil pressure, speed odometer, tachometer, etc.  But that clearly isn’t the same thing.

When discussing dashboards in technology applications, business executives are very familiar with the term. They’ve been using business intelligence dashboard for clear over a

Executive BI Dashboard

Executive BI Dashboard

decade. It’s purpose is similar to the car dashboard: to inform the manager of the over all health of the company by measuring key performance indicators, like monthly revenue, number of new customers, number of renewals, and so on.

 

Likewise, a crime dashboard’s main objective should be to give you an overview of crime trends in your jurisdiction. I call this the who, what, why, when, where of crime intelligence. It should be easy to read and even easier to use in order to make policy decisions that are right for your county, city, or town. Now, there is more than one way to build a crime dashboard, so I’m going to discuss below the most important considerations for creating my own department’s crime dashboard.

But first we need to ask ourselves: what needs to be included in your crime dashboards – crime type, suspect information, narratives, maps? The answer is certainly all of these and even more. Now I will grant you this, without a specific software program that assists you in creating your crime dashboard, it can be a real chore to piece this information together by manual means, but it can be done. This is where I started before using CommandCentral Analytics, which I used for many years.

Crime Dashboards Provide Agencies an Overview of Crime at a single glance.

Crime Dashboards Provide Agencies an Overview of Crime at a single glance.

A specific software platform will certainly make the creation of your crime dashboards a much easier process – essentially a matter of minutes instead of hours or even days. I have found that the best practice tenants that I’m about to outline ring true no matter which method you use to create your dashboards. In reference to the points I’m about to make, I contend that your aim is to have all pertinent information on one screen and have the ability to drill down within your dashboard to gain greater insight.

 Considerations When Creating A Crime Dashboard

1. Make sure that you can see where your crimes have occurred.

This is generally achieved through a map visualization. I like to also supplement the mapping function with something such as a pie chart or bar chart to break down the number of occurrences with in a specific beat or zone by crime type.

2. Make sure that you can see when you’re crimes have occurred.

In this case I typically use a Time of Day/Day of Week Heat Map.  This map easily displays, through a hot/cold style visualization when the crimes are occurring by a cross-reference of time of day and day of week. That being said, this information can also be displayed in a number of other ways such as; a combination bar chart displaying the time of day and day of week.  It is very important to remember that the time of day and day of week need to both be included.  Simply looking at the time of day or the day of week on their own leaves too many questions to be answered by your viewer.

3. Make sure that the who and what of your crimes can be easily viewed.

This is undoubtedly the most difficult suggestion that I will give you.  The reason it is the most difficult is because it is the most expansive information, and thus the ability to drill down within a visual on your dashboard is invaluable. Really the only way to do this without a piece of software such as CommandCentral Analytics would be to create a secondary list that you could attach to your original dashboard. Logging in to your RMS to view this information individually simple takes too much time and negates the dashboard purpose. However, within CommandCentral Analytics I used the list function for this visual which allowed me the ability to see all the specific information about the crimes I have chosen including the responsible or reporting officer and their entire narrative.

4. Make sure your dashboards are set up in an intelligent manner and in the proper mindset for their intent.

The dashboards you create can be for a number of purposes as well as a number of divisions within your department.  Ensure that when you create each dashboard it makes sense for the application it is being created for.  For instance, a tactical dashboard for a specific narcotics case should be as specific to that case in all of its visuals as well as its time parameters as it can be.  On the other hand, a dashboard that has been created to follow a strategic plan over a long term set of crimes should be modified with time, location, and other factors so as to aid in the long term planning of the specific crime-fighting series.

To sum up, your dashboards should not be viewed as cookie cutters for every situation.  Although I believe there are certainly a set of best practice procedures that should be followed to give each of your dashboards maximum effect and usability, I also would direct you to be as individualistic as possible with each dashboard in terms to the specific problem it has been created to address. Every dashboard that you create should directly lead your agency into the proper actionable, intelligence-led decisions that will ultimately aid in reducing crime.

Crime Data Quality and Validation – A Necessity for Every Agency

Accurate Mapping is The Epicenter for Making Sense of Your Crime Data

Let’s talk about mapping. Very few mapping systems, whether you are using GIS or some other type of mapping system, are always spot on. The reasons for these inaccuracies vary widely. From inaccurate GIS mapping at the onset, to duplicate addresses in your city that are only separated by a North-South or East-West designation, or simply a user data-entry mistake. Previously, I couldn’t change these map points in my records management system, nor did I have admin over the county GIS system that would allow me to change the points. However, I can now change them with CommandCentral. Recently, PublicEngines released a new feature dubbed the Data Quality and Validation tool, or DQV for short. With just a few steps I am now able to take my map, with an average of 150 inaccuracies a month, and turn it into a completely accurate crime map, with no inaccuracies.

How My Data Accuracy Quest Began

When I began the intelligence unit at my agency in the greater Atlanta area, one of the things I noticed first off was how inaccurate our crime mapping system seemed to be. As I began looking into the problem, I found that instead of it being the result of a single error, it was actually the result of a myriad of errors. Among those, were inaccurate geocoding, areas of my city that had been annexed in but not yet geocoded, duplicate addresses within my city, and of course data entry mistakes.

Now as you can imagine, as I began to remedy this situation I felt a little like a dog chasing its tail — I was certainly moving but I wasn’t making any progress. During one staff meeting, it became even more apparent that I needed to do something about the mapping inaccuracies when we began looking at crimes broken down by zone specifics.  We looked for crimes that we knew had occurred in a certain zone so that we could speak about them as a command staff and form a tactical action plan. But when we began searching for them, we weren’t able to find the specific crimes. As I began searching within my RMS to locate these “lost” crimes, I found them all mapped outside of my city boundaries. Many of these “lost” crimes were plotted tens, hundreds, and even thousands of miles away from where they should have been.  We had a serious problem to say the least, and unfortunately, no solution.

Fast-forward to my time at PublicEngines. One of the key drivers in developing the DQV tool was the research that I conducted in proactively auditing our customer’s databases. I found very quickly that my agency, with 150 mis-maps a month, was by far not on its own. The vast majority of agencies have mapping problems that they are either not aware of, or lack the to ability to fix. This is why I am so excited to introduce to you the DQV tool. Not only will you be able to identify all occurrences mapped outside of your jurisdictional boundaries, but you will be able to correct those errors in just a few steps.

A New Solution to An Age-Old Problem

crime data, crime data qualtiyAs I’m sure you can attest, accurate data is paramount to conducting crime analysis that leads to actionable intelligence and crime reduction. The DQV tool in CommandCentral ensures that the most common data errors – mis-mapped and mis-classified crimes – are easily correctable so agency personnel can make resource decisions with confidence.

Here are a few highlights of its capabilities.

  • Built-in alert bar notifies CommandCentral Analytics administrators when incidents are geocoded outside of an agency’s jurisdiction.
  • Click-to-correct mismapped incidents – inaccurately mapped crimes can be corrected simply by clicking on the CommandCentral Map
  • Create rules so that all future data synched to CommandCentral is mapped accurately
  • Edit crime incident categorization
  • Maintain data fidelity – changes are only made in CommandCentral, not in your RMS

Identifying mis-mapped crimes is as easy as selecting an Out of Area button in the system’s administration section. The tool then generates a list of occurrences that were all mapped outside of your jurisdictional boundaries. You can select any single occurrence listed to see where it’s currently positioned on the map, and then override it. This process is easy: simply select where that occurrence should be mapped. You can change the pin for this specific incident only or for all incidents previously mis-mapped in the same manner — which is especially important when it is one of those addresses that are constantly in error.

Visualizing and analyzing crime data through crime mapping solutions has been an essential tool in every agency’s arsenal since the mid 1800′s with the advent of the pin-map. Today, online tools make the task easier than ever. But the question remains, is the data you’re viewing accurate? With the DQV tool you can now be sure of it.

Four Steps To Effectively Using Crime Data in Law Enforcement

It’s no secret that law enforcement agencies are consistently being asked to do more with fewer resources. Budget cuts have meant fewer feet on the street and ever increasing demands on agencies and officers alike. To meet these growing demands, many agencies are increasingly relying on technology to fill that gap.

There are four important steps to make sure that you’re using data to its fullest potential.

1. Collecting the data

The fact of the matter is there is a plethora of data available. Useful data may include department-specific information such as:

  • Recent bookings
  • Various types of crimes that have been committed
  • Calendar of when crimes occurred
  • Maps of where illegal activities took place
  • Written citations

However, according to Doug Wylie, Editor in Chief of PoliceOne, some cities take it a step further to include a much greater holistic view of the community a department serves to include everything from utilities and social services records to new building permits.

2 . Accurate Data

Recently, one big city police department announced it would no longer be releasing monthly crime reports because the Excel files they used to distribute the information were being corrupted. Someone had been changing the data the public viewed. This follows the accusations a couple of years ago that the New York Police Department had been falsifying data.

Audits by PublicEngines reveals that up to 25% of all law enforcement data housed in RMS or CAD systems is not accurately recorded.

However, there are ways to improve data accuracy. According to a recent report, some of the variables that agencies should consider include:

  • Data that is correctly captured. This is crucial because there are myriad of codes, statutes and other minor details that allow for human error. Information can be mislabeled or mapped incorrectly. Regular review and comparison can help catch errors and ensure greater accuracy.
  • Quality report writing that includes correct classifications, a built-in multiple-level review processes, and a system to all for reclassification, supplements and follow-up reports to be reviewed, approved and added.
  • Regular audits of reports to verify accuracy. This might also include periodic surveys of randomly selected citizens, who have reported criminal activity to verify your records accurately reflect the facts as they were reported.

3. Adequately Interpreted Data

Those agencies with analysts rely on these hard-working people to identify crime trends. But they’re stretched thin. The ability for officers to predict crimes not only relieves some of the pressure on analysts, it also helps reduce crime. Access to this information is the key factor.

But with sheer amount of data now being gathered, is there room to interpret it in a way that predicts even more crime?

Take some of the non-criminal data that agencies are gathering that was mentioned by PoliceOne’s Wylie. An officer knows that construction sites often experience the theft of materials, vandalism and graffiti. If he also knows from new building permits that construction is under way in several projects, redirecting himself to those areas can significantly reduce the potential for those crimes.

4.Getting data into hands of those that take action

As the above example illustrates, when officers on the street have access to data, they can act accordingly. However, that can prove a challenge.

Products like CommandCentral Predictive, work to eliminate those challenges. Since it’s cloud-based, it is available literally anywhere it is needed so long as an Internet connected device is available. Reports can even be sent directly to officers via email automatically.

Officers in the field are hardly desk jockeys, which is why allowing them to access the information while in the field via their mobile phone or tablet is so important. It can literally be the difference between a crime being prevented and a crime occurring.

Data is available – maybe even too much data is available – but there are ways to harness that information to help predict and prevent crime. Collecting that data from a wide variety of sources, ensuring its accuracy and interpreting its value are important first steps. However, utilizing technology – getting this information to officers wherever they may be – allows them to predict crime and make the streets safer for everyone.

CommandCentral Predictive: Built for Field Use

Antique-Compass-1103474_55801016

We have had tremendous feedback since we announced our newest product, CommandCentral Predictive, several weeks back.  Since the announcement we have literally heard from hundreds of agencies.  Last week I was on the road speaking to agencies, including several of our existing customers and hopefully some new ones.

It is hugely rewarding to see such response given the amount of time and effort we have put into creating CommandCentral Predictive.  This launch is the culmination of months of meeting with customers and getting their continuous feedback on the product, and the extraordinary effort by our team to turn that feedback into tangible changes in the product.

During the course of developing this product we spoke with many law enforcement agencies about product features and had dozens of recommendations on how to build a better product.   Some we have been able to include in the first generation, and others that are still to come.

From my meetings in the last two weeks there were two things that have really stood out about CommandCentral Predictive feedback.  First is that every agency that I have spoken to is looking for a better way to direct and enable their patrol officers to increase their effectiveness.  It’s simply an issue of how to maximize their resources in the most effective way through direction and information enablement.

The second item goes along with that was something that we have heard all along: that anything we provide needed to be usable in the field, not just another screen on a computer for someone to look at.

That hit home last week when I visited a large agency who is currently testing their own internally-developed predictive analytics solution.  While they have yet to determine whether or not to roll it out, one of the concerns they had was whether or not officers will actually use it.  Their limited trials have had mixed results and previous experience with another application showed limited success because very few officers got to the first step: logging on.

We realized this early in the development of this product which led us to two needs: first, to build CommandCentral Predictive with mobile devices in mind, and second, to empower officers with real data to make smarter, more informed decisions.

The mobile part was a no-brainer.  If they haven’t started already, almost every agency envisions the time when they will drop their MDT’s (Mobile Data Terminal) for tablet devices.  So, we built this product not only to be run on a tablet or large smart phone, but to be optimized for it.

Second, we built functionality into it so that users could access the underlying data driving predictions.  And it was designed to present this information to officers so they could easily access it and use it to make better decisions when they are in the field.

This data accessibility was validated time and again last week as I spoke with agencies, who see the value in empowering their officers with data in the field that help them to make better decisions and actions.  It simply helps to make better, smarter decisions while on patrol.  And, most importantly, it does it automatically, improving information flow without burdening an analyst with pulling data for officers (more on this another time).

Anyone who takes CommandCentral Predictive for a test drive will see how easy it is to access information that will put an officer at the advantage.  It’s just one of those things we built into CommandCentral Predictive that makes is such an exciting product.  We look forward to more feedback as we meet with customers and continuing to innovate on CommandCentral Predictive to make it the most effective tool to predict and prevent crime.

The Science of Predictive Policing.

Author: Praneeth Vepakomma, Data Scientist, Public Engines

The scientific problem of being able to make predictions has raised the interests of researchers for centuries. Solutions that attack this problem have found all encompassing applications within various fields of societal, applied and theoretical interests. Over this period, the complexity of these predictive models gradually increased and so did the predictive accuracy and generalizability of these solutions improve with time. Stemming from the early work of Gauss in the early 1800’s to Andrew Ng’s present day experiment at Stanford consisting of autonomous helicopters that learn to fly- the problem of prediction has truly moved from the simple problem of fitting a straight line, to learning complex relationships from data.

R&D at PublicEngines

At PublicEngines we have a committed R&D team focused on this intersection of developing advanced mathematical models internally for predicting crime that also ‘learn’ from data and we are pleased to have announced our newest product, CommandCentral Predictive that culminates as a result of these dedicated research efforts.

Mathematical Modeling of Crime

The mathematical model that I have developed in-house along with my team exploits specific behaviors that are markedly inherent to crime-incident data. These behaviors include near-repeat victimization, long-term patterns, transient/short-lived patterns and interactions within crime-incident data that our model understands while separating signal from the noise thereby raising the statistical confidence levels around our predictions. These dynamic sets of crime-specific behaviors are measured through our model while not just separating one characteristic from another but also while accounting for the dependencies between them to mathematically understand the inherent predictive characteristics of the patterns of crime. We also were cognizant of some inconsistencies that crop up through the use of some of the existing academic literature around similar problems, and improve upon these as well within our patent pending system.

Focus on Models that Learn

The automated learning component of our model is another important aspect that brings in generalizability to our predictive engine where it guarantees a level of robustness, over new unseen data, when the models are deployed in reality. The scientific field of building various domain specific models that also have the ability to learn from data is an integral part of this field called statistical machine learning. This area began to gain early momentum, when scientists began to build mathematical models that mimic neural networks within the human brain, starting from late 1940’s. An early learning-system breakthrough occurred when scientists in the late 1980’s were able to develop handwriting recognition systems. The field grew leaps and bounds from there on, thereby adding analytical intelligence to various use cases. Today’s E-mail Spam Classification Systems, Voice Recognition Systems, Computer Vision Systems, Facial Recognition Systems, Document Classification Systems are just a few examples of the successes of machine learning. The main reason to focus on a learning component is because in reality, a model that works great over the data that you use to build and train the model on, wouldn’t always be replicable in new instances of these datasets when deployed in real-life scenarios. This happens primarily because the model could learn non-generalizable intricacies within the training dataset, which do not otherwise translate into practical results over future data, that the model would face after deployment. This is the reason; we rigorously train our patent pending crime-specific statistical model, and as well have a learning component comprising of an ensemble of multiple models to keep it robust in real-life deployment. This allows for our model to separate the predictive characteristics of future crimes, down to a high level of tactical/actionable granularity.

AdvancedPredictiveModelvsHotspotting

Field-testing & Evaluation Metrics

We have rigorously tested our models against the most traditional methodologies like ‘Hotspotting’ which are heat-maps generated using historic crime-incident data. In order to have a scientific evaluation, we quantify the performance of existing Hotspotting methodologies by overlaying a grid over the hotspots and measuring the success rate of our predictions in comparison to the hotspot. The number of days required for an evaluation ranges somewhere between a minimum of 80-120 days based on the dataset in order to reach a statistically significant comparison. We conclusively observe, that our patent pending model performs about 2.7 times better than the widely used hotspot.

Positive Impact of Actionable Patrol

This level of focused tactical information would help reinforce ‘directed, actionable patrol plans’ and increase ‘resource-efficiency’ so that agencies can use it in their law enforcement efforts to positively impact their communities through predictive policing. This technology unlocks tactical intelligence from your RMS and closes the analyst/officer gap by efficiently improving operational decision-making. From the perspective of the tech-industry as well, this is absolutely inline with our goal of continuing to reinforce officers with technology that provides tactical and directly actionable information in order to assist them in continuing to do their best.