Find Us On

Follow Us on Facebook Follow Us on LinkedIn Follow Us on Twitter Follow Us on YouTube Follow Us on our RSS Feed

Connect with Your Citizens Anywhere They Want - CityConnect: New Mobile App for Law Enforcement

Crime Data Quality and Validation – A Necessity for Every Agency

Accurate Mapping is The Epicenter for Making Sense of Your Crime Data

Let’s talk about mapping. Very few mapping systems, whether you are using GIS or some other type of mapping system, are always spot on. The reasons for these inaccuracies vary widely. From inaccurate GIS mapping at the onset, to duplicate addresses in your city that are only separated by a North-South or East-West designation, or simply a user data-entry mistake. Previously, I couldn’t change these map points in my records management system, nor did I have admin over the county GIS system that would allow me to change the points. However, I can now change them with CommandCentral. Recently, PublicEngines released a new feature dubbed the Data Quality and Validation tool, or DQV for short. With just a few steps I am now able to take my map, with an average of 150 inaccuracies a month, and turn it into a completely accurate crime map, with no inaccuracies.

How My Data Accuracy Quest Began

When I began the intelligence unit at my agency in the greater Atlanta area, one of the things I noticed first off was how inaccurate our crime mapping system seemed to be. As I began looking into the problem, I found that instead of it being the result of a single error, it was actually the result of a myriad of errors. Among those, were inaccurate geocoding, areas of my city that had been annexed in but not yet geocoded, duplicate addresses within my city, and of course data entry mistakes.

Now as you can imagine, as I began to remedy this situation I felt a little like a dog chasing its tail — I was certainly moving but I wasn’t making any progress. During one staff meeting, it became even more apparent that I needed to do something about the mapping inaccuracies when we began looking at crimes broken down by zone specifics.  We looked for crimes that we knew had occurred in a certain zone so that we could speak about them as a command staff and form a tactical action plan. But when we began searching for them, we weren’t able to find the specific crimes. As I began searching within my RMS to locate these “lost” crimes, I found them all mapped outside of my city boundaries. Many of these “lost” crimes were plotted tens, hundreds, and even thousands of miles away from where they should have been.  We had a serious problem to say the least, and unfortunately, no solution.

Fast-forward to my time at PublicEngines. One of the key drivers in developing the DQV tool was the research that I conducted in proactively auditing our customer’s databases. I found very quickly that my agency, with 150 mis-maps a month, was by far not on its own. The vast majority of agencies have mapping problems that they are either not aware of, or lack the to ability to fix. This is why I am so excited to introduce to you the DQV tool. Not only will you be able to identify all occurrences mapped outside of your jurisdictional boundaries, but you will be able to correct those errors in just a few steps.

A New Solution to An Age-Old Problem

crime data, crime data qualtiyAs I’m sure you can attest, accurate data is paramount to conducting crime analysis that leads to actionable intelligence and crime reduction. The DQV tool in CommandCentral ensures that the most common data errors – mis-mapped and mis-classified crimes – are easily correctable so agency personnel can make resource decisions with confidence.

Here are a few highlights of its capabilities.

  • Built-in alert bar notifies CommandCentral Analytics administrators when incidents are geocoded outside of an agency’s jurisdiction.
  • Click-to-correct mismapped incidents – inaccurately mapped crimes can be corrected simply by clicking on the CommandCentral Map
  • Create rules so that all future data synched to CommandCentral is mapped accurately
  • Edit crime incident categorization
  • Maintain data fidelity – changes are only made in CommandCentral, not in your RMS

Identifying mis-mapped crimes is as easy as selecting an Out of Area button in the system’s administration section. The tool then generates a list of occurrences that were all mapped outside of your jurisdictional boundaries. You can select any single occurrence listed to see where it’s currently positioned on the map, and then override it. This process is easy: simply select where that occurrence should be mapped. You can change the pin for this specific incident only or for all incidents previously mis-mapped in the same manner — which is especially important when it is one of those addresses that are constantly in error.

Visualizing and analyzing crime data through crime mapping solutions has been an essential tool in every agency’s arsenal since the mid 1800’s with the advent of the pin-map. Today, online tools make the task easier than ever. But the question remains, is the data you’re viewing accurate? With the DQV tool you can now be sure of it.

Four Steps To Effectively Using Crime Data in Law Enforcement

It’s no secret that law enforcement agencies are consistently being asked to do more with fewer resources. Budget cuts have meant fewer feet on the street and ever increasing demands on agencies and officers alike. To meet these growing demands, many agencies are increasingly relying on technology to fill that gap.

There are four important steps to make sure that you’re using data to its fullest potential.

1. Collecting the data

The fact of the matter is there is a plethora of data available. Useful data may include department-specific information such as:

  • Recent bookings
  • Various types of crimes that have been committed
  • Calendar of when crimes occurred
  • Maps of where illegal activities took place
  • Written citations

However, according to Doug Wylie, Editor in Chief of PoliceOne, some cities take it a step further to include a much greater holistic view of the community a department serves to include everything from utilities and social services records to new building permits.

2 . Accurate Data

Recently, one big city police department announced it would no longer be releasing monthly crime reports because the Excel files they used to distribute the information were being corrupted. Someone had been changing the data the public viewed. This follows the accusations a couple of years ago that the New York Police Department had been falsifying data.

Audits by PublicEngines reveals that up to 25% of all law enforcement data housed in RMS or CAD systems is not accurately recorded.

However, there are ways to improve data accuracy. According to a recent report, some of the variables that agencies should consider include:

  • Data that is correctly captured. This is crucial because there are myriad of codes, statutes and other minor details that allow for human error. Information can be mislabeled or mapped incorrectly. Regular review and comparison can help catch errors and ensure greater accuracy.
  • Quality report writing that includes correct classifications, a built-in multiple-level review processes, and a system to all for reclassification, supplements and follow-up reports to be reviewed, approved and added.
  • Regular audits of reports to verify accuracy. This might also include periodic surveys of randomly selected citizens, who have reported criminal activity to verify your records accurately reflect the facts as they were reported.

3. Adequately Interpreted Data

Those agencies with analysts rely on these hard-working people to identify crime trends. But they’re stretched thin. The ability for officers to predict crimes not only relieves some of the pressure on analysts, it also helps reduce crime. Access to this information is the key factor.

But with sheer amount of data now being gathered, is there room to interpret it in a way that predicts even more crime?

Take some of the non-criminal data that agencies are gathering that was mentioned by PoliceOne’s Wylie. An officer knows that construction sites often experience the theft of materials, vandalism and graffiti. If he also knows from new building permits that construction is under way in several projects, redirecting himself to those areas can significantly reduce the potential for those crimes.

4.Getting data into hands of those that take action

As the above example illustrates, when officers on the street have access to data, they can act accordingly. However, that can prove a challenge.

Products like CommandCentral Predictive, work to eliminate those challenges. Since it’s cloud-based, it is available literally anywhere it is needed so long as an Internet connected device is available. Reports can even be sent directly to officers via email automatically.

Officers in the field are hardly desk jockeys, which is why allowing them to access the information while in the field via their mobile phone or tablet is so important. It can literally be the difference between a crime being prevented and a crime occurring.

Data is available – maybe even too much data is available – but there are ways to harness that information to help predict and prevent crime. Collecting that data from a wide variety of sources, ensuring its accuracy and interpreting its value are important first steps. However, utilizing technology – getting this information to officers wherever they may be – allows them to predict crime and make the streets safer for everyone.

Timely Crime Data that is Easy to Interpret Leads to Better Law Enforcement Decisions

Organizations of all types, especially law enforcement agencies, are being buried in Big Data. As defined by Wikipedia, the term Big Data  represents data so large and complex that it becomes difficult to process.

The same problem can also referred to as Information Overload, and aside from the technical challenges to Big Data, too much information can be difficult to access, store, share, and be useful. In today’s more electronic world, we aren’t in jeopardy of being buried in paper – rather the biggest threat of all is that the data goes unused.

Useful information leads to better knowledge, and thus better decision-making.  A typical approach to managing data and information can be:

  1. Collect raw data
  2. Sort into useful information
  3. Analyze information
  4. Share findings
  5. Information is turned into useful knowledge

The Information Sharing Environment (ISE) provides vital information about terrorism, weapons of mass destruction, and homeland security to analysts, operators, and investigators in the U.S. government. From ISE’s Website:

A fundamental component of effective enterprise-wide information sharing, for example, is the use of information systems that regularly capture relevant data and make it broadly available to authorized users in a timely and secure manner.

While focused on activities mostly related to terrorism, ISE acknowledges its ideas of data sharing are extremely effective in local law enforcement as well.

In law enforcement, timely and accurate information is vital; as untimely or inaccurate information simply causes problems. PublicEngines audits show that up to 25% of all law enforcement data housed in RMS or CAD systems in the average agency is not accurate – often mislabeled or improperly mapped. This can lead to a poor allocation of resources and confusion.

However, some agencies struggle with sharing relevant and timely information at all. Time spent reporting, analyzing, and distributing information can be tedious, and time-consuming. And with up to 59% of agencies stating there is a lack of staff for crime analysis, it’s no surprise the critical information is often not shared.

The lessons are clear – sharing data that is relevant, timely, and easy to interpret, is an effective way to become more efficient in law enforcement.

PublicEngines recently announced it has added Email Reports to its widely popular CommandCentral Analytics solution. You can read the announcement here: PublicEngines Launches Email Reports for CommandCentral

Creating more tools that allow for easier sharing of critical information is a step in the right direction of tackling this challenge of too much information for law enforcement. This is especially true when the information shared is easy-to-read and interpret. The more people that have access to timely and valuable information, the better law enforcement decisions will be made.

So, how is timely and relevant information shared at your agency? Do you have an internal process, meeting, or technology solution that works particularly well for you? Let us know in the comments section.

Hot Spot Policing Reduces Crime in Real World Experiment

Today there is an abundance of theories about different strategies and tactics police departments can implement to reduce crime and save tax payer money. Unfortunately, like many theories, they can be difficult to measure, and prove – or disprove.

I recently came across an article in Dispatch called A Hot Spot Experiment: Sacramento Police Department that took the so-called Koper curve theory of hot spot policing, and put it to a real world test.

The Sacramento Police Department tested out the theory, which states that certain neighborhoods or locations will have an unequal distribution of crime when compared to other locations in the same area. The higher crime areas are called hot spots, and the theory says that when there is a visible police presence in these hot spots, crime will drop.

Hot Spot Map

The CommandCentral Heat Map shows density of crime by time per agency patrolling area.

The experiment outlined a ranking of Hot Spots, and two separate groups (Hot Spot Policing, and Routine Patrols) were assigned. Hot Spot Policing was defined as having police officers who are highly visible in the assigned Hot Spot for 12-16 minutes every two hours.

The Sacramento Police Department tested the theory over a three month period. Following are some of the findings of that real world study:

  • Crimes in areas that used Hot Spot Policing decreased by 25 percent
  • Officer productivity improved due to Hot Spot Policing
  • Hot Spot Policing lead to significant cost savings (almost $300,000 over the three month period)

So, while this is only one real world experiment that seemed to show benefits to implementing the Koper curve theory of hot spot policing, more research can be done. I also found it interesting to see how vitally important accurate crime data and statistics are to implementing a technique such as Hot Spot Policing. Accurate crime data allowed Sacramento PD to identify Hot Spots, and track the impact of its experiment. Ultimately it seems, having the ability to collect, track, and analyze crime data, leads to better knowledge, and thus better decision-making.

Congratulations to the Sacrament Police Department for using their data to implement Intelligence Led Policing systems that lower crime, and save money. To read more about this experiment, visit http://cops.usdoj.gov/html/dispatch/06-2012/hot-spots-and-sacramento-pd.asp

 

Intelligence Led Policing Yardsticks: Data Cleansing and Management

So let’s start piling in the data! Right?

Not just yet cowboy; pull back the reigns for just a minute. I know you’re anxious to get your Intelligence Led Policing initiative up and running, but don’t skip the most important step: making sure the data you are going to feed into your intelligence system isn’t garbage! We’ve all heard the term, “garbage in, garbage out.” Well, the data integration between an RMS/CAD system with an analytics solution is where that term really comes to life.  So many agencies I speak to have unwittingly made the mistake of pushing data into an intelligence system without vetting that data first.  What makes it worse is that most of these agencies won’t figure out that they pushed bad data into their system until they begin to get results from that system that doesn’t make sense.  Some of the most common results they will see as a result of bad data are:

  • Maps will be a mess
  • Multiple crime types getting bulked together (think murder, rape, theft, lumped together in an “other” category)
  • And generally, their numbers won’t jive with what they know to be true

So let’s talk about your data; I mean really talk about why it is so important to spend time on your existing data set to make sure that it is clean, standardized, and accurate.  Don’t assume that your data set is correct. It may be a difficult thing for you to do, but I want you to assume that you data set is “generally” correct, but needs verification before it is trusted.  As President Reagan said so often, “Trust, but verify”. Now don’t get me wrong, you can certainly use your existing data, as it sits, without verification, for your intelligence initiative.  You can pull intelligence from that un-vetted data set and distribute that intelligence throughout your department.  But, without verifying your data, your intelligence will be wrong.

Data verification is the key to good intelligence.

So how do you verify your data?  Here are a few areas to check in what I would consider the most common error categories, and steps to take to correct any mistakes you may find.

1.   Maps – Most commonly our mapping data, whether from GIS, Google, Lat/Long, or address, is far from perfect.  As a matter of my experience, I would say that maps are one of the most inaccurate, yet one of the most desired data sets for an agency.

crime map, crime mapping, crime reports

Crime map data comes from a variety of technologies.

Meaning, we all want good maps, but very few of us actually have such.

Why? – There are a multitude of reasons that our maps leave a lot to be desired, such as: inaccurate input from our officers and dispatch, duplicate addresses resulting from some unknown oddity in city planning, and GIS, Google and other mapping systems simply putting the map point in the wrong place. Very few of these, other than officer/dispatch mistake, are under our control.  This is what makes the mapping issue such a big problem, many of us just throw up our hands and exclaim, “it’s out of my control, so we’ll just have to live with it.”

Effect – By leaving bad mapping to its own devices, we are allowing our data to be tainted.  In essence, we are saying, we don’t really care about those crimes that are mapped incorrectly, we will just rely on the crimes that are mapped correctly.  Of course, this is false logic because, as we all know, especially Intelligence Led Policing  should be an “all crimes” approach.  With an “all crimes” approach, we know we are getting the entire picture. Without it, we are only getting a biased view.

Solution – The first step is training – make sure your officers and dispatchers are entering the addresses correctly.  After that is done, choose a software solution, such as the one I use, CommandCentral by PublicEngines, that allows you to identify incorrect mapping points and move them to their correct location.  I have seen many software solutions, records management systems included, that allow the user to see incorrect mapping points, but very few allow the user to move those bad points to where they really need to be.  CommandCentral, however, has gone one step further than just allowing you to identify and move your mapping points, it has streamlined that entire process down to just a few clicks of the mouse.  I am able to look at all of my crimes over a particular date range in the entire city, un-click my zone designations, and then click a tab called “outside” that shows all of my crimes that mapped outside of my zone designations.  I can utilize the administration tools to move mis-mapped crimes to their proper location.  A quick solution to a menacing problem.

2. Crime Types – Used widely by agencies to easily designate the difference between certain crimes, such as Burglary Residential and Burglary Commercial, these designations have quickly gotten out of hand in many jurisdictions.  What I mean by out of hand is that many agencies have found themselves with so many crime types, some have 400, 500, or more, that they have a hard time keeping the designations separate.

Why? –  Let’s face it, so many times it is much easier for an officer to pick Theft/Other than it is to find the exact designation that the crime demands, especially when there are so many options. Whats more is that in so many records management systems, in order to run a report on a parent crime type such as theft, you have to run multiple individual reports on each crime type designation within that parent theft designation.  And sometimes, an officer incorrectly categorizes a crime.  He may write a burglary report, when in actuality, it should have been a theft report.

Effect –  You find your data is scattered all over your records management system.  For instance, in order to run a report of the thefts in your jurisdiction, you have to go to multiple report tables, run multiple reports, and then you still have to compile, generally manually, all of those records into one document.

Solution –   Like Maps, the first step is training. Make sure you train your staff the proper way to designate each crime and more than that, make sure they know why it is so important to properly designate each crime.  The second step then is to revert back to your software solution to make this whole process easier for you.  I again refer to CommandCentral, it allows me to easily bulk my various crime types into easy-to-understand quick tabs.  Meaning then, that all of my various thefts are bulked into one tab simply called “Theft,” while still allowing me to just choose one type of theft and generate a report on it alone if needed. The system also allows me to bulk these crime types myself. That way, if I want to move a specific crime type to another category, I can do so with ease.

3. Numbers – Truly at the heart of the matter aren’t they?  The chief wants them, the city council wants them, even the FBI wants them. But if the data problems we spoke of earlier are not right, then your numbers are surely off as well.

Why? – There is a direct correlation between your numbers being off and the data supporting your numbers being off.  As go the earlier topics, so goes your numbers, and so on.

Effect – So many times we are in a rush to get the numbers out that we forget the correlation.  As a result of putting out bad numbers, we all look poorly to those we have created the report for. We are working with something the likes of a living organism, all parts must work together for all the parts to work correctly.

Solution – Begin by correcting all of the areas that you would get your numbers from, such as the areas we spoke of earlier. Please be diligent about this. You must understand that if your data sets are not correct, then there is no way your numbers, which come from those data sets, can be correct.

In closing, I want us to all remember an old saying some of us were taught in mandate    school: “The Fruits of the Poisonous Tree.” Use it as a guide for our Intelligence Led Policing.  If we use bad data (the poisonous tree) in our intelligence initiative, then the intelligence that we get out of that bad data (the fruit), is corrupt, misleading, and all around garbage. Your Intelligence Led Policing initiative lives and dies on the quality of the data you feed it. Feed it good, accurate data, and it will thrive, feed it fruits of the poisonous tree and it will wither and die.