When people talk about analytics in law enforcement, the brunt of the discussion naturally falls on the ways agencies use predictive tools in a public-facing capacity, especially as they pertain to the prediction and prevention of crime.

However, alongside this increasingly controversial topic, another use of predictive analytics is under consideration by law enforcement agencies and the governments that support them.

Serving as so-called “early warning” systems, these predictive tools are purported to uncover the likelihood of future misconduct that would result termination or even arrest.

The promise here is appealing. On paper, a tool that lets an agency single out problematic individuals (who may have otherwise passed undetected until it was too late) has the potential to reduce the agency’s risk, increase public safety, and increase the bond of trust between law enforcement and the public they serve.

Yet, the same concerns that give predictive analytics limited utility also raise serious questions about their ability to properly forecast misconduct within the ranks.  

Instead, agencies wishing to curb misbehavior before it flares up should consider the value of improved infrastructure and administrative tools, as well as an improved set of evaluation standards, before diving headlong into the “future” of public safety HR.

Predictive analytics in public safety: An overview

For public-facing predictive endeavors, there are several ways in which predictive tools can or may soon be used to prevent crime and increase overall public safety, such as:

  • Using historical data to identify the geographical areas most likely to be affected by future crimes
  • Identifying risk factors of violence between gangs and other groups
  • Locating at-risk citizens and potential victims

While such goals in this vein are commendable, issues quickly arise when these high-level analytics are applied to low-level processes.

First are the legal concerns, which are numerous and endlessly complex. These issues also frequently intertwine with explosive social contexts surrounding “predictive” techniques, such as profiling. These concerns can quickly become harmful racial and socioeconomic debacles if agencies—even those equipped to handle the legal concerns—don’t manage predictive analytics with an intense level of scrutiny, awareness, and care.

Naturally, the legal and social concerns are fewer when applied to the employer-employee relationship, but even then, tools designed to prevent misbehavior by using previously identified red flags or “unintuitive” connections through practices such as machine learning carry unique risks and downsides.

The risks of overreliance on predictive analytics

Violation of union rules The practice of firing, disciplining, or otherwise altering the trajectory of an officer’s career based on the chance of misbehavior may violate union bylaws or other legal standards.
Morale Morale problems can quickly spike if HR actions are tied to predictive analytics. Employees who feel they might be fired, demoted, or replaced because a computer system said they were a potential candidate for trouble—despite a clean working record—may become fearful or disheartened. Even the best officers may be harmed by the increased stress such an environment produces.
Inaccuracy Predictive analytics are not guarantees, and decisions based on them are not infallible. For example, if a supervisor relies on a prediction that an officer won’t improve as needed and lets the officer go, that same officer may very well become a star employee in the next town over. Inaccurate or outright wrong predictions can throw nominal operations off course.
Litigation Risk As predictive technology becomes more widespread, so too will lawsuits based on unfair termination and discrimination. Departments may find it difficult to prove that their early warning systems are equitable and reliable.

A better alternative

It is inevitable that predictive analytics will play a larger role in law enforcement—on both sides of the counter—as the technology matures, improves, and continues to make connections that humans might not be able to find on their own.

Get the Free Use-of-Force and Misconduct Toolkit

 

However, that does not mean the technology in its current state is remotely close to besting the HR capabilities that humans bring to the table.

Agencies without a collective intelligence on the behaviors, training, and work history of their personnel are inherently at a disadvantage.  

However, tools such as the Acadis® Readiness Suite provide it all at a glance, making it that much easier to get a holistic view of an employee’s time, progress, and tendency toward troublesome behavior—things a human eye can catch when they have the whole story.

Agencies may further enhance technological upgrades with training that allows supervisors to internally assess their own biases as leaders, such as their tendency to turn a blind eye toward officers who may be popular but covertly bad or malicious in their roles. Training in this vein is further enhanced by the ability to view centralized, whole-life data.

Predictive analytics should not be regarded as a cure all for internal issues despite its growing presence in public safety practices.

Human oversight will always be the first line of defense against misconduct, and there is no substitute for adequate leadership.

Those tools that seek to enhance leaders’ natural capabilities for guiding proper behavior and nurturing a healthy culture—rather than replace those capabilities—will always have an advantage.

With Acadis, agencies looking to spend money on tools to better address misconduct get that and much more. In addition to tracking positive and negative behavior and outcomes, its modules cover everything from inventory and learning management to internal investigation reviews, gathering them in a centralized system that can be tailored to an individual agency’s needs.

Posted on Apr 1, 2021