Rethinking Response Part Two: AI to Analyze Body Worn-Camera Footage

Introduction 

The 2014 death of Michael Brown sparked a national shift in the discourse around policing. Communities across the country raised concerns about a lack of accountability in law enforcement, leading to widespread adoption of body-worn cameras (BWCs). These programs aimed to improve transparency and accountability in policing by providing, in theory, an incontrovertible account of police interactions.  Ultimately, the hope was that BWCs would enhance trust between police and communities, as well as the standard of policing. Federal support, including directives from the White  House and grants from the Department of Justice, helped law enforcement agencies initiate pilot programs. Today, BWCs are commonplace among agencies large and small.

Despite their potential, the expected accountability BWCs promised has failed to materialize fully. While some studies point to fewer complaints against officers and reduced use of force, others show no significant change.

One potential explanation is that BWC programs face a glaring capacity challenge: they record more video than can be reviewed. Axon, the leading supplier of BWCs in the country,  holds more than 100 petabytes in its database today. To put this in perspective, that is more than 5,000 years of video. Manually sifting through hours of video is a labor-intensive process, and only a limited number of agencies have the resources to analyze it effectively. In many cases, footage is only reviewed when certain incidents occur, like arrests, but this represents only a small fraction of police activities. Random audits of BWC  footage by supervisors, where they exist, often only cover a limited number of videos on a monthly basis. As a result, most footage ends up in storage, unreviewed and underutilized. 

It has become increasingly clear that the mere presence of cameras alone is not enough to drive meaningful change.  Without the capacity to review and organize the footage, the potential benefits of BWCs are significantly limited. To address this issue, some agencies are turning to AI to help analyze BWC  video, identify critical incidents, and provide insights that could improve police behavior and accountability.

What AI Can Offer

Typically, BWC analytics use a form of AI known as “natural language processing” to create speech-to-text transcriptions from audio files. Then, text is analyzed for specific keywords,  phrases or sentiments, and is automatically assigned labels to categorize the event (for example, flagging interactions where an officer used profanity or threatened force). Some vendors,  such as Polis, also use computer vision to analyze video to detect facial expressions and body movements. After events are tagged, positive and negative interactions are flagged for supervisors.

So analyzed, AI could, for example, be used to identify when an officer interrupts a person, uses inappropriate language,  employs force, or mutes their camera. Once flagged, these incidents can be tagged for further review by supervisors, who can then take action as needed, such as retraining officers on agency policies.

AI BWC analytics can also assist in this training by providing real-world examples of problematic behaviors as well as positive ones. AI can sort, filter, and flag sections of videos efficiently to demonstrate positive and negative encounters.

For example, some vendors’ software aims to detect when officers explain their actions during encounters and refrain from using force. Supervisors can use these examples to illustrate how an officer should respond in common situations.  

These tools solve one piece of the problem, but there is a broader picture: body-worn camera data could be used by researchers to gain crucial insights about policing, leading to improved practices overall. BWCs generate massive amounts of data, making them the largest single source of policing data in existence. Researchers, with the help of AI, can make sense of this data, paving the way to much-needed reforms on topics such as de-escalation, the reasons officers give for stops, and what is needed to achieve procedural justice.

For instance, Washington State University’s Complex Social  Interactions Lab works with police departments by reviewing  BWC footage and providing recommendations. With the assistance of AI, researchers analyze video to track factors such as the race of officers and people involved, whether officers explained their actions or commands, and whether force was used. AI then identifies correlations between these factors and the outcomes of each encounter. The Lab then shares its findings with police departments, helping to shape training standards and identify areas where policing practices can improve. For example, the Lab began its work partnering with the Pullman Police Department to analyze videos and train officers accordingly, and has since partnered with at least ten other law enforcement agencies at no cost. After analyzing thousands of hours of footage, the Lab found that interactions that don’t end in violence are more likely when officers take the time to explain to community members what is happening. With thoughtful implementation, AI has the ability to unlock the untapped potential of BWCs and create a more transparent and accountable law enforcement system. 

Key Considerations for Policymakers

Restrictions on use of BWC analytics. BWC footage collects vast amounts of information, including people’s movements,  conversations, and interactions. Although those who call for police service may anticipate some invasion of privacy,  cameras also record others who are not interacting directly with the police and who did not expect to become part of a record. These recordings can be viewed after the incident, giving police access to unprecedented amounts of information with more time and scrutiny. Jurisdictions should set clear rules governing how agencies can use BWC analytics and should make clear that these tools, which are adopted for accountability purposes, cannot be repurposed for surveillance. 

Redact Personally Identifiable Information. Data privacy is another critical concern. BWC footage often contains sensitive personal information — for example, an individual’s address or phone number. To protect individual privacy, and because such information is not needed for BWC footage to serve accountability or research ends, any system or program that analyzes BWC footage should ensure personally identifiable information is redacted before it reaches human reviewers.