The Manuals INITIATIVE: A guide for improving online accessibility & Transparency

 A civilian-led review group meets with police officers to discuss policy recommendations. 

Here at the Policing Project, we believe that communities should have a voice in how they are policed — which means they need to be informed about what their police departments are doing and the policies that guide these actions. This is why we’re launching the Manuals Initiative.

The Manuals Initiative aims to encourage and help police departments get their policy manuals online in a way that is accessible and understandable to the public. Once the public understands the policies that govern the actions of their officers, then they can start to engage on the substance.

Introducing Our Manuals Scorecard

What does an accessible policy manual look like? What characteristics make an online manual easier for the public to read and understand? And why does it matter? 

To answer this, we’ve developed a guide to best practices for publishing a department's policy manual online. We began our work by surveying the 100 largest U.S. police departments (based on total number of sworn officers, according to the Uniform Crime Reporting program) to see which ones have posted their manual online and how these manuals compare. 

Of 100 departments we surveyed, a shocking 46 have not published their manuals online at all, and not a single department received more than 6 of the possible 8 points on our scorecard.

After thinking long and hard about what makes a quality online manual, we developed scoring criteria based on six aspects of accessibility, transparency, and navigability. 

Each city in our survey was evaluated and given a score, the results of which can be found in the scorecard below. Overall, our evaluation revealed a lot of room for improvement. A shocking 46 departments have not published their manuals online at all, and not a single department received more than 6 of the possible 8 points on our scorecard.

In addition to each department's scores, this report outlines the components of our criteria, and offers tips for improvement, as well as examples of departments we felt exemplified best practices.

Understanding The ScoreCard

We want to stress that much of what we found unsatisfactory is easy to fix. Our goal is for this a source of productive feedback, and we hope the metrics outlined below will provide law enforcement with a clear path for implementing improvements to their online manuals and the overall transparency of their departments.

Easy To Find

Our first category measures how easy it is for a member of the general public to navigate the department’s website to access the manual. After all, a manual is not useful if members of the community cannot easily find it.

We awarded 2 points to a manual if it is posted on the department website, clearly identified, and accessible from the homepage or one level of sub-navigation. Departments were given 1 point if the manual was difficult to find, usually due to being buried within numerous subsections or being given a name that is not intuitive. If the manual was not posted or was nearly impossible for us to find, it received 0 points.

We believe it would be easy for most departments to raise their score in this category. For many, receiving 2 points is simply a matter of moving the manual to a more visible place on their website or giving it a name containing clear identifiers, such as “Manual” or “Policies.”

For a great example, look to how the Minneapolis Police Department links to its policy manual through top-level navigation on the uncluttered sidebar of its home page.

Searchable

Next, we scored departments on whether a user can search for a key phrase within the manual. This is important because the public needs to be able to find everything relevant to a topic they’re interested in, and this information may be spread out over many sections in the manual.

A manual received 1 point if the entire document was searchable either through Ctrl + F or, preferably, through a built-in search function. The Chicago Police Department’s manual provides a great example of this functionality. The public can search throughout the full manual and see their query returned as a highlighted phrase within the policies.

We gave a manual 0 points if it is impossible to search through or only possible to search through one section at a time. This may be because the manual is broken up into multiple .PDFs with no built-in search feature capable of searching across them. It may also be because a web browser cannot recognize the text in the manual's .PDF due to how it was created.

A Table of Contents

Our third category scores whether the manual has a table of contents, which is a high impact feature that helps readers find information quickly. A manual received 2 points if it has a hyperlinked list of its sections with headings that provide information about where sections are. San Antonio’s offers a great example.

A manual received 1 point if its table of contents is not hyperlinked. Manuals that were missing a table of contents or had an incomplete table of contents received 0 points.

A Marked Revision Date

We also scored whether departments mark the revision dates of their policies, which assures the public that they are reading the most up-to-date version of the manual. Manuals were awarded 0 points if they contain no revision dates or if revision dates were only available on some policies.

The Wichita Police Department goes above and beyond by including not only a revision date for each section, but also when the next review date will be.

Solicits Online Feedback

Here at the Policing Project, we believe policing practices should be formulated with input from the public. This is why we encourage police departments to provide an easy opportunity to provide online feedback about specific policies.

This is an area of our scorecard where we hope to see great improvement. Of the 100 departments we surveyed, only Baltimore and St. Paul offer this feature. While other departments offered a way for the public to submit questions about the manual was posted online, we did not feel this was in the same spirit as seeking more open-ended feedback on the policies themselves.

Summaries and FAQs

Finally, we scored departments on whether they have FAQs or summaries for key policies. This feature helps ordinary citizens gain an understanding of police practices that might otherwise seem very technical. While not necessary for everything covered in the manual, we believe summaries or FAQs are great for policies the community has a special interest in, such as use of force.

None of the departments we surveyed are currently offering this feature in their online manuals. Interested in adding policy summaries to your manual? We're currently working with the Cameden County Police Department on this very thing, and are open to assisting other departments as well. Contact us for more information.

Suggested Areas For Improvement

For each department, our scorecard also provides Suggested Areas For Improvement, for the purpose of both clarifying the scores we gave and providing additional guidance.

This section of the scorecard identifies common problem areas revealed in our scoring that require additional review by departments. These areas are defined as:

Title: The manual’s title lacks clear identifiers such as “Manual” or “Policies,” which may make it difficult to find. Examples of titles that may be confusing for the public include “Directives,” “Patrol Guide,” “General Orders” or “Operational Orders.”

Navigation: The manual is not linked from the home page and is buried under several levels of sub-navigation, making it difficult to find.

Page Elements: The manual is linked directly from the homepage but remains difficult to find due to a high number of other elements on the page.

Placement: The manual is posted in a section of the website that is not intuitive, making it difficult to find.


Cross search: It is not possible to search across entire sections of the manual either because 1). the manual is broken up into multiple documents and the website provides no cross-search feature or 2). the website does have a cross-search feature but it only searches titles and not the full manual text.

Text search: The settings of the manual’s .PDF do not recognize text, making search of any part of the manual impossible.

Google search: While utilizing Google's site search function is an easy way to provide cross search capability, we felt this approach is less user friendly than utilizing a built-in search feature that can highlight query results.


Limited TOC: While a table of contents is present, it is not at the start of the document, contains no subheadings, or is so non-specific as to be practically useless.

No TOC: The manual is missing a table of contents.

Hyperlinks: The manual has a table of contents but it does not contain hyperlinks to the sections.


File Size: The manual is posted as a large size, single .PDF, which may make it slow or difficult to load or even cause a web browser to crash.

Requires download: The public is forced to download the manual in order to view it.


Incomplete: The manual as posted online is missing publicly accessible sections (other than those that would always be redacted). In some cases, manuals were too incomplete to merit scoring.

Not posted: This department has not posted any of its manual online.

Out of date: The manual as posted online has not been updated in a significant amount of time and its contents or presentation suggest the information it contains is not the actual, current version.

Let’s Stay In Touch

We hope this project will provide guidance to departments and their communities on how to publish an effective manual. It’s also our goal to make this a working document. If your department makes any changes based on the recommendations in this guide, let us know and we’ll update the score.

Research for this report was prepared by Policing Project externs Zeinab Hussen and Cynthia Long. Interested in joining our extern team? Apply through NYU Law.