Statement of Resigning Axon AI Ethics Board Members

June 6, 2022

It is with deep regret that we, the undersigned nine members of Axon’s AI Ethics Board, announce our immediate resignation from the Board. We do so in response to Axon’s recent announcement that it intends to develop Taser-equipped drones, pre-position them in potential targets for mass and school shootings, and encircle those targets in surveillance cameras with real-time streaming capabilities.

We wish it had not come to this. Each of us joined this Board in the belief that we could influence the direction of the company in ways that would help to mitigate the harms that policing technology can sow and better capture any benefits. For a time, we saw that influence play out in some of Axon’s decisions. From not equipping any of its products with facial recognition capabilities, to withdrawing a new software tool to collect data from social media websites, to promoting desperately needed legislation to bring the use of license plate readers under control, we observed tangible evidence of the difference we were making. Our insistence to Axon that the community, and not the police, should be the company’s ultimate customers led Axon to establish the Community Advisory Coalition, a group which brings together community leaders to share perspectives on Axon’s products and services.

Being on the Board also afforded us an unparalleled view not only into what the vendors of policing technology are producing, but more importantly, what they are thinking about producing in the future. It gave us the ability to inform the public and policymakers, to speak out about harms, and to propose practical solutions. We felt that providing transparency into what the company was considering was valuable, not only to Axon, but also to its competitors in the policing technology industry who too often themselves do not adhere to ethical standards, to activists and advocates concerned about the technologies, to policing agencies that wish to make informed decisions about whether and how to use technologies, and perhaps most important, to legislators who need to step up to regulate these technologies.

Only a few weeks ago, a majority of this Board—by an 8-4 vote—recommended that Axon not proceed with a narrow pilot study aimed at vetting the company’s concept of Taser-equipped drones. In that limited conception, the Taser-equipped drone was to be used only in situations in which it might avoid a police officer using a firearm, thereby potentially saving a life. We understood the company might proceed despite our recommendation not to, and so we were firm about the sorts of controls that would be needed to conduct a responsible pilot should the company proceed. We just were beginning to produce a public report on Axon’s proposal and our deliberations.

None of us expected the announcement from Axon last Thursday, June 2 regarding a very different use case. That announcement—that the company’s goal is to entrench countless pre-positioned, Taser-equipped drones in a variety of schools and public places, to be activated in response to AI-powered persistent surveillance—leads us to conclude that after several years of work, the company has fundamentally failed to embrace the values that we have tried to instill.

For example, for years the Board has warned Axon against the use of real-time, persistent surveillance in its products. Yet, Axon has proposed a degree of surveillance that is sweeping. This type of surveillance undoubtedly will harm communities of color and others who are overpoliced, and likely well beyond that. The Taser-equipped drone also has no realistic chance of solving the mass shooting problem Axon now is prescribing it for, only distracting society from real solutions to a tragic problem.

We all feel the desperate need to do something to address our epidemic of mass shootings. But Axon’s proposal to elevate a tech-and-policing response when there are far less harmful alternatives, is not the solution.

Before Axon’s announcement, we pleaded with the company to pull back. But the company charged ahead in a way that struck many of us as trading on the tragedy of the Uvalde and Buffalo shootings. Significantly for us, it bypassed Axon’s commitment to consult with the company’s own AI Ethics Board.

Although we all joined this Board understanding that we are advisory only—and have seen Axon reject our advice on some prior occasions—rushing ahead to embrace use of surveillance-enabled, Taser-equipped drones, especially when its Board was urging against unnecessarily precipitate action, is more than any of us can abide. We have lost faith in Axon’s ability to be a responsible partner.

Barry Friedman

Wael Abd-Almageed

Miles Brundage

Ryan Calo

Danielle Citron

Rebekah Delsol

Chris Harris

Jennifer Lynch

Mecole McBride