The Los Angeles Police Department will start using artificial intelligence software to categorize video footage captured by police body cameras, the LAPD said in a statement. Axon, the maker of the electric weapon known as the Taser and a line of police body cameras and cloud storage for law enforcement, will be providing the artificial intelligence software.

According to David Luan, Axon's director of artificial intelligence, the software is supposed to help reduce the amount of time it takes officers to review, analyze, and categorize body camera footage. Luan says most body camera videos are an hour long, but only 15 minutes shows an incident that requires officers to review. The software will replace police officers who need to watch footage and flag incidents for police reports. The software does not have facial recognition technology and it will not make decisions on police interactions, crimes, or other subjective issues.

Luan says the software will be able to make video "discoverable," which means it will label chunks of footage that officers need to review and other chunks as unnecessary to view. (Footage cannot be edited, Luan says.) The software will be able to tell agencies, for instance, that in one particular video, between minute one and minute seven, the officer is driving to the scene, and between minute 8 through minute 10 it shows a traffic stop that turns into a foot pursuit and an arrest. The remaining 30 minutes show the officer typing and writing up the report and interviewing witnesses. 

"The AI helps to redirect officer's time and help them not waste hours watching inconsequential parts of the video," Luan says. Axon acquired Luan's company Dextro to build an artificial intelligence lab, which Luan now heads. 

According to a statement made by Daniel Gomez, a Sergeant in the LAPD, the police department has a mountain of video footage it needs to deal with.

"In the past year alone, we have accumulated more than 33 years' worth of video data to analyze," said LAPD Sergeant Gomez. "Reducing the time it takes for our staff to review footage is a priority for us so we can invest more time and energy in the field."

Axon was picked after a 14-month competitive trial, which the LAPD hosted in partnership with Justice and Securities Inc., University of California, Los Angeles, and the Los Angeles Police Foundation.

But, Jay Stanley, a senior policy analyst with the American Civil Liberties Union, says A.I. software doing police work could have social implications that need to be considered. There are two main concerns about artificial intelligence doing back-office police work, says Stanley; algorithms could have an inherent bias, and they could pose privacy concerns. 

"If algorithms are going to play some trusted role in analyzing anything that has taken place, then the code should be public," says Stanley. "Algorithms are not objective, just like video and humans are not objective."

The long-term goal is for the software to auto-populate police reports with simple, objective facts, says Luan. One day, the software could start a police report and include simple facts, like the make and color of a car involved in a robbery, so the officer can focus on more complex aspects of a crime. The software could help police departments review footage of a serious crime and release a redacted version to the public faster. The software will not make decisions on what is, or what isn't a crime, Luan says. 

"It's not about deciphering what is good or bad, legal, or illegal," says Luan. It's about what parts of the video is worth an officer's time to review, he explains. 

Stanley says another concern is where corporate trade secrets and intellectual property collide with freedom and justice. 

"If any evidence generated by the algorithms is being used in court, trade secrets should not trump the ability of defendants to cross-examine the evidence used against them, including looking at the nuts and bolts of how that evidence is generated," says Stanley. 

Published on: Oct 18, 2017