The Capitol Hill riot shows the power and peril of facial recognition in law enforcement
Published Date: 2/13/2021
Source: axios.com

The arrests and charges in the aftermath of the Jan. 6 Capitol Hill insurrection made clear the power of facial recognition, even as efforts to restrict the technology are growing.

Why it matters: With dozens of companies selling the ability to identify people from pictures of their faces — and no clear federal regulation governing the process — facial recognition is seeping into the U.S., raising major questions about ethics and effectiveness.


Driving the news: The Minneapolis City Council voted on Friday to bar its police department from using facial recognition technology, Axios Twin Cities' Nick Halter reports.

The big picture: Even as efforts to restrict facial recognition at the local level are gathering momentum, the technology is being used across U.S. society, a trend accelerated by efforts to identify those involved in the Capitol Hill insurrection.

  • Clearview AI, one of the leading firms selling facial recognition to police, reported a 26% jump in usage from law enforcement agencies the day after the riot.
  • Cybersecurity researchers employed facial recognition to identify a retired Air Force officer recorded in the Capitol that day, and after the attack Instagram accounts popped up purporting to name trespassers.

By the numbers: A report by the Government Accountability Office found that between 2011 and 2019, law enforcement agencies performed 390,186 searches to find facial matches for images or video of more than 150,000 people.

  • The Black Lives Matter protests over the summer also led to a spike in use in facial recognition among law enforcement agencies, according to Chad Steelberg, the CEO of Veritone, an AI company. "We consistently signed an agency a week, every single week."
  • U.S. Customs and Border Protection used facial recognition on more than 23 million travelers in 2020, up from 19 million in 2019, according to data released on Thursday.

How it works: In Veritone's facial recognition system, crime scene footage is uploaded and compared to faces in a known offenders database — though as agencies begin to share information across jurisdiction, that possible database has been getting larger.

  • Veritone's system returns possible matches with a confidence interval that police can use — together with other data, like whether someone has a violent record — to identify possible suspects.

The big questions: Does it work? And should it work?

  • Facial recognition is notoriously less accurate on non-white faces, and a 2019 federal study found Asian and Black people were up to 100 times more likely to be misidentified than white men, depending on the individual system.
  • There have been two known cases so far of wrongful arrest based on mistaken facial recognition matches.

What they're saying: "Today's facial recognition technology is fundamentally flawed and reinforces harmful biases," FTC Commissioner Rohit Chopra said last month, following a settlement with a photo storing company that used millions of users' images to create facial recognition technology it marketed to the security and air travel industries.

The other side: Facial recognition companies counter that humans on their own are notoriously biased and prone to error — a 2014 study found 1 in 25 defendants sentenced to death in the U.S. are later shown to be innocent — and that the models are improving over time.

  • "There's nothing inherently evil about the models and the bias," says Steelberg. "You just have to surface that information so the end user is aware of it."

Be smart: At its most basic level the underlying technology isn't that sophisticated, which makes it difficult to control.

  • Big tech companies like Microsoft can decide not to sell facial recognition software to police departments, but there are plenty of startups to take their place.
  • And as Jan. 6 showed, even individuals can tap facial recognition with ease to become cyber-sleuths — or cyber-vigilantes.
"The core technology isn't limiting. It's really more of a legal jurisdiction question, which is where the rubber will meet the road."
Chad Steelberg, Veritone