Media Contact

Molly Rivera, ACLU of North Carolina, 919-438-0492 or [email protected]
 

February 28, 2020

RALEIGH — Officers with the Raleigh Police Department have used Clearview AI to run North Carolina residents’ faces against the company’s database of billions of photos, according to a company client list reviewed by Buzzfeed News. The list was reportedly obtained via a security flaw in Clearview’s platform. While the Raleigh Police Department has responded to concerns about the use of Clearview AI, they have not altered their current policy which allows officers to use facial surveillance technology.

Below is a comment from Ann Webb, Policy Counsel, in response:

“Raleigh police officers should not be secretly running community members’ faces against a shadily assembled database of billions of our photos without democratic oversight, without transparency, and without safeguards against abuse. It is alarming that, despite the growing opposition to face surveillance, our police department has used an error-prone and privacy-invading technology peddled by a company that can't even keep basic client information secure. The Raleigh City Council should prohibit the department from using Clearview's dangerous system, and state lawmakers must immediately halt law enforcement use of face recognition technology as communities nationwide have done.”

There is widespread evidence that face recognition technology is error-prone and biased, with error rates higher for faces of women and people with darker skin. Recognizing these harms, states and cities across the nation have reined in law enforcement's use of face recognition with several banning the government’s use of face recognition from their communities.