Dallas Police Used Face Recognition Software Without Authorization, Installed on Personal Phones
Dallas police officers used unauthorized facial recognition software to conduct between 500 and 1,000 searches in attempts to identify people based on photographs.
A Dallas Police spokesperson says the searches were never authorized by the department, and that in some cases, officers had installed facial recognition software on their personal phones.
Use of the face recognition app, known as Clearview AI, was not approved, she said, “for use by any member of the department.”
Clearview AI did not respond Wednesday when asked if it had revoked access for officers whose departments say their use is unauthorized.
The Dallas Police Department says it has never entered into a contract with Clearview AI.
According to BuzzFeed, officers who signed up for a free trial at the time were not required to prove they were authorized to use the software.
During an internal review, Dallas officers told superiors they had learned about Clearview through word of mouth from other officers.
The Dallas Police Department is only one of 34 agencies to acknowledge employees had used the software without approval.
A study of 189 facial recognition systems conducted by a branch of the U.S. Commerce Department in 2019, for example, found that people of African and Asian descent are misidentified by software at a rate 100 times higher than white individuals.
Clearview AI, which is known to have scraped billions of images of people off social media without their consent or the consent of platforms, has consistently claimed its software is bias-free and, in fact, helps to “prevent the wrongful identification of people of color.”