Victim identification officer says Clearview facial recognition biggest breakthrough, Chicago to review CPD use
Police departments across the U.S. and Canada are using biometric technology from Clearview AI to identify children who are victims of sexual abuse, according to a New York Times report.
The facial recognition software was used by police in Indiana to identify 14 of 21 victims found in images from an offender in the state, allowing police to contact them and ask if they wanted to make victim statements.
The technology was characterized by a Canadian victim identification officer who was not authorized to publicly discuss investigations as “the biggest breakthrough in the last decade” for investigators.
Reports about Clearview’s business practice have led to a storm of controversy around the company, including a ban on New Jersey law enforcement using it.
Clearview’s app transmits facial data, rather than entire images, and while law enforcement agencies declined to comment on whether a technical audit of the company’s technology had been performed, a Department of Homeland Services Child Exploitation Investigations Unit spokesperson told the Times that the “victim-centered” approach of the unit precludes sharing illegal imagery.
Law enforcement officials were reluctant to give details of how Clearview is used, lest criminals discover a way to defeat the technology.
Some companies providing tools for such investigations, like CameraForensics, have decided against incorporating facial recognition into their offerings, or receiving any sensitive images from law enforcement. CameraForensics Founder Matt Burns said he understands why investigators would want biometric capabilities, however.
“While this type of technology has existed for quite some time, we believe we have created something that enables law enforcement to solve previously unsolvable crimes and, most importantly, protect vulnerable children,” Ton-That wrote in his email to the Times.
Venmo, meanwhile, has joined the chorus of online platforms demanding Clearview stop scraping images from its user accounts, the Associated Press reports.
“Scraping Venmo is a violation of our terms of service and we actively work to limit and block activity that violates these policies,” said company spokesman Justin Higgs. Venmo is reportedly in the process of sending the letter to Clearview.
Clearview attorney Tor Ekeland echoed the firm’s founder when he compared its treatment of data to Google’s in a statement.
The use of Clearview’s facial recognition technology by Chicago Police is detailed by the local CBS affiliate.
The 24-month pilot CPD signed with Clearview on January 1 costs the department just under $50,000, according to the report. Only 30 people have access to the software, and while police would not provide an example of how it is used, they say it is not used for “live” or “real-time” identification.
“We strictly use these processes to solve crimes that have been reported, where we have evidence that gives us a person’s face, but not identity,” Interim Police Superintendent Charlie Beck said, according to CBS.
Chicago Mayor Lori Lightfoot will formalize his administration’s engagement with stakeholders to conduct a review on the use of facial biometrics by the City with a new working group that includes public safety leaders, privacy advocates and others, a statement from her office said.
“The CPD uses a facial matching tool to sort through its mugshot database and public source information in the course of an investigation triggered by an incident or crime, allowing the Department to speed up the sorting of thousands of mugshot photos when there is a possible comparator image. Live FRT is very different from the facial matching tool used by the CPD,” the force said in a statement.
“CPD is committed to ensuring the safety of all of our communities in a constitutional way that respects privacy. Access to CPD’s facial matching tools is not shared with any other department, City agency or federal law enforcement agency, and it is reserved only in key situations, for properly trained and authorized personnel, once photo evidence of a crime has emerged.
“As new technology emerges, we will continue to engage with internal and external stakeholders to ensure that we only use tools that meet constitutional and privacy norms.”
The ACLU demanded the department stop using the technology immediately, alleging it is inaccurate, racially discriminatory, and threatens privacy and First Amendment rights.
Article Topics
biometric identification | biometrics | Clearview AI | facial recognition | forensics | police
Comments