Ohio case could redefine legal standards for police use of facial recognition

In a case with potentially far-reaching implications for law enforcement and civil liberties, the American Civil Liberties Union (ACLU), the ACLU of Ohio, and the National Association of Criminal Defense Lawyers (NACDL), have filed an amicus curiae brief with Ohio’s 8th District Court of Appeals in the case of State v. Tolbert.
The case has become a flashpoint in the ongoing national debate over the use of facial recognition in criminal investigations. At its core is a legal and ethical confrontation between technological innovation and constitutional protections, particularly the Fourth Amendment’s safeguard against unreasonable searches and seizures.
It began with the February 2024 killing of a man on a street in Cleveland. In the hours after the homicide, Cleveland police obtained surveillance footage from a nearby convenience store which they gave to a state law enforcement agency to process using Clearview AI’s facial recognition software.
The software returned several possible matches, including a 23-year-old black man named Qeyeon Tolbert, who became a person of interest. Police eventually secured a warrant to search Tolbert’s home, and inside found a firearm they claimed was the murder weapon, which led to Tolbert’s arrest and formal charges. What followed though turned the case into a historic legal battle.
When applying for the warrant, the police failed to disclose their use of facial recognition to the judge. It was omitted from their affidavit in support of the warrant. A trial judge later ruled that the evidence obtained during the search was inadmissible because the warrant was improperly secured due to this nondisclosure. Prosecutors appealed the decision, which set the stage for a ruling that could define how the use of facial recognition is treated by the criminal justice system.
Central to the amicus brief filed by the ACLU and NACDL is the challenge to the legitimacy of facial recognition as a foundation for establishing probable cause. The brief argues that the use of Clearview AI’s software, as well as similar tools, is fundamentally flawed and unreliable. The briefs point out that the technology is known to produce false positives, particularly when applied to people of color, and lacks consistent accuracy across demographic groups. These limitations, combined with a lack of judicial oversight, pose serious threats to constitutional protections, the briefs argue.
For its part, Clearview AI’s platform includes disclaimers that search results are indicative rather than definitive and should only serve as investigative leads. Despite this, Cleveland police allegedly used the result from Clearview as the primary basis for their investigation into Tolbert, without informing the judge of the software’s involvement or the potential flaws associated with it.
“In an age where artificial intelligence and facial recognition technology has become so pervasive and widespread, with limited oversight and regulation, it is important for courts to recognize the dangers and unreliability of this evidence in issuance of a search warrant,” said Amy Gilbert, Senior Staff Attorney for the ACLU of Ohio. “Face recognition technology grants police unprecedented and dangerous power because it doesn’t require the knowledge, consent, or participation of the individual and is often used in secretive ways without oversight.”
Nathan Wessler, deputy director of ACLU’s Speech, Privacy, and Technology Project, emphasized the gravity of the risks, saying that “face recognition technology often gets it wrong and is particularly error-prone when used to try to identify people of color. The appeals court has an important opportunity to affirm that when the government breaks the rules by keeping its use of this flawed technology secret it should be held to account.”
(Biometric testing by the National Institute of Standards and Technology (NIST) had found in 2019 that while the majority of facial recognition algorithms were more likely to misidentify people with darker skin, women and the elderly, the most accurate algorithms show very low differentials in the Institute’s latest testing.)
Defense attorneys likened the unregulated use of facial recognition to “giving hand grenades to children without any instructions or supervision.”
Compounding these concerns is the fact that the Cleveland Police Department lacks a formal policy governing the use of facial recognition technology. There are no standardized procedures for how the software should be used, no mandatory training requirements for officers, and no clear rules for disclosing its application in court filings. This absence of regulation has drawn criticism from both legal experts and community watchdogs.
At a March 2025 meeting of the Cleveland Community Police Commission, co-chair John Adams acknowledged the urgency of the situation. “They shouldn’t be doing things in an official capacity without a policy in place,” he said, referring to the department’s use of AI tools.
The commission, which already operates under a federal consent decree with the U.S. Department of Justice, has expressed concern about the growing role of AI in local law enforcement and the lack of corresponding oversight. Civil liberties groups argue that this lack of transparency creates a “gray area” in which powerful surveillance tools are used without accountability.
“This case exposes how law enforcement conceals dangerous and unreliable surveillance tools from public view,” said Sidney Thaxter ofNACDL’s Fourth Amendment Center. “This technology has no place in the justice system or an open and free society.”
Clearview AI has become one of the most polarizing companies in the AI space. It boasts a massive 60 billion facial image database that it has scraped from public websites which it then uses to power its software. While the company claims its technology is accurate and intended for responsible use, it has faced numerous legal challenges for privacy violations. In March the company settled multi-district litigation over alleged biometric data privacy violations. Despite its disclaimers noting that its facial recognition reports are not admissible in court, Clearview’s software is increasingly being used by law enforcement agencies across the country – often without public knowledge or court disclosure – and it has intensified calls for national standards and restrictions on how such tools are deployed.
As the Tolbert case moves through the appeals process, legal scholars and policymakers are watching it closely, as it presents an opportunity for the judiciary to address whether law enforcement will be allowed to use facial recognition as the basis for obtaining a search warrant without disclosing its use to a judge?
Led by Ohio Attorney General Dave Yost’s office, prosecutors argue that the AI match was just one component of a larger investigation. They claim that traditional detective work reinforced the findings from Clearview, thereby making the search warrant valid regardless of the AI lead.
“The question for this court is whether the state can tell the jury the truth that the police found the gun used to shoot Story and other incriminating evidence in Tolbert’s apartment,” Yost’s team wrote. “The answer should be, ‘yes.’”
This legal stance relies on the “independent source doctrine,” which allows evidence to be admitted if it could have been lawfully obtained through separate, independent means. But critics argue that if the AI match was the initial catalyst for the investigation, and its use was concealed, then the warrant’s integrity is fundamentally compromised.
Courts across the U.S. are grappling with the growing use of facial recognition and its compatibility with constitutional rights. Yet few rulings so far have directly addressed the admissibility of evidence obtained through undisclosed AI tools.
The Tolbert decision could be precedent-setting. If the appeals court upholds the trial judge’s suppression of evidence, it may signal a judicial demand for greater transparency and restraint in the use of surveillance technology. If, on the other hand, the prosecution prevails, it could embolden law enforcement agencies to continue using facial recognition with little or no disclosure.
Article Topics
biometric identification | biometrics | Clearview AI | criminal ID | facial recognition | Ohio | police
Comments