Judge throws out Clearview facial recognition match in US murder case

A judge in Cleveland, Ohio has dismissed a facial recognition match murder case after determining that police used Clearview AI‘s facial recognition software — which explicitly states its results are inadmissible in court — to secure a search warrant.
Judge Richard McMonagle threw out evidence in the murder case of Blake Story after siding with the defense attorneys of the suspect Qeyeon Tolbert. During a hearing on January 9th, the judge noted that the facial recognition match of Tolbert was comparable to information from an anonymous informant, which alone is insufficient to establish probable cause. The search warrant affidavit was misleading and relied on inadmissible evidence, he concluded.
The story has highlighted the lack of training for law enforcement in using facial recognition as well as shortcomings in oversight and regulation, according to news outlet Cleveland.com. It also suggests a widespread problem with American police failing to understand probable cause, even though it is one of the fundamental concepts in the country’s legal system.
It has also put the spotlight on other law enforcement agencies across the U.S. who are using Clearview’s technology – despite controversies. According to figures from June last year, the number of facial recognition searches performed with Clearview by law enforcement officials has doubled over the past year, reaching 2 million.
Blake Story was shot and robbed on the streets of Cleveland in February last year. Although police managed to obtain footage of the incident, the killer’s face was not visible and could not be identified with Clearview’s facial recognition.
Several days later, however, officers examining real-time CCTV cameras came across footage of a person with a similar build, hairstyle, clothing and walking characteristics. The footage was captured six days after the murder and at a different place from the incident.
The police sent the footage to the inter-agency data-sharing group Northeast Ohio Regional Fusion Center, which ran it through Clearview’s facial recognition, coming up with eight photos of people — two of which belonged to Tolbert. Tolbert then became the main suspect in the murder of Story with police obtaining a warrant for a search of his premises based on the biometric match and finding a gun alongside other evidence.
Once it reached court, however, the case began to fall apart. In its search warrant affidavit, detectives did not disclose that the police used facial recognition to help identify the suspect nor did they add the disclaimer from Clearview noting that its facial recognition reports are not admissible in court. The affidavit also did not mention that the software identified people other than the suspect Tolbert.
Tolbert’s attorney claims the disclaimer indicates that Clearview does not have faith in the accuracy of its product. In fact, facial recognition is considered probabilistic, rather than deterministic, and therefore does not qualify as evidence in courts around the U.S. and the world.
Although the prosecution argued that police did not only rely on AI for evidence, the defense attorney highlighted that the police did not find DNA evidence, witnesses, or gunshot residue nor was the suspect’s phone pinged in the area of the crime, The Register reports.
Prosecutors are currently appealing the ruling from Judge McMonagle. The case, however, has drawn attention to police use of FRT, which is being used in other investigations in Ohio and beyond. Media reports have found that Cleveland’s police lack policies on using facial recognition in crime investigations.
In March last year, Clearview’s facial recognition was added to a directory of AI and digital technologies that can be considered for American defense contracts.
Article Topics
biometric identification | biometric matching | biometrics | Clearview AI | criminal ID | facial recognition | law enforcement | police







Comments