Corsight’s Porter calls out ‘clickbait’ biometric bias claims
A mismatch by a facial recognition system is blamed for a wrongful arrest in Maryland in a Wired article that former UK Surveillance Camera Commissioner Tony Porter has called out as an example of reporting that ignores important distinctions to serve as clickbait.
Alonzo Sawyer was arrested in spring, 2022 and accused of assaulting a bus driver and stealing his phone in the Baltimore area, despite being taller and having several features distinct from those of the suspect. Sawyer was matched as a candidate for the suspect by a facial recognition system, and his former parole officer said it looked like him.
Sawyer’s wife testified to state lawmaker’s about the case in support of a proposed bill to limit law enforcement use of facial recognition in the state. She says a clause prohibiting the use of facial recognition “as the sole basis for positive identification” could have prevented her husband’s arrest.
Facial recognition is widely considered appropriate to use as a source of leads, but not as probable cause. Police misconduct has been raised in past instances of wrongful arrests involving facial recognition, but infrequently referred to in consumer media coverage.
The article says that facial recognition has a “history of misidentifying people with darker skin,” referencing coverage of NIST’s 2019 evaluation of demographic differentials without mentioning that it has been updated.
“Demonising the software when it is clearly the operating agency does not serve society well,” says Porter, now chief privacy officer at Corsight, in a LinkedIn post. “Technology provides an indication only – a similarity. Humans have the job of thereafter making a proportionate assessment, apply safeguards, then act in a balanced and lawful way- This clickbait fails to point out that the best frt developers are getting 99.8% acccuracy in national testing. As an analogy, it’s a little like blaming an umbrella for getting your hair wet in a downpour because you didn’t act reasonably!”
A report from the ACLU of New York follows Wired in referring to the 2019 version of the NIST study, and studiously ignoring the new version.
That link is the sole support for the three separate claims made in the post that the technology is racially biased and two that it is “error-prone.”
The NYCLU argues that a facial recognition system used to screen visitors to the state’s prisons has wrongly matched at least one individual with a banned person, preventing the individual from visiting an incarcerated relative.
The group also states that the corrections system is generally biased against Black and Latino people.
Bringing accountability to inequitable systems may require stronger arguments, referring to the most recent evidence.
Article Topics
accuracy | biometric-bias | biometrics | Corsight | demographic fairness | Face Recognition Vendor Test (FRVT) | false arrest | NIST
Comments