NY Police Chief argues facial recognition improves safety without infringing rights
Policing has changed a significantly over the past few decades, and biometric facial recognition has become a highly valuable tool for generating leads, clearing the wrongfully accused, and identifying missing people, according to an opinion piece written by New York Police Commissioner James O’Neill for The New York Times’ Privacy Project.
O’Neill argues that facial recognition has proved its worth and not infringed the public’s right to privacy, so it would be an injustice to not use it. He also acknowledges the privacy concerns that the technology has raised, and that the public should know how the technology is used.
Potential matches generated for New York police by facial recognition software are assessed by a human investigator before a top match is chosen, and that decision is reviewed by “detectives and seasoned supervisors,” who affirm or dispute the match, before the original officer continues to investigate, according to O’Neill. Matches are drawn exclusively from arrest photos, not social media or open-source images, those these may be consulted after a possible match is found, such as to try to identify the clothing worn by a suspect during a crime. The department does not use sketches, he says, but does use editing software to substitute a generic feature or mirror part of the face in certain situations.
The Georgetown Center on Privacy & Technology’s Garbage In, Garbage Out report criticizes the department’s practices, and compares the use of edited images to filling in a partial fingerprint, but O’Neill dismisses this comparison as absurd. The report also refers to the use of a celebrity image in place of one of the suspect, which O’Neill does not address.
New York police made 7,024 requests for facial biometrics from the department’s Facial Identification Section in 2018, with possible matches returned in 1,851 cases (26 percent), with 998 of those leading to arrests so far. O’Neill mentions violent crimes that have been solved with the technology, and cases of missing or unidentified people who police were able to successfully match.
The Innocence Project says that 71 percent of wrongful convictions it has documented involved mistaken witness identifications, and these mistakes are less likely when facial recognition is appropriately used, O’Neill writes.
“Facial recognition technology can provide a uniquely powerful tool in our most challenging investigations: when a stranger suddenly commits a violent act on the street. In the days of fingerprint cards and Polaroid mug shots, these crimes defined New York City, for visitors and residents alike.”
New positions for Duke and AWS
Amazon Web Services (AWS) CEO Andy Jassy told an audience at the Code Conference this week that the technology should be subject to government regulation, but not bans, CNET reports.
Jassy compared facial biometrics to knives, which can be used for harm, but normally aren’t.
Duke, meanwhile, does not plan to reopen the site hosting the DukeMTMC dataset for facial recognition training, and acknowledged that its previous availability violated University policy, according to the Duke Chronicle. Duke Vice President for Public Affairs and Government Relations Michael Schoenfeld wrote in an email to the Chronicle that media reports about the datasets use by the industry, and the lack of consent from those pictured, led to an investigation by the school’s Institutional Review Board.
The Board found that the dataset was “neither collected nor made available to the public consistent with the terms of the study that had been approved by the Institutional Review Board.”
The dataset consists of more than two million images of two thousand students, collected in 85 minutes of video from eight cameras placed around the campus in 2014. The use of American resources to train technology by Chinese companies has recently come under scrutiny, and the Chronicle reports that just under half of the worldwide verified citations of the DukeMTMC dataset are from Chinese researchers.