Police facial recognition on images from US protest, social media in Canada questioned
A coalition of groups has written to the administrators of a facial recognition system used to identify a person accused of committing a crime when police used force to clear protestors from Lafayette Square in 2020, seeking transparency, The Washington Post reports.
More than two dozen groups, including the Electronic Privacy Information Center (EPIC) and the National Association of Criminal Defense Lawyers have written to the Metropolitan Washington Council of Governments (MWCOG), which operates the forensic biometric system known as the National Capital Region Facial Recognition Investigative Leads System (NCRFRILS).
The letter cherry-picks from research to suggest a 35 percent error rate for facial recognition systems identifying Black women is normal, and argues that the technology chills freedom of speech with real-time and forensic “abilities nearly unique to facial recognition.” The groups ask MWCOG to publish documentation for the NCRFRILS creation, funding and performance, along with any analysis of it.
NCRFRILS was used to help identify a man who allegedly assaulted Capitol Police officers ahead of a Presidential photo op last June.
The NCRFRILS biometric database holds 1.4 million photos, according to the Post, and the system is accessible by more than a dozen law enforcement agencies.
A law recently passed in Virginia could spell the end of NCRFRILS, MWCOG spokesperson Steve Kania told the Post.
“The pilot program is in the process of being re-evaluated,” Kania wrote in an email. “I understand that our committees will be discussing the issue in the coming weeks.”
Canadian federal police used facial recognition from another social media-scraper
Meanwhile in Canada, federal police force RCMP (Royal Canadian Mounted Police) has been revealed as the first facial recognition customer of U.S.-based IntelCenter by The Tyee, citing internal email from the law enforcement agency.
IntelCenter says it can perform biometric matches against images of more than 715,000 terrorists’ faces. The software was created with facial recognition technology from Idemia predecessor Morpho. The company’s database appears to be at least partially made up of images scraped from social media platforms, according to Kate Robertson of the University of Toronto’s Citizen Lab. In addition to recalling Clearview AI, which was declared illegal in Canada earlier this year, this method provides images without assurance of their quality, which could undermine the accuracy of biometric algorithms.
British Columbia RCMP units contracted the service in 2016, according to the report. While the number of searches it performed with the software is redacted, it paid US$20,000 to the company.
The Tyee claims the emails also show the RCMP broke its own rules by not alerting superior officials of the sole-sourced contract, which is required for purchases on software of over $500, or for any expenditure above $10,000. It also avoided labelling the software with terms that might trigger more oversight, such as ‘facial recognition’ or ‘biometric.’ The publication provides details on the history of the contract, including an angry email sent by an RCMP procurement official, and the response from an Operations Support Group administrator that disclosure had “totally slipped (his) mind.”
RCMP officials were subsequently recommended to take “procurement 101” training.
The contract was not renewed in 2019, but civil liberties and privacy advocates remain unimpressed.