Clearview AI facial recognition fallout continues in Canada and Australia, online privacy aid developed
A link is now available on Clearview’s website for Canadians to be removed from its searches.
The company announced earlier this month that it would stop operating its biometrics service in Canada, though it is still under investigation by several privacy watchdogs in the country. The National Child Exploitation Crime Centre, operated by the RCMP, was Clearview’s last client in Canada before it announced last week it is no longer using the company’s technology.
A photographer in Quebec, meanwhile, has launched a proposed class-action lawsuit against the company, seeking unspecified damages and the deletion of all images of Canadians in the company’s possession, The Canadian Press reports. Ha Vi Doan filed the suit in Canadian Federal Court.
Canada’s Personal Information Protection and Electronic Documents Act (PIPEDA) makes it illegal for companies to collect the biometric data of Canadians for commercial purposes without consent.
Details of Australian Federal Police’s use of Clearview’s facial recognition technology have been revealed in documents obtained by the ABC.
The technology was tested by an officer who submitted photos of him or herself, and an AFP spokesperson has confirmed that a “limited pilot of the system” was conducted. AFP initially denied using the system.
Searches for five persons of interest were carried out by the Australian Centre to Counter Child Exploitation (ACCE).
The documents show AFP first used Clearview in November of 2019, though how many searches were carried out is not known, as the agency’s access to Clearview is now restricted. An executive briefing on the system says it was used once to try to locate a victim of sexual assault, but also that no personal information had been successfully retrieved with the technology to date. Another email suggests that a mugshot was used to find a suspect’s Instagram account.
The documents also show AFP officers questioning whether the app had been officially approved.
The Australian Information Commissioner announced an investigation into Clearview’s practices last week.
New facial recognition blocking technology
Startup Alethea AI has developed a technique for wrapping digital images of faces in masks or “skins” using AI algorithms.
Coindesk reports that the company offers a range of synthetic media avatars generated with technology similar to that used in deepfakes, most for $99.
Alethea AI has also partnered with Oasis Labs to provide verification that the content is synthetic but approved by the person being represented.
According to the report, Alethea AI intends to eventually provide facial anonymization for real-time video, when the requisite computing power is available.