Ethics panel sets criteria for London police use of live facial recognition as Dubai city plan criticized
The London Policing Ethics Panel examining the use of biometric real-time facial recognition has concluded that if police can prove the technology will not introduce gender or racial bias into their operations, they should be allowed to use it, The Guardian reports.
While the panel recognized a need to address “important ethical issues,” it found that those issues do not necessarily make live facial recognition (LFR) too dangerous to use. Metropolitan Police have carried out 10 trials with a watchlist of people wanted for violence-related crimes, and have said positive identifications have led to a number of arrests, according to the Guardian. The panel previously called for Metropolitan Police to clarify the legal basis for the technology’s use before continuing with the trials.
The panel reviewed the Met’s deployments in a report, and said that LFR should only be used if the overall public safety benefit outweighs the potential public distrust created. It also said the trials should be used as a source of information about bias, and how it would or would not affect police actions.
“We argue it is in the public interest to publish the trial data and evaluations, to address these concerns,” the panel concludes. “Additionally, because the actions of human operators affect the technology’s functioning in the field and therefore the public’s experience of automated recognition, appropriate LFR operating procedures and practices need to be developed.”
The panel also surveyed over a thousand London residents about the issue, and found that over 57 percent consider LFR acceptable for police to use, and 83 percent accept its use to identify serious offenders. Half of those surveyed think it would make them feel safer, but only 56 percent say police would use their personal data legally, and more than a third say they are concerned about privacy, and that the police would collect data on innocent people. Just under half of respondents say LFR would lead to more personal information being collected from certain groups. Younger people and those with darker skin were less accepting of the technology than others, according to The Guardian.
A representative of the UK Information Commissioner at the trial brought by a Cardiff man who says the system violated his privacy recently reiterated concerns about the lack of a legal framework for LFR’s use by police to the court.
“We want the public to have trust and confidence in the way we operate as a police service and we take the report’s findings seriously,” DCS Ivan Balhatchet responded on behalf of the London force. “The MPS will carefully consider the contents of the report before coming to any decision on the future use of this technology.”
Crime prevention or social control?
The deployment of artificial intelligence and facial recognition to a city surveillance network in Dubai is raising concern that the technology is being weaponized against freedom of expression in the name of public safety, according to BuzzFeed News.
Dubai’s plan, which was announced in early 2017, has drawn interest from technology providers including IBM, Hikvision and Huawei. The latter two companies are facing potential sanctions from the U.S. government. BuzzFeed reports that dissidents and journalists, as well as criminals the UAE have been subject to extensive surveillance such as cellphone hacking.
“This is good technology if it’s used in the right way,” computer engineer, prisoners’ rights campaigner, and former detainee in the UAE Khaled Ahmad told BuzzFeed. “But with the UAE government, they arrest people just for tweeting, for publishing on Facebook, just for speaking about freedom and human rights. It’s good technology if it’s used against criminals, but not if you use it to cover people’s mouths.”