AI policy advisor says facial recognition bans could spread to states and federal government
The trend of bans on biometric facial recognition in U.S. cities will soon spread to states, and from there to the federal government, Harvard fellow and AI policy advisor Mutale Nkonde told an audience at the EmTech conference, according to MIT Technology Review.
Portland is considering a ban which addresses the technology’s use in the private sector, while previous bans in San Francisco, Oakland, and Somerville have focused on police and government agency use. What uses will face future bans is uncertain, speakers at the MIT Technology Review-hosted event said, but what they are more confidant of, is the eventual establishment of new rules.
“There will be legal challenges, and there will eventually be regulation,” predicts University of Essex human rights lawyer Daragh Murray.
Nkonde agrees, suggesting that facial recognition could “flip” the constitutionally protected right to presumption of innocence. She gave the recent campaign by residents of an apartment building in New York, many of whom are working-class women with dark skin. They engaged human rights lawyers, and more affluent groups allied with them, according to the report. Nkonde said that following the marginalization of minority groups by facial recognition, it would be used to target groups with more power, leading to bans.
Chinese Academy of Sciences Head of AI Ethics and Safety Yi Zeng said that 83 percent of Chinese people and 80 percent of Americans support the “proper use” of the technology by governments, indicating a significant divergence of opinion, at least among uses, if not between groups of people.
The proposed ban in Portland was examined in a work session held by city commissioners this week, and reported by Oregon Public Broadcasting.
Commissioner Jo Ann Hardesty said she wants to make exceptions to the ban as narrow as possible, while Portland Police Bureau Assistant Chief of Investigation Andrew Shearer told the Council that he sees potential value for facial recognition software in limited instances, and that the issue ““warrants further investigation.”
Mayor Ted Wheeler and a representative of Portland’s Office of Equity and Human Rights emphasized the importance of differentiating consensual and involuntary biometrics collection.
Council members requested another work session before November to get further input.
Call for halt to UK live facial recognition trials
A large coalition consisting of more than a dozen members of parliament, including Liberal Democrat party leader Jo Swinson, as well as 25 rights advocacy groups, technology experts and academics, and a handful of barristers have co-signed a call for UK police and private companies to immediately halt the use of “live facial recognition for public surveillance.”
“We hold differing views about live facial recognition surveillance, ranging from serious concerns about its incompatibility with human rights, to the potential for discriminatory impact, the lack of safeguards, the lack of an evidence base, an unproven case of necessity or proportionality, the lack of a sufficient legal basis, the lack of parliamentary consideration, and the lack of a democratic mandate,” the letter says.
Groups co-signing the call include Big Brother Watch, Amnesty International, and the Ada Lovelace Institute, while MPs representing the Labour and Conservative Parties also signed.
“What we’re doing is putting this to government to say: ‘Please can we open this debate and have this conversation,” Big Brother Watch Director Silkie Carlo told the BBC’s Victoria Derbyshire program. “‘But for goodness sake, while it is going on, there is now a surveillance crisis on our hands that needs to be stopped urgently’.”
Digital Barriers CEO Zak Doffman gave two different examples to point out the lack of clarity and consensus around different uses of facial recognition. In one, a group of known individuals plans public harm on a massive scale. In the other, an individual who is kicked out of a pub faces a biometric black-list. While neither use is regulated, Doffman suggests the former would be broadly supported by the public, while the latter would at best meet divided public opinion.
“There should be a standard around its siting, efficiency and effectiveness,” says UK Surveillance Camera Commissioner Tony Porter. “I suppose you might say, ‘What is an appropriate force hit-rate that is tolerable against the totality?’ There needs to be a lot more assurance to the public that any notion of bias through ethnic background is eradicated.”
Microsoft President and Chief Legal Officer Brad Smith told Reuters, meanwhile, that the company “won’t sell facial recognition services for the purposes of mass surveillance anywhere in the world,” and calling for increased regulation.
That view is far from unanimous in the industry, however. Human Recognition Systems (HRS) CEO Neil Norman tells Business Live that “there are very strict restrictions in play about we can do. It’s not different to any other sector or any other product in that way.”
Norman says that 99 percent of facial recognition applications are benign, and the other 1 percent are subject to GDPR. Further, using the example of his recently-hacked Netflix account, Norman says biometrics are safer than passwords, “so the benefits of biometrics really do outstrip the negatives.”
artificial intelligence | biometrics | cctv | data protection | Digital Barriers | facial recognition | Human Recognition Systems | law enforcement | legislation | Microsoft | privacy | regulation | UK | United States | video surveillance