It’s all about trust when the public discusses biometric surveillance, UK project says
A special UK citizens’ biometrics council has concluded that neither government nor industry has made the case that they can be trusted to surveil the public with facial recognition and other digital tools.
Leaders of the British independent research organization the Ada Lovelace Institute created the council last year, after having called for a moratorium in 2019 on the use of biometrics.
The council, comprised of “informed members of the public,” was designed to be a barometer for researchers. Their resulting and detailed report mentions the word “trust” 23 times.
It is interesting to note that participants in the year-long endeavor stated that oversight of the technology needs to go further than its use in law enforcement. They rejected a stovepipe view for a matrix perspective that includes all government and relevant businesses.
Council recommendations fell under one of three headings. Council members want comprehensive laws and regulations governing biometric technology. They want an oversight body formed that is independent and authoritative. And they want design and deployment standards implemented and enforced.
The council voiced “a clear expectation” that legislation and rules will keep pace with technology development to protect rights and avoid societal harms. Regulations must be meted out with the broadest range of stakeholders, but particularly those in marginalized communities.
And a new oversight authority — again something that is an “expectation” — must be independent and absolutely comprehensive when it comes to monitoring government and commercial use of biometric systems. Overlaps today, according to the council, are wasteful and gaps are dangerous.
The participants had no time for such a body to be a suggestion box. Leaders need to be able to investigate complaints and hold those using biometric surveillance accountable for actions that impair citizen rights.
Last, those deploying biometric tools have to meet standards for responsible, trustworthy and proportionate use.
Going deeper, the council would like software and hardware to be as free as possible of bias, discrimination and errors before deployment. They are dissatisfied with the common practice among technology companies of getting a product out the door and then dealing with problems that could have been caught prior to use in the field.
A separate group convened by the Institute plans to present an independent legal review of biometric data governance in the UK this spring.
Article Topics
Ada Lovelace Institute | best practices | biometric data | biometric identification | biometrics | data protection | facial recognition | regulation | UK | video surveillance
Comments