Governments and tech companies are ‘gaming the rules’ on biometrics, need hard lines

Categories Biometric R&D  |  Biometrics News  |  Surveillance  |  Trade Notes

advisory-board-artificial-intelligence-leadership

Greater legislative protections on the use of people’s biometrics are needed and technology companies need to be told to desist from continuing some areas of development, according to a senior QC (Queen’s Counsel) speaking at the CogX Global Leadership Summit and Festival of AI and Transformational Technology in London.

“The idea that you can judge the safety or the appropriateness or the ethical use of biometric data by how well the public understands it or by how transparent, how much you tell people, is the exact trap we’ve fallen into with the data aggregators that have become these huge companies dominating world economies,” said Matthew Ryder, senior QC at Matrix Chambers and lead of an independent review of the governance of biometric data in the UK.

Ryder states that the “regulatory system isn’t really fit for purpose” and that public understanding of a technology or its application should not be a basis for allowing it to exist.

“You have to ensure that you don’t open the door to exceptions to the normal case use, that essentially undermined the entire base upon which you’re allowing that more limited use to take place in the first place,” said Ryder, who believes that governments and tech firms alike are “adept at gaming the rules” to get what they want out of using a certain technology or the data it gathers.

“The solution to that can be much harder-edged rules than we would normally expect to see in a regulatory environment.”

The QC recommends the erring on the side of hard regulations which would then be softened. “We need regulators to say we’re not always going to be defaulting to try and find a useful way tech companies can work and promote tech companies. We have to be comfortable saying this is a hard line, you can’t cross it. Legislators have to be comfortable with that.”

Speaking in the same session, Julie Dawson, Director of Regulatory and Policy at Yoti, countered that a greater understanding of the nuances in biometric technologies is needed.

Dawson gave the example of age recognition as a factor of facial recognition: “We’ve got over a billion people on the planet that do not have a root identification document. But a facial biometric could be used to assess that this is a real human and to assess age. In that instance it’s not reviewing anything about the identity of the individual.”

Dawson recommends sandboxing as a way to safely allow biometric innovation as bad actors will not stop innovating in areas of criminality: “What you don’t want is companies coming up with technology, spending five years innovating and then finding there’s no way they can actually use that. You’re not incentivising that research and development to meet the issues.”

Matthew Ryder hopes that regulators of the biometric sector can avoid falling into the same trap that data aggregators helped create. “We’re trying to avoid the normalization of the harvesting of biometric data. We’re trying to avoid false distinction in the categorisation of biometric data,” he said, warning that there can be major infringements of privacy via categorization of people, not just from identification.

On the issue of bias that has emerged in biometric systems, particularly with facial recognition, Ryder criticized the private sector’s promises to be more transparent: “It isn’t resolved simply by people saying they’re trying to do the best we can, and it isn’t resolved by people saying until you can prove there’s bias and discrimination, we should be able to use it”.

Digital health passes in the workplace

In a separate session, panellists discussed the practicalities and ethics around digital vaccine or health passes for returning to the workplace or as a requirement for new recruits.

Testing, rather than health passes, could be a way around people’s concerns about the use of their health data, according to Dr. James Wilson, Professor of Philosophy and co-director of the UCL Health Humanities Centre.

“Less building of permanent data structures, easier to dismantle, less violating of individual privacy so it could be a better way to go than building a system of certification merely for the purposes of employment,” said Wilson on testing.

Wilson suggested that digital health passes might not be an easy sell for governments if they try to introduce them after a population has had access to vaccines and is expecting greater freedoms.

Fellow panellist, Dr. Linnet Taylor, Associate Professor at the Tilburg Institute for Law, Technology and Society, believes states need to do much more in tackling COVID-19 in terms of technology, regulation and foreign policy.

In national responses so far, “All the discussion has been about the tech which is absolutely insane. So if we de-center the technology a little bit, we might find that there are more lessons to follow than we think,” said Taylor, who believes governments are “getting the social bit wrong.” They implement technologies such as contract-tracing apps, but do not follow up on the users who have been alerted to a possible infection.

“The main weapon that we have is state regulation of not just the tech, but the private sector – not treating them as if they were completely independent agents when they’re dealing with the majority of the population as employees,” said Taylor on the technology involved on people attending work during the pandemic.

She recommends sunset clauses in any contracts with technology providers as a way to check whether it is proving to be beneficial.

Ultimately, the pandemic has to be treated by governments as a global issue. “Unless we’re thinking of this as a planetary problem, and as a foreign policy problem as well as a domestic policy problem, the domestic answers that we come up with are not going to be meaningful,” warned Taylor.

Related Posts

Article Topics

 |   |   |   |   |   |   |   |   |   |   | 

Comments

Leave a Reply

This site uses Akismet to reduce spam. Learn how your comment data is processed.

Most Read This Week

Featured Company

Biometrics Research

Biometrics White Papers

Biometrics Events

Explaining Biometrics