UK biometrics watchdog raises surveillance risks; Yoti outlines alternative immunity passport
A pair of UK authorities have staked out positions on the increasing use of biometrics, with one making recommendations for police use of facial recognition, while the other has expressed concerns that tracking measures for COVID-19 containment could lead to a version of society which was not intended. In this context, an alternative immunity passport proposal may be a way to put the breaks on the expansion of surveillance.
The Home Office’s Biometrics and Forensics Ethics Group’s (BFEG’s) Annul Report for 2018 has been published, and makes several recommendations for making the use of biometrics by UK law enforcement more ethical.
The Group released its first report just under a year ago, more than two years after it was originally expected to be published, and identified several problems with the Home Office Biometrics (HOB) program.
The new report reviews the Ethical Principles established by the group in 2016 and other work. The group is set to publish the DNA Leaflet it was asked in 2016 to produce, which will be distributed to police forces, and the BFEG’s work in assessing the ethical implications of live facial recognition (LFR) usage by police is detailed.
It was in 2018 that a pair of legal challenges of the use of facial recognition by UK police were filed.
The BFEG recommends that the Metropolitan Police Service (MPS) encourage more public scrutiny of its LFR trial deployments, that the trials should comply with the usual standards for experimental trials pending a legislative framework, and that Data Protection Impact Assessments (DPIAs) required by GDPR should be broad enough in scope to examine human rights and social impacts, in addition to data protection.
Finally, the group recommends the National DNA Database (NDNAD) be used for familial searches before any consideration of searches for relatives through genetic genealogy or other alternative methods.
Data ethics and the proceedings of the Forensic Information Database Strategy Board are also dealt with, progress on past recommendations and future plans are covered in the 31-page report.
UK Commissioner for the Retention and Use of Biometric Material Paul Wiles spoke to the online Westminster eForum conference on digital identity, saying that the warning he gave in a January report on police use of facial recognition has become even more pressing.
“I concluded my discussion of the technologies with a point that, at the time, I was cautious to make: that decisions about the use by governments of artificial intelligence and biometrics involve a politically strategic choice about what kind of future social and political world we want to create,” Wiles said. “It was clear to me that these new technologies are going to lead to a new social and political framing of the world we live in. And they may even be the basis for rebuilding our economy, if it emerges badly damaged from the pandemic.”
However, Wiles has also warned of the risk that denying the UK is like China without positing a positive goal for what its society should look like will result in the nation unintentional stumbling into the use of social controls that are incompatible with its self-perception. Wiles had referred to China’s now-mandatory social credit system somewhat cautiously, but says the pandemic has underlined the risks of something similar being adopted in the UK.
The use of new technologies during the emergency, and how to wind them down when it abates, is discussed by Wiles, and has been widely debated. The pandemic has had a similar function to Brexit, however, taking policy-makers attention away from the issue even while it unfolds.
The danger is that technologies promising easy solutions to tough problems will be settled upon, even in the absence of evidence of their need, Wiles says.
“Public trials methodology is well embedded into science and its governance, but not in many other areas. Each other area of application – for example, policing – needs a standard trials methodology. Without that, we run the risk of deploying technologies that have unforeseen or harmful effects, or we fail to develop the necessary decision-making framework,” he explains. “We have to address claims made by technology developers in good faith. The point is not hostility to developers or to dampen technical development, but to extend the development process into real-world applications with the same rigor.”
Without such evaluations, such concepts as proportionality cannot be practically applied.
Wiles term as Commissioner is set to end in June.
Against the backdrop of this intensifying debate, Yoti has published its Code of Practice and developed a privacy-focused approach to sharing personal health data, according to a company announcement.
The five pillars of Yoti’s code, which was developed in consultation with health and privacy experts, include trusted identity verification of individuals, trusted and transparent medical testing of individuals by medical authorities, trusted storage of the individual’s credentials and medical data, trusted presentation and transfer of medical test credentials, and privacy requirements.
The 25-page document describing the code also explains how the pillars can be upheld, and outlines the genesis of some of the measures. One example of a system used to support trusted and transparent testing is the Standard for Safe Sport, developed by a consultant to North American professional sports teams.
Yoti has also unveiled a proposed alternative to immunity passports. The company was reported to be among those making proposals to NHSX last week.
CEO Robin Tombs is skeptical of the immunity passport concept, Business Insider reports, and has said that technological tools should focus on proving recent test results for COVID-19.
Yoti warns that the concept of immunity passports is vague and lacks granularity and adaptability. Instead, the company proposes an app which includes details of what kind of test was performed, and when it was carried out. Other information could include the presence of antibodies detected by the test, and what medical authorities have approved the specific test.
The information is linked to individual identities through ID documents and facial recognition. Health organizations would be registered to the app, and verify patients’ test results with special access permissions. A negative test could be shown with a QR code shared with registered verifiers at airports, workplaces, or other venues with limited access, Business Insider reports.
“Yoti are not health experts. We have listened carefully to immunologists as well as privacy experts and are publishing our draft Code of Practice for wider feedback and to raise awareness of the important health data issues and digital identities,” Tombs states.
“Transparency underpins trust and encourages fair and effective scrutiny.”
biometrics | Biometrics and Forensics Ethics Group | data protection | digital identity | ethics | facial recognition | identity verification | law enforcement | monitoring | privacy | surveillance | UK | Yoti