Takeaways from the ICO’s draft guidance on biometric data
By David J. Oberly, Biometric Privacy & Data Privacy Attorney
The UK Information Commissioner’s Office (“ICO”) recently published draft guidance on biometric data (“Draft Guidance”), which explains how the UK General Data Protection Regulation (“UK GDPR”) applies when companies use biometric recognition systems. The Draft Guidance intended both for organizations that use or are considering the use of biometric recognition systems, as well as the vendors of these systems, and discusses key issues such as when biometric data is considered “special category data,” its use in biometric recognition systems, and applicable compliance obligations under the UK GDPR.
As the final version of the ICO’s guidance is likely to remain substantially similar to the Draft Guidance, organizations that utilize biometrics in their operations should familiarize themselves with the legal principles and practical compliance strategies offered by the ICO. At the same time, organizations should audit their current biometrics practices to assess their level of compliance with the Draft Guidance, and implement any necessary modifications or adjustments needed to align with the compliance roadmap offered by the ICO.
“Biometric Data” & “Special Category Biometric Data”
Under the UK GDPR, biometric data is defined as “personal data resulting from specific technical processing relating to the physical, physiological, or behavioral characteristics of a natural person, which allow or confirm the unique identification of that natural person, such as facial images or dactyloscopic [fingerprint] data.” The Draft Guidance clarifies that personal data only constitutes “biometric data” where it satisfies the following three conditions:
- relates to a person’s behavior, appearance, or observable characteristics;
- has been extracted or further analyzed using technology (g., an audio recording of someone talking is analyzed with specific software to detect things like tone, pitch, and inflections); and
- is capable of uniquely identifying (recognizing) the person it relates to.
The Draft Guidance further explains that biometric data constitutes “special category biometric data” under the UK GDPR only when that data is used for the purpose of uniquely identifying (recognizing) a person.
These classifications are important because the applicable compliance obligations differ based on whether biometric data constitutes “special category” data.
To lawfully process special category biometric data, controllers and processors must possess both a lawful basis under Article 6, and a separate condition for processing sensitive category data under Article 9 (although the two do not have to be linked). Article 9 offers ten conditions for processing special category data. Biometric data that is not special category data (i.e., not used for the purpose of uniquely identifying a person) only requires a lawful basis under Article 6.
The Draft Guidance provides that under most circumstances, explicit consent will be the only valid Article 9 condition applicable for processing special category biometric data. The ICO further cautions that where an imbalance of power exists between an organization and data subjects, the organization should carefully consider whether relying on explicit consent is appropriate. This is because to be valid, consent must be “freely given,” which means giving people genuine choice and control over how their data is used. As the ICO has highlighted in prior guidance, organizations in a position of power—and, in particular, employers—may find it more difficult to show valid freely given consent.
Additional compliance obligations applicable to biometric data
The Draft Guidance also provides an important refresher on other UK GDPR requirements that must be satisfied when using any type of biometric data.
First, as part of the “accountability” data protection principle, organizations must demonstrate how they have complied with applicable data protection principles. In particular, appropriate measures and records must be maintained for purposes of demonstrating compliance with the UK GDPR.
Second, organizations must adopt a data protection by design approach. This requires companies to consider both privacy and data protection issues first at the design stage, and thereafter throughout the lifecycle of biometric data. Specifically, organizations must:
- evaluate any reasonably anticipated risks that may arise from the use of biometric data;
- protect biometric data in any system used by the organization; and
- only use processors who are able to offer sufficient guarantees regarding the measures they have in place for data protection by design.
Third, organizations must complete a data protection impact assessment (“DPIA”) for any processing activity that is likely to result in a high risk to the rights and freedoms of data subjects. The Draft Guidance explains that the use of biometric recognition systems will, under almost all circumstances, require the completion of a mandatory DPIA. This is because DPIAs are required for both the processing of special category data and the systematic monitoring of publicly accessible areas on a large scale—and most uses of biometric recognition systems involve at least one of these two criteria.
Fourth, organizations must provide clarity and transparency as to when they are considered a controller and whether they might be considered a joint controller with another organization. In joint controller scenarios, both companies must ensure that data subjects are able to exercise their rights, and also that they understand what is being done with their biometric data.
Fifth, organizations must ensure they have express provisions in their contracts with processors limiting the use of biometric data by the processor to the purposes for which it has been instructed by the controller.
Sixth, organizations must maintain effective data protection measures to safeguard biometric data in their possession from unauthorized access and acquisition. In particular, the ICO identifies three specific measures that must be implemented to comply with the UK GDPR when using biometric data:
- risk assessments;
- regular testing and review of security measures; and
- encryption of biometric data.
AI solutions providers
In addition to biometric recognition systems, the Draft Guidance also discusses the use of AI tools, particularly in the context of utilizing personal data to train and develop AI models and algorithms.
The Draft Guidance explains that when an organization partners with an AI solutions provider, it must:
- establish whether the provider seeks to utilize data generated by the organization’s biometric system for internal AI tool improvement purposes;
- confirm that the provider would be acting as a controller for any such use of data; and
- determine how the organization or the provider will inform data subjects about the provider’s use of their data for this purpose.
The ICO further notes that companies may need to amend their contracts with AI solutions providers to satisfy these particular compliance obligations.
What to do now: Practical compliance tips
At this time, organizations should work closely with experienced biometric privacy counsel to conduct a thorough audit of their current compliance practices to identify and remediate any gaps that exist vis-à-vis the ICO’s new biometric data guidance. In particular, organizations should assess their current compliance programs to ensure they encompass the following practices:
- complete DPIAs prior to implementing any new biometrics-powered system or making any material change to an existing biometrics tool or solution;
- obtain explicit consent in connection with special category biometric data processing activity;
- offer suitable alternative options for individuals that decline to consent to the processing of their biometric data, which must be no less favorable than the comparable biometric system or tool;
- maintain security measures highlighted by the ICO as key for safeguarding biometric data; namely, risk assessments, encryption, and regular testing and review;
- address biometrics-related discrimination risks—e., where data subjects or groups are treated unjustly based on protected characteristics—by completing pre-deployment assessments to evaluate whether biometric systems are likely to have a discriminatory impact on data subjects; and
- employ a data protection by design approach when using biometric data.
About the author
David J. Oberly is Of Counsel in the Washington, D.C. office of Baker Donelson, and a member of the firm’s Biometric Privacy, Artificial Intelligence, and Data Protection, Privacy & Cybersecurity practices. Recognized as “one of the nation’s foremost thought leaders in the biometric privacy space” by LexisNexis, David’s practice focuses on counseling and advising clients on a wide range of biometric privacy, artificial intelligence, and data privacy/security compliance and risk management matters. In addition, David has deep experience in litigating bet-the-company BIPA class action disputes. He is also the author of Biometric Data Privacy Compliance & Best Practices—the first and only full-length treatise of its kind to provide a comprehensive compendium of biometric privacy law. He can be reached at email@example.com. You can follow David on X at @DavidJOberly.
DISCLAIMER: Biometric Update’s Industry Insights are submitted content. The views expressed in this post are that of the author, and don’t necessarily reflect the views of Biometric Update.