Biometric Processing Privacy Code for New Zealand comes into effect

New Zealand’s new Biometric Processing Privacy Code is now live, having taken effect on November 3.
The code applies to any sharing of biometric information – such as facial images, fingerprints, voiceprints, iris scans, palm prints and behavioural traits like gait or typing speed – through automated algorithmic systems.
A government overview says “businesses of all sizes and across all industries, whether using biometrics for security, customer engagement or operational efficiency, will need to understand and prepare for these new requirements.” However, entities already using biometrics have until August 3, 2026 to bring their existing systems into compliance with the law.
There is an exemption for health agencies, which are governed by the Health Information Privacy Code, and for select national security agencies. Biometrics for personal consumer devices, like a phone or smartwatch, are also generally outside the scope of the law.
Rather, it focuses on third-party uses for authentication, access control, customer analytics, biometric attendance systems, and other instances in which biometrics are recorded and stored by an external entity, such as a store or app developer. In short, any instance of “biometric processing,” which it defines as “the use of technologies, like facial recognition technology, to collect and process people’s biometric information to identify them or learn more about them.”
That means any use for 1-1 identity verification, 1-n facial matching, or any kind of biometric categorization, which includes facial age estimation systems and other face- or voice-based inference. Behavioral biometrics count, whereas genetic material such as DNA, or information about a person’s brain activity or nervous system, does not.
13 rules for biometrics collection lean into common sense
The code is built on 13 “stringent rules.” The office of New Zealand’s privacy commissioner, Michael Webster, summarizes the rules as follows:
“Only collect biometric information if it’s necessary, effective and proportionate and with the right safeguards in place. Get it straight from the people concerned where possible. Tell them why you’re collecting biometric information, and if there’s an alternative option. Be fair when you’re getting it. Take care of it once you’ve got it.”
“People can ask if you have their biometric information and see their biometric information if they want to. They can correct it if it’s wrong. Make sure biometric information is correct before you use it. Get rid of it once you’re done with it. Use it for the purpose you got it and don’t categorise people unless there is a good reason. Only disclose it if you have a good reason. Make sure that biometric information sent overseas is adequately protected. Only assign unique identifiers if permitted.”
In policy-speak, that translates to requirements for necessity and proportionality, privacy by design, transparency and consent. It puts limitations on use, prohibits using biometric data to infer sensitive information like gender or ethnicity, and adds disclosure controls. It demands accuracy and data quality, security safeguards including encryption and audit logs, and strict protocol for retention and disposal. It discourages unique identifiers generally, sets out rules for trialing systems, and requires organisations to “consider the cultural impacts of biometric processing, particularly for Māori and other communities.”
The rules generally correspond with the Information Privacy Principles under the Privacy Act 2020, although new language has been added following public consultations. The commissioner’s office says “most of the changes are minor or drafting improvements” and “many of the rules have stayed the same.”
New rules regulate estimation, inference technologies
In an interview with The New Zealand Herald, Webster says the update was prompted by the number of businesses and organizations beginning to use biometric technology, whether to improve customer service or improve safety. “Inevitably,” he says, “we think there’s going to be a greater use of biometric technologies by organizations out there.”
As such, the updates “clarify and strengthen some of the requirements for organizations thinking about using this technology.” This includes “a deliberate process” of evaluating privacy safeguards, data retention practices and proportionality. It mandates clear signage to indicate any instance in which biometrics are being collected.
In terms of inferential biometrics that claim to read emotions, Webster gives the example of a business using tech to try and sense whether someone is more excited when they walk by a certain product. Under the new code, that’s more or less prohibited.
Webster says the code factors in issues of potential bias in biometric tech, notably in a New Zealand and Māori context, by including strict criteria on use. It’s clear on how to file a complaint, and it enables individuals to ask organizations for any biometric information they have on them.
Webster says New Zealand is still catching up to much of the world with its rules and regulations. He cites Australia, the UK and Canada as nations whose approaches align with what New Zealand is doing.
How to ensure compliance with New Zealand’s biometric privacy code
A blog post from Source Services offers a compliance checklist for organizations facing down the new code. It urges them to identify any biometric systems in use, conduct a Privacy Impact Assessment (PIA) for each one, weigh alternatives and provide documentation, update privacy policies, implement security and retention controls, review vendor contracts and staff training, update procedures for access and correction requests, and “plan for secure deletion of biometric data if not already in place.”
Article Topics
biometric data | biometrics | data privacy | data protection | legislation | New Zealand | New Zealand Privacy Commissioner | regulation






Comments