FB pixel

New facial recognition guidance for Australia addresses privacy in retail environments

Proportionality, accuracy, consent and governance serve as pillars
New facial recognition guidance for Australia addresses privacy in retail environments
 

As Australia grapples with the use of facial recognition technology (FRT) in retail environments, the Office of the Australian Information Commissioner (OAIC) has published a guide to assessing the privacy risks of facial recognition in commercial settings.

The new OAIC guidance differentiates between facial verification (1:1 FRT) and facial identification (1:n FRT) and lists the key principles to support “the appropriate use of sensitive information when using FRT” in the context of the Privacy Act and the Australian Privacy Principles (APP).

These include necessity and proportionality; consent and transparency; accuracy, bias and discrimination; and governance and ongoing assurance. Put simply, organizations collecting personal data (including biometrics) must have a reason, tell people they’re doing it, do everything they can to prevent bias, and have clear policies in place to govern it.

The guidance is blunt in its assessment of how facial recognition impacts privacy. “FRT significantly interferes with the privacy of individuals, and live FRT in particular is highly intrusive to an individual’s privacy,” it says. Organizations who use it are strongly encouraged to embrace privacy by design, informed by a Privacy Impact Assessment (PIA).

“When assessing whether the use of FRT is proportionate,” says the document, “organisations should carefully consider whether the benefits clearly outweigh the risks posed to individual’s privacy and other human rights. For example, where an organisation is using FRT to lessen or prevent serious threats to the health, safety and security of customers in a commercial or retail setting, it must be able to demonstrate how its use is proportionate to the risks identified.”

New rules underscore decision on Bunnings FRT use

This note reverberates more loudly in the wake of the recent decision by the Privacy Commissioner, which found that major retail chain Bunnings breached citizens’ privacy by collecting personal and sensitive information through a facial recognition system.

Bunnings, which sells household hardware and garden supplies, claims it used facial recognition to protect itself against “violent and organized crime.”

In her decision, Privacy Commissioner Carly Kind wrote that “facial recognition technology may have been an efficient and cost effective option available to Bunnings at the time in its well-intentioned efforts to address unlawful activity. However, just because a technology may be helpful or convenient, does not mean its use is justifiable.”

Which is to say, the mafia is surely not interested enough in Bunnings’ cordless lawn mowers to warrant the collection of face biometrics from the store’s customers.

The OAIC has published a handy one-pager that covers the key points of the guidance.

The decision, and its accompanying guidance, sets up a teetering scale on which the benefits of facial recognition technology in stores and other commercial environments are weighed against the potential harms. The OAIC’s document lists a number of considerations related to public interest, alternative methods and storage of biometric data.

Likewise with consent, which is addressed at four levels: informed consent (“I know what’s happening”), voluntary consent (“I agree to it”) current and specific consent (“You can do these things with my data”) and consent with capacity (“Assume that an individual has capacity to consent unless there is a factor that casts doubt on their capacity, like language or disability, which must be tested.”)

Accuracy, bias and accountability also get sections, much of which amounts to advice to use tested and trusted systems, do due diligence on privacy, and make sure you’re up to date on – and following – the relevant laws and regulations.

Related Posts

Article Topics

 |   |   |   |   |   |   |   | 

Latest Biometrics News

 

Canada regulator backs privacy-preserving age assurance

The Office of the Privacy Commissioner of Canada (OPC) has published a policy note and guidance documents pertaining to age…

 

FCC seeks comment on KYC revision for commercial phone calls

The U.S. Federal Communications Commission (FCC) has proposed stronger KYC requirements for voice service providers to prevent scams and illegal…

 

Deepfake detection upgrade for Sumsub highlights continuous self-improvement

Sumsub has launched an upgrade to its deepfake detection product with instant online self-learning updates to address rapidly evolving fraud…

 

Metalenz debuts under-display camera for payment-grade face authentication

Unlocking a smartphone with your face used to require a camera placed in a notch or a punch hole in…

 

UK regulators pan patchwork policy for law enforcement facial recognition

The UK’s two Biometrics Commissioners shared cautionary observations about the use of facial recognition in law enforcement over the weekend…

 

IDV spending to hit $29B by 2030 as DPI projects scale: Juniper Research

Spending on digital identity verification (IDV) technology is projected to reach a 55 percent growth rate between now and 2030,…

Comments

Leave a Reply

This site uses Akismet to reduce spam. Learn how your comment data is processed.

Biometric Market Analysis and Buyer's Guides

Most Viewed This Week

Featured Company

Biometrics Insight, Opinion

Digital ID In-Depth

Biometrics White Papers

Biometrics Events