New facial recognition guidance for Australia addresses privacy in retail environments
As Australia grapples with the use of facial recognition technology (FRT) in retail environments, the Office of the Australian Information Commissioner (OAIC) has published a guide to assessing the privacy risks of facial recognition in commercial settings.
The new OAIC guidance differentiates between facial verification (1:1 FRT) and facial identification (1:n FRT) and lists the key principles to support “the appropriate use of sensitive information when using FRT” in the context of the Privacy Act and the Australian Privacy Principles (APP).
These include necessity and proportionality; consent and transparency; accuracy, bias and discrimination; and governance and ongoing assurance. Put simply, organizations collecting personal data (including biometrics) must have a reason, tell people they’re doing it, do everything they can to prevent bias, and have clear policies in place to govern it.
The guidance is blunt in its assessment of how facial recognition impacts privacy. “FRT significantly interferes with the privacy of individuals, and live FRT in particular is highly intrusive to an individual’s privacy,” it says. Organizations who use it are strongly encouraged to embrace privacy by design, informed by a Privacy Impact Assessment (PIA).
“When assessing whether the use of FRT is proportionate,” says the document, “organisations should carefully consider whether the benefits clearly outweigh the risks posed to individual’s privacy and other human rights. For example, where an organisation is using FRT to lessen or prevent serious threats to the health, safety and security of customers in a commercial or retail setting, it must be able to demonstrate how its use is proportionate to the risks identified.”
New rules underscore decision on Bunnings FRT use
This note reverberates more loudly in the wake of the recent decision by the Privacy Commissioner, which found that major retail chain Bunnings breached citizens’ privacy by collecting personal and sensitive information through a facial recognition system.
Bunnings, which sells household hardware and garden supplies, claims it used facial recognition to protect itself against “violent and organized crime.”
In her decision, Privacy Commissioner Carly Kind wrote that “facial recognition technology may have been an efficient and cost effective option available to Bunnings at the time in its well-intentioned efforts to address unlawful activity. However, just because a technology may be helpful or convenient, does not mean its use is justifiable.”
Which is to say, the mafia is surely not interested enough in Bunnings’ cordless lawn mowers to warrant the collection of face biometrics from the store’s customers.
The OAIC has published a handy one-pager that covers the key points of the guidance.
The decision, and its accompanying guidance, sets up a teetering scale on which the benefits of facial recognition technology in stores and other commercial environments are weighed against the potential harms. The OAIC’s document lists a number of considerations related to public interest, alternative methods and storage of biometric data.
Likewise with consent, which is addressed at four levels: informed consent (“I know what’s happening”), voluntary consent (“I agree to it”) current and specific consent (“You can do these things with my data”) and consent with capacity (“Assume that an individual has capacity to consent unless there is a factor that casts doubt on their capacity, like language or disability, which must be tested.”)
Accuracy, bias and accountability also get sections, much of which amounts to advice to use tested and trusted systems, do due diligence on privacy, and make sure you’re up to date on – and following – the relevant laws and regulations.
Article Topics
Australia | biometric identification | biometrics | data privacy | data protection | facial recognition | Office of the Information Commissioner (OAIC) | retail biometrics | video surveillance
Comments