FB pixel

Asda supermarket chain faces legal complaint over facial recognition

Asda supermarket chain faces legal complaint over facial recognition
 

Asda is facing a legal complaint over its use of live facial recognition (LFR) cameras in its supermarkets.

The legal filing comes from privacy rights group Big Brother Watch, which argues that Asda’s use of LFR cameras is “unlawful.” Filed with the Information Commissioner, the legal complaint claims Asda is “infringing the data rights” of shoppers and exhorts the end of the use of the biometric technology by the supermarket chain.

The biometric surveillance technology was acquired from UK start-up FaiceTech, with Asda installing the cameras and LFR in five shops in the Greater Manchester area as part of a trial. Asda is the first national supermarket chain to install LFR.

The trial is designed to increase security amid a rise in retail crime. The retailer says it recorded approximately 1,400 assaults on its staff last year. The installation scans images of shoppers against a watchlist of previous offenders compiled by Asda. If there is no match, the images and the biometric templates collected are deleted immediately and permanently.

“The rise in shoplifting and threats and violence against shopworkers in recent years is unacceptable and as a responsible retailer we have to look at all options to reduce the number of offenses committed in our stores and protect our colleagues,” Liz Evans, chief commercial officer for Non-food and Retail at Asda, said when the trial kicked off in March.

But Big Brother Watch argues the system processes “data with a high degree of risk to data subjects’ rights for private benefit” and poses “significant” risks to shoppers’ rights and freedoms. The complaint states that Asda’s facial recognition system would have a “profound impact on the data rights of tens of millions of people” if implemented across Asda’s stores across the UK.

Madeleine Stone, senior advocacy officer at Big Brother Watch, commented: “Facial recognition surveillance turns shoppers into suspects by subjecting customers browsing the supermarket aisles to a series of invasive biometric identity checks.”

Stone called Asda’s trial “deeply disproportionate” and “chilling” and that use of the technology is “dangerously out of control in the UK” citing its growing use across police and the private sector. “Asda should abandon this trial and the government must urgently step in to prevent the unchecked spread of this invasive technology,” she said.

Police Scotland are currently consulting on the use of LFR by Scottish police, with Big Brother Watch among the 26 organizations involved in the consultation. The privacy rights group is against the use of the technology.

US consumers skeptical of facial recognition use, surveys say

A consumer research report from Scayle has revealed that 71 percent of U.S. consumers are uncomfortable with some of the AI tools retailers are using, including facial recognition.

The survey of more than 1,500 U.S. shoppers found that nearly one-third of consumers are uneasy with retailers’ use of facial recognition (32 percent), 30 percent with AI-created product images and models, and the same number with AI-powered customer service chatbots.

A new report from the Identity Theft Resource Center (ITRC) also explored consumer attitudes toward biometrics, throwing light on the growing reliance on biometric technologies and the skepticism many consumers still hold toward them.

While biometric technology cannot eliminate identity fraud entirely, the report says, its appropriate and ethical use can significantly reduce the risks posed by static, compromised data.

ITRC surveyed 1,177 U.S. adults last August and found that 87 percent had been asked to provide biometric information during an identity verification process within the past year. Biometrics mentioned in the survey included selfie photos, live video, fingerprints, voice, and eye scans.

Despite serious privacy concerns reported by 63 percent of the respondents, a striking 91 percent continued with their transaction when prompted to submit biometric data.

“This research highlights a critical need for those of us working to prevent identity crimes to do a better job explaining both the benefits and risks of emerging identity technologies – especially biometrics,” ITRC CEO Eva Velasquez said in a statement.

Related Posts

Article Topics

 |   |   |   |   |   |   |   | 

Latest Biometrics News

 

Pitched as the future of work, agentic AI is not selling well – but fraudsters love it

Everyone wants to talk about agentic AI. Throughout 2025, AI agents have been hailed as both the future of work…

 

Facial age estimation spoof, VPN bypass claims called into question

Reports of the defeat of facial age estimation technology may be greatly exaggerated, and UK children have not flocked to…

 

Reddit users’ questions expose major shortcoming in age assurance effort

 “Am I the only one that’s confused about how they’re going to confirm if you’re a certain age?” So asks…

 

AI fraud threat continues to spur deepfake detection integration, investment, development

Reality Defender and 1Kosmos have announced a strategic partnership that will see the deepfake detection firm integrate its real-time deepfake…

 

iProov, Aware, Paravision power airport biometric boarding pilots at MCO

Airports across the Americas are accelerating their shift to biometric identity systems, with Orlando, Houston and Oklahoma City all rolling…

 

EU and Canada agree to collaborate on digital ID mutual recognition, pilots

Representatives of the European Union and Canada emerged from the meeting of the EU-Canada Digital Partnership Council on Monday with…

Comments

Leave a Reply

This site uses Akismet to reduce spam. Learn how your comment data is processed.

Biometric Market Analysis

Most Viewed This Week

Featured Company

Biometrics Insight, Opinion

Digital ID In-Depth

Biometrics White Papers

Biometrics Events