FB pixel

Facial recognition lessons for the private sector from the South Wales police case

 

south-wales-police-facial-recognition-vehicles

In response to the successful appeal against the legality of South Wales Police’s use of facial biometrics for automated public identification, the Data Protection Report from law firm Norton Rose Fulbright LLP has analyzed the main lessons the private sector should learn from the legal decision.

Businesses should implement internal rules, carefully review DPIAs (Data Protection Impact Assessment), assess proportionality in an objective way, and always check for potential bias in training data, according to the report.

In September last year, the case had been dismissed by the High Court, on grounds the use had been necessary and there was no clear evidence of discriminatory use.

By implementing internal rules for Automated Facial Recognition technology (which is referred to in the report as AFT) use, organizations in the private sector can confirm that a “checks and balances” system, which includes “regular reviews, oversight mechanisms and clear (and narrow) criteria,” was applied to make sure the technology is used appropriately. At any given moment, companies will need to be able to explain why the biometric technology was used instead of less intrusive methods, Norton Rose Fulbright Partner Lara White and Senior Knowledge Lawyer Janine Regan write.

Considering the court scrutinized DPIAs to make a decision, the Data Protection Report recommends companies closely analyze these reports, “consider how a DPIA may be viewed by a court, regulator or complainant” and “be precise in the responses.”

Objectivity is of great importance when evaluating proportionality to ensure deployment was based on a rational decision and less privacy intrusive methods could not be applied. As per GDPR requirements, “the use of AFT must be targeted and proportionate.”

The Court found that South Wales Police was not aware which database had been used to train the system, so there was no way of knowing if the technology was in any way biased. “Those using AFT in the private sector should therefore ensure that they make appropriate enquiries with developers of this technology about potential bias in the software,” the recommendation reads. “Similarly, if organizations are training software using their own training data they should also take steps to reduce the risk of bias.”

UK’s ICO (Information Commissioner’s Office) is looking into the AFT use by a real estate developer at Kings Cross station. According to the Centre for Data Ethics and Innovation (CDEI), the private sector has shown an increased interest in the technology, as it has already been deployed for marketing or to identify shoplifters and people with antisocial behavior in stores.

Article Topics

 |   |   |   |   |   |   |   | 

Latest Biometrics News

 

Will Scotland be the first nation to pass primary legislation covering live FRT?

The Scottish privacy commissioner continues to express consternation over the potential use of live facial recognition by Police Scotland. Meanwhile,…

 

France Identité app launches sandbox for iOS, proves age check privacy bona fides

France Identité, the French government’s mobile app for digital identity verification, has made its sandbox build available in iOS. Writing…

 

Digital ID success at scale hinges on tech, governance, adoption: IN Groupe

A study by French identity provider IN Groupe has established that digital identity systems succeed at scale only when countries…

 

New book makes case for DPI as fully integrated ecosystem

Digital development specialist Pedro Tavares has published a book that outlines how governments can successfully build digital states with digital…

 

Agentic AI pushes financial sector toward continuous identity

Agentic AI is forcing a rethink of identity and authentication in payments, as systems designed for human approval struggle to…

 

New Reality Defender Ethics Committee not mere theater, says CEO

“Most ethics committees are theater. This is not one of those.” So begins a new post from Reality Defender CEO…

Comments

Leave a Reply

This site uses Akismet to reduce spam. Learn how your comment data is processed.

Biometric Market Analysis and Buyer's Guides

Most Viewed This Week

Featured Company

Biometrics Insight, Opinion

Digital ID In-Depth

Biometrics White Papers

Biometrics Events