Facial recognition lessons for the private sector from the South Wales police case
In response to the successful appeal against the legality of South Wales Police’s use of facial biometrics for automated public identification, the Data Protection Report from law firm Norton Rose Fulbright LLP has analyzed the main lessons the private sector should learn from the legal decision.
Businesses should implement internal rules, carefully review DPIAs (Data Protection Impact Assessment), assess proportionality in an objective way, and always check for potential bias in training data, according to the report.
In September last year, the case had been dismissed by the High Court, on grounds the use had been necessary and there was no clear evidence of discriminatory use.
By implementing internal rules for Automated Facial Recognition technology (which is referred to in the report as AFT) use, organizations in the private sector can confirm that a “checks and balances” system, which includes “regular reviews, oversight mechanisms and clear (and narrow) criteria,” was applied to make sure the technology is used appropriately. At any given moment, companies will need to be able to explain why the biometric technology was used instead of less intrusive methods, Norton Rose Fulbright Partner Lara White and Senior Knowledge Lawyer Janine Regan write.
Considering the court scrutinized DPIAs to make a decision, the Data Protection Report recommends companies closely analyze these reports, “consider how a DPIA may be viewed by a court, regulator or complainant” and “be precise in the responses.”
Objectivity is of great importance when evaluating proportionality to ensure deployment was based on a rational decision and less privacy intrusive methods could not be applied. As per GDPR requirements, “the use of AFT must be targeted and proportionate.”
The Court found that South Wales Police was not aware which database had been used to train the system, so there was no way of knowing if the technology was in any way biased. “Those using AFT in the private sector should therefore ensure that they make appropriate enquiries with developers of this technology about potential bias in the software,” the recommendation reads. “Similarly, if organizations are training software using their own training data they should also take steps to reduce the risk of bias.”
UK’s ICO (Information Commissioner’s Office) is looking into the AFT use by a real estate developer at Kings Cross station. According to the Centre for Data Ethics and Innovation (CDEI), the private sector has shown an increased interest in the technology, as it has already been deployed for marketing or to identify shoplifters and people with antisocial behavior in stores.
Article Topics
best practices | biometric identification | biometrics | data protection | facial recognition | privacy | real-time biometrics | South Wales Police | Wales
Comments