UK introduces CCTV data protection impact assessment guidelines and warns of biometrics risks
The UK government has released a set of guidelines and a template for organizations to assess data protection impact for surveillance cameras or surveillance camera systems, with particular guidance for those implementing biometrics capabilities. The guidance was developed by the Surveillance Camera Commissioner (SCC) and the Information Commissioner’s Office (ICO), following advice requests from operators of surveillance camera systems on how to process personal data.
The guidelines were released for entities in England and Wales that have to comply with the Surveillance Camera Code of Practice under Section 33(5) of the Protection of Freedoms Act 2012, but they can also be leveraged by public authorities and any other bodies that operate surveillance cameras in the UK.
According to the document, assessments should be performed when cameras are added or removed from systems, cameras are moved or change position, whole or parts of systems are upgraded, new systems are installed, and where biometrics capabilities such as automatic facial recognition are in use.
When installing a surveillance camera system, organizations have to consider the impact it has on individuals and their privacy. Under data protection law it is seen as ‘likely to result in high risk to rights and freedoms,’ especially if biometric data processing is involved. High risk surveillance cameras involve body worn video with audio recording functions or a surveillance system with automatic facial recognition capabilities.
The document warns that technology such as biometric facial recognition, automatic number plate recognition (ANPR), audio recording, body worn cameras, unmanned aerial vehicles (drones), megapixel or multi sensor very high-resolution cameras could raise concerns about individuals’ rights and freedoms.
The DPA template provides a number of questions organizations can follow to determine the problem they are trying to solve, the risks and impact.
The SCC pointed to the lack of legal guidance on facial recognition in CCTV networks as a pressing problem a few months ago. The Equality and Human Rights Commission has asked that law enforcement groups in the country suspend the use of automated facial recognition until more is understood about the technology and its impact.