NEC tells UK court facial biometrics not scraped from internet, but declines training dataset details
Biometric templates used by NEC for facial recognition are unique to the company and cannot be shared with or used by other vendors, a company representative told a South Wales Court of Appeal, according to The Register.
The statement from NEC Global subsidiary Northgate Public Services was provided to the court through council for South Wales Police, and also notes that the company does not scrape facial images from the internet for inclusion in the database.
The company declined to provide the court details about the origins or contents of its training dataset.
Northgate Public Services Head of Global Facial Recognition Paul Roberts denied a suggestion from expert witness Dr. Anil Jain, writing on behalf of the plaintiff that Neoface Watch, or “AFR Locate” as South Wales Police call it, may use machine learning to improve its performance after deployment.
Jain writes that automated facial recognition systems “(t)ypically” use a deep learning network for continual improvement. Roberts responded that Jain’s claim is based on an older version of the Neoface algorithm SDK, which was used for internal testing, and that Neoface Watch is not the product used in the U.S., which Jain has been involved with as a consultant.
The plaintiff’s claims of discriminatory impact cannot be evaluated, Jain replied, without analyzing the training dataset, which neither he nor South Wales Police have been given access to. Roberts says the company cannot disclose training dataset information because it is “commercially sensitive.” Jain says they have not provided summary statistics or any empirical evaluation of the dataset.
Jain asserts that the system runs on Bosch Mic Starlight 7000 HD cameras, with 1080 by 1920-pixel resolution.
A ruling that the police force’s use of the system is legal despite interfering with privacy rights was appealed in late-2019. Plaintiff Ed Bridges, with support from rights advocacy group Liberty, is requesting the system be shut down for breaching human rights laws. The UK Information Commissioner’s Office (ICO) has submitted that the legal framework on which the AFR system is based does not meet legal requirements.
The Appeals Court is expected to hand down its decision later in the year.