Advocates worried about global face biometrics systems find little transparency
Transparency continues to be a capability that world governments cannot demonstrate when it comes to biometric systems aimed at their citizens.
Privacy advocates are calling for more visibility into considering, testing and operating biometric databases and facial verification algorithms by governments, including projects where agencies collaborate with businesses.
Programs in the United Kingdom and Australia are being questioned now, and a more-exotic system deployed in Abu Dhabi began raising transparency concerns in July.
Britain’s National Health Service, one of the few government units that residents of all political bents appreciate, has not been transparent about a deal it struck with authentication vendor iProov to build an administrative app used by 10 million.
The London-based company provides the technology enabling the app to collect and store face biometrics for ID verification from video, according to The Guardian. People use the app for a number of bureaucratic tasks, including to access medical records and get a COVID vaccine travel certificate.
Although a contract was signed in 2019, according to the publication, the NHS had not acknowledged it on its site as of last week for security reasons. Security also has been cited as the reason the public cannot be shown a data protection impact assessment.
For example, it would be unwise to say how long the data is stored, according to the article.
Worse, in the eyes of some, a spokesperson for the department said law-enforcement agencies can ask to see — though not demand — data collected by the system.
In Australia, the government has drawn fire for silently deploying voluntary facial verification systems in the states of New South Wales and Victoria.
According to reporting by Reuters, the product is an app through which the government can be assured that someone who is under quarantine orders is, in fact, isolated.
Australian software maker Genvis had already worked with Western Australia police to deploy it in that state in November 2020. Expansion to the nation’s southeastern states had not been announced.
Government leaders in Western Australia reportedly have banned using the app in any roles not related to COVID.
That might be a hard sell for some Australians, given examples elsewhere of mission creep.
European news publisher Euractiv this month found that the role of a facial recognition system created by Austria’s federal government to fight crime has quietly been expanded.
Part of a 2020 law, the system was sold as a way to search for people suspected in serious crimes. Faces recorded by surveillance cameras are compared to a growing database, now with nearly 640,000 entries.
According to Euractiv, the government belatedly said the system has since been used to investigate demonstrators.
Face biometrics providers have also been called upon by investors earlier this year to increase their transparency for improved public trust.