Can businesses be trusted with their hands in the public biometrics cookie jar?
Is it ethical for police to work with commercial entities to create live facial recognition systems? It is a question that a biometrics-ethics advisory board in London is trying to answer.
Live facial recognition is an attractive technology for governments, so much so that government agencies on both sides of the Atlantic are rushing to sign contracts with private sector entities, sometimes under new operational models, with few if any safeguards on captured data.
Politicians see two campaign promises in one tech package: lower spending and lower crime. In theory, at least, biometrics can be a police-force multiplier, putting more eyes on the streets without hiring few new cops. And there is a chorus of voices delivering that message to pols.
In some cases, private businesses are offering to shoulder some of the cost burden for this additional security capacity, in return for access to government-operated databases that provide identity information for people who allegedly pose potential threats.
In London, the advisory Biometrics and Forensics Ethics Group, is investigating this public-private intersection to see if it is harmful to personal privacy, is an unwarranted intrusion on individuals by government, or both.
The ethics group has invited public and private organizations to demonstrate how live face-biometric systems are beneficial or harmful in a hearing April 30. Groups invited to testify are limited to makers of the systems, public- or private-sector users of the technology, and security forces, local governments and land owners. A past hearing involved government regulators and advocates of civil liberties.
Central to the group’s investigation is a live-facial-recognition project involving 67 acres of mixed-use development called King’s Cross in London, owned by King’s Cross Central L.P.
A Financial Times article reports that the company can track tens of thousands of people using its own facial-recognition and other tracking methods. The problem is that London’s Metropolitan Police Service gave King’s Cross Central access to at least some of the photos in its own database.
Feeding into general dissatisfaction with facial recognition was the disclosure that the officials had initially been mistaken or dishonest when they said the service (and the British Transport Police) was not involved with King’s Cross surveillance systems.
In the United States, executives of the social media-monitoring firm Geofeedia, have said that the Baltimore police department hired the company during 2016 racially charged protests in the city to scrape social-media account photos. Those images were compared to department databases to find people with outstanding warrants and arrests, who were then arrested in the crowd, leading to accusations that the government had used “preferred access to social media speech.”
In Toronto, a proposed public-private project wrapped in urban redevelopment has been scaled back after public opposition. Would-be smart-city builder Sidewalk Labs, owned by Alphabet Inc., proposed a 190-acre planned development.
Some grumbling was about how much the land was going to be sold to Sidewalk Labs, but the loudest outcry was for how much data would be collected on people living in or even walking the grounds of the area in question. A public hearing on the idea is imminent.
Article Topics
biometrics | Biometrics and Forensics Ethics Group | commercial applications | ethics | facial recognition | London Metropolitan Police | partnerships | police | UK
Comments