Different responses worldwide to facial recognition in criminal justice, report suggests

A new international survey published by Plos One has unveiled different attitudes in relation to the use of automatic facial recognition technology (AFR) in criminal justice systems around the world.
The two-part document shows that although participants were overall aligned in their attitudes and reasoning toward AFR, there were some key differences across countries.
The first part of the survey (called Study 1) analyzed results from focus groups in the UK, Australia, and China, while the second (Study 2) collected data from over 3,000 participants in the UK, Australia, and the U.S.
The studies showed that individuals living in the U.S. felt more accepting of government and private companies’ tracking of citizens using facial recognition, but less trusting of the police using the technology when compared to the UK and Australian residents.
In terms of negative elements affecting the perception of AFR in those surveyed, algorithm accuracy was the most relevant, with many of them still doubting face biometrics’ effectiveness, particularly in the UK.
At the same time, the studies’ results also highlighted there is some confusion among the public about the accuracy of AFR.
In terms of legal applications, the survey suggested U.S. participants were more willing to hypothetically accept facial recognition technology as evidence in court to secure convictions without other evidence. This is despite the fact that facial recognition technology does not have forensic status in U.S. courts (like most countries), so cannot be used as evidence.
Demographic and gender biases are also a cause for concern, according to the report, together with extensive surveillance tactics.
Furthermore, the new study noted how the type of facial recognition systems and what they are used for greatly impacted individuals’ perception of them.
In order to eliminate the stigma around the technology and favor its adoption worldwide, the researchers from the UK Universities of Lincoln and Exeter, Beijing Normal University, and the University of New South Wales ended the study by inviting companies and governments to discuss more facial recognition technologies and how they work, including details concerning accuracy and biometric data protection.
“Based on our data, we recommend that developers, system designers, vendors, and users of AFR do more to publicize the use, data privacy, and accuracy of AFR,” the report reads.
The researchers also recommended governments to set legal boundaries around the use of AFR in investigative and criminal justice settings.
“That it is important for users of AFR to justify their use case and know the capacity of their system, and that governments should provide clear legislation for the use of AFR in criminal justice systems around the world.”
Article Topics
biometric identification | biometrics | criminal ID | data protection | facial recognition | police | privacy | regulation | surveillance
Comments