Facial recognition providers called on to improve transparency, preserve human rights
Investment company Candriam published a call from 50 global institutional investors to the biometrics industry to address the risks to both human rights and investor returns posed by facial recognition.
The investor statement, signed by institutions including pension boards and sustainable investment funds, identifies observed racial and gender biases, “the questionable accuracy and lack of public testing of most systems in use,” possible violations of law and personal privacy in database image sourcing, and misuse by governments, law enforcement agencies and others as controversies representing risk. The risks are reputational and operational, as well as to finances and human rights.
The investors collectively have $4.5 trillion in assets under management. They include Canada’s BMO Global Asset Management, UK-based Royal London Asset Management, U.S.-based Summit Global Investments, and Sweden’s Ohman.
The co-signing organizations point out that unlike most biometric modalities, facial recognition does not require the intentional participation of the end user, although voice recognition in included on the list and the threat of surveillance is subsequently invoked. Characterizing facial recognition technology as “easily accessible, automatic, seamless and cost effective,” the investors note its rapid adoption by law enforcement agencies and businesses.
The investors commit to applying the United Nations Guiding Principles on Business and Human Rights (UNGPs) to their analysis of companies involved with facial recognition.
“We will urge companies to take reasonable and pro-active steps to anticipate possible impacts of FRT, focusing on the most serious and severe potential harm, as well as on communication with their stakeholders,” they write.
Specifically, the co-signees say they will ask companies to disclose the accuracy of their technology as measured by a “recognized and relevant” tester, disclose the source of images in their databases, demonstrate due diligence in evaluating clients before offering them facial recognition capabilities, and demonstrate effective mechanisms to address grievances.
Candriam published a white paper on the same topic earlier this year, noting that “without transparency, we cannot assess these controversies.”
“The increasing deployment and use of facial recognition technologies have human rights implications which are not fully being considered by companies,” Aviva Investors Senior ESG Analyst Louise Piffaut told Reuters.
The investors have called for others to join them in their commitments.
Article Topics
accuracy | biometric identification | biometric testing | biometrics | ethics | facial recognition | law enforcement | regulation
Comments