Facial recognition critics consider the options in OBVIA webinar

Academics, Ada Lovelace Institute Director discuss different biometrics uses and safeguards
Facial recognition critics consider the options in OBVIA webinar

The current legal landscape throughout North America and Europe is inadequate to deal with the risks to public rights created by increasing facial recognition use, according to panelists in a webinar hosted by Laval University’s International Observatory on the Societal Impacts of AI and Digital Technology (OBVIA).

The webinar accompanies a report by University of Ottawa Professor Céline Castets-Renard, ‘Use of facial recognition by police forces in the public space in Quebec and Canada,’ which explores the legal frameworks that guide the use of facial recognition and the technology’s implementations by police forces in Quebec and elsewhere in Canada. The report is written in French, but an English summary is available. The report recommends establishing a cost-benefit balance between freedom and security interests, strengthening provincial and federal privacy laws, and the adoption of specific legislation to restrict any law enforcement agency use to conditional authorization.

Castets-Renard was joined on the panel by Rashida Richardson, visiting scholar at Rutgers Law School and its Institute for Information Policy and Law, Caroline Lequesne Roth, Assistant Professor of Public Law at the University Cote d’Azur, and Ada Lovelace Institute Director Carly Kind.

Panelists claimed several times without qualification that facial recognition is inaccurate and biased, but demonstrated nuanced understanding of several related areas.

Castets-Renard is concerned about a lack of transparency around the technology’s use. The disclosure by media, rather than public authorities, about the use of Clearview AI by police, and of facial recognition in malls by a private company, were cited as examples of this problem.

The potential for human rights violations, including gender and racial bias, and ineffective data protection laws are Castets-Renard’s other concerns with facial recognition in law enforcement and elsewhere.

The legal landscape is reviewed by Castets-Renard. Three Canadian provinces have laws that apply to facial recognition beyond PIPEDA, though the country’s federal government has also introduced new data protection legislation.

Richardson reviewed the policy landscape in the U.S., including partial bans and moratoriums in a growing number of local jurisdictions, emerging state legislation and the effect of Illinois’ Biometric Information Privacy Act (BIPA), and the lack of federal regulation. She also reviewed use cases in the country, including in schools and housing,

The policy approaches taken to limit the use of face biometrics in the U.S. were explored by Richardson, including targeted application-specific bans, rules around the information and databases involved, such as standards for image quality or data protection laws. Other proposals include oversight to improve transparency and use requirements.

Kind reviewed the relatively extensive use of facial recognition by law enforcement and the private sector in the UK, including legal pushback. The current legal framework has been noted by several authorities as inadequate to the state of facial recognition technology and use, Kind notes.

Research from the Lovelace Institute indicates that the UK public has high awareness of facial recognition, but a low amount of knowledge about it. It is relatively comfortable with the use of facial recognition in opt-in or consent-based applications, though as Kind noted, these concepts do not apply to law enforcement, at least under one interpretation of UK law.

The UK public generally supports the use of facial recognition in forensic investigations, in airports and on smartphones, but not in commercial settings like retailers or in schools.

Every country in Europe has experimented with facial recognition, Roth says, though with significantly more success in application areas like public transport than in schools. The biometrics industry’s fast adaptation to applications related to the pandemic is also pointed out by Roth.

The thorny question of how the technology impacts European commitments to human rights and freedoms has come up in the form of several controversies, and authorities on the continent are split, with some data protection authorities supporting a mooted moratorium on the technology’s use in public spaces.

Roth also emphasized the importance of technical safeguards, and the growing role of China in bodies like the International Telecommunication Union (ITU).

The tension between innovation policies and economic interests on one side, and rights advocates on the other, was noted.

During a lively question and answer session, the panelists were asked about the affect of fragmented policy landscapes, how events like terrorist attacks and the COVID-19 pandemic affect public attitudes. Kind expressed scepticism about the stated benefits of facial recognition, suggesting that claims such as high value in finding missing children have not been backed up with systematic analysis.

Panelists were also asked to compare face biometrics to other technologies that can be used in surveillance, and Richardson noted that there is a much lower level of awareness, and corresponding outrage and legal action, around location tracking.

Faces carry their own characteristics, however, which make facial recognition different from other biometric modalities and identification technologies, the panelists broadly agreed.

Related Posts

Article Topics

 |   |   |   |   |   |   |   |   | 

Comments

Leave a Reply

This site uses Akismet to reduce spam. Learn how your comment data is processed.

Most Read This Week

Featured Company

Biometrics Research

Biometrics White Papers

Biometrics Events

Explaining Biometrics