FB pixel

‘Consent is not almighty’: tracking facial recognition in Europe and why it gets blocked

‘Consent is not almighty’: tracking facial recognition in Europe and why it gets blocked
 

How facial recognition is used for authorization purposes is the topic of the third of MAPFRE’s six-part ‘Mapping the Use of Facial Recognition in Public Spaces in Europe’ investigation into understanding public facial recognition in both the EU and UK. Authors believe this to be the only study of its kind in Europe.

The report, ‘Facial Recognition for Authorisation Purposes,’ gathers the many situations where facial recognition is used publicly such as for paying fares on buses in Madrid, automated check-out in supermarkets, and also how they are presented. Whether to “accelerate people flows,” “improve the customer experience” or “speed up operations,” and how messaging changed with COVID-19.

The study found seven “emblematic” uses of facial recognition technology (FRT) in public places, sought out the relevant documentation from data controllers as well as EU and country laws, plus reactions from civil society, Data Protection Authority (DPA) opinions and court decisions. It also considers the different interpretations of IATA’s One ID concept for air travel.

The authors examine issues of necessity and proportionality, the differences between identification and verification of individuals, and the legal difference between consent and public interest to make a series of recommendations to data controllers, DPAs, the European Data Protection Board (EDPB) and policymakers.

Can non-biometric alternatives be used? Are lessons being learnt from the data protection impact assessments (DPIAs)?

As the series is published it is becoming an increasing valuable resource for understanding every aspect of FRT. Here we attempt to raise awareness of the team’s work with a very brief overview.

Seven ‘emblematic’ uses of FRT

Two use cases in France – one with a biometric token for the Parafe passenger authentication scheme in airports and train stations, one with an ID token for access to schools in the PACA region – explore the individual verification.

For individual identification, the team consider payments in Scottish school canteens (no token used); season ticket holder access to Belgium’s Molenbeek Stadium (no token); the Mona app system trial in France’s Lyon airport to go through all checkpoints via FRT, matched against a database per flight (no biometric token); the similar Aena/Iberia system for three airlines at three airports in Spain (no token); and three airlines and three airports again in Germany and Austria for Star Alliance, also similar to Mona, but also with different implementation (no token used).

Consent, public interest and voluntary use

The study finds that consent is problematic when an FRT scheme involves minors as there is an imbalance of power between them and the institution implementing the technology. The example is the use of FRT for two high schools in the PACA regions of France, which was never actually implemented “since both the French DPA and, later, a French Court considered that such an experiment would have been unlawful as regards the GDPR provisions. One of the main arguments was that pupils could not freely provide consent since the High Schools Board of directors have authority over the pupils.”

While the system for payment in school canteens in North Ayrshire, Scotland, was not challenged over consent, as those over the age of 13 are deemed to be able to give consent in the UK. Instead, it was challenged by the UK’s DPA – the ICO – over proportionality.

The report acknowledges that in no case examined was there a situation where there was no alternative to the biometric approach.

“Substantial public interest” may be used as grounds for an FRT project, although this is rare. The report finds that this exception requires a law to be in place for processing that data. Belgium realized it did not have one.

Parafe, the scheme in French airports and stations did use the exception of substantial public interest, complicated further by being voluntary. By being voluntary but not consensual, Parafe does not require data controllers to obtain “explicit consent” from passengers: “In other words, the data controller is under no obligation to demonstrate that the choice of a passenger to use the Parafe authentication system is based on a ‘freely given’, ‘specific’, ‘informed’, ‘explicit’ and ‘unambiguous’ consent,” referring to the definition of consent.

The study finds that European DPAs disagree as to whether facial recognition is a strong means of authentication, causing a fundamental issue in weighing necessity against proportionality.

Recommendations

The authors make a series of recommendations for different audiences. For data controllers, the recommendations are that they should understand “that they have the burden to prove that they meet all GDPR requirements.”

All schemes in Europe have been based on the cooperative use of FRT, either by consent or by being voluntary. But consent is “not almighty” state the authors, such as when a scheme involves minors. Data controllers should understand the limits of cooperative usage.

DPAs and the EDPB should ensure harmonization on the centralization of databases and principles of data processing. They should also provide guidance on running DPIAs and evaluations of FRT authorization use.

While policymakers are urged to gain an understanding of the different use cases of facial recognition for authorization, of the AI Act and calls for bans – though only on remote biometric identification.

“If, on the other hand, other proposals, calling for a broad ban on ‘biometric recognition in public spaces’ are ultimately successful, they are likely to result in all of the ways in which FRT is used for authorisation purposes being prohibited. Policy-makers should take this into consideration, and make sure that this is their intention, before they make such proposals.”

“MAPping the use of Facial Recognition in public spaces in Europe” (MAPFRE) is a project under the AI-Regulation Chair of the Multidisciplinary Institute in Artificial Intelligence (MIAI) at France’s Université Grenoble Alpes.

The first report, ‘A Quest for Clarity: Unpicking the “Catch-All” Term’ examined the issue of definitions and the positions adopted by the Members of the European Council and Parliament amid the drafting of the AI Act. Part 2 looked at how facial recognition works and classification of the technologies.

Article Topics

 |   |   |   |   |   |   |   |   | 

Latest Biometrics News

 

ATO attacks surge in Q2 2024, Sift warns of growing ‘Fraud-as-a-Service’ threat

A recent report highlights the growing threat of account takeover (ATO) attacks, which surged by 24 percent in the second…

 

EU AI pact sets new standards for ethical AI use across Europe

By Tony Porter, Chief Privacy Officer at Corsight AI The European Union’s AI Pact marks a crucial step towards forming…

 

Deepfake detection challenge, integration to protect content integrity unveiled

A new deepfake detection competition has been announced with the intention of advancing “next-generation deepfake detection and localization systems” development….

 

Utah judge blocks age verification requirement for social media

A federal judge in Utah has ruled in favor of tech lobby group NetChoice and against the state’s new law…

 

Google announces beta test for digital IDs based on biometrics and US passports

A new type of digital ID based on U.S. passports in Google Wallet has been introduced ahead of beta testing….

 

Biometrics startups address pressing industry challenges in pitch competition

A group of biometric and digital identity startups went head-to-head in a pitching competition at Identity Week on Wednesday and…

Comments

Leave a Reply

This site uses Akismet to reduce spam. Learn how your comment data is processed.

Most Read This Week

Featured Company

Biometrics Insight, Opinion

Digital ID In-Depth

Biometrics White Papers

Biometrics Events