FB pixel

Is the EU AI Act leaving a backdoor for emotion recognition?

Is the EU AI Act leaving a backdoor for emotion recognition?
 

As the European Union prepares to adopt the Artificial Intelligence Act, the landmark legislation is still attracting criticism, this time for allowing the use of emotion recognition. An opinion piece published in the EU Observer highlights the AI Act still leaves the door open for its use in law enforcement and migration officers which could lead to potential rights abuses.

The anonymous writer, who describes themselves as an EU civil servant, claims that this leaves space for its use in law enforcement and migration control, particularly the controversial EU-funded iBorderCtrl project, an AI lie-detector for facial and emotion recognition technologies designed to be used during migrant interrogation. In 2018, the bloc announced it would fund the project and conduct pilots in Hungary, Greece and Latvia.

iBorderCtrl has been a target of heavy scrutiny among rights groups and lawmakers. Despite efforts to shed light on its use, the Court of Justice of the European Union (CJEU) ruled in September 2023 to deny access to documentation related to the project, citing the protection of commercial interests.

Facial emotion recognition is the process of identifying human sentiment using face biometrics, a relatively new field of AI that has been met with mixed results. The AI Act itself highlights the technology’s shortcomings and its potential for “discriminatory outcomes” and intrusiveness to rights and freedoms.

“There are serious concerns about the scientific basis of AI systems aiming to identify or infer emotions, particularly as expression of emotions vary considerably across cultures and situations, and even within a single individual,” the document states.

Despite these concerns, emotion recognition has not been included in Article 5 of the AI Act which defines Prohibited Artificial Intelligence Practices. Instead, emotion recognition is defined as a high-risk AI system and its use is prohibited at the workplace and in educational institutions, with exceptions for safety and medical reasons. Deployers of emotion recognition systems are not required to inform people about the operation of such systems if the system is used to “detect, prevent and investigate criminal offenses,” according to the legislation.

The language of the AI Act adds to the confusion on the legality of emotion recognition: The law states that the fact that an AI system is classified as a high-risk AI system “should not be interpreted as indicating that the use of the system is lawful under other acts of Union law or under national law compatible with Union law.”

At the beginning of February, European lawmakers reached an agreement on the technical details of the AI Act, opening the path to European Parliament committees’ approval in April. Although significant opposition is not expected, lawmakers may still introduce changes that could slow down the timeline of its implementation.

Article Topics

 |   |   |   |   |   | 

Latest Biometrics News

 

Sphinx raises $7.1m to expand AI-powered compliance agents

Identity checks were once reliant on human eyes and human discernment, but making sure people and entities are who they…

 

Identity fraud revs up in the automotive sector as purchases move online

Like most industries, the automotive sector is dealing with a spike in fraud. A survey snapshot released by identity provider…

 

DHS RIVR results suggest most ID document validation disastrously ineffective

The results of the identity document validation track within the 2025 Remote Identity Validation Rally are sobering. They indicate that…

 

DHS signals major expansion of biometric matching infrastructure

The Department of Homeland Security (DHS) has issued a Request for Information (RFI) seeking industry input on biometric matching software…

 

ROC impresses in NIST biometric age estimation benchmark, Shufti makes debut

Two new entrants to NIST’s Face Analysis Technology Evaluation (FATE) Age Estimation & Verification, one a debut and the other…

 

Online dating at risk as romance scams, deepfakes infiltrate platforms

Online dating sites are being flooded with deepfakes and AI content, making it hard for users to distinguish real matches…

Comments

Leave a Reply

This site uses Akismet to reduce spam. Learn how your comment data is processed.

Biometric Market Analysis and Buyer's Guides

Most Viewed This Week

Featured Company

Biometrics Insight, Opinion

Digital ID In-Depth

Biometrics White Papers

Biometrics Events