FB pixel

EPIC: Emotion recognition tech violates EU fundamental rights

Categories Biometrics News  |  Surveillance
EPIC: Emotion recognition tech violates EU fundamental rights
 

The Electronic Privacy Information Center (EPIC) has urged the Dutch Data Protection Authority to protect students and employees from the harms of emotion recognition.

The EU AI Act prohibits the development, deployment, and placement of emotion recognition systems in the EU market intended for workplaces and educational institutions, with limited exceptions for certain medical and safety reasons. However, the Dutch data protection agency Autoriteit Persoonsgegevens (AP) opened a consultation requesting feedback on the implementation of this prohibition.

The Washington DC-based EPIC has urged AP to define emotion recognition systems broadly and to either allow for no exemptions for its use, or construe the medical and safety exemption narrowly. EPIC’s recommendation is based on the “complete lack of scientific evidence that these systems work,” the organization writes, and that they “violate” various protections enshrined in the EU Charter of Fundamental Rights and other EU regulations.

EPIC regularly advocates for the protection of civil liberties and privacy rights, with focus on biometric surveillance, and has previously complained to the FTC on a job application screening tool that used emotion recognition. In addition, it has advised the United States Department of Education on the harms of emotion recognition, and warned the United States Department of Justice on the invasive nature of emotion recognition technologies.

AI-based emotion recognition systems make predictions about an individual’s emotional state based on biometric data such as heart rate, skin moisture, voice tone, gestures or facial expressions. However, the science behind “emotion recognition” can be barely construed as science. This is for the simple reason that inner emotions can be very hard to objectively measure based on a person’s external features.

For example, a skilled movie actor can be read as sad or anguished or extremely happy, but it does not mean that they are genuinely experiencing those emotions within themselves. Researchers have found that facial expressions can convey varying emotional states and that these can also vary substantially across different cultures, situations, and even across people within a single situation.

Therefore, “objectively” assessing emotions is a misnomer. Furthermore, such technologies can display discrimination based on race, gender and disability. Australian researcher and lawyer Natalie Shard recently explained in a piece for The Conversation why she believes the Australian government should have specific regulations surrounding the use of such technologies, which can be read here.

Related Posts

Article Topics

 |   |   |   |   |   |   |   | 

Latest Biometrics News

 

Sphinx raises $7.1m to expand AI-powered compliance agents

Identity checks were once reliant on human eyes and human discernment, but making sure people and entities are who they…

 

Identity fraud revs up in the automotive sector as purchases move online

Like most industries, the automotive sector is dealing with a spike in fraud. A survey snapshot released by identity provider…

 

DHS RIVR results suggest most ID document validation disastrously ineffective

The results of the identity document validation track within the 2025 Remote Identity Validation Rally are sobering. They indicate that…

 

DHS signals major expansion of biometric matching infrastructure

The Department of Homeland Security (DHS) has issued a Request for Information (RFI) seeking industry input on biometric matching software…

 

ROC impresses in NIST biometric age estimation benchmark, Shufti makes debut

Two new entrants to NIST’s Face Analysis Technology Evaluation (FATE) Age Estimation & Verification, one a debut and the other…

 

Online dating at risk as romance scams, deepfakes infiltrate platforms

Online dating sites are being flooded with deepfakes and AI content, making it hard for users to distinguish real matches…

Comments

Leave a Reply

This site uses Akismet to reduce spam. Learn how your comment data is processed.

Biometric Market Analysis and Buyer's Guides

Most Viewed This Week

Featured Company

Biometrics Insight, Opinion

Digital ID In-Depth

Biometrics White Papers

Biometrics Events