FB pixel

EPIC: Emotion recognition tech violates EU fundamental rights

Categories Biometrics News  |  Surveillance
EPIC: Emotion recognition tech violates EU fundamental rights
 

The Electronic Privacy Information Center (EPIC) has urged the Dutch Data Protection Authority to protect students and employees from the harms of emotion recognition.

The EU AI Act prohibits the development, deployment, and placement of emotion recognition systems in the EU market intended for workplaces and educational institutions, with limited exceptions for certain medical and safety reasons. However, the Dutch data protection agency Autoriteit Persoonsgegevens (AP) opened a consultation requesting feedback on the implementation of this prohibition.

The Washington DC-based EPIC has urged AP to define emotion recognition systems broadly and to either allow for no exemptions for its use, or construe the medical and safety exemption narrowly. EPIC’s recommendation is based on the “complete lack of scientific evidence that these systems work,” the organization writes, and that they “violate” various protections enshrined in the EU Charter of Fundamental Rights and other EU regulations.

EPIC regularly advocates for the protection of civil liberties and privacy rights, with focus on biometric surveillance, and has previously complained to the FTC on a job application screening tool that used emotion recognition. In addition, it has advised the United States Department of Education on the harms of emotion recognition, and warned the United States Department of Justice on the invasive nature of emotion recognition technologies.

AI-based emotion recognition systems make predictions about an individual’s emotional state based on biometric data such as heart rate, skin moisture, voice tone, gestures or facial expressions. However, the science behind “emotion recognition” can be barely construed as science. This is for the simple reason that inner emotions can be very hard to objectively measure based on a person’s external features.

For example, a skilled movie actor can be read as sad or anguished or extremely happy, but it does not mean that they are genuinely experiencing those emotions within themselves. Researchers have found that facial expressions can convey varying emotional states and that these can also vary substantially across different cultures, situations, and even across people within a single situation.

Therefore, “objectively” assessing emotions is a misnomer. Furthermore, such technologies can display discrimination based on race, gender and disability. Australian researcher and lawyer Natalie Shard recently explained in a piece for The Conversation why she believes the Australian government should have specific regulations surrounding the use of such technologies, which can be read here.

Related Posts

Article Topics

 |   |   |   |   |   |   |   | 

Latest Biometrics News

 

Biometric Update Podcast digs into deepfakes with Pindrop CEO

Deepfakes are one of the biggest issues of our age. But while video deepfakes get the most attention, audio deepfakes…

 

Know your geography for successful digital ID adoption: Trinsic

A big year for digital identity issuance, adoption and regulation has widened the opportunities for businesses around the world to…

 

UK’s digital ID trust problem now between business and government

It used to be that the UK public’s trust in the government was a barrier to the establishment of a…

 

Super-recognizers can’t help with deepfakes, but deepfakes can help with algorithms

Deepfake faces are beyond even the ability of super-recognizers to identify consistently, with some sobering implications, but also a few…

 

Age assurance regulations push sites to weigh risks and explore options for compliance

Online age assurance laws have taken effect in certain jurisdictions, prompting platforms to look carefully at what they’re liable for…

 

The future of DARPA’s quantum benchmarking initiative

DARPA started the Quantum Benchmarking Initiative (QBI) in July 2024 to expand hardware capabilities and accelerate research. In April 2025,…

Comments

Leave a Reply

This site uses Akismet to reduce spam. Learn how your comment data is processed.

Market Analysis

Most Viewed This Week

Featured Company

Biometrics Insight, Opinion

Digital ID In-Depth

Biometrics White Papers

Biometrics Events