FB pixel

EPIC: Emotion recognition tech violates EU fundamental rights

Categories Biometrics News  |  Surveillance
EPIC: Emotion recognition tech violates EU fundamental rights
 

The Electronic Privacy Information Center (EPIC) has urged the Dutch Data Protection Authority to protect students and employees from the harms of emotion recognition.

The EU AI Act prohibits the development, deployment, and placement of emotion recognition systems in the EU market intended for workplaces and educational institutions, with limited exceptions for certain medical and safety reasons. However, the Dutch data protection agency Autoriteit Persoonsgegevens (AP) opened a consultation requesting feedback on the implementation of this prohibition.

The Washington DC-based EPIC has urged AP to define emotion recognition systems broadly and to either allow for no exemptions for its use, or construe the medical and safety exemption narrowly. EPIC’s recommendation is based on the “complete lack of scientific evidence that these systems work,” the organization writes, and that they “violate” various protections enshrined in the EU Charter of Fundamental Rights and other EU regulations.

EPIC regularly advocates for the protection of civil liberties and privacy rights, with focus on biometric surveillance, and has previously complained to the FTC on a job application screening tool that used emotion recognition. In addition, it has advised the United States Department of Education on the harms of emotion recognition, and warned the United States Department of Justice on the invasive nature of emotion recognition technologies.

AI-based emotion recognition systems make predictions about an individual’s emotional state based on biometric data such as heart rate, skin moisture, voice tone, gestures or facial expressions. However, the science behind “emotion recognition” can be barely construed as science. This is for the simple reason that inner emotions can be very hard to objectively measure based on a person’s external features.

For example, a skilled movie actor can be read as sad or anguished or extremely happy, but it does not mean that they are genuinely experiencing those emotions within themselves. Researchers have found that facial expressions can convey varying emotional states and that these can also vary substantially across different cultures, situations, and even across people within a single situation.

Therefore, “objectively” assessing emotions is a misnomer. Furthermore, such technologies can display discrimination based on race, gender and disability. Australian researcher and lawyer Natalie Shard recently explained in a piece for The Conversation why she believes the Australian government should have specific regulations surrounding the use of such technologies, which can be read here.

Related Posts

Article Topics

 |   |   |   |   |   |   |   | 

Latest Biometrics News

 

UK Home Office raises estimate for passport contract to 12 years, £576M

The UK Home Office has opened a third round of market engagement for its next major passport manufacturing and personalization…

 

US lawmakers move to restrict AI chatbots used by kids

A bipartisan pair of House and Senate bills would impose new federal restrictions on AI chatbots, including a ban on…

 

Utah age assurance law for VPN users takes effect this week

Privacy advocates and virtual private network (VPN) providers are up in arms over Utah’s Senate Bill 73 (SB 73), “Online…

 

CLR Labs wins ISO 17025 accreditation for biometrics testing across EU

Cabinet Louis Reynaud (CLR Labs) has been accredited for ISO/IEC 17025, the international standard for testing and calibration laboratories, in…

 

Leidos, Idemia PS advance checkpoint modernization with biometrics, CAT-2 systems

Leidos and Idemia Public Security have formed a strategic partnership to deploy biometric‑enabled eGates and integrated Credential Authentication Technology (CAT-2)…

 

OpenAI rolls out passkeys for ChatGPT, partners with Yubico

OpenAI has introduced new passwordless security settings for ChatGPT accounts, allowing users to opt for passkeys or physical security keys….

Comments

Leave a Reply

This site uses Akismet to reduce spam. Learn how your comment data is processed.

Biometric Market Analysis and Buyer's Guides

Most Viewed This Week

Featured Company

Biometrics Insight, Opinion

Digital ID In-Depth

Biometrics White Papers

Biometrics Events