FB pixel

Consultants see small role now for emotion recognition, but skeptics remain

Consultants see small role now for emotion recognition, but skeptics remain
 

Divisive as it is, emotion recognition continues to find adherents in business, particularly in the automotive sector, but also in the legal profession and in retail.

The CEO Magazine, in fact, has looked into emotion recognition trends in retail, talking to a trio of top executives working at two global business consultancies: Ernst & Young and Gartner.

EY’s global AI leader, Rodrigo Madanes, says emotion-based artificial intelligence might soon become a well-used instrument in a retailer’s tool kit. That is despite its privacy implications and the need to account for cultural differences before deploying emotion recognition tools.

Gartner analyst Annette Zimmermann says that more questions need to be answered. There is as yet is no clear-cut way to warn customers their emotions are being detected, analyzed, and potentially stored.

Zimmermann says the industry is on more solid footing right now when deploying emotion recognition for market research like profiling customers and assessing reactions to products or services.

According to Gartner analyst Robert Hetu, the inability to push the technology further is also connected to most retailers’ failure to convert customer feedback into product strategies.

Hetu suggested that retailers should take advantage of computer vision or voice analysis to add more context to customer interactions. For instance, computer vision at self-checkout stations could cut theft while also measuring customer feelings.

Skeptics want emotion recognition banned in AI Act

Emotion recognition is also at the center of a recent warning published by civil rights organizations Article 19 and European Digital Rights (EDRi).

Article 19 senior program officer Vidushi Marda and Ella Jakubowska, EDRi policy advisor on fundamental rights, write that emotion recognition is junk science and must be outlawed. They prefer to add it to prohibited algorithms in the proposed AI Act being debated in the European Union.

The pair write that emotion recognition at the moment is classified mainly as a low or minimal risk in the AI Act.

“Civil society has made important progress in influencing the European Parliament to support amendments banning public facial recognition, and other mass surveillance uses,” reads the article. “But an incredibly dangerous aspect remains largely unaddressed – putting a stop to Europe’s burgeoning emotion recognition market.”

“Developers’ only requirement is to tell people when they are interacting with an emotion recognition system. In reality, the risks are anything but ‘low’ or ‘minimal,’ ” Marda and Jakubowska wrote.

They also refer to studies that suggest emotion recognition algorithms are not precise enough to be trusted. Further, Marda and Jakubowska argue that even should they become more accurate in the future, the codes should still be banned.

“The technology’s assumptions about human beings and their character endanger our rights to privacy, freedom of expression and the right against self-incrimination,” reads the post. “The capacity for discrimination is immense.”

The warning comes amidst final discussions in the EU Parliament concerning the AI Act, which the EU aims to finalize this year.

Article Topics

 |   |   |   |   |   |   | 

Latest Biometrics News

 

Biometrics developers dance with data privacy regulations continues

Biometrics controversy and investments are often found side by side, as seen in many of this week’s top stories on…

 

EU AI Act should revise its risk-based approach: Report

Another voice has joined the chorus criticizing the European Union’s Artificial Intelligence Act, this time arguing that important provisions of…

 

Swiss e-ID resists rushing trust infrastructure

Switzerland is debating on how to proceed with the technical implementation of its national digital identity as the 2026 deadline…

 

Former Jumio exec joins digital ID web 3.0 project

Move over Worldcoin, there’s a new kid on the block vying for the attention of the digital identity industry and…

 

DHS audit urges upgrade of biometric vetting for noncitizens and asylum seekers

A recent audit by the DHS Office of Inspector General (OIG) has called for the Department of Homeland Security (DHS)…

 

Researchers spotlight Russia’s opaque facial recognition surveillance system

In recent years, Russia has been attracting attention for its use of facial recognition surveillance to track down protestors, opposition…

Comments

Leave a Reply

This site uses Akismet to reduce spam. Learn how your comment data is processed.

Most Read This Week

Featured Company

Biometrics Insight, Opinion

Digital ID In-Depth

Biometrics White Papers

Biometrics Events