Consultants see small role now for emotion recognition, but skeptics remain
Divisive as it is, emotion recognition continues to find adherents in business, particularly in the automotive sector, but also in the legal profession and in retail.
The CEO Magazine, in fact, has looked into emotion recognition trends in retail, talking to a trio of top executives working at two global business consultancies: Ernst & Young and Gartner.
EY’s global AI leader, Rodrigo Madanes, says emotion-based artificial intelligence might soon become a well-used instrument in a retailer’s tool kit. That is despite its privacy implications and the need to account for cultural differences before deploying emotion recognition tools.
Gartner analyst Annette Zimmermann says that more questions need to be answered. There is as yet is no clear-cut way to warn customers their emotions are being detected, analyzed, and potentially stored.
Zimmermann says the industry is on more solid footing right now when deploying emotion recognition for market research like profiling customers and assessing reactions to products or services.
According to Gartner analyst Robert Hetu, the inability to push the technology further is also connected to most retailers’ failure to convert customer feedback into product strategies.
Hetu suggested that retailers should take advantage of computer vision or voice analysis to add more context to customer interactions. For instance, computer vision at self-checkout stations could cut theft while also measuring customer feelings.
Skeptics want emotion recognition banned in AI Act
Emotion recognition is also at the center of a recent warning published by civil rights organizations Article 19 and European Digital Rights (EDRi).
Article 19 senior program officer Vidushi Marda and Ella Jakubowska, EDRi policy advisor on fundamental rights, write that emotion recognition is junk science and must be outlawed. They prefer to add it to prohibited algorithms in the proposed AI Act being debated in the European Union.
The pair write that emotion recognition at the moment is classified mainly as a low or minimal risk in the AI Act.
“Civil society has made important progress in influencing the European Parliament to support amendments banning public facial recognition, and other mass surveillance uses,” reads the article. “But an incredibly dangerous aspect remains largely unaddressed – putting a stop to Europe’s burgeoning emotion recognition market.”
“Developers’ only requirement is to tell people when they are interacting with an emotion recognition system. In reality, the risks are anything but ‘low’ or ‘minimal,’ ” Marda and Jakubowska wrote.
They also refer to studies that suggest emotion recognition algorithms are not precise enough to be trusted. Further, Marda and Jakubowska argue that even should they become more accurate in the future, the codes should still be banned.
“The technology’s assumptions about human beings and their character endanger our rights to privacy, freedom of expression and the right against self-incrimination,” reads the post. “The capacity for discrimination is immense.”
The warning comes amidst final discussions in the EU Parliament concerning the AI Act, which the EU aims to finalize this year.
Article Topics
biometrics | emotion recognition | European Digital Rights (EDRi) | face biometrics | market report | privacy | regulation | retail biometrics
Comments