FB pixel

Emotion AI excites some businesses, but the legal landscape feels fickle

Categories Biometric R&D  |  Biometrics News
Emotion AI excites some businesses, but the legal landscape feels fickle
 

Businesses deploying emotion recognition algorithms need to be aware of the legal risks that they could incur in a regulatory environment that is inconsistent across the United States, and becoming more so as states pass data privacy laws governing biometrics and other personal information. This is the view advanced by Lena Kempe of LK Lawfirm in an article in the American Bar Association’s Business Law Today.

Emotion AI comes with some of the same concerns as biometric technologies, such as risks to data privacy and bias. It also introduces the possibility that individuals whose emotions are understood by automated systems could also be manipulated by them.

Kempe suggests that the market is growing. The article cites a forecast from market analyst Valuates, which sets revenues in the field at $1.8 billion in 2022, and predicts rapid growth to $13.8 billion by 2032 as businesses attempt to improve online user experiences and organizations address mental health and wellbeing.

Kempe also notes that Affectiva was performing advertising research for a quarter of the companies in the Fortune 500 as of 2019. A year and a half later, the company said it was up to 28 percent, and today it is 26 percent, somewhat undercutting the claim of rapid growth.

Emotion AI uses data such as the text and emojis contained in social media posts, facial expressions, body language and eye movements captured by cameras, and the tone, pitch and pace of voices captured by microphones and shared over the internet. Biometric data such as heart rate can also be used to detect and identify emotions, as can behavioral data like gestures.

If this data or its output can directly identify a person, or if it can be reasonably linked to an individual, it falls under the category of personal information. This in turn, brings it into the scope of the European Union’s General Data Protection Regulation and a raft of diverse U.S. state data privacy laws. In some cases outlined by Kempe, the information can qualify as sensitive personal data, triggering further restrictions under GDPR and state law.

The frequent use of biometric data for emotion AI also introduces regulatory risk from Illinois’ Biometric Information Privacy Act (BIPA) and similar laws being passed or considered elsewhere around the country.

Kempe advises businesses to include any emotion data in comprehensive privacy notices, minimize the data they collect and store and anonymize it where possible and review and update policies to limit their data handling based on the specific purpose it is used for. They should implement opt-in measures when sensitive personal data is involved and robust security measures.

She also sets out legal strategies for avoiding bias and manipulation, which are largely related to transparency and risk management.

The unsettled regulatory environment and market for emotion AI and affective computing force companies that are using the technologies to keep abreast of ongoing changes, Kempe says, lest their excitement for a deeper understanding of their users lead to feelings of violation or betrayal, and lawsuits.

Related Posts

Article Topics

 |   |   |   |   |   | 

Latest Biometrics News

 

Ring and Flock call off integration as scrutiny of camera-to-police partnership intensifies

Amazon-owned Ring and Flock Safety have canceled their planned partnership, stepping back from an integration that would have linked one…

 

MOSIP pursues democratization of digital identity with unconference conversations

A democratic vision of digital identity is central to the non-profit, open-source mandate of MOSIP. As the organization and the…

 

Liveness is king: FaceTec’s Jay Meier in conversation with Chris Burt 

It’s best, says Jay Meier, to think about identity management as a system of symbiotic systems. Which is to say,…

 

Ofcom fines Kick, threatens 4chan as OSA enforcement steadily dials up

UK regulator Ofcom has faced criticism for being too slow and lenient with its power to enforce the Online Safety…

 

Innovatrics, ROC improve rankings in NIST ELFT, rising to 2 and 3 respectively

Innovatrics is celebrating success in the latest National Institute of Standards and Technology (NIST) Evaluation of Latent Fingerprint Technologies (ELFT)…

 

Meta plans launch of facial recognition to smart glasses in ‘dynamic political environment’

Meta is reportedly planning to roll out facial recognition capabilities for its smart glasses as early as this year, taking…

Comments

Leave a Reply

This site uses Akismet to reduce spam. Learn how your comment data is processed.

Biometric Market Analysis and Buyer's Guides

Most Viewed This Week

Featured Company

Biometrics Insight, Opinion

Digital ID In-Depth

Biometrics White Papers

Biometrics Events