FB pixel

Another wet blanket on heated claims about emotion recognition

Another wet blanket on heated claims about emotion recognition
 

Claims that facial recognition algorithms can discern emotions are harder to believe in the wake of new research.

In fact, not even the mind can read emotions by viewing expressions unless it knows the context of an expression, according to a paper written by a pair of researchers from the Massachusetts Institute of Technology and Katholieke Universiteit Leuven.

The scientists started their emotion recognition research project from the common industry assumption that a frown only means sadness, for example. At least some previous studies looking at how people recognize emotions photographed models, not professional actors, who were asked simply to express a generic emotion with their faces.

AI has been trained this way since the 1990s, when an MIT Media lab professor published a report called Affective Computing. The category also has been called sentiment analysis by retail analysts.

It looks for knit brows, grimaces and wide eyes, for example.

But as any couple that has survived many years attest, a raised eyebrow is not just a raised eyebrow. The viewer cannot accurately infer an expression’s meaning without knowing what the subject is thinking.

That is context, and the best way to know for sure another’s thoughts is by asking. Even then, the information volunteered is being filtered at best.

Someone laughing in surprise after seeing an auto accident at a street corner might somehow be responsible for causing the mishap. Or that person could be realizing that a coincidence had prevented her from being in the street at the wrong time.

Cultural contexts also are mixed in with physical emotion responses. Some cultures bear up under sorrow with stoicism while others release frenetic wailing that is infectious.

It is hard, maybe impossible, to code for these variables.

And then there are biases. A 2019 article in the Harvard Business Review found a study that saw algorithms assign more negative emotions to some ethnicities than others.

The researchers suggest that future developers abandon still images for dynamic stimulus and seek out a greater spectrum of cultural contexts.

As it is, governments and businesses could be depending on emotion recognition to make faulty or disastrous decisions, such as whether to search the man raising algorithmic suspicions in line with gritted teeth who actually is fighting a losing battle with food poisoning.

A company called ParallelDots makes a consumer-analysis system called ShelfWatch to note expressions in stores.

Another firm, Oxagile, says its algorithms boost buyers’ “operational efficiency.”

Lists of emotion recognition firms around the globe can be found here and here.

In 2018, industry analyst firm Gartner predicted that 10 percent of personal devices will be capable of recognizing emotions. The algorithms will be used to help diagnose mental and emotional ailments and customize schooling for children.

Today, one of the selling points used by automakers when discussing facial recognition on drivers is spotting expressions of rage, which could be related to compelling world news or a missed goal in the playoffs.

Even some industry insiders see such claims as optimistic, both in terms of timing and technical capability.

Microsoft researcher Kate Crawford, promoting her book Atlas of AI, said flawed thinking goes into marketing claims that machine learning soon will be able to suss out intentions, urges, plans and the like from facial recognition. It is too complex, says Crawford. Microsoft is involved in the field, too.

Article Topics

 |   |   |   |   |   | 

Latest Biometrics News

 

Canada regulator backs privacy-preserving age assurance

The Office of the Privacy Commissioner of Canada (OPC) has published a policy note and guidance documents pertaining to age…

 

FCC seeks comment on KYC revision for commercial phone calls

The U.S. Federal Communications Commission (FCC) has proposed stronger KYC requirements for voice service providers to prevent scams and illegal…

 

Deepfake detection upgrade for Sumsub highlights continuous self-improvement

Sumsub has launched an upgrade to its deepfake detection product with instant online self-learning updates to address rapidly evolving fraud…

 

Metalenz debuts under-display camera for payment-grade face authentication

Unlocking a smartphone with your face used to require a camera placed in a notch or a punch hole in…

 

UK regulators pan patchwork policy for law enforcement facial recognition

The UK’s two Biometrics Commissioners shared cautionary observations about the use of facial recognition in law enforcement over the weekend…

 

IDV spending to hit $29B by 2030 as DPI projects scale: Juniper Research

Spending on digital identity verification (IDV) technology is projected to reach a 55 percent growth rate between now and 2030,…

Comments

Leave a Reply

This site uses Akismet to reduce spam. Learn how your comment data is processed.

Biometric Market Analysis and Buyer's Guides

Most Viewed This Week

Featured Company

Biometrics Insight, Opinion

Digital ID In-Depth

Biometrics White Papers

Biometrics Events