Affective biometrics: Too new, too inaccurate for police work, says think tank

Most people knowledgeable about face biometrics have at least some misgivings about how it is used. It is safe to say that most people who know about affective biometrics have misgivings about it being used at all.
Of course, affective biometric systems try to discern moods, future actions and deception by analyzing expressions and body language with algorithms. The market for the systems is growing.
Many people feel they might eventually trust facial recognition with their liberty and life because they believe it will be a simple yes or no transaction: I was not at the bank when it was robbed, and computer vision will prove that.
But who trusts their face to be blank, particularly during unguarded moments, of which everyday life is full?
Could my sideways glance be the deciding evidence of larcenous greed or malicious lust for software and, eventually, a hiring manager or a judge?
The center left think tank The Brookings Institution sees affective biometrics being far too experimental for law enforcement to use in ascertaining guilt much less predicting criminal actions, as some boosters use to promote the technique.
Brookings, in fact, has called for a ban on federal use of affective biometrics. This is hardly new turf for the think tank. It continues to offer guidance on biometric regulations in the United States and beyond.
Genuflecting at the altar of possible virtuous outcomes, researchers in a post note that audio and video systems can pick out some military veterans who might try to kill themselves. Video systems can diagnose fatigue in drivers, too.
But spotting lies or criminal intent, in the words of Brookings, “are clearly beyond the capacity of affective computing.”
Canada tried and abandoned Avatar, its experimental algorithmic interviewing project, according to the post, which lists other black eyes for the technique.
The AI Now Institute at New York University two years ago urged a ban for government and private sector use. Institute leaders found that the science lags far behind the marketing of, for example, apps sold by HireVue.
Article Topics
affective biometrics | AI | biometrics | computer vision | criminal ID | emotion recognition | expression recognition | police | regulation
Comments