FB pixel

Take the shine and target off face biometrics, say researchers

Take the shine and target off face biometrics, say researchers

Of the uncounted words, both alarmed and alarming, devoted to law enforcement’s use of facial recognition, there is one sentence that could tame pro and con rhetoric.

A subtitle in a scholarly journal reads: ‘Facial Recognition is Just Another Forensic Feature Comparison Tool.’

If held to the same professional standards as every other biometric identification tool, use of face biometrics can add to the number of solved crimes with minimal collateral harms (like false positives).

That is the conclusion of an article in the fall issue of Translational Criminology by National Academy of Sciences-affiliated professors Cynthia Rudin and Shawn Bushway. It is a welcome bit of sobriety in a debate dominated by vendors’ exaggerated claims and emotional privacy advocates.

The writers strictly differentiate forensic and surveillance roles.

“The concerns about mass surveillance associated with facial recognition software are serious,” the authors write. This is not the focus of their article.

Instead, they make the case that treating face biometrics as just another forensic tool has the highest chance of success. That means recognizing that it is fallible but can meet acceptable minimum performance standards.

It also means accepting that there are algorithms with little racial and other biases. The answer is imposing rigid ethical standards for how algorithms and human operators are trained and judged.

Every biometric tool ever devised has recognized and measurable limitations, the authors point out. Fingerprints can be smudged and DNA can be contaminated. (DNA also suffers the myth that it is indisputable.)

The pair include advice that is disturbing in that it needs to be offered. Avoid low-resolution images captured by old or inexpensive CCTV camera. Often, the faces captured with those systems are as useful as smudged fingerprints.

They could have gone a step further, advising caution when approached by software developers promising they can clean up images. Changed images have had information either added or subtracted, which can result in false results.=

Indeed, AI software must be trained to throw up its hands when the probability of a mistaken identification rises beyond acceptable parameters.

Related Posts

Article Topics

 |   |   |   |   |   |   |   |   | 

Latest Biometrics News


Leave a Reply

This site uses Akismet to reduce spam. Learn how your comment data is processed.

Most Read This Week

Featured Company

Biometrics Insight, Opinion

Biometrics White Papers

Biometrics Events

Explaining Biometrics