It’s disgusting what can fool facial recognition
A lot of the operational successes attributed to facial recognition happen under optimal conditions, but the real world still offers plenty of challenges for biometric systems. Something as random as a mashed bug on the sidewalk has the potential to defeat an algorithm.
A research paper out of Saudi Arabia looked at how well AI systems deal with emotions playing across people’s faces compared to a neutral visage.
The researchers found that a look of disgust can contort one’s face enough to fool biometric systems. (Sadness, they write, was very similar to the kind of neutral face typically found on government identification.)
The point of the research was not to identify emotional camouflage or to encourage wanted criminals to walk around with revulsion on their faces. It was to see which of six common expressions present the most risk for still-fallible biometric systems. A database of subjects reacting with sadness, disgust, happiness, anger, surprise and fear was used in experiments.
Each emotion was given a similarity score (how similar it is to neutral). Sadness ranked highest, at 93.93 percent. Disgust differed the most, scoring 92.01 percent.
But more specifically, the researchers wanted to find the facial features that are least changed regardless of emotion. This way, algorithms can be written to focus mostly on those areas for best results in less-optimal conditions.
Twenty-two features were identified in the research. Least changed were the positions of the right eye, chin, mouth, left eye, and forehead.
The features most “deformed” by emotion were the distance between the left eye and mouth, the right eye and mouth, the right eye and nose, the nose and forehead, the right ear and mouth, the left ear and mouth, the right eye and eyebrow, and the left eye and eyebrow.
As might be guessed, the width of the mouth and nose and nose also change notably with each emotion.