November 8, 2013 -
A new study published in the journal of Psychological Science by researchers at the University of Texas at Dallas and the National Institute of Standards and Technologies could turn our understanding of facial recognition on its head.
The research describes a series of experiments that demonstrates there is potentially more information for biometrics-based identity recognition in images of people than the face alone. Specifically, the recognition study used three types of images for comparison: a subject’s face and upper body (left); only the face (center); and the upper body with the face masked (right). Results suggest that automatic recognition systems could be improved by adding body information beyond the face.
“For twenty years, the assumption in the automatic face recognition community has been that all important identity information is in the face,” Jonathon Phillips, an electronics engineer at NIST who is a co-author on the study said. “These results should point us toward exploring new ways to improve automatic recognition systems by incorporating information about the body beyond the face.”
In a series of experiments, the researchers showed study participants pairs of images of either the same or different people and asked them to determine if the photos matched or not. The images selected for the study were chosen from a database used in the NIST Face Recognition Vendor Test 2006, an international competition of face recognition systems conducted by NIST. The study team selected a subset of the image pairs that automated face recognition systems could not recognize.
The images included a subject’s face and upper body. The study found that biometric identification accuracy by humans was essentially random when viewing only the face. Study participants were presented with two additional cases. In the first case, they compared the original images containing the face and upper body; while in the second case, they compared images of the upper body with the face masked. In both cases, human accuracy was the same and above chance. These results showed that participants primarily made decisions based on the upper body.
Participants also reported using facial cues to make identifications. Eye-movement tracking of the study participants, however, told a different story.
“Eye movements revealed a highly efficient and adaptive strategy for finding the most useful identity information in any given image of a person,” the study’s lead author Alice O’Toole of the University of Texas said.
According to NIST, The eye-movement results suggest that the participants were unaware of how important the body was in their decisions.