A win for spotting synthetic faces but deepfake blue videos get easier to make

For everyone worried about detecting AI-born faces, it is reassuring to know that generative adversary networks still have a problem with biometric symmetry.
A team of U.S. researchers say they have written code capable of examining the pupils in digital images, one of the last reliable biometric tip-offs to fake face photos. GANs do not create pupils that are symmetrical or that are smooth circles or ovals.
It is encouraging to hear of methods to reveal fakes, but not only are countermeasures inevitable, programmers are making it startlingly easy to simulate reality in harmful ways.
For a time, many manufactured digital faces could be spotted because ears and ear jewelry did not match in all photos. Ears in earlier efforts were highly symmetrical, while real ears differ visibly.
As algorithms were trained to avoid symmetry, they ended up creating odd, sometimes ridiculous ears, including ears with mismatched jewelry.
These clues were visible in many of the thousands of convincing simulated faces published by The New York Times last November in a unique scrolling technique showing how realistic fakes can be. It is not possible in every image to see pupils clearly, but where they are distinct, differences can be discerned.
Five researchers from the State University of New York system and Keya Medical in Seattle found that the problem with pupils occurs because of a “lack of physiological constraints in the GAN models.”
Ear asymmetry is being addressed by the most effective makers of simulated digital facial photos.
As the paper was getting picked up by publishers, Technology Review was reporting on an online tool that was capable of fitting an uploaded photo into a previously taped sex act with just one click.
A publication of the Massachusetts Institute of Technology, the magazine withheld all identifying information about the site hosting the app rather than give its owner new customers.
The editors were unable to contact anyone associated with the development of the biometric app, but they noted that after their attempts, a note was placed on the site saying it was unavailable.
Technology Review said this is the first time someone has posted a one-click deepfake tool, something that without doubt would be attractive to criminal elements, including any random spurned lover or angry ex-spouse.
In this case, it appears that a little sunlight was antiseptic enough to make deepfake porn more difficult, at least temporarily, to create. Regrettably, not all bugs will run from light.
Article Topics
AI | biometrics | biometrics research | deepfakes | face photo
Comments