Free face swap tool goes to number one on GitHub
If the chorus of voices noting the danger of freely available biometric deepfake tools seems alarmist, witness the newest entry: Deep-Live-Cam is face swapping software that has gone viral on social media for its ability to perform real-time digital video mapping of one face over another using just a single source photo. But ID R&D is among those developing defenses.
Tool layers 3D faces of Clooney, Vance, Zuck over user’s
A report from ArsTechnica says the software is available for free on GitHub, where it briefly went to number one on the platform’s trending repositories list. It “wraps together several existing software packages under a new interface” in order to detect faces in both the source and target images, such as a frame of live video.
The swap technology runs on an AI model called “inswapper,” developed by a somewhat mysterious startup called InsightFace, based in Hong Kong and incorporated in April 2024. (A Reddit forum from late 2023 includes allegations that InsightFace has run copyright striking campaigns on YouTube videos showcasing tools from competitors such as Roop and Reactor; meanwhile, ArsTechnica refers to Deep-Live-Cam as “an earlier fork” of Roop.)
Another AI model in the mix, GFPGAN, refines video quality in Deep-Live-Cam by enhancing details and removing artifacts.
Deep-Live-Cam’s AI works because it was trained on a dataset of millions of facial images of thousands of individuals making different facial expressions under different lighting conditions, filmed from various angles. This allows the neural network to develop an algorithmic “understanding” of facial structures and their dynamics under different conditions, and to infer a 3D model from a 2D image.
Coverage has examples of video in which the faces of U.S. vice presidential candidate J.D. Vance, Hollywood stars Hugh Grant and George Clooney, and Facebook exec Mark Zuckerberg are mapped over the face of social media user. The clip demonstrates how Deep-Live-Cam matches the pose, lighting and expressions of the superimposed face with the real face underneath. There are some problems with neck size, but the results are remarkably convincing.
This week, a new report from Veriff found that face swaps are increasingly common, evolving rapidly, and often used to create pornographic deepfakes.
ID R&D to the rescue with face swap detection tool
On LinkedIn, ID R&D VP of Growth Peter Martis throws down the gauntlet for the company’s deepfake detection software. “While most startups invest heavily in aggressive marketing campaigns to pitch their ‘we’ll help you fight deepfakes’ narrative, we at ID R&D create functional prototypes before making revolutionary claims,” he says.
Calling the newly available first version of ID R&D’s deepfake detection engine “shockingly accurate,” Martis says “telepresence” is the company’s next big thing, citing critical use cases in employment onboarding, telemedicine, mobile legal services and remote meetings (bringing to mind, once again, the poor Arup employee in Hong Kong who transferred $25 million to false bank accounts because a deepfake of his boss asked him to).
ID R&D Chief Scientist Konstantin Simonchik took DeepFaceLive on a run through the company’s IDLiveFace Plus tool, as demonstrated in his own LinkedIn post. It required two days of fully realistic face mask training, but was ultimately successful in detecting deepfakes from both DeepFaceLive and an even simpler face swap app, SwapFace.
Webinar on threat of AI injection attacks underlines need for liveness
The team also demonstrated the capabilities of face swapping for fraud, and how their detection system works, in a recent webinar from Biometric Update, which is available on demand. In the video, ID R&D President Alexey Khitrov touches on something that Simonchik also mentions in his post: the importance of detecting injection attacks in a live video feed as a complement to biometric presentation attack detection.
Noting combinations of face swapping, face morphing and voice cloning can create deepfakes that nail details like the movement of a person’s hair. “From the biometric perspective,” Khitrov says, “it’s important to understand that without this additional layer of protection for the injection tech detection, the biometrics in of themselves are very exposed. From a purely security perspective, liveness and injection tech is the key element of securing biometric transactions and biometric identity verification.”
Article Topics
biometric liveness detection | biometrics | deepfake detection | deepfakes | face biometrics | face swap | ID R&D | injection attacks | InsightFace
Comments