FB pixel

iProov shows how face-swapping app can fool liveness detection on financial apps

iProov shows how face-swapping app can fool liveness detection on financial apps
 

iProov has demonstrated that a readily available generative AI face-swapping tool could be used to evade biometric liveness detection software and perform injection attacks, endangering remote identity verification Know Your Customer (KYC) processes used by financial, banking and cryptocurrency mobile apps. The “critical, high-risk” vulnerability could potentially expose users across the world, according to the biometric identity verification company.

Injection attack scenarios are playing out in the real world, with iProov’s Identity Verification Threat Report released in February showing a 300 percent increase in face swap attacks since 2023 on the face biometrics systems used for KYC.

A case study on the attack scenario was published by the firm’s in-house Red Team in the Mitre ATLAS (Adversarial Threat Landscape for Artificial-Intelligence Systems), a knowledge base of adversary tactics and techniques against AI-enabled systems maintained by U.S. not-for-profit organization Mitre Corporation.

“We’ve seen an explosion in attack vectors relating to identity verification over the last 12 months, largely driven by advances in generative AI and the wide availability of low-cost tools,” says iProov Chief Scientific Officer Andrew Newell. “The publication of this latest Mitre ATLAS case study is part of the vital process of identifying and documenting such methodologies.”

UK-based iProov tested a desktop app named Faceswap that uses generative AI to swap faces in a video in real-time. The team obtained user identity information and high-definition facial images from online sources and used the software to create live deepfake videos.

During the identity verification stage on a financial services application, the team streamed the deepfake video using the Open Broadcaster Software (OBS) and an Android app called Virtual Camera: Live Assist, which allows users to replace the device’s default camera feed with a video stream. The team successfully evaded the liveness system.

iProov’s conclusion is that active liveness detection solutions, which rely on analyzing image artifacts and user movements, can be replicated by sophisticated AI-generated deepfakes.

“Substituting a mobile device’s camera with a virtual camera application allows attackers to bypass device-level security controls,” says the company.

To avoid such scenarios, organizations should seek vendors that have been tested against the recent European standard CEN 18099, which provides testing protocols against injection attacks, iProov adds. iProov’s injection attack detection (IAD) software was confirmed compliant with CEN 18099 “High” in a Level 2 evaluation performed by Ingenium and announced in November.

The firm recently deployed its own liveness detection solution in Thirdfort’s client due diligence platform to help prevent identity fraud in legal and property transactions and in Hypr’s Affirm platform to combat workforce identity fraud.

Related Posts

Article Topics

 |   |   |   |   |   |   |   |   | 

Latest Biometrics News

 

Face biometrics use cases outnumbered only by important considerations

With face biometrics now used regularly in many different sectors and areas of life, stakeholders are asking questions about a…

 

Biometric Update Podcast explores identification at scale using browser fingerprinting

“Browser fingerprinting is this idea that modern browsers are so complex.” So says Valentin Vasilyev, Chief Technology Officer of Fingerprint,…

 

Passkeys now pervasive but passwords persist in enterprise authentication

Passkeys are here; now about those passwords. Specifically, passkeys are now prevalent in the enterprise, the FIDO Alliance says, with…

 

Pornhub returns to UK, but only for iOS users who verify age with Apple

In the UK, “wanker” is not typically a term of endearment. However, the case may be different for Pornhub, which…

 

Europol operated ‘shadow’ IT systems without data safeguards: Report

Europol has operated secret data analysis platforms containing large amounts of personal information, such as identity documents, without the security…

 

EU pushes AI Act deadlines for high-risk systems, including biometrics

The EU has reached a provisional agreement on changes to the AI Act that postpone rules on high-risk AI systems,…

Comments

Leave a Reply

This site uses Akismet to reduce spam. Learn how your comment data is processed.

Biometric Market Analysis and Buyer's Guides

Most Viewed This Week

Featured Company

Biometrics Insight, Opinion

Digital ID In-Depth

Biometrics White Papers

Biometrics Events