FB pixel

Firms join the deepfake dance, offering responses to new AI threats

Prove and Pindrop are among those touting software to reduce widespread risk
Firms join the deepfake dance, offering responses to new AI threats
 

A rash of deepfake attacks has brought a note of panic to biometrics and other industries, which are waking up to just how far down the AI rabbit hole fraudsters have already gone. From major corporate theft to voice scams targeting individuals, the prevalence and scope of the problem confirms the urgent necessity of deploying technological safeguards that can keep pace with rapidly evolving criminal threats.

Digital ID firms are exploring different systems to thwart deepfakes and keep the line between reality and a generative AI-fueled dystopia as firm as possible, and the market potential is huge.

Prove creates an identity chain from data points tied to phone number

For Prove, the solution is in your pocket – specifically, tied to your mobile device and phone number. Prove’s system comes at identity from a different angle than many digital ID firms, in that it doesn’t rely on face biometrics or other biological features. Instead, Prove accesses the  cryptographic exchange that happens between your device and the cell tower to verify the number. It establishes trust thresholds based on a variety of data points that constitute the so-called reputation of the device. And it bounces the authenticated number off a variety of bank grade data sources to ensure that that phone number shows up associated with that identity.

Tim Brown, global identity officer for Prove, says the resulting chain of proven identity is less susceptible to attacks than biometric systems that use selfie-to-document matching.

“If you’re not a company that is actually involved in the creation of credentials in some form or fashion and you’re basically relying on your own type of AI modeling to build your document libraries and models and look for variances, then it’s actually really scary that something like this is happening right now,” says Brown, pointing to recent news about video deepfakes used to commit major financial fraud, and to a 404 article about a website that is allegedly using neural networks to generate cheap images of fake IDs.

Brown is speaking from a FIDO Alliance member conference in Madrid, where Prove has just been named to the FIDO Alliance board. He notes the role that standards can play in building a strong foundation of security. “Certainly, standards like FIDO and passkeys have a strong role to play in this world,” says Brown, “making sure that you’re protecting your accounts from phishing. Getting rid of passwords altogether will go a long way to helping individuals at least know that they’re interacting with a legitimate website.”

Coming from Idemia, Brown says he initially had quiet reservations about a system like Prove’s, which constructs and verifies identity in a more abstract way than an ID document or a selfie. But he soon had to admit he cannot argue with the numbers, and that Prove’s success rate was showing the effectiveness of its system for onboarding and other use cases.

“I think one of the cool things about Prove is that we’ve found a way of doing identity verification that takes advantage of some of the signals that people haven’t thought about before,” says Brown. “Or they think about them but not necessarily in the creative way we put them together to address the identity verification question.”

Pindrop hears what your mouth, tongue, tonsils are saying

For Pindrop, the secret lies in the unique qualities of the human voice. The Atlanta-headquartered firm aims its voice biometric engine at artifacts and traces which betray that speech has been generated by AI – in other words, that it is a deepfake.

Pindrop recently made headlines for being able to determine which TTS algorithm was used to create fake audio of U.S. President Joe Biden that was circulating as robo calls, in which Biden discouraged voting. By accounting for degradation of the audio through the transmission of multiple channels, as well as artifacts in Biden’s voice, Pindrop’s software successfully concluded that ElevenLabs had been the original source of the fake audio.

Vijay Balasubramaniyan, CEO of Pindrop, says that “what we’re seeing is that each deepfake engine is inhuman in a very specific way.” In the case of Biden, the deepfake fricatives – sounds in which air is constantly expelled through a narrow airway, as in F or S – were enough like electronic noise that they registered as flags.

“Sometimes what happens is the machine the codec is saying, oh, that’s noise. So let me replace it with bits that represent a noise block, as opposed to the subtle sound in front of the noise. We do this because we’re human and we developed an overbite when 10,000 years of farming developed soft foods. But the systems are machines. They don’t know any of this stuff.” Analyzing temporal changes in speech – the actual physical positioning of the speech organs – is an even better way to detect telltale artifacts. By running the Biden audio through their system, which includes a data set of about 20 million samples, Pindrop was able to match the “fakeprint” created by the combined anomalies against its record of TTS engines, and ultimately land on ElevenLabs.

“The final cherry on top was ElevenLabs finding the account that created the audio and suspending it,” says Balasubramaniyan.

Despite their differing approaches, both Balasubramaniyan and Tim Brown agree that fraudsters are getting more and more sophisticated with deepfake tools, and that circles of trust must grow with the threat – and get stronger. “Which means you need sophisticated detection,” says Balasubramaniyan. “You can’t just say ‘oh, it’s a deepfake.’ You need to be able to understand where they come from, and what they are going to do next.”

Article Topics

 |   |   |   |   | 

Latest Biometrics News

 

Nigeria tenders $83M digital identity system upgrade and MOSIP integration

Nigeria is planning to implement the MOSIP platform with its digital identity management system and upgrade its biometric capabilities with…

 

Passkey adoption by Australian govt, banks drives wider passwordless authentication

It’s high noon for passwords. Across the Authentication Corral, an inscrutable stranger saunters up and puts their hand on the…

 

‘New era in travel’: airports, airlines continue to be sweet spot for biometrics

A fascinating experiment in biometrics would be to find a privacy conscious person who would generally avoid facial recognition, put…

 

Limitations of FRT apparent in search for United Healthcare CEO’s killer

The murder of United Healthcare CEO Brian Thompson in Midtown Manhattan involved the use of facial recognition technology (FRT) to…

 

OpenID, BIO-key, RSA, SecureAuth showcase at Gartner IAM Summit

The 2024 Gartner Identity & Access Management Summit, running from December 9-11 in Grapevine, Texas, is playing host to names…

 

Aboriginal digital ID offers Indigenous Australians pathway to essential services

There are more than 200,000 Aboriginal and Torres Strait Islanders in Australia who lack a birth certificate. Without this vital…

Comments

Leave a Reply

This site uses Akismet to reduce spam. Learn how your comment data is processed.

Most Viewed This Week

Featured Company

Biometrics Insight, Opinion

Digital ID In-Depth

Biometrics White Papers

Biometrics Events