Voice clone detection software from ID R&D tackles potent new AI-assisted threat
Biometrics developer ID R&D has outfitted its IDLive voice liveness detection software with the ability to detect voice cloning and audio deepfakes. A press release says the new feature protects against identity fraud and unlawful impersonation by using AI to process recorded speech and categorize it as genuine or cloned.
“Just as deepfakes have made it harder to distinguish between fact and fiction in the digital world, voice clones make it hard to believe what we hear said by people we think we know and recognize,” says Alexey Khitrov, CEO of ID R&D and general manager for its parent company, Mitek. “In a world where deepfake impersonations are proliferating so rapidly, voice clone detection plays an essential role in preserving trust between people and technology, securing the voice interface from fraud.”
Delivered as an SDK, the IDLive Voice clone detection product works on mobile devices, either standalone or in concert with ID R&D’s suite of voice biometrics and liveness tools. No special recording equipment or hardware is required, and the API requires only three seconds of speech to assign a clone detection score.
Voice clone detection is yet another instance of using AI to fight AI. Generative AI models have enhanced the capability for fraudsters to create real-sounding voice clones that can be trained on short snippets of recorded speech to carry out presentation attacks capable of defeating voice biometric systems. Fraud risks include account takeover, identity theft, misinformation, extortion, defamation, and appropriation of identity.
In an article published by Politico in November 2023, following on U.S. President Joe Biden’s announcement of an Executive Order to establish standards in AI security and safety, Bruce Reed, a White House official spearheading the administration’s AI strategy, names voice cloning as the AI-related worry making him lose the most sleep.
“That technology is still new, but it’s frighteningly good,” says Reed. “It hasn’t dawned on society yet how much the notion of perfect voice fakes could upend our lives. No one will answer the phone if they can’t be sure whether the voice on the other side is real or fake.”