$35M stolen in voice clone as deepfake tech availability increases
A previously unreported incident in which $35 million was stolen in 2020 when fraudsters used voice cloning technology to trick a banker in the UAE into transferring the money into their own account has been revealed by Forbes. The criminals used voice altering technology to pretend to be a company director who needed the money transferring for an acquisition. Tools and techniques for carrying out such attacks are being shared on the dark web, suggesting an increasing threat, says a biometrics firm.
It appears that the fraudsters mounted an elaborate attack involving emails, but it was the voice deepfake which may have ultimately convinced the banker to make the transfer, leading to the second known such attack – and for a far greater sum of money – after $243,000 was defrauded in the UK in March 2019. However, details of the case only emerged after Forbes discovered court documents generated when the UAE court sought help from American investigators to trace $400,000 of the stolen funds that went to bank accounts in the U.S. held at Centennial Bank.
“The absence of face-to-face contact under lockdown has made it easier than ever for fraudsters to get past standard identity checks. We’re now seeing an uptick in deepfake tech and service offerings across the dark web, where users are sharing illicit software, best practices and how-to guides,” Stephen Ritter, CTO at Mitek, and an expert on the technologies behind and challenges posed by deepfakes told Biometric Update.
“All of this demonstrates a concerted effort across the cybercrime sphere to sharpen deepfake tools, which in turn points towards the first signs of a new wave of impending fraud.”
This increasing activity around voice shaping and deepfakes has been known to experts for some time, but the UAE heist news came earlier than many predicted. Tools to combat such frauds are also under constant development by firms such as Mitek.
“Voice offers a powerful and convenient form of biometrics that will have a critical role to play in improving anti-fraud defenses. Where one form of biometrics, such as facial or thumbprint readers, presents a solid defense against would-be hackers, two offers a lot more protection, which leads to lower fraud rates,” said Ritter, whose company recently acquired voice and face biometrics tech provider ID R&D to bolster its capabilities.
The ease of collecting staff biometrics can also be used as part of a firm’s defenses, explains Ritter: “In our experience, the combination of both voice and face biometrics makes the verification process almost impenetrable by fraudsters, offering four layers of protection – liveness and recognition of both face and voice. Not only that, but voice biometrics can be collected from our devices easily and passively – meaning it isn’t hard to develop a robust database of employee or consumer voice biometrics to better prevent fraud.”
Article Topics
biometric liveness detection | biometrics | deepfakes | fraud prevention | identity verification | Mitek | spoof detection | synthetic voice | voice biometrics | voiceprints
Comments