Voice deepfakes on the rise; biometrics can help
A new type of deepfake is spreading, based on voice recordings. Voice biometric algorithms continue to improve, and threat actors are using them for fraud, identity theft and other illicit activities.
A recent Vice article has shown that several members on the 4chan social media platform have used ElevenLabs’ beta software to generate voices sounding like notables including Joe Rogan, Ben Shapiro and Emma Watson mouth racist or abusive remarks.
ElevenLabs provides “speech synthesis” and “voice cloning” services, allegedly to explore new frontiers of voice AI and help “creators and publishers seeking the ultimate tools for storytelling.”
In the wings is OpenAI’s Vall-e. According to TechCrunch, the model has made substantial advancements in the last few months and is now capable of generating convincing deepfakes.
Need more? Enter My Own Voice, AI-powered “voice banking” software by Acapela Group, a French startup. Presented at CES 2023 and spotted by DigitalTrends, My Own Voice is designed to aid people who are losing their ability to speak to recreate their voice.
The software can reportedly create a convincing voice using only three minutes of recorded audio.
How to tackle voice deepfakes with biometrics
Anti-spoofing measures are also being developed, however.
According to voice recognition engineers at Pindrop, call centers can take steps to mitigate the harm of voice deepfakes.
Companies can educate workers to the danger.
Callback functions can end suspicious calls and request an outbound call to the account owner for direct confirmation.
Finally, multifactor authentication (MFA) and anti-fraud solutions can reduce deepfake risks. Pindrop mentions factors like devising call metadata for ID verification, digital tone analysis and key-press analysis for behavioral biometrics.
Even China is working on deepfake regulation. As reported by the New York Times, the country unveiled stringent rules requiring manipulated material to have the subject’s consent and bear digital signatures or watermarks.
Whether regulations work on deepfakes is not known. However, rights advocates warn that they could further curtail speech in China.
Article Topics
biometric authentication | biometrics | call centers | deepfakes | ElevenLabs | fraud prevention | Pindrop | voice biometrics | voice recognition
Comments