Deepfake raises concerns ahead of 2024 US elections
Deepfake technology has emerged as a pressing concern in the lead up to the 2024 U.S. elections, with the potential to disrupt not only the political landscape but also the financial sector. Recent research from Regula highlights the rise in deepfake-related fraud, estimating that it costs the financial industry an average of $600,000 per company.
Additionally, 33 percent of U.S. survey respondents believe the media is most vulnerable to deepfakes, expressing concerns that falsified news reports and interviews could mislead the public.
Deepfakes pose challenges for election integrity and trust in digital communications as Regula’s survey indicates that 28 percent of American and 34 percent of German respondents believe deepfakes could influence election outcomes. As the use of social media increases, the likelihood of manipulated content being used for misinformation campaigns rises, particularly in the lead-up to elections.
The financial sector is not immune to the threats posed by deepfake technology. According to Regula’s findings, the average financial institution incurs substantial losses due to fraudulent activities involving deepfake media. The survey’s results suggest that as deepfake technology becomes more sophisticated and accessible, companies must bolster their defenses against potential threats.
“The significant gap between confidence in detecting deepfakes and the reality of financial losses, particularly in Financial Services, shows that many organizations are underprepared for the sophistication of these attacks,” says Ihar Kliashchou, chief technology officer at Regula.
“As the threat evolves, it’s crucial for companies to switch to a liveness-centric approach, which we adhere to at Regula. This approach is focused on dealing with physical objects only — both faces and documents, as well as their dynamic parameters — in real time, which can significantly decrease the chances of falling victim to a deepfake attack. Additionally, it’s advisable to use multiple layers of identity verification and choose highly reliable technologies, like secure server-side reprocessing of all document and biometric checks.”
Au10tix sees increase in social media attacks
A separate report by Au10tix underscores the growing incidence of identity fraud, with a noted spike in social media attacks leading up to the U.S. presidential election. The Q3 2024 Global Identity Fraud report indicates a trend of identity theft tactics evolving in response to the increasing digitalization of identity verification processes. As election campaigns ramp up, fraudsters are leveraging these vulnerabilities, posing risks not only to individuals but also to organizations and their reputations.
Experts warn that the implications of deepfake technology extend beyond mere financial losses. The erosion of trust in legitimate media sources could lead to widespread disillusionment among voters, undermining democratic processes.
“Fraudsters are evolving faster than ever, leveraging AI to scale and execute their attacks, especially in the social media and payments sectors,” says Dan Yerushalmi, CEO of Au10tix.
“While companies are using AI to bolster security, criminals are weaponizing the same technology to create synthetic selfies and fake documents, making detection almost impossible. The only way to detect this type of fraud is by analyzing behavior at the traffic level, as Au10tix does with our Serial Fraud Monitor.
“We are committed to continually advancing our detection methods to protect customers against this rapid evolution of fraud tactics, using a combination of advanced AI, biometric verification, and deepfake detection.”
To combat these emerging threats, Regula and Au10tix advocate for the implementation of advanced identity verification technologies, by leveraging AI and machine learning to better detect manipulated content.
Article Topics
AU10TIX | biometric liveness detection | biometrics | deepfake detection | deepfakes | elections | fraud prevention | Regula | social media | United States
Comments