FB pixel

Romance scams empty the bank account – and rip out the heart

Deepfake media has amplified fraud tied to dating apps, social media romances 
Romance scams empty the bank account – and rip out the heart
 

It’s almost Valentine’s Day. For the lucky ones, that means Cupid is afoot. But in the age of generative AI, many are liable to find Cupid shot dead with his own arrows and replaced by a deepfake version that is trying to sell them on a hot new cryptocurrency.

Romance scams are big business, as fake profiles proliferate. Mixed with a squeeze of social engineering, biometric deepfakes fashioned into potential dates are a potent recipe for fraud.

New research from global identity technology firm GBG shows that 61 percent of people that have used dating apps or websites in the UK have matched with a profile they later discovered, (or strongly suspected) was a bot, scammer or catfish.

“I’m in the army,” say the scammers. “I’m living or working abroad.” Sometimes they will claim to be a celebrity; just ask Becky Holmes, the author of Keanu Reeves Is Not In Love With You, who tells Feedzai how she had to fend off messages from “handsome soldiers” and Keanu Reeves impersonators that turned into harassment. Stringing along victims for months at a time, these hijackers of the heart create elaborate reasons for needing more and more money. Then, they break it off – leaving victims not just poorer, but emotionally and psychologically scarred.

Writing in The Conversation, Tony Jan, professor of information technology and director of Artificial Intelligence Research and Optimization (AIRO) Centre at Torrens University Australia, says “romance scams are among the most emotionally damaging forms of cyber crime because they combine carefully manufactured intimacy with financial theft – the scammers go after your heart, and then your wallet.”

50 ways (or more) to cheat your lover

Deepfakes are making romance more complicated than ever. Forty two percent of those on dating apps struggle to tell the difference between a real person and a fake profile – and the number rises to 52 percent among 18-24-year-olds.

In the words of Gus Tomlinson, chief technology and product officer at GBG, “the data suggests that identity verification is moving from an optional perk to a regulatory and public necessity. Singles wanting to make a connection, want to do so with confidence that the person or people they engage with are genuine individuals.”

“As AI tools become more accessible and it becomes increasingly difficult to distinguish what is real from what is fake, it’s in the interest of platforms to create a baseline for online safety that reassures customers that individuals on the app are there for the same reason.”

There are plenty of companies willing to step up. GBG is one. Feedzai, FaceTec, Veriff and National Hunter have also developed solutions tailored to the market.

In a recent Biometric Update piece, Dave Rossi, Managing Director at National Hunter, says that every year in the UK, there are more than 9,400 reports of romance fraud. “This equates to a loss of more than £106 million ($144 million), or around £11,200 (about $15,265) per victim on average.”

According to a 2025 report from UK Finance, £20.5 million (about $28 million) was lost to romance scams in the first six months of 2025, with almost 3,000 cases reported. A report from TSB says social media platforms accounted for 58 percent of romance fraud cases, and dating sites 42 percent. Facebook had the highest number of cases of any platform, accounting for 30 percent. Demographics-wise, 65-74 year olds accounted for the most romance fraud cases at 23 percent.

Scripted schemes superpowered by generative AI

The numbers are grim. Even so, they do not adequately convey how damaging romance scams can be. According to Tony Jan, “romance scams rely on a small number of psychological levers, applied repeatedly. Finding their victims online through various platforms, romance scammers accelerate intimacy, often expressing strong feelings unusually early. Then, they isolate their target. Often, the entire romance scam quite literally follows a script.”

Bringing the conversation off the app leads to the financial request, which could be an outright ask or some form of investment scheme. But it also gives the impression of greater intimacy – which digs the hooks in deeper, and makes them all the more painful when they’re ripped out.

Jan says deepfakes and synthetic media have accelerated the problem and introduced the need for a higher level of caution. Similar alarm about technological sophistication underpins a piece published by the University of New South Wales, Sydney, which says “romance scammers are no longer working with stolen photographs and clumsy scripts. They’re deploying artificial intelligence that operates around the clock, generating deepfake videos that smile on command while chatbots maintain dozens of personalized conversations simultaneously – each one calibrated to exploit the specific vulnerabilities of lonely hearts.”

The piece quotes Dr. Lesley Land, a senior lecturer in the School of Information Systems and Technology Management at UNSW Business School. “Deepfakes enable romance fraud by easing the ability of perpetrators to generate fraudulent profiles quickly. Additionally, agentic AI is revolutionising romance fraud by enabling scammers to move beyond simple phishing to creating highly sophisticated, automated, and emotionally manipulative, long-term deceptions.”

Shot through the heart, by the blockchain

To keep safe and avoid getting hosed, Tony Jan lists a few tips: go slow, stay on the platform, don’t share intimate pictures with unverified contacts, and watch out for “bright red flags.”

“If someone you have never met in person begins steering you toward cryptocurrency, trading platforms or guaranteed returns, disengage.”

The Canadian Anti-Fraud Centre also offers advice on its web page, and a guide to variations on the theme. “The CAFC is noticing an increase in a combination of romance and investment frauds which are often referred to as ‘pig butchering.’ These often involve pumping money into a fraudulent cryptocurrency.

Also a problem is “approval phishing.” Per the document, “fraudsters deceive you into granting access to your cryptocurrency accounts by those posing as trusted services. After they coach you on how to acquire cryptocurrency like Ethereum or Tron, they send you a fake request that appears to come from the crypto service, asking you to ‘approve’ access to your wallet. By clicking ‘approve,’ you unknowingly give control of your funds to the fraudster.”

For Dr. Land, extensive education across channels is key. “Mitigation of all scams requires awareness and education at personal, societal and governmental levels.”

Biometrics finding love on major dating platforms

Big platforms are getting the message. In recent months, Match Group, which owns many of the largest online dating apps, has strengthened its requirements for biometric verification, following a damaging investigation into how the company allegedly turns a blind eye to abusive users.

In June 2025, Match Group made it mandatory for new Tinder users in California to provide face biometrics through a video selfie when setting up an account, with biometrics provided by FaceTec. Its Hinge platform now requires new and existing users in the UK and Australia to undergo facial age estimation (FAE) to comply with the UK’s Online Safety Act and Australia’s Social Media Minimum Age Act, and biometric liveness detection from FaceTec for all users around the world.

FaceTec has also hooked up with Grindr to provide age assurance. And popular dating app Bumble has deployed optional biometric ID verification, provided by Veriff.

Online dating is now not just a dating game: it’s a mind game, as fraudsters work to exploit weaknesses and leverage the power of generative AI. In an age of know-your-customer, know-your-business and know-your-agent, it has become equally important to know-your-match, in order to avoid the sting of Fake Cupid’s cruel arrow.

Related Posts

Article Topics

 |   |   |   |   |   |   |   |   |   | 

Latest Biometrics News

 

NZ Parliamentary Committee recommends age assurance for social media

Age assurance should be required for people accessing social media in New Zealand to keep people under 16 away from…

 

EU kicks off panel discussions on social media age restrictions

The European Commission has taken another step towards regulating child safety online, organizing the first panel on age restrictions for…

 

EU can rein in AI agents with EUDI Wallets and business wallets: WE BUILD

The EU should take a coordinated approach to integrating AI agents into digital transactions, with special attention on payments, according…

 

Indonesia to ban under-16s from social media, implement standard-based age checks

Indonesia, the biggest country in Southeast Asia, is taking the momentous step to ban social media for under 16s. Communication…

 

GenKey takes over biometric passport, national ID card production in Comoros

East African archipelago nation Comoros has selected GenKey to produce its biometric passports and national ID cards. GenKey replaces Semlex,…

 

India mandates medical colleges to issue ABHA patient IDs in digital health push

India’s National Medical Commission (NMC) has directed that all medical colleges must generate and issue patient IDs to all those…

Comments

Leave a Reply

This site uses Akismet to reduce spam. Learn how your comment data is processed.

Biometric Market Analysis and Buyer's Guides

Most Viewed This Week

Featured Company

Biometrics Insight, Opinion

Digital ID In-Depth

Biometrics White Papers

Biometrics Events