What are deepfakes? Understanding modern scams

  0 Be the first to comment

What are deepfakes? Understanding modern scams

Editorial

This content has been selected, created and edited by the Finextra editorial team based upon its relevance and interest to our community.

While AI and generative AI technology have made leaps in streamlining financial services over the last decade, fraudsters have made similar strides to use this technology for their advantage. Most nefariously, deepfakes becomes increasingly hard to differentiate from reality. So what is deepfake technology? And how is it affecting financial services?

A portmanteau of ‘deep learning’ and ‘fake’, deepfakes are defined as synthetic media produced by generative AI based on a person’s likeness or voice. In banking, these deepfakes can enable sophisticated social engineering scams.

2025 research finds that deepfake fraud attempts have increased by a staggering 2,137% over the past three years across Europe. In the UK, Ofcom reported that two in five people say they have seen at least one deepfake in the last six months, and crucially, only one in 10 are confident in their ability to spot them.

Deepfake threats in banking

There are multiple ways in which deepfake fraud affects financial services.

1. New account fraud

Fraudsters can use deepfakes to appear as a legitimate person and open new bank accounts. These accounts can then be used for malicious purposes, for example as money mules. Synthetic identities are especially hard for banks to spot during new account fraud as they sit at the very beginning of the customer journey.

2. Account takeover

Deepfakes can be used by fraudsters to impersonate an existing account holder and gain access to their account. For example, a reporter from the Wall Street Journal used AI to create a synthetic version of herself and see if the AI would gain access to her bank account. It did.

3. Phishing scams

Deepfakes make phishing scams increasingly harder to spot. Particularly in romance scams, deepfakes are used create audio and video content to enhance deception. In 2024, a woman in Scotland was scammed out of £17,000 through deepfakes used in a romance scam. Similarly, fraudsters can use AI-generated voices to impersonate friends and family of a victim to defraud them of money. Concerningly, a McAfee survey found that 70% of people weren’t confident that they could tell the difference between a real and cloned voice.

4. C-suite impersonation

Deepfakes can make CEO fraud increasingly harder to spot. In 2024, a financial clerk in Hong Kong was tricked into attending a call with several members of the team, including the CFO. However, all of the people attending were deepfakes. After the call, he transferred a total of $200 million Hong Kong dollars—about $25.6 million—as agreed on the call, only to later realise it had been a scam.

What is a bank’s role in preventing deepfake fraud?

Deloitte finds that “banks have been at the forefront of using innovative technologies to fight fraud for decades. However, a US Treasury report found ‘existing risk management frameworks may not be adequate to cover emerging AI technologies.’”

Relying on advanced technology and AI is only one piece of the puzzle in solving for deepfake fraud. Another crucial aspect is cooperation and data sharing between organisations. No one organisation will be able to effectively address fraud by itself—criminals are moving too fast.

Lastly, and importantly, financial institutions need to increase education for both employees and customers. Former Finextra reporter Niamh Curran aptly writes that, over lunch with her father, she “explained to him that there were new abilities for scammers to clone my voice, call him up and have it sound like me asking for money. [..] He didn’t even know that was a possibility.  More banks need to be educating their customers. There are warnings when making payments and boxes which pop-up, but there is scope for more to be done.”

The bottom line is that there will never be a silver-bullet solution against fraud, so financial institutions need to continually accelerate their self-learning to keep pace with fraudsters, and continuously invest in education and awareness amid this growing threat of deepfake scams.

Channels

Keywords

Comments: (0)

Editorial

This content has been selected, created and edited by the Finextra editorial team based upon its relevance and interest to our community.