/security

News and resources on cyber and physical threats to banks and fintechs worldwide.

Starling warns of rise in voice cloning scams

Voice cloning scams – where fraudsters use AI technology to replicate the voice of a friend or family member – could be set to catch millions out, according to new research from Starling Bank.

1 comment

Starling warns of rise in voice cloning scams

Editorial

This content has been selected, created and edited by the Finextra editorial team based upon its relevance and interest to our community.

The study found that over a quarter (28%) of UK adults say they have been targeted by an AI voice cloning scam at least once in the past year.

Starling says faudsters can now use voice cloning technology to replicate a person’s voice from as little as three seconds of audio, which can easily be captured from a video someone has uploaded online or to social media.

Scam artists can then identify that person’s family members and use the cloned voice to stage a phone call, voice message or voicemail to them, asking for money that is needed urgently. In the survey, nearly 1 in 10 say they would send whatever they needed in this situation, even if they thought the call seemed strange.

Despite the prevalence of this attempted fraud tactic, just 30% say they would confidently know what to look out for if they were being targeted with a voice cloning scam.

To help combat the fraudsters, Starling Bank has launched the Safe Phrases campaign, in support of the government’s Stop! Think Fraud campaign, encouraging the public to agree a ‘Safe Phrase’ with their close friends and family that no one else knows, to allow them to verify that they are really speaking to them.

Lisa Grahame, chief information security officer at Starling Bank, comments: “People regularly post content online which has recordings of their voice, without ever imagining it’s making them more vulnerable to fraudsters. Simply having a Safe Phrase in place with trusted friends and family - which you never share digitally - is a quick and easy way to ensure you can verify who is on the other end of the phone.”

To launch the campaign, Starling Bank has recruited leading actor, James Nesbitt, to have his own voice cloned by AI technology, demonstrating just how easy it is for anyone to be scammed.

Commenting on the campaign, Nesbitt says: “I think I have a pretty distinctive voice, and it’s core to my career. So to hear it cloned so accurately was a shock. You hear a lot about AI, but this experience has really opened my eyes to how advanced the technology has become, and how easy it is to be used for criminal activity if it falls into the wrong hands. I have children myself, and the thought of them being scammed in this way is really scary. I’ll definitely be setting up a Safe Phrase with my own family and friends.”

Discover new challenges and opportunities artificial intelligence brings to the banking sector at Finextra's first NextGenAI conference on November 26 2024. Register your interest here.

Sponsored [New Impact Study] Mastering the Transition to ISO 20022: Strategies for Compliance and Automated Testing in Financial Services

Comments: (1)

John Davies

John Davies CTO at Incept5

I think many of the banks have their heads in the sand. Barcleys for example "If you find it difficult to remember password information, we can confirm your identity using voice recognition technology when you call us, so you don’t have to go through security questions". It's pretty easy to get past this step backward in security with today's voice-cloning technology. The voice can be cloned from a short recording, a family video for exmaple and voila, you know the family security questions, you have the voice and you're in.
Frankly the only way we're really going to protect from most of this is multi-factor and digital signing, things we've already used and now trying to replace with easier options. Easier means less secure and more vulnerable, security is NOT easy.

[Impact Study] Adding GenAI To Your Fraud Prevention StrategyFinextra Promoted[Impact Study] Adding GenAI To Your Fraud Prevention Strategy