Long reads

Five behaviours that indicate a social engineering scam

Níamh Curran

Níamh Curran

Senior Reporter, Finextra

Social engineering and authorised push payment (APP) scams have been on the rise. For banks this creates a litany of problems, including compromising customer trust, difficulty navigating repayment schemes, and the changing nature of these attacks.

The number of people falling victim to these scams rose during the pandemic, with UK Finance finding the number of APP scams up by 22% during 2020.

“Social engineering scams are probably the biggest threat we’re facing” says Martin Salter, senior fraud manager at Nationwide. “Ten years ago, the problem was account takeover. Somebody rings up, pretends to be you, and tries to get into your account. We can train our staff not to do that because we've got access to them all the time. We can put prompts up on the screen. But the customer I can't train.”

Some banks are pushing for customer education, but this hasn’t solved the problem as scammers convince victims to dismiss warning pop-ups banks have designed.

Financial services and social engineering scams

There are three main types of social engineering scams: information harvesting, Remote Access Tool (RAT) scams, and real-time payment scams. Information harvesting is one of the oldest forms of social engineering scams, but is still very common, often taking the form of a phishing or vishing attack.

RAT scams can use impersonation schemes to encourage a victim to download software that allows a cybercriminal to take over their device and initiate a payment.

Real-time payment or APP scams often involve impersonation of a representative at a trusted organisation such as a bank or a government agency. Scams of this nature often take advantage of people at vulnerable moments.  As noted in the journal Frontiers of Psychology, “Social engineering cyberattacks are a kind of psychological attack that exploit weaknesses in human cognitive functions.”

Information harvesting often plays a role in real-time payment and APP scams, as the more a criminal knows, the easier it will be to coerce a victim. The problem facing banks with these scams is that the victim themselves are often the ones who authorise the payment. Due to the fact that the payment is authorised by the victim, banks are under no legal obligation to repay customers - as this is not technically fraud.

Ayelet Biger-Levin, senior VP, Market Strategy at BioCatch comments: “I think that financial institutions are trying their best to do what’s right by their customer.  But social engineering scams create a grey area. The issue is in many of these cases, the legitimate customer is performing the transaction under the guidance of a cybercriminal or unwittingly providing their credentials to give the cybercriminal access to their account. So it’s a really tough call for banks to make.”

Are banks tackling these scams effectively?

To aid their customers, many banks have signed up to the Contingent Reimbursement Model (CRM). However, CRM has proven to be imperfect, Biger-Levin illustrates: “While there has been a push by consumer advocates to mandate it, the industry is pushing back on refunding in every case as there are cases of people falsely claiming to have been scammed and failures in not refunding people who had legitimately been scammed. The challenge is distinguishing between false and legitimate claims.”

Banks lament the fact that while they are managing with these scams the best they can, the existing systems they have are not equipped to deal with them. Salter comments: “I've never been better equipped to stop fraud. I've got tools that tell me whether it's your device, whether it’s you, even whether you're pressing buttons like you normally press buttons. The problem is, if someone can dupe you into making the transaction, you'll pass every test that I've got because it's you. All my tests are designed to find whether it's you or not.”     

However, a study conducted by BioCatch suggests that it is possible to highlight some behaviours which may help banks stop a scam in progress. The report identifies five behaviours that can be used to detect this kind of scam when it is in process. Even if the authentication may show that the correct person is making the payment, certain actions and patterns may show a consumer is being guided or coerced.

Biger-Levin comments: “There are very clear behaviour patterns associated with genuine and fraudulent activity within an online session. When a customer operates in an online account under the guidance of a cybercriminal, behavioural signals such as duress and distraction are presented. BioCatch has studied this at length and identified several behaviour patterns that are indicative of criminal activity.”

While each of these patterns on their own do not necessarily imply that a scam is in progress, when combined with hundreds of other data points, and compared to the norms of the genuine user population, insights like these patterns work to build risk models that can accurately detect advanced social engineering.

1. Typing Patterns

One might think that the way you type is something which is not distinguishable, however, typing patterns can actually provide a great deal of insight. For example, fast typing indicates to a bank that a user is engaged, but typing errors may imply that a customer is frustrated.

Typing patterns can even indicate whether a customer is using their long or short-term memory when inputting information. Importantly for social engineering scams, it can also indicate whether a user is receiving instructions from a cybercriminal.

Segmented typing patterns are the main indicators of receiving dictation. Normally, a consumer would have all of their details at hand, however, if they are talking to a scammer, they may be receiving instructions on what to do such as the account number of an account for a victim to transfer their money to. The patterns one might see when a user is being directed by a scammer can be compared to the usual typing rhythm of the account holder when making a payment.

According to research conducted by BioCatch, segmented typing patterns are present in one out of every 20 impersonation scams, compared with just one in every 500 genuine sessions.

2. Mouse Doodling

A user’s mouse movements can indicate a great deal about their mental state. An engaged user makes fast, direct and smooth mouse movements, whereas a confused user may make a number of mouse revisits and multiple clicks on the same location, and a hesitant user makes much smaller mouse strokes.

Excessive mouse doodling is a major indicator that a social engineering scam is in action. On average, confirmed impersonation scams see six doodles per session. In an ordinary banking session, only 1% of the population exhibit six or more doodles in a session, however, in cases of fraud this figure rises to 38%.

Biger-Levin notes, “This behaviour is logical given the long waits, pauses and dead time caused by a cybercriminal explaining or dictating instructions to a victim.”

3. Session Length

An ongoing social engineering scam can significantly lengthen a user’s session where 10% of sessions involving an impersonation scam last longer than 30 minutes, compared to only 1% of genuine sessions.

That percentage increases when it comes to social engineering scams which involve the use of a Remote Access Tool (RAT) which take over a victim’s computer. In scams where the use of a RAT is detected, 12% of the sessions last for more than 30 minutes. This likely accounts for the time it takes a victim to download the software.

4. Payment Context

There are numerous markers throughout the customer payment journey which might indicate a social engineering scam – from the navigation flow to the time it takes to initiate the payment.

Many banks are already on the lookout for unusual payees or references. Banks can see the account’s previous activity, whether the payee matches, the amount of money being paid, and the IP address of your computer. Salter comments that banks are “looking at values, we're looking at payees, we're looking at the type of activity that you do in the account.”

However, the timing of how this information is entered should also raise red flags to banks. For example, the majority of genuine users initiate the ‘Add Payee’ process within five minutes of a session starting, indicating that there is a conscious decision to make a payment, and the action is completed almost immediately. In contrast, 42% of sessions involving impersonation scams take over 30 minutes to complete the ‘Add Payee’ process.

5. Active Call

A user being on an active call while navigating through a session in their mobile banking app is a major indicator that a social engineering scam is underway. In more than a quarter of impersonation scams the victim was on an active phone call during their mobile banking session. This can be seen as a major warning signal to banks, as this is only occurs in less than 1% of the genuine banking population.

Beating scammers at their own game

Presently, social engineering scams are a top concern for some of the biggest banks in the world.

“We're all very worried about social engineering. It's very hard to stop because it is your customer, and we're in the business of allowing our customers to do transactions,” states Salter.

“We've got lots of tools that are very good at preventing unauthorised transactions, but when it's an authorised transaction, we've got to try and outfox the criminal. But when we've got limited information, we can't always rely on what the customer tells us, particularly given the possibility that they've been told to tell us something that's not true.”

With the additional ability to track some of the patterns mentioned above, banks may be able to stop scams while they are in progress. On their own, each of these patterns does not necessarily signify a scam, however, observing multiple indicators of this behaviour can help banks put the pieces of a scam together. Biger-Levin concludes: “Overall, an aggregation of hundreds of such indicators will provide a strong indication to drive distinction between genuine and coerced transactions.”

Download your copy of the Finextra & BioCatch report -
Stemming the tide of Social Engineering Scams with Behavioural Insights

Comments: (0)