/identity

News and resources on digital identity, trust, biometrics and Secure Customer Authentication.

NextGen Nordics 2024: Can behavioural biometrics fix fraud?

Moderating her second panel of the day - Fraud prevention and AML: the need for behavioural biometrics in this instant age - Finextra’s senior reporter Niamh Curran spoke to Megan Heald, senior project manager at NICE Actimize, John Sam-Kubam, senior vice president at Crown Agents Bank, Beju Shah, head of Nordic centre at the Bank of International Settlements, and Robert Woods, director international market planning and financial services SME at LexisNexis® Risk Solutions.

  1 Be the first to comment

NextGen Nordics 2024: Can behavioural biometrics fix fraud?

Editorial

This content has been selected, created and edited by the Finextra editorial team based upon its relevance and interest to our community.

Outlining first the challenges FIs and banks face in the current fraud environment, Sam-Kubam highlighted that faster payments has not necessarily increased the risk of fraud itself. Rather, it has made the recovery of the funds harder. While modern authentication methods are increasingly capable of detecting and mitigating fraud, the real challenge lies in social engineering.

“Banks historically took the view that that’s a problem for the individual, right?” Sam-Kubam commented. “If you've been somehow coerced into making payments, it's your problem. Regulators take a different view. The view is that if you provide banking services, you have a duty of care to the people who are using your services. The areas of risk today for most banks and fintechs is where individuals who are authorised to make payments and are properly authenticated have been compromised. And it's very difficult to detect that.”

Woods agreed, stating: “If you operate in a Faster Payments environment, you need to have strategies and tools to look at the mules and how you can manage the scammers. The UK is the scam capital of the world. There is loads of controls coming in: confirmation of payee, contingent reimbursement model, SCA - all it's done is push fraud into scams.”

When asked about which technologies can tackle this problem, Shah commented there are many options - AI, ML, privacy technologies - but the key might lie in collaboration models. He talked about the Bank of Settlement’s Project Aurora, which explored new ways of combating money laundering taking a data driven approach. Within the project, they tested a combination of different AI and privacy technologies together, and then simulated it in a siloed, national and globally collaborative environment. “We were able to show that we could detect three times more money laundering activities, but also reduce false positives by 80%, using these technologies, and there was a vast difference between a siloed approach and a collaborative approach.”

Opinions diverged on the effectiveness of industry collaboration. While the idea is hailed, in practice banks and FIs are sanctioned as individuals - meaning that when it comes to managing financial risk, companies are more inclined to think about what they have to do as a business, not as an industry.

This is where the conversation switched to biometrics. While the panellists agreed that behavioural biometrics are a force of good, there are lots of things that need to be considered.

“Rather than all it behavioural biometrics, I’d rather call it behavioural analytics of behavioural intelligence,” Woods stated. “Because it's how you interact with your device, whether it be a laptop or a mobile, it's behaviour that's unique to you, but it's not like a fingerprint or a face. It's unique as to how you are engaging with that interface.”

And this information is crucial to security vendors. Heald gave examples how biometric intelligence can help improve confidence in risk scoring: “It’s a force for good, and especially if we're talking about changes in behaviour. Usually, you open your phone with a few confident clicks and your task is done. But if you're being coached, if someone is on the phone telling you you need to make this payment, you're slower. And if that can be picked up by behavioural analytics, that helps us, which helps our customers.”

Heald additionally gave an example of call centre agents recognising fraudsters voices, yet as Shah rightly pointed out, even voice detection is becoming increasingly difficult in this day and age where voices can be easily spoofed. Pattern detection is being changed by AI, where even the largest deepfake databases are struggling to keep up.

On this point, Curran brought up privacy and ethical considerations around behavioural biometrics.

Sam-Kubam stated that firms are protected and have a right to certain information, as long as compliance teams have solid policies in place and get the right training. Strict guidelines around collection, maintenance and deletion of data are necessary to keep user privacy. Shah added on the principle of data minimisation and collecting data on a strict purpose basis. Woods made the crucial point that, when collecting data, it’s not about recording passwords but rather about the cadence between keystrokes and how a user interacts with the interface, while Heald emphases the importance of flexibility to put in controls to not always challenge based on hard rules.

Lastly, commenting on maintaining user experience, Sam-Kubam commented: “Its interesting. I was at a conference talking about fraud risk last year, and one of the fintechs mentioned that the largest complaint they get is that an account has been blocked. The second largest complaint they get is that an account wasn’t blocked in time. The reality is, with the data we have at the moment, it’s very difficult to get it right. Hopefully AI will be able to address these issues in the future, but at the moment, it is a major challenge.”

Sponsored New Event Report – Natural Capital Finance

Comments: (0)

[On-Demand Webinar] PREDICT 2025: The Future of AI in the USFinextra Promoted[On-Demand Webinar] PREDICT 2025: The Future of AI in the US