Community
AI tools have spread rapidly across financial services. Long gone are the days of purely manual workflows. A joint Bank of England and FCA survey in 2024 found that 85% of financial services surveyed were already using or planning to use AI - and just one year on, this figure is almost certainly higher.
Yet widespread availability does not automatically translate into meaningful capability. AI systems may be everywhere, but the skills required to use them effectively and safely are lagging behind.
AI use cases now extend across fraud detection, transaction monitoring, onboarding, credit decisions, customer service, risk scoring and regulatory compliance. In many workflows, the first draft or first analysis is now produced by AI - delivering major efficiency gains to those who can harness it effectively.
Consequently, human roles are shifting away from manual execution and towards higher-value supervision and judgment. This transition requires new capabilities in oversight, interpretation and challenge.
Despite this rapid shift, a survey of financial services employees found that 55% have never received any official training on AI tools. Workforce capabilities are failing to keep pace with rapid AI deployment.
Across institutions, four capability gaps stand out:
The UK government’s latest workforce report highlights that, paradoxically, the very people tasked with overseeing AI deployment - compliance, risk and legal - often receive the least training. Yet these are the teams responsible for protecting customers and demonstrating accountability to regulators.
If employees can't take full advantage of AI technologies, it’s not just individuals who lose out - companies and the wider economy fall short of their full potential. But without strong governance, ethical literacy, or the ability to interrogate AI decisions, firms cannot effectively supervise these systems, challenge them when things go wrong, or explain their AI use to regulators.
The absence of these skills amplifies the risks of irresponsible AI adoption. Customer harm becomes more likely and vulnerabilities can emerge across the financial system - infrastructure that is critical for everyday life.
The EU AI Act makes AI literacy a legal obligation for in-scope organisations and Member States, requiring training to ensure that people understand how to use AI safely.
This is a positive step, and meeting this duty will require:
AI’s potential in financial services is transformative, but only if the workforce is equipped to use it and govern it effectively.
Today, the AI skills gap stands as both a major barrier to unlocking AI’s full potential and a significant risk that cannot be overlooked.
This content is provided by an external author without editing by Finextra. It expresses the views and opinions of the author.
Bo Harald Chairman/Founding member, board member at Trust Infra for Real Time Economy Prgrm & MyData,
17 November
Glenn Fratangelo Head of Product Marketing at Sardine
Sam Boboev Founder at Fintech Wrap Up
16 November
Shushant Sudarshan Executive at Appinventiv
13 November
Welcome to Finextra. We use cookies to help us to deliver our services. You may change your preferences at our Cookie Centre.
Please read our Privacy Policy.