Join the Community

24,284
Expert opinions
40,818
Total members
341
New members (last 30 days)
206
New opinions (last 30 days)
29,342
Total comments

The AI skills gap: adoption in finance is outpacing human capability

AI tools have spread rapidly across financial services. Long gone are the days of purely manual workflows. A joint Bank of England and FCA survey in 2024 found that 85% of financial services surveyed were already using or planning to use AI - and just one year on, this figure is almost certainly higher. 

Source: Bank of England and FCA, Artificial Intelligence in UK financial services (2024)

 

Yet widespread availability does not automatically translate into meaningful capability. AI systems may be everywhere, but the skills required to use them effectively and safely are lagging behind.

AI is reshaping how work is done

AI use cases now extend across fraud detection, transaction monitoring, onboarding, credit decisions, customer service, risk scoring and regulatory compliance. In many workflows, the first draft or first analysis is now produced by AI - delivering major efficiency gains to those who can harness it effectively. 

Consequently, human roles are shifting away from manual execution and towards higher-value supervision and judgment. This transition requires new capabilities in oversight, interpretation and challenge.

The skills gap

Despite this rapid shift, a survey of financial services employees found that 55% have never received any official training on AI tools. Workforce capabilities are failing to keep pace with rapid AI deployment.

Across institutions, four capability gaps stand out:

  1. Practical use of AI systems: Training provision has not kept pace with demand, meaning many employees simply don’t know how to use AI tools confidently or safely. The Financial Services Skills Commission reports a 35% gap between demand and availability of AI skills. In practice, this means staff may not know how to write effective prompts, validate outputs, or adjust inputs to reduce hallucinations. 
  2. Governance: Many lack the skills to oversee, manage, and document AI systems in a way that meets regulatory expectations. The UK government’s AI Skills for the Workforce report identifies governance as a major gap, with many employees unsure how to assess whether an AI model is high-risk or low-risk, or how to demonstrate that an AI system aligns with internal policies.
  3. Ethics: A recent systematic review of AI ethics in banking highlights that fairness, bias mitigation, explainability, and customer-impact considerations remain underdeveloped. Many staff lack frameworks to identify ethical risks or understand when a human-in-the-loop review is required.
  4. Critical interpretation: Teams often struggle to interpret AI outputs and communicate them clearly. One cross-sector study found that 58% of workers have relied on an AI output at work without checking its accuracy. This leads to over-reliance, limited challenge when systems behave unexpectedly, and difficulty explaining automated decisions to customers or regulators. 

 

These gaps are most acute in compliance and legal teams 

The UK government’s latest workforce report highlights that, paradoxically, the very people tasked with overseeing AI deployment - compliance, risk and legal - often receive the least training. Yet these are the teams responsible for protecting customers and demonstrating accountability to regulators.

Source: Skills England / UK Government, AI skills for the UK workforce (2025)

 

If employees can't take full advantage of AI technologies, it’s not just individuals who lose out - companies and the wider economy fall short of their full potential. But without strong governance, ethical literacy, or the ability to interrogate AI decisions, firms cannot effectively supervise these systems, challenge them when things go wrong, or explain their AI use to regulators. 

The absence of these skills amplifies the risks of irresponsible AI adoption. Customer harm becomes more likely and vulnerabilities can emerge across the financial system - infrastructure that is critical for everyday life.

Closing the gap

The EU AI Act makes AI literacy a legal obligation for in-scope organisations and Member States, requiring training to ensure that people understand how to use AI safely.

Source: The EU AI Act, Article 4 (AI literacy)

 

This is a positive step, and meeting this duty will require:

  • Practical training on when and how to use AI tools day-to-day;
  • Governance and ethics training for compliance, risk and legal teams;
  • Senior-level capability in oversight, accountability and responsible deployment;
  • Cross-functional training rather than siloed expertise;
  • Learning designed around real workflows, not theoretical deep dives.

AI’s potential in financial services is transformative, but only if the workforce is equipped to use it and govern it effectively.

Today, the AI skills gap stands as both a major barrier to unlocking AI’s full potential and a significant risk that cannot be overlooked.

External

This content is provided by an external author without editing by Finextra. It expresses the views and opinions of the author.

Join the Community

24,284
Expert opinions
40,818
Total members
341
New members (last 30 days)
206
New opinions (last 30 days)
29,342
Total comments

Now Hiring