Community
At this year’s Credit Scoring and Credit Control Conference in Edinburgh, Steve Finlay examined whether the industry’s reliance on economic response models (ERMs) remains the most effective way to forecast losses, and how affordability-focused methods could provide a stronger alternative.
ERMs have long been the standard for linking macroeconomic conditions to default rates. They remain widely used, but their shortcomings are increasingly evident. Often they depend on a narrow set of variables and produce results shaped more by historic assumptions than by current or future conditions.
Steve highlighted an alternative: a bottom-up, affordability-centred approach. This combines long-run macroeconomic data with account-level insights to build a more realistic view of how households respond to shocks.
Economic response models (ERMs) have been the industry standard for linking macroeconomic conditions to default rates. They remain widely used, but several weaknesses stand out:
Narrow indicators. Reliance on GDP or unemployment risks missing other key drivers of behaviour.
Outdated assumptions. Patterns reflect older economic thinking, producing counter-intuitive outputs.
Adoption by convention. ERMs remain standard largely through habit and familiarity.
Lack of granularity. Aggregate-level modelling overlooks the differences between households who are resilient and those more exposed.
So, if ERMs are the accepted approach, but not the most effective, how do we move towards something more robust? Here’s where affordability comes in.
An affordability-based framework links external shocks to individual households’ ability to manage their finances.
The approach works in two stages:
Use macroeconomic data to forecast changes in factors such as wages, pensions, housing and energy costs.
Apply those changes at customer level to identify how households are affected.
To support this, two complementary data sources are essential: recent account-level transactional data, and more many years’ of publicly available time series data. Used together, they provide both the granularity and the historical perspective required to make the approach effective.
The central hypothesis is that the customer’s ability to pay is the factor most sensitive to changes in the wider economy. By modelling how macroeconomic shifts feed through into affordability, it becomes possible to produce default forecasts that better reflect real household behaviour.
This approach recognises the uneven impact of economic change. Some households are resilient, while others are highly vulnerable — and capturing that difference is essential for accurate forecasting.
In his presentation, Steve also identified multiple areas where affordability-based modelling could be applied:
One of the first areas where this approach can add value is IFRS 9. A bottom-up view of customer finances can enrich forward-looking probability of default estimates, providing a more grounded basis for scenario design and outcome measurement.
By linking macroeconomic shocks to affordability, firms can build stress testing scenarios that reflect the real pressures households are likely to face. This can lead to more accurate assessments of portfolio resilience and sharper insight into capital requirements under adverse conditions.
The same modelling can also be applied at an individual level. It can help identify which customers are financially resilient and which are more exposed to changes in the economy. That understanding creates opportunities to intervene early — whether through tailored repayment plans, additional support for vulnerable groups, or proactive communication about upcoming pressures.
Perhaps most importantly, this approach shifts the perspective from broad economic indicators to detailed household behaviour. It recognises that not all customers are affected equally, and it enables lenders to design responses that reflect those differences.
Deployment will require proof-of-concept work with lenders holding large retail portfolios. Portfolios such as credit cards, personal loans and mortgages are strong candidates, provided there is sufficient default history to validate outputs.
Each component of the model can be validated in the same way as existing risk models, with monitoring and recalibration built in. This ensures regulatory defensibility alongside practical insight.
Over the next three to five years, Steve said that he expects regulators to encourage more nuanced approaches to loss forecasting. The direction of travel is already clear. Today, the focus is firmly on climate risk. Tomorrow, it may be conflict risk, or even the need to plan for another pandemic.
Affordability-based approaches provide a flexible framework to adapt to new risk categories. By linking macroeconomic shifts directly to household-level impact, they offer a path to more relevant and practical forecasting.
Loss forecasting will never be an exact science, but it can be made more reflective of the pressures customers face. That was the central point of Steve’s session: the opportunity to evolve methods so they better capture real household dynamics.
This content is provided by an external author without editing by Finextra. It expresses the views and opinions of the author.
Muhammad Qasim Senior Software Developer at PSPC
28 November
Hussam Kamel Payments Architect at Icon Solutions
Nick Jones CEO at Zumo
26 November
Shikko Nijland CEO at INNOPAY Oliver Wyman
Welcome to Finextra. We use cookies to help us to deliver our services. You may change your preferences at our Cookie Centre.
Please read our Privacy Policy.