Blog article
See all stories »

Basel IV: No time for banks to take foot off the peddle

Regulation of the financial markets is a never-ending process of modification, improvement and extension. Nowhere is this clearer than in the case of the Basel Accords.

It is now five years since the Basel Committee on Banking Supervision developed Basel III in response to the global financial crisis. In the intervening years, the Basel Committee has decided the requirements of Basel III leave room for improvement and has set about updating them. The modifications the committee is developing are so significant they have been dubbed (you guessed it) 'Basel IV'.

So what is the changing? The answer can be broken down into four topics - operational risk, market risk, credit risk and large exposures. In each case, the Basel Committee wants to enhance the methods banks must use to calculate their risk. 

So far the committee has:

  • published a new framework for calculating operational risk
  • initiated a "fundamental review of the trading book", which includes plans to make the standardized model for calculating market risk closer to the internal model
  • created a new standardized approach to counterparty credit risk (SA CCR) and a new framework for the credit risk calculation itself
  • published a consultation paper on revising the way large exposures are calculated. 

The 'Basel IV' reforms are scheduled to come into force from 2017 onwards. This may seem like a long way off - and banks certainly have enough work to be getting on with in the meantime. However, the changes that will be introduced by 'Basel IV' are so fundamental they require close attention now.

The consultation papers that have been published make it clear that 'Basel IV' will have an impact not only on the way banks set up and run their risk calculations, but also on the data they use and the data they report.

The 'Basel IV' risk calculations are complex. For example, the plans to make the standardized model for market risk more similar to the internal model will increase – not reduce – the complexity of the calculations. As the calculations become more complicated, they will place greater demand on the performance of banks' compliance software.

In order to run the new calculations, banks will need new types of data. 'Basel IV' introduces new data definitions and requires more granular data and higher data volumes than Basel III. Banks need to consider whether their data management infrastructure offers the flexibility and scalability to accommodate these changes.

Finally, new calculations will mean new outputs. As a result, banks will need to ensure they can easily implement new reporting templates and quickly respond to regulatory demands for new information about their trading books, exposures and own funds.

When it comes to the Basel Accords, it is clear nobody can afford to rest on their laurels for long. 

6112

Comments: (1)

Boyke Baboelal
Boyke Baboelal - Asset Control - Amsterdam 09 February, 2015, 09:26Be the first to give this comment the thumbs up 0 likes

It is vital that the increase in the volume and complexity of data, the processing of it, and the connectivity to various sources and down-stream users, is effectively and efficiently managed. This is to ensure that not only service levels are met and compliance with internal and regulatory requirements is achieved, but costs are also kept under control.  The regulatory burden is already significant – not only from a capital and margins perspective, but also in respect of the resources required to produce quality risk reports. 

The most effective and proactive organizations will be those that have placed a dynamic data management platform at the center of the complex multi-source, multi-system distribution process – taking inputs from vendor feeds and departmental sources, testing them for quality, enriching them, and routing them through the platform to downstream systems and users. The ability to handle large sets of reference and historical data will be key, as well as the ability to efficiently create, visualize, check and use more complex data structures.