Blog article
See all stories »

BCBS 239: How to simplify your data architecture

The crux of BCBS 239 is to lay down standards for risk data aggregation and reporting. Data is at the heart of it and it's an opportunity to look at the cost and synergies of data and improve the decision making process. Regulatory fatigue aside, the January 2016 deadline is tight. While not currently mandatory for all, there are indications that it is considered best practice for all. Broadly speaking there are four key areas: risk data aggregation, risk reporting, supervisory review and governance and in a four part series of blogs we will review each in turn. First on the list is risk data aggregation.

BCBS 239 dictates that data should be aggregated on a largely automated basis. With so many data sources, legal entities and geographies, this means simplifying current architectures into a single hub. The aim is to enable banks to make much better use of all the data they gather. A central platform will also provide a comprehensive assessment of risk exposures at a global consolidated level.

The quality of data is also under the spotlight as controls around data accuracy are being tightened. Managing the quality of data is made harder as it is a shared resource. Empowering data creators and users enables them to take ownership of the data. The same standards and rules need to be applied across the enterprise so that data doesn't need to be cleaned multiple times. Risk should have the same data as the back office, finance, operations or legal.

The system needs to be both flexible and scalable. This is the only way accurate risk data can be produced on an ad hoc basis or during times of stress/crisis for all critical risks, a key requirement of BCBS 239. There also needs to be sufficient depth and breadth of data for the reporting needs of different parts of the business to be satisfied. Different parts of the business need information presented to them in different ways and the system needs to be able to handle such diversity.

While all of these capabilities may already exist in pockets throughout the bank, best practice now needs to be executed on an enterprise-wide basis. Sounds like a challenge? Perhaps but an enterprise-wide risk data management platform is do-able. 

Comments: (1)

A Finextra member
A Finextra member 12 July, 2014, 06:43Be the first to give this comment the thumbs up 0 likes

This fractured data is single largest risk for a reporting institution. In our experience as risk practioners we have seen mission critical data on excel sheets. The decision makers are taken unawares.  data aggregation is primary to accurate reporting. Big data is all fine, what I am saying is complete data.

From that perspective the 239 is timely and an aid to determine LCR.