Blog article
See all stories »

Four Components of Risk Management and Data Excellence

In my previous blog, I addressed the new requirements of the new directive of risk data aggregation and risk reporting from the Basel Committee. In many of the banks I meet, the risk function tends to still be seen as a compliance function rather than an active part of the portfolio management of the bank (it is certainly true in developing countries of Asia).

Considering the investment in technology, and the more exhaustive requirements of the regulators across the world, one could be tempted to look at synergies that might exist between the regulatory ‘have to’ and the initiatives that can actually make a difference in the running of the organization. In particular, many of these initiatives are driving a closer integration of the Risk and Finance function.

For example:

  • Stress Testing regulations are requiring the re-statement of pro-forma Income Statement and Balance Sheet under stressed assumptions. What was once a pure risk exercise is now a Risk and Finance and Capital Planning initiative.
  • IAS9, requiring the calculation of Impairment and Macro and General Hedge accounting.
  • Performance Measurement initiatives such as the implementation of Risk Adjusted Performance Measurements (RAPM) such as RAROC, bringing together the results of the cost of fund calculations, risk and profitability.

There are several additional internal drivers that are likely to complicate any of these initiatives, specifically the requirement to have a Statutory Reporting and a Management Reporting in the field of finance that satisfies both internal and regulatory requirements.

Considering the investment required, let’s talk about the technology components needed to roll out initiatives which overlap Risk and Finance?

  1. A unified data model: the foundation of such a platform should consist of a unique Financial Service data model and data management framework that is both complete and ready to use. The physical data model covers all the areas of a bank’s risk management needs, thus truly enabling a bank to establish a “single source of truth” for risk data. It would cover critical risks identified by the Basel Committee, such as credit exposures (including derivatives), trading exposures and positions, liquidity risk and operational risk factors. The Financial Services data model would allow institutions to capture data at the most granular level and provides detailed definitions for a comprehensive set of banking book and trading book instruments. Combined with customisation possibilities, this naturally ensures full compliance on adaptability. This model would have all the elements required for Risk and Finance applications, ALM, FTP, TP, Balance Sheet Planning, Market, Credit Risk, Loan Loss Forecasting and Provisioning…
  2. A Data Quality & GL Reconciliation Framework: The unified platform would offer two major components to ensure the accuracy and integrity of risk and Finance data. First, a data quality framework enables the bank to ensure the quality of the input data in a systematic manner by putting in place the robust controls requested by the regulators. Second, a dedicated GL reconciliation module ensures that risk data is “reconciled with bank’s sources, including accounting data...”
  3. An Analytical Workspace Area: The platform should not only provide a single source of truth for the risk and finance data but also have a single mechanism where the calculation gets executed. This is usually not the case, but taking a view on the data model and having multiple engines to execute calculations that are ultimately to be brought back together would make a lot of sense. This would also ensure that the there is no proliferation of data marts in multiple calculation engines scattered across departments.
  4. A Results Area: This unified platform having fully integrated its data model, both logical and physical, its calculation layer, and the next component should create a single unified results area and reporting technology to ensure that departmental or tool specific reporting solutions are standardized.

In the next blog, I will talk in greater details about this common computational layer which has truly been the neglected child in these transformational opportunities.

5574

Comments: (1)

A Finextra member
A Finextra member 14 August, 2013, 10:36Be the first to give this comment the thumbs up 0 likes

John lays out a clear "nirvana" position.  My view, based on 30 years' experience in preciely this area is that the legacy nature of institutions' systems and the prolferation of systems as a result of acquisitions and mergers makes this nigh on impossible to achieve.  In fact  would go as far as to say that I do not know of any Financial Instititution of signifcant size and complexity globally that has been able to achieve this. 

Some have tried and many have failed.  Basel II was seen as the ideal driver to make this happen, and now Basel III is being touted as the next opportunity to get it right.

The sensible institutions adopt a compromised approach - they realise that they can never create a true unified data model unless it is at an aggreagated level.  The understand that there will always have to be a "datamart" structure below this - generally hubbed by geography or product group.  They realise that this only makes sense if reconciliation is managed at each level.

There are several high profile transformational programmes underway in the City right now.  All seek to solve the problems John highlights and all focus on the needs of Treasury, Finance and Risk.  All have found issues with data acquisition and data quality and realise that these are the key to being able to deliver something that satisfies numbers 3 & 4 on John's list

 

Now hiring