Internal data models could face greater scrutiny from regulators as they become more familiar with trade reporting regimes and managing the increase of data now held inside trade repositories.
Regulators face a difficult challenge in attempting to quickly establish how they will analyse and interpret this information, as well as deciding how they intend to use it. There is a distinct possibility that once regulators begin to make sense of the
data given to them, they may see things they don’t like and discover that the data models inside many banks are actually quite poor.
The data models inside investment banks have evolved in a very siloed way, where different businesses have different standards. Data is held all over the place and is sometimes duplicated. This partially explains why trade reporting has proved to be so costly
for banks. The necessary response is that there needs to be a massive overhaul.
What do the regulators want?
The risk facing banks is that regulators may want to understand more about these internal models and processes. This is similar to what we have seen with the Basel Committee on Banking Supervision’s Principles for Effective Risk Data Aggregation and Reporting
(BCBS239). The increase in the volume of data submitted to regulators will prove a challenge to process, as they aim to identify systemic risk and provide greater transparency in the market. It will require a lot of work to achieve these goals, particularly
in terms of the technology needed.
There is certain to be an increase in a focus on data standards. This will be driven by both the regulators and senior management inside organisations. Internal drivers will emphasise the need to have reports available for intra-day decision making around
asset management, financing and liquidity.
In the future, banks will have to manage their balance sheets and risk in a more controlled and stringent way. They will need to be able to make decisions around asset optimisation, liquidity, and funding on an intra-day basis. In most cases their current
data models cannot support these requirements. The question we need to ask is whether regulators will acknowledge that this is a huge challenge for banks and will they give them sufficient time to implement any new data standards that emerge?
How can banks improve their data models?
To meet the demands of this increase in scrutiny, banks will need to improve their data models. To begin with, the value of data inside an organisation needs to be recognised and promoted and data handlers need to be incentivised to value the data they produce
or touch. It is rare to find a culture of producing and maintaining high quality data in financial service organisations – something which will need to change.
Banks have to improve their data governance and they should begin by establishing a ‘data organisation’ approach that is supported and empowered by senior managers. From an organisational point-of-view there needs to be a structure that overcomes traditional
boundaries and enables a consistent data operating model to take hold.
Banks need to agree on a comprehensive data operating model, along with comprehensive data standards. The most important objective of an enterprise data operating model is to elevate the value of data within the enterprise. Data communication and incentives
to promote positive cultural change within the organisation should be encouraged, whilst unstructured data manipulation should be discouraged.
Ultimately regulators want to understand data model taxonomies and processes for aggregating data. Are these internal processes and data supporting or leading to better data aggregation and risk reporting practices. If these models are not up to the standard
expected, regulators will want know why and what steps and strategies firms intend to take to improve models and processes.
This is an opportunity for banks to start investing in cleaning up their data models and simplifying them. With the potential of an extension being made to BCBS239, the focus on data standards will increase, forcing banks to invest more resources in cleaning
out their data models. The message is clear - banks should start this process sooner rather than later.