Blog article
See all stories »

What comes before regulatory compliance?

Its old news: there is a lot of effort being expended on the landslide of regulatory change.

But before you can even start to consider complying with new (or for that matter more established) regulations, surely consideration needs to be given to the operational and technical processes in place today?

I don’t necessarily mean basic operational processes - you would fully expect a clear understanding of those - but more those that need to be enhanced in order to generate information for reporting and other regulatory requirements.

Much has been written about ‘big data’ in the recent past, but just as much consideration needs to be given to data availability, format and content.

It is only when you start to investigate and question the availability of data, when you start to understand that it is rarely freely available without cost or internal infrastructure related challenges, and when you start to request further information that the costs start to mount.

Further, when you actually analyse the availability of data sets, you will invariably find that they are in a multitude of disparate formats: our conversations with customers indicate anything up to 150 different formats for varying sized financial institutions.

Why does this matter? It’s one thing needing to comply with new regulations in terms of reporting, or indeed, automating certain operational processes. But if you need to compile that information from multiple data sources and formats, this adds a cost related to the complexity of the solution you need to source or build (recent Celent analysis predicts at least a $50 billion spend), and could introduce elements of operational and regulatory risk.

So what are the options for managing this issue? The strategic path would lead you to lobby, probably with others, the providers of those data sources, to get them to adhere to a standardized way of sharing the data. However, with each end client or recipient of information requiring different data formats, depending on their internal technology requirements, this will not be a quick fix.

Perhaps a better option is to install an application that can consume these data elements and ‘normalise’ them into one usable format for your purposes - even better, providing you the ability to add new format translations yourself without relying on often expensive professional services charges, therefore circumventing the issue in quick, manageable and future proof order.

Using this solution you now have the data, but what do you do with it?

Big data is said to enable analysis of business trends, determine quality and prevent risk. However, once you have the disparate data, and perhaps have even aggregated and normalised it as suggested above, what do you do with the mountain of information at your fingertips? Is there a need for another layer of technology to actually turn that data into something usable?

I have a genuine concern on this point. With so much data being reported in the future landscape, are we prepared as an industry to manage that reported risk in a way that ensures institutions can no longer fail?

6682

Comments: (0)

Blog group founder

Member since

0

Location

0

More from member

This post is from a series of posts in the group:

Post-Trade Forum

The Post Trade Forum's aim is to propagate debate and discussion between senior practitioners in Post Trade Operations in the global securities market; to bring about increased awareness and knowledge across both buy-side and sell-side financial institutions in financial products and be a focal point for firms and practitioners to air views.


See all

Now hiring