Blog article
See all stories »

Schema Valididation and Enhanced Reconciliations

Based on the work I have done on reconciliations for some of the largest investment banks in the world, including those in America, Canada and the UK I'm convinced that there is room for improvement in the way that Operations and Financial Controls are implemented.

Typically, there will be a team of product managers, or operations control people or operations engineering people working with colleagues in IT  to define extracts from Front Office, Middle Office and Back Office Settlement and Sub Ledger Systems. These data extracts will be based on their own appreciation of the economic data associated with specific types of trade or position or cashflow.

On the other side of the reconciliation the data could be being extracted from another Sub Ledger system or it could be arriving on the SWIFT network from a trading counterparty or from an Agent or a Custodian.

Custodians, Agents and SWIFT themselves are more likely to be providing data in some kind of standard format. For example the majority of Custodians can send SWIFT MT536 statements for trades across a Fixed Income Trading account. Given that SWIFT mandates the structure of the MT536, all the economic data is already present, and the only repair that may be required would be if the underlying security identifier is not one of the more common ones, for example ISIN. Other repairs may be required but the security identifier repair is the most common one.

On the so called "Ledger" side of the reconciliation against a Statement, there is no obligation on the in-house IT development team to produce a pseudo SWIFT MT536 out of their Fixed Income sub ledger system, and indeed in my experience proprietary systems that run on mid-range technology platforms  are not easily configured to produce a valid SWIFT statement message, largely because few custodians or agents would use mid-range hardware based systems for business as Custodian for an Investment Bank. The custodians are more likely to have a large proprietary mainframe system and a team of support staff to ensure that their statements are SWIFT compliant.

Given that any "Industry Standard" message can comprise 10 or fewer mandatory elements or as many as 100 or more, it is easy to see that there is a potential gap between what you could reconcile between Sub Ledger and Statement or between "Our" confirmation and the trading counterparties. This is particularly true of the OTC market. But the problem can also occur within a single organisation with disparate Sub-Ledger systems for the same asset class that have been purchased on different dates by different legal entities before the banking "Group" was finally formed by acquisition or merger.

There are cases in the market where a single large UK, Swiss or American Bank has more than one subsidiary that trades with other entities in the same group on an OTC basis where there is no simple match between the messages which represent the same trade and is exchanged between them without matching.

A simple, and if I dare to say elegant, solution to this problem is to introduce a transformation process quite early in the lifecycle of the building of the control. So, when an Ops Control person defines the list of elements to be extracted from the sub ledger, it would be beneficial if he could run this model through a tool that understands the data and transforms it into the nearest equivalent industry standard message - so for Bond Trades this would be FIXML version 5.x and the sub schema would be the Trade Capture Report. It is then relatively simple to take each message thus constructed and push it through a schema validator, and then read through any resultant validation errors. It may be necessary to loop around this process a number of times until the definition of the extract is perfected - ie the extract contains sufficient data to construct a bona fide FIXML message. This would ensure that the control is being based on sufficient information as would be needed to exchange an industry standard message with the trading counterparty.

Obviously FIXML is not extensive enough to cover all asset classes, but other standards such as FPML, SWIFT and ISO20022 are being developed and will eventually allow us to use standard schema validation on all data flows into all banking operations and financial controls reconciliations, so that at least we can be sure we really do have all the necessary economic data for the reconciliation before we start to finalise the matching logic and identify and remediate the breaks.

4081

Comments: (3)

A Finextra member
A Finextra member 19 November, 2014, 16:20Be the first to give this comment the thumbs up 0 likes

Cliff,

The approach you are suggesting of using a Transformation engine (ETL) ahead of the Reconciliation tool is one that many reconciliations practitioners will be familiar with.  

Legacy reconciliation tools are primarily based around Swift and the onboarding challenge is to make your data look as close to Swift as possible.  The most efficient way to do that has been to use an ETL tool to do the transformation.  

Even when this can be made to work well there are downsides to this approach:

1.  You need experts in the ETL tool to do the mapping efficiently and effectively.

2. What if your chosen standard that you're mapping to doesn't have enough attributes?  E.g.  I've seen reconciliations that take in data that is over 500 attributes wide and all 500 need to be reconcilied.  If the standard you're mapping to only has 100 attributes then that's 400 attributes that won't have a home.  The poor configurer is left to shoe horn in data.  Combining multiple attributes into say a single text field.  In many cases a suitable home for the data can't be found and the delivered reconciliation does not provide all the Operational controls required.

3.  The ETL tool becomes an essential part of the architecture and is another moving part that will require ongoing support and maintenance raising the onging cost of runnign the reconciliation.

This approach also assumes that there are appropriate standards to map to. In my experience it is financial product innovation that introduces the most operational risk.  Organisations need to be able to apply appropriate controls to mitigate this operational risk arising from innovation.  Innovation comes first,  then best practice then finally standards.  If you're reconciling innovative financial products then you'll have to make compromises when choosing an appropriate standard.  Although that is the joy of standards,  there are so many to choose from. 

There is an alternative.  Use a reconciliation tool that has been designed to work where there are no standards and where best practice is emergent.  A tool that can hold any data of any complexity with no shoe-horning.  A tool that can analyse your data files and build an appropriate schema using modern data management techniques.

By using such a tool you can remove the need for an ETL tool in the reconciliation flow.  Configureres only need to learn one tool not two and the resulting reconciliation is not compromised since it has access to all the data attributes rather than the subset shoe-horned into a standard based message.

The speed of onboarding onto such a modern tool is an order of maginitude better than legacy approaches and enables organisations to keep pace with rapid financial product innovation. 

 

A Finextra member
A Finextra member 19 November, 2014, 17:36Be the first to give this comment the thumbs up 0 likes

Thanks Neil. I always enjoy your thoughts on Recs. I prefer to segregate data integration from data matching, to isolate the end matching application from the adverse impact of data structure and schema changes.

A Finextra member
A Finextra member 27 November, 2014, 17:30Be the first to give this comment the thumbs up 0 likes

Excellent article Cliff,

It is an interesting debate that actually affects more than reconciliation.

The same issues arise when having to make payments to multiple regions or via multiple service providers; when having to trade-report to various authorised bodies; when having to deal with custodians each having their own flavour of a standard or even to deal with the fact that the standards themselves evolve (as is the case for funds migration from SWIFT MT messages to 20022).

We implemented EAI technology for financial institutions for many years but found the ‘E’ and the ‘L’ aspects of ETL were getting simpler as technologies such as TCP, FTP, SQL, MQ etc. standardised; But the ‘T’ component conversely became more complex and the standard technologies available were not fit for this changing landscape.

So whether you keep it separate or embed it in the reconciliation application there are lots of situations beyond reconciliation where you will need to be fluent with message standard be they proprietary or published standards.

 

Now hiring