Long reads

Digitising reconciliation as a business case in itself

Alex Duggan

Alex Duggan

Capital Market Solutions Head, Cognizant

Taking a big-picture view, the data funnelling through financial institutions can be seen as the strategic asset that allows them to make the correct decisions around market execution, optimal funding, timely settlement, and payments, demonstrating compliance, and improving client experience through accuracy, time to market, and analytics.

However, the increased introduction of products, complexity, and volume, into banks’ systems has become a significant operational hurdle to overcome. The global, complex nature of large banks has resulted in a spaghetti of systemic architecture that needs to be processed and supported.

Banks, particularly tier one institutions, are investing significant sums in efforts to remain regulatorily compliant with swathes of nuanced, non-standards market infrastructure reconciliation processes, as they’re required to normalise data attributes from their internal systems, the market, and clients. Yet, despite the vast amounts being spent, banks are struggling to come up with a viable business case to rationalise a more dramatic, front-to-back transformation.

Transforming reconciliation processes – the challenges

Building a business case and finding the funding

A significant problem faced by large investment banks with global businesses is that they have multiple areas in which reconciliations are carried out, resulting in a bifurcated processes and architecture.

Understanding the landscape is a challenge. While people understand that reconciliation processes  should be centralised and standardised, something of a ‘political bonfire’ can emerge within financial institutions, around how the future target operating model should look. Stakeholders understand that there is potential to save anything from 30% to 40% of the total cost of ownership, but they don’t always understand all of the components that need to be fixed in order to realise the saving.

Additionally, as the upfront cost of these projects tends to be significant and the cost reductions are often only realised 18 months to two years down the line, getting the project approved at the outset can be a huge hurdle. On top of this, banks are reluctant to proceed with projects which will perhaps provide a 30% reduction in three years’ time. Banks are wary of buying-in to lengthy, multi-year projects which, historically, also often see delays.

Financial institutions need to consider all options. They know the broader technology is there to deliver an optimal front to back process and there are experienced third parties with a demonstrable track record of delivery willing to co-invest and guarantee targeted cost reduction targets through different support models.

Devising a target operating model

When there are hundreds of people, products and systems to be considered for such large-scale projects, it becomes very difficult to see the woods for the trees. In order to rectify the sticking points within an operating model, it is vital to break the process into modules and smaller components.

It is just as important for stakeholders to be as aware of the process of decommissioning the old operating model, as it is for them to understand the new model. There is a widespread issue with companies setting themselves up to fail in their approach to shifting to a new model. Companies will often decide on a new architecture or system they want to move to, and proceed to have the same people who have been working on the previous system devise the requirements for the new system. Rather than redesigning and thinking in a modern way about what the business outcomes should be, they’re merely moving legacy processes onto a new platform.

Cultural reluctance to move toward progressive models

Historically, banks have been highly motivated to invest in building their technology platforms in-house. This has resulted in a reluctance to surrender complete control to third party vendors when establishing and managing technology partnerships. Some of the more conservative banks tend to struggle with the idea of reliance on third parties, as they believe that they should be in control of all facets of the operating systems, rather than partnering with third parties who have the experience and accelerators to help the bank move forward.

In essence, stakeholders within banks tend to believe that they are giving up responsibility for their operational losses, their clients, their risk profile, to a centralised group. This is very much a cultural attribute that is challenging to move away from within incumbent institutions, often crippling efforts to modernise.

AI, ML and achieving digitisation of data

While moving to a strategic system will offer more functionality and capability, the data that it receives must also be of a certain quality for the system to be able to do its job. Therefore, having tools which are able to transform data into standardised formatting is a key part of a strong reconciliation solution.

Considering that most banks will have multiple front-office systems feeding-in reconciliation requirements, market data, client files, along with more complex reconciliations, there are a multitude of attributes that need to be reconciled.

Having the ability to transform all of this data, pre-reconciliation, is one of the keys to success.

From here, it becomes a question of finding ways to improve match-rate during reconciliation. While an 85% match rate may seem high for a team processing millions of transactions each day, a 15% exception rate leaves a significant number of exceptions to resolve.

If banks can improve this figure to 95%, they will significantly reduce resource requirements both in the central reconciliation teams and the broader groups working to resolve the identified exceptions. Having an overlayed AI component as transactions are processed, for instance, means that the solution can be learning suggested matches. When there are exceptions, AI can lean on historical data to automate the suggested resolution type or automatically assign it to a person to resolve.

There is a significant amount of process automation and machine learning which can be done to clean up the data transformation and reduce manual intervention throughout the reconciliation process.

Once reconciliation has occurred, orchestration becomes the next priority, and the ability to identity what the problem is and delegate it out to the right person through AI learning, significantly streamlines the process.

Finally, if banks have the data and analytics to show what the root causes of the majority of exceptions, they are able to take a deep dive into the data and make the necessary upstream fixes to rectify the problem at the source, and continuously improve.

Why are retail banks trying to resolve these problems?

Retail banks have always had to perform a number of different reconciliations, and with the high volumes experienced in the UK, they also have to manage these reconciliations across multiple systems.

Further, given the regulatory push toward stronger consumer protection, banks need to demonstrate to regulators that they are undertaking adequate checks before reconciliation is completed on a real-time basis. Given the variety of reconciliation types, this presents a significant challenge, but failure to do so can ultimately lead to regulatory fines.

A flexible system that can execute these different types of controls once, using the same data, is more efficient than a system that carries out multiple core reconciliations followed by manual checks.

In practice, this can be seen when a consumer overdraws on their account. The bank has to inform the client hours before market cut off that they will be charged an overdraft fee to allow them to take remedial action. If the bank fails to do so, the fee cannot be charged. While it may not sound difficult to achieve, with hundreds of thousands of accounts to monitor, banks rely on automated solutions to protect this revenue stream.

The business case for automation therefore becomes clear, as banks which aren’t able to meet this criteria risk missing out on a significant revenue stream.

Building a more resilient and efficient data transformation strategy

When people think of reconciliation, they tend only to imagine matching positions and balances, but the scope of data attributes being reconciled and can be complex, and the demonstration of data integrity and control through a bank’s process is much broader than that, and therefore demands a much more holistic solution than merely plugging in a new system.

Banks must build a data strategy that centres on data transformation, to digitally normalise data as it feeds through the bank’s systems. Not only will this position banks to be able to compete in a digital financial services ecosystem, but it provides the opportunity for banks to reduce their total cost of ownership across operations and technology by anywhere between 30 and 40% in an optimal standardised TOM. This significant saving is a hugely compelling feature.

Before moving to a strategic platform, it’s important to consider what is trying to be achieved.  Technology is no longer the problem, it has the capability to solve banks’ needs, but it must be approached with the right plan and design up front. This is where banks tend to fall down.

The overall data transformation strategy – orchestration and management reporting where banks can solve issues on a real-time basis – is what reduces risk for financial institutions.

Operational losses from reconciliations represent a sizable cost for banks. With a full front-to-back solution with effective orchestration, banks can reduce their ageing profile by 50+%. Importantly, banks need flexibility to support future requirements so that once the problem is fixed, it’s fixed forever. It is in banks’ long term interest to build AI or ML tools which constantly improve their processes.

Striving for an optimal front to back solution is key, but equally important is the decommissioning of legacy systems. Otherwise, banks will have find themselves with a sophisticated new system, running in parallel to the old system at great expense.

 

Comments: (0)