Technology is a starting point — the end goal is to use it to regain control of change management.
There’s nothing new with banks being compelled to transform to keep relevant. As enablers of economic growth, banks are constantly under pressure to morph into something else, due to either globalisation, regulation, advance in technology, or all of the
above. The difference here is the sheer pace of change which digital technology imposes, as it tends to blur the present and the future.
If the new front-line is digital, does this mean that banks should turn themselves into software development houses? Do they need to increase their IT budgets several times over? Not necessarily. What it does entail though, is to reduce the equations that
produce branchless services, seamless payments, speedy trade finance and compete with Fintechs to address the needs of digitally native clienteles -in other words, a massive simplification of IT.
Simplification should not be confused with downsizing or standardisation, as both consist of narrowing down the scope of possibilities. Mathematics define simplification as obtaining identical results with a shorter equation. Likewise across the finance
industry, simplifying is about enabling rather than shrinking, reducing the number of variables not the number of solutions, cutting costs rather than value. The ultimate goal is to capture or retain market shares and ultimately keep banks’ returns on equity
(RoE) well above their cost of capital.
Among the banks that invested massively to provide on-line services to retail customers, the winners were the ones capable to swiftly integrate with their back-ends, minimising change management and project risks. In its Asia Pacific review of Banking 2019
(1) McKinsey consulting estimated that banks heavily relying on digital banking reduced the expense for signing up new clients by 50% compared to branch-based relationships.
As digital banking now reaches corporate and transaction banking, the challenge is not about replacing sales desks with aps., but again rethinking the back-end. It involves real-time pricing, treasury advisory services, derivatives, margins, collateral and
more. The mere idea of an algorithm proposing services or investments across asset classes and client activities involves consuming information from a broad range of sources.
This premise rules out the vast majority of bank organisations as they are currently structured. Bank innovation is incremental in nature, therefore typically delivered by adding new layers of software on top of the existing ones (2). The resulting system
stacks use disparate data and interfaces. Building open banking, smart algorithms or machine learning on top of those is unlikely to challenge the least disruptive of the Fintechs.
Algorithms are expected to be open for business 24/7, around the world and across jurisdictions. Aside from high availability, APIs and data security, this raises challenges such as real-time views of liquidity positions, immediate settlements and payments,
preventing regulatory arbitrage, and these are just the appetizers. No bank was built this way.
Digitisation leverages technology which assumes free, ubiquitous availability of data -a huge mismatch with today’s reality at most financial institutions. To be able to design tomorrow’s value propositions, banks must able to simplify their IT now.
It’s all about timing and enabling. As a definition we could describe simplification as “allowing an organisation to timely adapt to clients’ changing needs, in a compliant and controlled fashion”. As needs usually arise from competitive pressures, the
temptation is high to rush and acquire technology, aps. or infrastructure, … adding new layers on top of the existing ones. Enabling does not start with spending more but freeing people and budgets.
Industry sources (3) generally estimate that 70% to 80% of banks IT resources and budget is involved in maintenance, a substantial part of it being allocated to upgrading the existing systems. Whether in-house developed or acquired from vendors, systems
grew in size and complexity over the last decades, leading to onerous, dragged-out upgrades and lingering defects. Simplifying an organisation therefore starts with reducing maintenance, which requires shorter, less frequent upgrades.
In turn, reducing maintenance tasks starts with cleaner integration of systems stacked up on each other, and interfaces built as after-thoughts. Most vendors who boast the merits of their APIs generally stay at arm-length of third-party systems and prefer
to let clients develop, let alone support, their interfaces themselves. The results are entangled sets of disparate, poorly documented utilities unlikely to provide the data interchange necessary to build an adaptive digitised client experience.
The complexity of upgrading business critical systems is the trickiest and most expensive issue that the banks of all sizes face today. Treasury and risk systems, for example, were initially designed for either front or back-office purposes, or specific
instruments in mind. As the focus of the last 2 decades was to achieve straight-through-processing (STP) across an ever-increasing range of assets, they grew incredibly complex and monolithic to the point where new releases sometimes bring new defects as well
as fixes. As a striking illustration of the challenge, some vendors issue press announcements celebrating the completion of a client system upgrade, as if reaching their own Everest. How exciting!
System consolidations which aimed to bring “spaghettis” and clustered diagrams within a single “de-siloed” matrix of functionalities often result in much greater complexity as they merely encapsulate old code stacks within a new framework; sometimes programming
languages get to even co-exist within a "new" system!
Simpler architecture requires simpler systems. Fintechs build on stack of open sources, to limit their developments to the essential business logic required, thus speeding up completion and minimising support. Modern components are designed for integration
and connectivity. Removing complexity involves doing less with independent pieces of software, rather than trying to encapsulate as many functions as possible into a single one.
WHAT DOES NOT SIMPLIFY?
The first misconception is that standardisation can make things simpler. In the domain of banks’ treasury, front, back or risk systems, for example, pre-configuring processes to suit many banks along some kind of golden standard could seem to simplify implementations
and maintenance. The truth is -it does not. In most cases standardisation actually brings further complexity.
Bank business processes evolve empirically as functions of their client needs, technical constraints and local regulations. They are unique in nature. Implementing a pre-configured solution often involve de-configurating the standard, only to re-customise
to client needs. This is made even worse with small organisations with little experience of change management, where trying to force standard solutions is perceived as daunting project risk and leads to internal reluctance.
Trying to map a bank’s process on an existing software application is necessarily more complex, if at all possible, than mapping software on the process. To create best practice, flexibility and agility do a much better job than out-of-the-box standards.
The second misleading intuition is that consolidating a maximum of functionalities into a versatile monolithic system can streamline operations. Here too, the reality has been very different; having less systems doing more things brought a lot more complexity
and costs than even a do-nothing strategy would have. A simple explanation is that there were reasons in the first place why several systems were set to co-exist. Trying to fit financial instruments with very different types of sensitivities or data processing
into a single application usually leads to developing workarounds that, at scale, will likely prove problematic and require bespoke support. Other types of consolidations, such as trying to shoehorn real-time valuations or computer intensive risk calculations
into systems originally designed for trade processing amount to teaching elephants to dance.
Of course, without getting too caricatural, some levels of standardisation, integration or consolidation do bring benefits. Yet if the ultimate goal is to simplify, the basic rule to keep in mind is that any endeavour carried out at the expense of agility
eventually brings more costs and more complexity, not less. So, the question to ask when contemplating the white boards should be "does this make us more agile, in the future, when things will change?"
HOW TO SIMPLIFY?
How can a bank simplify its IT architecture, compete with Fintechs, boost profitability, cut costs and thrive in the digital economy? How to support fluid strategies and find the shortest path to a Target Operating Model (TOM) which is yet to be defined
The above comments and experience driven observations have highlighted modern technology, such as containerised micro-services, cloud computing and open sources, to develop nimble architecture and agile strategies, as opposed to recycling old code made up
as a new standard. Yet technology alone will not cut it. It’s not about being the first one bringing the new tech in house, but finding the shortest way toward the infrastructure that enables tomorrow’s bank. Therefore, it involves first and foremost to regain
control of transition and change management. An excessive reliance on vendors leave the bank with little control on their architecture and how it evolves.
A microservice architecture allows for deployment of new services independently, possibly using cloud-based technology. This provides new paths to change management as an organisation can quickly deploy new service environments without much impact on the
existing infrastructure. Intuitively it would seem that adding new components to an existing portfolio would rather complicate it, but in a dynamic perspective it is exactly the opposite.
Let’s take the example of a firm that needs to implement the ISDA SIMM initial margin approach, transforming a simple bilateral flow into a complex data interchange involving multiple custody, data sources and pricing libraries, securities, cash collateral
and impacts limits as well as XVA calculations. A project of this nature touches on just about every functional part of front-office, processing systems, valuations, risk management and inventories. Whether the bank runs on a cluster of disparate applications
or on a handful of monolithic systems does not make a huge difference. The only certainty is that multiple dependencies will add to the project duration, contingencies and cost; and that is a most favourable scenario where no system upgrade is required.
Imagine now decentralising the new process into a new, independent, specific micro-service designed for multilateral connectivity, managing multiple price sources, extracting necessary data and running computer-intensive tasks with outside the existing trade
entry, risk or processing architecture. Adding one or several microservices, as opposed to asking existing software to carry out tasks it was never designed for, removes a lot of dependencies and considerably de-risks the project.
As far as they can share content and connect to one another, multiple microservices individually implemented under a same data model constitute a form of “ap-store” -a much simpler and agile approach than clear upgrading stacks or monolithic giants.
The main point is not the new technology, but to restore the bank's agility to timely address business and regulatory needs, with full control on the migration, project cost and dependencies.
(1) McKinsey & Company Asia Pacific Banking Review July 2019
(2) Deloitte 2019 Capital Market & Banking Outlook
(3) Computer Weekly / IDC 2018