How Open Banking will blow core systems out of the water

How Open Banking will blow core systems out of the water

Banks operating on legacy architecture will find it increasingly difficult to compete in an API-driven business environment, says Hans Tesselaar, executive director at BIAN.

There are three key acronyms that have topped the agenda in banking technology discussions in recent years - APIs, PSD2 (Payment Services Directive II) and DLT (Distributed Ledger Technology). Although DLT is challenging these other concepts for media interest, the technology is still in its infancy and not yet powerful or cost-effective enough to justify its implementation within mainstream banking - particularly as a replacement for the core banking systems currently in place. With that in mind, I don’t expect to see major headlines here until significant developments arise.

APIs and PSD2 however, will continue to dominate the banking agenda this year and beyond. The key difference with these topics, which both play into the open banking agenda, is that they enable banks to innovate with the systems they already have in place. Under PSD2, a fundamental piece of payments-related legislation in Europe, digital disruption will accelerate and contribute to the re-shaping of the retail banking industry across the region. Through this initiative, we are already seeing the start of more a more competitive, collaborative and innovative banking industry.

The two key motives behind PSD2 are increased competition and therefore greater customer choice. These objectives will be achieved by facilitating market entry for regulated non-bank players and driving increased transparency and customer protection. This new ‘sharing’ culture will be underpinned by open APIs, which are compliant with the SWIFT ISO20022 open banking standardisation approach. This is recognised and compatible with banks universally and will facilitate these institutions to open up their payment services to third party payment service providers.

When one door closes, many more open

The requirement to open up data to third parties is one of the most hotly discussed elements of the PSD2 regulation, and a worrying concept for many banks, who fear the competition from flexible, tech-savvy challengers.

This development will of course increase competition as intended, but it will also present significant opportunities for banks to grow new revenue streams, capture customer ownership and progress towards an extended ecosystem. As well as facilitating collaboration with FinTech companies or broader tech organisations, developing and opening APIs will finally give banks the opportunity to adopt cloud-based solutions, enabling them to operate a more streamlined and cost effective model.

It’s become evident that the banking industry will need to build new functionalities not only to comply with the regulatory requirements but also to be able reap the benefits outlined above. APIs are a relatively new concept, and remain mysterious in function to many. There is often a misconception that APIs are just sat dormant behind the scenes waiting for someone to cut a ribbon and declare then ‘open’. The reality however, it is significantly more complicated, tied up with banks’ tangled and archaic systems. This was evidenced through a recent spot survey by BIAN, in which over 60% of respondents expressed concerns that banks will struggle to open up their APIs because of the “current state of banks’ core architecture.”

A time for change

The age-old spaghetti-like systems that banks have been relying on since the dawn of technology are finally putting a spanner in the works when it comes to innovation. Financial institutions are now faced with the costly and expensive next step of untangling their old and inefficient infrastructure in a bid to streamline their core banking processes before they can even think of opening up their systems to newcomers. Then, in order for open banking to work, each API should be designed to sync up with the core architecture. There are more cost-effective ways of reaching the same goals. In many cases the bank puts an API handler between the API World and their “legacy” backend. Sometimes such solutions are referred to as “Service Fabric”. There are some good examples available implemented by BIAN members.

The problem at the moment is that every bank is defining its own set of APIs, thereby hindering connectivity, which sort of defeats the object of PSD2: openness and transparency with data. Only a limited set of participants in the Financial Services Industry are aware of a universally adoptable reference model or taxonomy to lay out clear standard definitions for all the various banking business functions. Without such a model, it’s almost impossible for banks to visualise the different information flows within all the banking capabilities within their model, let alone how these are connected and which should be taken up for API enablement.

When the bank is sufficiently API enabled, the next steps that banks will need to take will be to decide whether to keep these various business capabilities in-house, or simply consume them off the cloud as and when required.

The problem they face right now though, is without clear sight of what they have in play, how can they possibly move ahead? Agreeing on and adopting a global standard and model for APIs will be crucial in taking this innovation forward and allowing banks to realise their full potential.

PSD2 came into effect on January 13th this year and despite forewarning from the Competition and Markets Authority for banks to be ready for open banking, six of the nine largest UK current account providers were not. Five of these asked for an extension on the deadline, with one other missing the cut-off completely. More than anything, it shows that in order for financial bodies to be compliant, they must undergo a significant overhaul. Although this will cost them in both time and money, it will leave them in a much healthier shape than they have ever been.

My hope is that the real shift that we see in 2018 will be one of mind-set, in terms of banks’ approach to innovation. In short, they should be thinking about customers first, then business, then technology. All updates and change must come from market demand.

Comments: (13)

Harri Rantanen
Harri Rantanen - SEB Transaction Services - Helsinki 26 February, 2018, 09:38Be the first to give this comment the thumbs up 0 likes

Thanks Hans!

Here as reference the fresh ISO 20022 guideline for JSON transformation within the RESTful APIs https://www.iso20022.org/sites/default/files/documents/general/ISO20022_API_JSON_Whitepaper_Final_20180129.pdf

 

Behzod Sabirov
Behzod Sabirov - Sanscrit LLP - Almaty 26 February, 2018, 09:45Be the first to give this comment the thumbs up 0 likes

The EU regulatory authorities remind me of an elephant in a china shop. An equally unwise move with their PSD2 as with forbidding earlier mobile roaming fees in Europe. Anyway, although the PSD2 is backed by, presumably, good intentions, its implementation is highly prone to data security breaches given its technical design. You don't have to a Sherlock to deduce that data shared among many parties is more likely to be lost/hacked, despite all security efforts.

As to the article's central point, I can't get why banks have to necessarily switch to cloud-computing because of the PSD2 requirements.

Melvin Haskins
Melvin Haskins - Haston International Limited - 26 February, 2018, 10:371 like 1 like

How does this analysis fit with a recent survey published in Finextra showing that 73% of bank customers said that they definitely would not use open banking. The other 27% did not say that they would.

Behzod Sabirov
Behzod Sabirov - Sanscrit LLP - Almaty 26 February, 2018, 10:46Be the first to give this comment the thumbs up 0 likes

An Open Banking should be a natural initiative from the banking community, not pushed authoritatively.

Ole Hansen
Ole Hansen - Temenos - Copenhagen 27 February, 2018, 07:192 likes 2 likes Fully agree! Banks need to replace their legacy core in order to enable a fully digital (and cost efficient) customer experience. Digital to the core. But might be a bit biased here....
A Finextra member
A Finextra member 27 February, 2018, 08:041 like 1 like

I agree that the #legacydragon in the data centre is holding back progress. From our experience of migrating clients from legacy Core Banking to 21st Century is that the 20th Century application lifecycle around the legacy. You can't replace what you don't understand.  Capturing legacy behaviour in a modern testing solution is step 1 to successful replacement

Ketharaman Swaminathan
Ketharaman Swaminathan - GTM360 Marketing Solutions - Pune 27 February, 2018, 13:391 like 1 like

tl;dr. 

Total BS. In my three decades of observing technology adoption at BFSI, I've heard people make the same statement w.r.t. Unix minicomputers, OOPS, CORBA, SOA, Web, Web 2.0, RIA, Mobile, Social Media, etc. During this period, banks have not only continued with their mainframe CBS but have launched innovative products, introduced cutting edge technologies, outperformed wannabe TELCO and fintech disruptors and, above all, remained the most profitable sector in Fortune 500.

IMO, people with vested interest in replacing mainframes should do a better job of shilling their alternative offerings. 

João Bohner
João Bohner - Independent Consultant - Carapicuiba 27 February, 2018, 13:421 like 1 like


@James Tomaney,

"You can't replace what you don't understand."

And, frankly, you will not want to replace the behavior of the "legacy dragon", so you don't need to spend resources understanding it.
As most opinions, discussions converge blaming the "dragon", I'll repeat part of my comment on the Nick Fernando's blog Legacy - Good or Bad?

= = = =

"It’s vital to rebuild that capability (to understand the 'legacy landscape') otherwise the journey ahead could be very bumpy indeed."

From my experience, this is a waste of time and money.

I've had a solution for a very complex international Citicorp system.
People bought the idea and hired CapGemini to make a 'functionality report' on the old system.
One year later, hundreds of pages of gap report, original managers relocated, one more book on the shelf and no more replacement project (Our senior management ... has altered our plans to replace...).

= = = =

The right approach to cope with 21st Century is to draw a clear Business Scenario, and then make a plan on how to get there, Corporately.

(Not think out-of-the-box. Forget the box!)

A Finextra member
A Finextra member 27 February, 2018, 13:51Be the first to give this comment the thumbs up 0 likes

@Joao Bohner

Absolutely agree that replacing what you have isn't the game, but turning off what you don't understand leads to unexpected results.

Building something new and migrating also requires resources (both human and capital) that has to be found from somewhere and most organisations have to reduce the TCO of the old in order to generate the desired ROI of the new. 

I base my caution on spending the last 10 years replacing legacy in a number of payment sites (Core Banking, & Channels like Cards & ATM).  

João Bohner
João Bohner - Independent Consultant - Carapicuiba 27 February, 2018, 18:15Be the first to give this comment the thumbs up 0 likes


@Ketharaman,

it's not the question of replacing the mainframes, 'au contraire', new kind of big irons will be needed in the future, to handle integrated, corporate processes and data.

The matter is of replacing the old VERY EXPENSIVE way of handling the financial business (+ of 80% of operations budget, just to maintain the current systems), even they remained the most profitable sector in Fortune 500.
Deutsce Bank: € 4 Billion/year
Lloyds Bank: £ 8 Billion/year

     No 'unbanked' will endure that cost!

The replacing of the old banking way, will be done by a cheap handling of the financial business, in which that BILLIONS of unbanked will be allowed to use banking systems.

And that requires the rewiring the current 'legacy dragons' by eliminating the manual and batch processes, drastically reducing operating costs.

In my proposal of the "Bank of the Future" Architecture, huge 'big irons' will be required, to handle the Single Source of Knowledge and the Single State Machine, to handle the processes and storage replications, Corporately.

@James Tomaney,

You won't "turning off what you don't understand", unless your new option is tested, approved and running!

And for "reducing the TCO of the old in order to generate the desired ROI of the new.", you must strongly invest in the new, eliminating the old...



Open for discussions in this exciting matter!

Rodney Farmer
Rodney Farmer - Realtime Transactions - Little Rock 03 March, 2018, 10:49Be the first to give this comment the thumbs up 0 likes

We can all agree that PSD2 Open API is not easy to integrate to old CBSs.  And, both speed and security must be maximized in the process. 

Each bank has its own way of integrating to the CBS and regulatory guidelines for integrating to the world....with the expectation of publishing their interface specifications.  The killer app will be the application of Stronger Customer Authentication in a realtime interface using Open APIs to perform SCT-Inst transaction originated by millions of devices that are not necessarily known to the bank by regulated providers also not known to the bank.  A new industry is developing around this effort.  In the meantime, banks will be cautious and PISPs will have limited success.  

To your point about the customers' interest in using Open Banking, the customer is not likely to know if he/she is using Open APIs or not.  

As for replacing the CBS, the product/functional silos of these systems are being re-written with each day that passes.  Incumbents are breaking apart their Universal Banking Systems in order to look like micro-services that can be license function-by-function while others are starting from the ground up building new banking systems with modern architecture, data structures, coding techniques, etc.   The cost to do either have reached an equlibrium where many challengers are attempting to build from scratch rather than implement a UBS from an incumbent.  Creating micro-service farms that can rationalize the functions and interdependencies of existing UBSs is the challenge. Time will tell if it is even possible, much less, can they deliver the features in a distributed fashion where best-in-class services are bought and sold among banks and vendors to service targeted customer segments.  But, the march to replace the the UBS has begun. 

More interesting is to determine what, if any, CBS functions are really needed any longer.  The Millennials appear to be saying they do not care about the nuanced features we perfected and used to differentiate or products from the competition at a time when life was not digital.  They just want an account with basic core functions.  Differentiation is handled on the front-end with service levels like nothing we have seen since the days of personal, customer care in the bank branch while interacting with the world's best CRM tool, our local banker.  With technological/digital focus on customer care and mobile devices to deliver the goods, only now, we can expect to be fully satisfied by the level of care provided by FIs in the digital age.  Technology is no longer about the cost/income ratio but  rather digital customer engagement and lifetime customer value. 

Eventually, the CBS services will be distributed micro-services delivered by highly specialized teams/systems capable of flexing and scaling their product(s) to the demands of the consumer.  Banks will specialize and segment the markets in order to cut costs and up their game for "their" target customers.  

My guess is 10 years to reach the tipping point and another 10 before today's CBS is fully retired.  

Ketharaman Swaminathan
Ketharaman Swaminathan - GTM360 Marketing Solutions - Pune 05 March, 2018, 08:421 like 1 like

@RodneyFarmer:

Is there a typo in your last line? Did you actually mean 10 months? Having worked for a CBS company, I've been hearing predictions of the death of CBS for 10-15 years, still CBS is around. For something like CBS to go away needs a disruptive shock to come along. If such a shock happens, in today's fast moving world, CBS will get uprooted in a few months. If such a shock does not happen, CBS will only get more entrenched than it is today and won't go away even after 20 years. Millennials, Microservices, and many other drivers mentioned by you are all flavors of the current season. They themselves will get replaced by some other flavors of the season in the next 3-5 years. No way they'll impact CBS after 10-20 years.

A Finextra member
A Finextra member 05 March, 2018, 08:55Be the first to give this comment the thumbs up 0 likes

@Joao - I don't think we are as far apart as you think.  I don't for example think (or say in my comments) that mainframes will necessarily be replaced - more the applications (CBS, Cards Switches, ATM driving etc) that run on them (and on HP NonStop, Stratus etc) will be replaced. 

It is my experience of delivering such replacements that the applicaitons have often been customised so extensively that no one really knows exactly how they work - this is the issue I refer to when I say that we need to understand what we have before we try to turn off (the application). 

I agree with you that the winners in this phase of the renewal will be those that design new business services to meet the new market requirements and then seek to eliminate the old. 

But as others have commented, this will be a long process over many years and the old and new will co-exist for sometime, which is why better management of the old is still important - if only to smooth the transition to the new.

Trending

Related News