Blog article
See all stories »

Can financial services afford the cost of avoiding legacy retirement?

Over the past 20 years, every time a new technology is introduced or evolved, financial services firms have been bootstrapping legacy systems try to stay afloat. But, increasingly, they’re finding that’s not sufficient. Today businesses need deeper analytics, better customer insight and to get more from their data. But that can’t be done with their current ultra-structured and inflexible core system environment.

So, now, if you’re a financial service or insurance company, one of your top priorities is to get data out of those legacy systems and then slowly decommission them.  But this is a risky proposition in enterprise technology.  It’s risky in terms of potential data loss.  It’s risky in terms of the systems up and downstream that need access to that data.  And finally, the process is delicate as users need to acclimatise to access to new data and systems in the revamped applications.

So what can be done?  Foremost, of all the things that can be done, there needs to be a way to insulate the organisation from the risk of this shift. In my opinion, this can only be done with a hub technology. A hub architecture means that data sources are decoupled from destinations, enabling applications to publish once and effortlessly support a one-to-many relationship with consuming applications. Crucially the hub structure must be able to assist with both publishing data to a central catalogue or subscribing to the data within it, as well as the integration of the data with other systems. Data can be validated once on publishing and then replicated to other applications. Providing that publishing and subscription buffer between old and new systems, as well as the connection to all other dependent systems that provide data to the process, is critical. If you can get a technology that does all that, alongside data quality processes, then you’re onto a winner.

Most of all, don’t listen to vendors that tell you that legacy migration is a “click a button and move the data” type of operation. Financial service and insurance businesses need to be vigilant when it comes to the systems that are connected to the old and new sources. The enterprise needs a robust data migration practice, including great technology. Your data hub should allow you to be connected not just to the data changes as you migrate, but control the integration and quality tasks and processes after the move.

So let’s listen to the experts. The majority agree that legacy retirement or migration is a big investment to be made. But can financial services afford the cost of not doing it?




Comments: (2)

Ketharaman Swaminathan
Ketharaman Swaminathan - GTM360 Marketing Solutions - Pune 29 July, 2015, 16:36Be the first to give this comment the thumbs up 0 likes

Legacy hasn't come in the way of banking becoming the most profitable industry or of introducing a slew of innovative products over the decades like credit card, ATMs, Mobile RDC, CDO, CDS, and so on.

FORTUNE Magazine article

6 Reasons Why Banks Can't Transform Legacy Applications

Now I know open system vendors facing stunted growth because banks have stuck to legacy. As a Finextra reader, should I care?

Graham Seel
Graham Seel - BankTech Consulting - Concord 31 July, 2015, 20:29Be the first to give this comment the thumbs up 0 likes

There is no question that ultimately the commercial banks need to migrate away from legacy technologies. They've known that for decades. The problem is the age-old "how do you fund infrastructure" problem - creating a business case is notoriously difficult. There would be great value in some bright accountant coming up with a credible way to value future operational risk plus future lost revenue opportunities, plus future reduction in retained revenues, that could be attributed to failure to update technology. PhD these anyone?

Now hiring