Join the Community

22,192
Expert opinions
44,256
Total members
394
New members (last 30 days)
213
New opinions (last 30 days)
28,728
Total comments

Can financial services afford the cost of avoiding legacy retirement?

  1 2 comments

Over the past 20 years, every time a new technology is introduced or evolved, financial services firms have been bootstrapping legacy systems try to stay afloat. But, increasingly, they’re finding that’s not sufficient. Today businesses need deeper analytics, better customer insight and to get more from their data. But that can’t be done with their current ultra-structured and inflexible core system environment.

So, now, if you’re a financial service or insurance company, one of your top priorities is to get data out of those legacy systems and then slowly decommission them.  But this is a risky proposition in enterprise technology.  It’s risky in terms of potential data loss.  It’s risky in terms of the systems up and downstream that need access to that data.  And finally, the process is delicate as users need to acclimatise to access to new data and systems in the revamped applications.

So what can be done?  Foremost, of all the things that can be done, there needs to be a way to insulate the organisation from the risk of this shift. In my opinion, this can only be done with a hub technology. A hub architecture means that data sources are decoupled from destinations, enabling applications to publish once and effortlessly support a one-to-many relationship with consuming applications. Crucially the hub structure must be able to assist with both publishing data to a central catalogue or subscribing to the data within it, as well as the integration of the data with other systems. Data can be validated once on publishing and then replicated to other applications. Providing that publishing and subscription buffer between old and new systems, as well as the connection to all other dependent systems that provide data to the process, is critical. If you can get a technology that does all that, alongside data quality processes, then you’re onto a winner.

Most of all, don’t listen to vendors that tell you that legacy migration is a “click a button and move the data” type of operation. Financial service and insurance businesses need to be vigilant when it comes to the systems that are connected to the old and new sources. The enterprise needs a robust data migration practice, including great technology. Your data hub should allow you to be connected not just to the data changes as you migrate, but control the integration and quality tasks and processes after the move.

So let’s listen to the experts. The majority agree that legacy retirement or migration is a big investment to be made. But can financial services afford the cost of not doing it?

 

 

External

This content is provided by an external author without editing by Finextra. It expresses the views and opinions of the author.

Join the Community

22,192
Expert opinions
44,256
Total members
394
New members (last 30 days)
213
New opinions (last 30 days)
28,728
Total comments

Trending

Boris Bialek

Boris Bialek Vice President and Field CTO, Industry Solutions at MongoDB

Enhancing Digital Banking Experiences with AI

Barley Laing

Barley Laing UK Managing Director at Melissa

Reducing the impact of AI-driven fraud in 2025

Now Hiring