Community
The insurance industry is ripe with innovation. Tempted by the fruits of technological change, insurers seek greater opportunities for data collection, smart, adaptive policies and fraud detection that can lead to better risk identification and mitigation measures.
For traditional players, existing IT infrastructure stands firmly in the way of this drive. Legacy mainframe systems, commonly buried deep in the bedrock of enterprise IT, are at the heart of their challenge.
Many insurers rely on mainframe applications, developed over decades and tied to policy-based algorithms, to manage fundamental business processes. Long-term policies, customer information controls and a range of background (but vital) transactional processes.
And this is where the challenge lies: the core business processes of large insurance companies are usually closely related to their underlying portfolio applications. This means their development capabilities are tied to how well they can modernise their core applications.
Yes, these old systems run reliably, but attempts to build agile digital services, analytics and AI on top of them often surface technological challenges too vast to overcome. So intertwined is this web of legacy languages, data types and interdependencies, that rewriting even individual programs to integrate with modern distributed systems based on Java, Linux and the cloud, have often proved fruitless. Rewriting or recompiling one program to run on a different operating system typically means transforming the entire portfolio.
For this reason, digital transformation is still very much perceived as an uphill battle, due to the prevailing assumption that it involves a ‘big-bang’ overhaul of IT with no recognition of the complications involved in modernising a complex web of intertwined systems and data.
This assumption has unfortunately seen many large insurers reluctant to change and modernise legacy technology. The pressure to throw out the old and bring in wholesale migration or all-encompassing package solutions is seen as simply too risky or expensive. In other areas of the financial sector, horror stories such as the infamous 2018 TSB IT upgrade have dampened the spirit of many enterprises to modernise with the perception that a big ‘transformative’ leap may be required.
Recent technological developments have a surfaced a way around some of these challenges. Technologies such as Software Defined Mainframes, enable organisations to switch legacy application modernisation into the same development pipelines used for more modern application development. So sophisticated are these systems, that the only difference between modernising legacy applications that run on a Software Defined Mainframe and any other Linux-based system is the language of choice.
Organisations can use the same development pipeline technologies, development methodologies, open-source projects and even organisational structures. These mainframe applications, which no longer need be referred to as legacy, can evolve just as quickly as the Java applications that target the latest containerised digital initiative.
Instead of focusing on an ‘all or nothing’ mindset, a Software Defined Mainframe enables the insurance industry to take the approach of incrementally modernising their system of record applications in small steps. This more graceful approach would deliver gradual benefits whilst reducing the risk of a “big bang” migration, enabling modernisation to be a continuum within the company, as opposed to a destination.
Insurers have to deal with which products and services they still want to offer and in which form. Based on this analysis, it is finally possible to determine which portfolio systems are to be transferred, which are no longer needed and which new ones are to be used. In each case this process offers a simple shift of the inventory system without large changes, to an effective foundation on which to modernise applications as they see fit.
The end result has a whole host of benefits for incumbent insurers. Infrastructure modernisation delivers the expected benefits of reduced costs and escape route from mainframe vendor lock-in, but the real value comes in the liberation of legacy applications and their associated data. Whilst mainframe-native applications in the insurance world are typically associated with ‘bread and butter’ business processes, the data within them is the lifeblood of the business, and failure to capitalise on this is a missed opportunity. When mainframe data is liberated from its legacy enclosure, it can be utilised for modern data analytics, combined with new forms of customer data to achieve true business intelligence for innovation.
The opening up of core insurance applications to modern environments and development practices is essential for faster application and service iteration, a prerequisite for keeping pace in the market as competition increases and customer demands intensify.
Until now, the most common method of legacy IT modernisation has been procrastination. Now is the time for insurance firms to embrace the tools at their disposal and reap the benefits of graceful, incremental migration to their technological future.
This content is provided by an external author without editing by Finextra. It expresses the views and opinions of the author.
Boris Bialek Vice President and Field CTO, Industry Solutions at MongoDB
11 December
Kathiravan Rajendran Associate Director of Marketing Operations at Macro Global
10 December
Barley Laing UK Managing Director at Melissa
Scott Dawson CEO at DECTA
Welcome to Finextra. We use cookies to help us to deliver our services. You may change your preferences at our Cookie Centre.
Please read our Privacy Policy.