We all recognize that the pandemic has made digital transformation a more urgent priority for financial services organisations. With employees working from home and many more customers wanting to do business virtually, the pressure to innovate and roll out
digital and virtual services to stay ahead of the competition is intense. However, a major barrier to IT innovation is the legacy application challenge.
One study has
that maintaining legacy systems typically accounts for 78 per cent of a bank’s IT budget, diverting resources away from digital initiatives. And 70 per cent of bankers believe their core processes cannot quickly adapt to change, which hinders transformation.
Modernize or retire?
The term ‘legacy system’ can refer to any software written for older platforms, or in need of modernization. But legacy systems are a mixed bag. They include the old, yet bulletproof, transaction processing workhorses that banks continue to rely on, alongside
the terminally obsolete applications that are no longer actively updated with live data. The former can be transformed using low-code and no-code modernization techniques (a topic for another article). The latter can be safely retired, provided that you first
remove any data that’s needed by the business.
Obsolete applications: a hidden opportunity
The potential for savings through decommissioning obsolete applications is considerable, but often underestimated. The total impact – measured in terms of support, hardware and software costs, operational (in)efficiency and business risk – can be hard to
assess because legacy systems are typically distributed around an organization and owned by different business units.
The distributed nature of the legacy problem, combined with a lack of overall ownership, and a genuine need to keep hold of historical data, has led to financial organizations keeping many systems alive far longer than necessary.
To give an example: one financial services organization Macro 4 has been working with had accumulated dozens of legacy applications over a number of years, mainly due to various mergers and acquisitions. Important historical information from these systems
had to be retained to meet the company’s regulatory compliance obligations, and to allow account managers to view the data when advising their clients. Additionally, whenever a new business application had been introduced, the old application had been kept
running purely to retain access to the data.
This is a common scenario faced by established financial services providers, especially those undergoing regular organizational restructuring or new systems implementations, leaving them with a growing number of redundant systems.
A systematic approach reduces risk
Concerns about losing access to important data by shutting down legacy systems are an understandable cause of inertia. But if organizations follow a structured decommissioning program that need not be a problem – and in reality, doing nothing carries a far
greater risk. The longer you hold onto a legacy application, the less likely it is that there will be anybody within the organization who understands how it works, or has the right skills to fix it if it goes wrong. There may also come a point where the legacy
software – or the hardware and operating systems it runs on – can no longer be upgraded. Equally, the security capabilities of legacy applications often lag behind current standards, making them more vulnerable to cyber attacks.
A move to reduce the business risk posed by keeping obsolete legacy systems running is one of the most common reasons for application decommissioning among the financial services organizations that Macro 4 works with. Another common motivator is to stop
spending precious time and money on maintaining systems that no longer add value to the business.
Once the decision to start decommissioning has been made, it can be tempting to jump straight to a technology solution (‘Where shall I put all the data I need to retain?’). While finding a suitable data repository – and providing ongoing access for business
users – is certainly important, the key factor that determines the success of a decommissioning project is the process that is followed. Only by taking a structured, systematic approach to application retirement can you be sure of delivering continuity of
access to the information your business needs.
Work closely with business users
The first step in the decommissioning process is for the IT team to engage with business users to gain a clear understanding of what legacy information needs to be retained. This will include taking a detailed look at how users currently access and work
with the data in each legacy system, and discussing what their requirements will be for the future. This focus on business users is essential. Otherwise, resistance to decommissioning can arise from IT teams underestimating how reliant business teams are on
historical data for performing business-critical tasks such as engaging with clients. Understanding the business priorities up front helps to prevent problems further down the line.
Next, it is important to consider how to format the data so it is easy to use once it has been removed from the legacy application. A common mistake is to dump the data straight into a database or other repository where it can no longer be viewed in its
original business context. Separated from the application logic the data can start to lose meaning. For example, applications often apply extra processes to the data – such as creating on-screen totals, or translating codes and abbreviations into plain English.
It is therefore better to ‘bake in’ this additional information at the time of decommissioning, creating meaningful views of the data that can be understood by business users, without additional interpretation from system experts.
Focus on compliance and usability
After extracting the data and applying any necessary formatting to keep it in context, it can be transferred to a secure repository such as an enterprise content management (ECM) system – either on premises or in the cloud – ready for user access. The original
legacy application can then be retired. To maintain regulatory compliance, the repository should enable data to be actively managed throughout its lifecycle, including retention for the appropriate statutory period, data protection in line with the GDPR and
financial regulations, and secure storage with strong user authentication and access controls. Another important consideration is ease of access for users – so a simple, intuitive interface is a must.
The success of a decommissioning project should be measured by the positive impact on business users, alongside the more obvious financial benefits of eliminating support costs and freeing up IT resources for digital initiatives. Following decommissioning,
teams who may have previously struggled to locate information stored in multiple legacy applications should find that they are able to access it quickly and easily from one central location, with no loss of context and no need for retraining.
Renew, retire, repeat
Legacy systems are the ongoing and inevitable result of organizational change and technology advancement. Mergers and acquisitions continue to give rise to duplicate systems and even today’s shiny new banking applications will be superseded eventually. Obsolescence
is a natural part of the technology lifecycle, so it is important to be prepared. However, while most financial organizations have rigorous processes in place for implementing new applications, very few have a systematic approach for retiring them at end of
life. That poses a risk – but also an opportunity. Putting in place a repeatable program to decommission legacy applications offers a competitive advantage: it allows businesses to simplify their IT landscape, eliminate the redundant technology that inhibits
progress and focus more of their energy on innovation.