/cloud

News and resources on cloud strategy, selection, build, migration and operation for banks and fintechs.
What does the future hold for a hybrid, multi-cloud future?

What does the future hold for a hybrid, multi-cloud future?

In the face of incessant economic volatility, political challenges and being compelled to execute continuous, complex technological adjustments, the financial services industry must acknowledge that the cloud will frame the future.

In fact, $5 billion of enterprise IT spending will transfer from legacy infrastructure to cloud services by 2022, according to NetApp whitepaper ‘Data Fabric: Weaving Together A Hybrid, Multicloud Future.’

While numbers confirm a trend toward cloud, they also cast doubt on forecasts of a cloud-only future. Financial organisations must embrace a fusion of on-premise and public cloud, which is referred to a hybrid model, and utilise services from two or more cloud providers, known as a multi-cloud approach. This co-existence will result in data silos, management complexity and fragmented user experiences, but these can be remedied with a bespoke data fabric.

A data fabric is a set of data services structured on a unified architecture that provides consistent capabilities across a number of chosen endpoints across on-premises infrastructure and multiple cloud environments, allowing for the integration and simplification of data management and accelerating digital transformation.

Finextra Research spoke to NetApp’s principal consultant John Hickman, global account strategist Benoit Malherbe, senior solutions engineering manager Steven Rackham and cloud solutions marketing manager Martin Warren about the benefits of building a bespoke data fabric, decoupling the cloud from location, controlling where applications run and where data resides and of course, the importance of repatriating data back to on-premise and bi-directional portability.

The cloud umbrella

While there has been a growing appetite in the financial services industry for cloud technology across the world, many banks have adopted a cloud-first strategy that has a hybrid, multi-cloud approach with a strong focus on data in the private cloud, as Malherbe explains.

Furthermore, the sharp rise in the use of public cloud platform-as-a-service and infrastructure-as-a-service products, such as relational databases and data warehouses, and the combination with private cloud options from public cloud providers, in addition to support from regulators, has resulted in financial institutions becoming less hesitant to using cloud.

Rackham takes this point further: “Banks chose to implement a cloud-first strategy without really knowing how, what, when, where; referring to it as a ‘strategy’ was optimistic. What they have started to do now is look at how they can use cloud to change their business, acknowledge how the traditional banking marketplace and competition is changing and adapt to that.”

Banks are also diving deeper into individual business cases of different parts of the organisation and looking into what would work in a public cloud, a private cloud or a hybrid model to see how different areas of this “cloud umbrella” can be used. “Because the competition has changed so much, they’re having to adapt differently, which is difficult due to decades-old mentality. Culturally, it’s a big a change as anything.”

This change has manifested itself as a shift to use of hybrid cloud. As Warren explains, 90% of business leaders are looking to using a combination of public and private cloud.

A bespoke data fabric

Amid this period of disruption in which new players are entering the financial services industry, it is crucial to highlight that these organisations think about data very differently and use it in very different ways. Alongside this, as Hickman says, new entrants in this respect are at a disadvantage in comparison to their incumbent counterparts, that have decades’ worth customer and transaction data. However, traditional banks are “unable to learn from the existing systems that they have.

“The notion of a data fabric is being able to take data from where it is staged and run and transfer it somewhere else where you can do something different, whether that is advanced analytics or machine learning - extracting value.

“If you were attempting to build this model yourself and without a data fabric, it would take 18 or 24 months before you would be able to tell if there was any value to be extracted. Using a data fabric would help an incumbent bank speed up the discovery to value inordinately,” Hickman says.

Rackham adds: “For traditional banks, what is key moving forward is flexibility and adaptability. A data fabric gives banks the ability to change their business model and the way in which they are approaching the market, in a way that they couldn’t do if they stuck to their on-premise environment.”

Using a single cloud provider is the same as using a single on-premise solution; it does not provide the bank with additional flexibility and adaptability needed in times of economic volatility. Being in control of data and having the ability to move it to where it is most appropriate is important, not only between the data centre and the cloud, but also between the cloud providers as well.

Warren says: “Good data management means that you can be more efficient in the way that you manage your business: you know what you’ve got, you know where it’s stored and therefore, when compliance comes in and the regulator comes looking, you’re already lean and mean in the way that data is structured.”

Renting the cloud

Over the years, those in the financial services industry have been encouraged to rationalise the number of technology vendors they are working with. “The more vendors you have, the more contracts and overarching terms of business there are that needs to be dealt with. You don’t see this capital cost as a line entry on a P&L, but what you do see is the complexity, which results in the business slowing down,” Hickman explains.

IT departments or certain infrastructures, will soon move to a consumption-led approach, following the cloud’s current pay-as-you-go model. This simplification will become apparent as banks shift from a Capex model to an Opex model where IT equipment is concerned and as regulators request to see how much is being spent on leasing equipment.

Procurement departments are currently tied up in the complexity of “how much they owe, for how much of it and how much they are paying for it. Stripping a lot of that cost and complexity out of the customer’s business is going to become important, helping to standardise as much as possible. This transition will take time, but it will be into a consumption-led approach,” as Hickman explores.

Taking this further, Rackham speaks to the challenges that emerge when moving data from one environment to another, especially when sharing business intelligence across siloes, focusing on the issues that emerge regarding data latency, data sovereignty and data transfer. “You can’t lift and shift everything. Or you could, but it very quickly becomes economically not viable.”

While Rackham believes that ‘lifting and shifting’ does not help the business in terms of data insight and data awareness, Hickman points out that it is also a vital step. “Prior to the ability to move data, nobody moved data, so at best, it could be an asset because you understood where it sat, you secured the perimeter, no bad agents were allowed access to that infrastructure. But as soon as you move it, the data becomes a liability because it is exposed.

“As well as the logistical difficulty of picking it up and shifting it from one place to another, you need to exquisitely understand what this means because you’re already at risk as soon as you’ve started moving it. I’d much rather know about the risk before I act.”

Warren summarises by saying that when migrating to the public cloud, “you’re effectively in someone else’s data centre and you need to understand how to manage that both practically and economically.” He adds that many organisations end up paying more than they need to in the cloud due to a lack of insight into what applications and storage services they are using, and at what cost. Therefore, it is important that a bank has a clear view and control of its entire enterprise across both cloud and on-premises to help avoid cloud-sprawl and optimise costs and efficiencies.

Malherbe highlights here that this is why the cloud is under regulatory scrutiny. “It is new for regulators to control and work to make this environment safer by providing risk management guidance to banks and other financial institutions. Returning to the topic of public and private cloud, it is also at the regulator’s request that banks have to implement an exit strategy in order to bring data back to on-premise if necessary.”

Comments: (0)

Trending