Blog article
See all stories »

Understand the value of data before you try to use it

Information is one of the most important weapons in the armoury of any bank. This is well understood by most institutions, as evidenced by the huge investment they have made in trying to leverage the oceans of data they have at their disposal for commercial benefit. In this endeavour they have not always achieved the success they hoped for. Where they were getting insights out of their data, all too often it has taken so long to get there that the practical usability of that data has depreciated. The moment has passed.

Where banks may be letting themselves down is not spending enough time and thought on understanding the value of that data in the context of their business objectives. It might sound obvious, but sorting a small amount of good data from a huge mess of less valuable data is an essential preliminary to unlocking significant benefits from it. Too many are charging into expensive platform implementation programs, but without investing due diligence in getting their heads around data’s value at the sharp end of the operation, so they are seeing little or no upside.

Let’s consider why this matters so much for the long term success of banks. Without the tools to assess its business value, banks are finding themselves short of the sort of high quality data that is central to profiting from crucial technologies like artificial intelligence (AI) and machine learning (ML). AI and ML are central to things like growing revenue, targeting customers, creating efficiencies and mitigating threats like financial crime and cyberattacks. They count when it comes to analysing trends ahead of the competition and keeping abreast of regulatory pressures. Good quality data is essential to support AI and ML if they are to address all these challenges and fulfil their promise as truly disruptive forces. 

Recent research found that 53% of respondents to a survey identified lack of enough high quality data as an impediment to successful AI and ML implementation. If banks want to develop AI-driven applications that are relevant, accurate, and scalable, then they must be sure that the data those applications are founded on is of the first rank, and also easily available where it is needed. It needs to be accessible in near real time within the organisations, and also be at the disposal of a broader ecosystem that includes partners, suppliers and, of course, customers.

An obvious way to understand the value of data is in terms of how it can be monetised. Data should be visualised in terms of how it will or will not empower an organisation to generate future economic benefit. Consulting firm PwC divides data monetisation strategies into three groups: first, the use of data to enhance current business, perhaps enriching an existing offer with new data modelling. Second, data can be used to enter an adjacent business opportunity, for example by finding new partners across the value chain or generating fresh insights to drive better return on investment. Third, it can help develop incremental business through leveraging new data sets and analytics tools.

So what techniques and tools are at the disposal of banks as they seek to prioritise the assessment of data value? With the rise of active metadata management and the continuous analysis of all available users, data management, systems/infrastructure, in combination with a virtualised data architecture approach, banks can avoid the traditional pitfall of deploying too many use cases and technologies all at the same time and ending up profiting from none of them. Or worse, getting caught in the tech headlights and ending up doing nothing. Active metadata management can also help solve the challenge of delivering efficient data engineering manpower so as to deliver value early. If banks can manage this, backed by C-level sponsorship, then they have made great strides towards measurable data-fuelled gains.

Insurance provider AA Ireland is a great example of a financial services business that has successfully rethought its approach to data by addressing its value and its practical application to business problems. The software it formerly deployed to model data would take anything up to a year to draw usable conclusions. The replacement solution takes that power out of the dusty realms of the IT department and gives it to employees on the ground who understand how the business works. They can use it to build their own models and use it in near real time to win new business and generate measurable value for the operation. Models can be spun up to identify fraud attempts or create embedded customer services which can then be rolled out instantly in live environments. Opportunities and risks are visible to all. Business insight and actions driven by data.

There is so much to gain here, and so much to lose by continuing to get it wrong. PwC estimates that organisations taking the right approach to data are twice as likely to be in the top quartile of performance within their respective industries.

 

 

3924

Comments: (0)

Blog group founder

Member since

0

Location

0

More from member

This post is from a series of posts in the group:

Big Data

this Group Focusing Big Data Area and Data Engineering /Data Warehouse /Data Analytics


See all

Now hiring