Blog article
See all stories »

Making Data Work for Banks

Data is everywhere. Every day, 2.5 Quintillion bytes of data is created around the world. To put that into perspective, a quintillion is one followed by 18 zeros. In 2013, we were already creating enough data to fill a stack of iPads that reaches two-thirds way to the Moon. By 2020, the amount of iPads needed to store the world’s data could take a person to the Moon and back three times. For banks these facts and figures can provide vital information for the improvement of customer experience and operational efficiency. Evaluating storage and analytics of transactional data created from interaction with customers, be it monetary or correspondence, is essential for improving online banking performance.

So how is this managed most effectively? First, businesses need safe and reliable data storage. Databases are the foundation on which banks build a strong architectural framework, which determines the accessibility for analytics. As we see a continued roll-out of Big Data and Internet of Things technologies in banking, such as the increased adoption of mobile banking, it becomes clear that the days of using one database for all business needs are over. Akin to the urban legend of the ‘library that sank’, if a business does not account for the right volume and variety of data, it’s bound to overload and sink under the weight of its treasured archive.

Polyglot Data Management meets this call for a new approach to data use and storage. Like the definition of polyglot – someone who speaks several languages fluently - this strategy encourages a data management framework would be better served by using several different database “languages”.  Using the problem solving capabilities of different coding languages through the databases they deploy, from NoSQL and Relational Database Management Systems (RDBMS) to in-memory databases, banks can avoid overworked and inefficient systems. In its place they gain agility, faster response times and effective analytics, leading to better customer experience and retention. UK banks like RBS are investing in polyglot data management technologies to improve their services. Realising the value of open source databases like MariaDB, MySQL and MongoDB to evaluate data and improving conversations with customers.

A second requirement for effective data management is availability of data. Banks are dependent on their databases to handle millions of transactions around the clock and as such it is paramount that data is swiftly obtainable. Failure to do so has made customers miss mortgage and house bill payments, making unwanted headline news for those banks.

A mixture of databases enables enhanced availability and streamlined analytics, with each database running its own, natively built, analytics systems and then consolidating those analytics onto one shared platform. Using a middle layer on top of different databases helps to quickly gather data for front line staff and analytic platforms. Before adopting such a system, banks should define which data is critical to business operations, such as financial data and user activity logs. This insight then determines which types of data each part of the application is creating, and which database should be employed for each task.

Implementing a Polyglot Data Management system is not without hurdles though, as an array of databases need to be managed by an administrator or a team, which is costly. However, by making this investment banks can take advantage of the innate problem solving capabilities of multiple data stores and analyse data with ease. This leads to improvements in business intelligence, queries can be resolved sooner and real-time offerings can be created and delivered to relevant customers.

As you can see, the benefits of implementing a Polyglot Data Management system are plentiful and to avoid the library parable, banks need to be built on a strong foundation of an agile, yet resilient IT infrastructure. Making use of all materials available in a Polyglot Persistent future, businesses can avoid sinking under the increasing influx of data coming social networks and connected machines, and be the library that stands. In our next blog, we will delve further into the subject of data analytics within banking, to help ensure banking systems are optimised for world-class performance.

For more insight into the Financial Services industry, please follow @knkumar

4963

Comments: (0)

Now hiring