The amount of data created in our society is increasing exponentially and is predicted to reach 44 zettabytes by 2020. As discussed in my previous blog, the amount
of data could fill enough iPads to build a stack
reaching two-thirds way to the Moon. As our digital universe continues to grow the amount of data in existence is outpacing storage,
and although most of it is transient, such as streaming and media data, businesses who want to stay on top need to be prepared for the volume and diversity of data coming their way.
For banks, having a strong architectural framework in place can mean the difference between consistent, reliable operations and overloaded systems resulting in service failures. Causing customers to miss mortgage payments or house bills can severely bruise
customer experience and lead to adverse headlines in the press.
We have looked into how digital architectures can be optimized by taking advantage of the distinctive problem-solving capabilities of different database languages. A Polyglot Database Management (PDBM) system can effectively match various types of business
intelligence data, from customer orders to transactional requests, with the data storage technology most suitable to store the data, from NoSQL and Relational Database Management Systems (RDBMS) to in-memory databases. As a PDBM system is able to “speak” each
language, information can easily be recorded and accessed across the full database cluster, allowing banks to ensure accurate market coverage through effective product management.
This is one of the most effective data storage methods for banks, while maintaining accessibility and agile communication across the network. Each customer request can be handled swiftly, satisfying customer expectations for speed, and information securely
stored for future business analytics. For banking businesses looking to improve operational performance and customer experience orchestration, effective information storage and analytics is crucial for maintaining a reliable source of actionable business insight.
The potential strategic value of analytics in banking is enormous. It could offer up strategic market agility by using contextual insights to drive innovative offers for selected segments.
According to industry analyst Gartner, there are four types of analytics; descriptive which describes what is happening, diagnostic which
explains why something happened, predictive which demonstrates what is likely to happen and prescriptive which gives guidance on what to do based on current information. Each type of analytics touches the customer journey at some point. For example, prescriptive
analytics is useful in an environment where self-service banking is prevalent and giving customers robo-advice when they make spending decisions. This something to consider when trying to deliver optimal customer experience.
One way data analytics helps established banks stay ahead of its competition is through predictions of customer behaviours and trends and to preempt risks. It is amazing to see only 13% of organisations are using predictive analytics because it is useful
in the tailoring and pricing of products. This not only benefits customers, who are more likely to be attracted to personalised banking products, but for banks too as they retain customers increasing their lifetime value proposition.
can be improved through better customer targeting and reduce costs while improving recovery rates.
There is no question the right data management strategy is vital for world-class banking performance, especially when
2.5 Quintillion bytes of data is created every day. To get to the next level of customer experience delivery, banks should master analytics. All four forms have their role to play to provide customers with personalised products and services. Deploying effective
analytics will set the pace in a market where adoption is still relatively low. This means banks can vastly improve their interaction with customers, leading to better retention and profitability over time.