Everything about financial services is driven by data. It is data that we consume, data we produce and data that we deliver to our customers. The proliferation of data raises endless questions and opportunities for analysis.
Here I will explore questions across four categories of data that are top of mind for 2024, and identify a fifth category that could offer a potential answer to the rising data management challenge.
Data supply chain
Financial institutions are great big data factories, ingesting raw material data, creating data products and then delivering those products; but financial services have not historically thought in those terms. As industry-wide fee compression and cost challenges
bite, it becomes more important to understand:
- Who is supplying your commercial data? How much, how often and most importantly how much cheaper could you be getting data of the same quality?
- Which of your systems are operating at peak capacity? Do you have the ability to absorb additional transactional increases within your capacity? If not, how much cost is needed against current infrastructure to gain processing volume?
- How well are you meeting service level agreements (SLAs) through your business? Is this captured, monitored and reported automatically, with zero human input? Can you point to operational efficiency? Do you understand where you may be manually processing
and increasing costs?
Transactional data
As financial markets shift to T+X, where X is an ever-decreasing integer, can we move data through our systems and organisations quickly enough to support customers? Can our customers consume that data quickly themselves?
Are organisations now at a nexus point where we switch from dealing with positions and holdings reports and deal in transactional records to give a real time perspective on status?
Positions and holdings are point-in-time capture, vital for reporting events, but this is rapidly decaying data. Transactions are the components that create a position. If we amalgamate transactions to a position, we can create “Books of Records” from the
core components and view them through any lens, be it trade date, settlement date or custodial delivery date with ease.
Event-based data movement through an organisation will become the standard, using tools like Kafka or cloud-based data platform tools from the likes of Snowflake, Azure and AWS.
The ability to not only move data around in a transactional real-time fashion but also to see and visualise it through real-time dashboards will ultimately remove the need for reconciliation by using transactional automated reconciliation or atomic delivery
options.
Commercial data and commercial data vendors
As data becomes more comprehensive in terms of asset and analytical information, the ability to consume, ingest and distribute within an organisation becomes more demanding. As niche and alternative data sets come to fruition; geospatial, inventory and
logistics tracking and image, for example, do organisations have the structure, capabilities and technology to fully utilise and distribute such data to the investment teams that need it?
Success requires abstraction of this data from the underlying provider formats and the ability to switch providers quickly with zero impact to downstream systems. That in turn will depend on whether the firm has developed a standardised data taxonomy,
or business data model which is the ‘lingua franca’ of the organisation, allowing data to be ingested and distributed through the organisation without any impact to that data in terms of translation and transformation in a point to point world.
Most data problems in firms are not solved with technology, but with thinking about data as the asset and the value to the firm and ensuring that it is treated, supported and defined accordingly.
Data to support artificial intelligence
The clear exciting topic at the moment is artificial intelligence, in its myriad of forms, including Generative AI. But is the financial services world, an ecosystem that is based on fact and data, ready for Generative AI, a technology that is famous for
‘making things up’?
Therein lies the rub. AI is an amazing technology solution, but how do you apply it to your organisation? The accuracy and reliability of your AI technologies will be directly related to how accurate, dependable and trustworthy your data is. How comprehensive,
accessible and timely that information is to consume through different channels will be driven by technology capabilities that support in-depth transparency on the data, complete with ‘explainability’ of how a conclusion, output or signal is reached.
Trust in AI might be easy when you are not a regulated industry dependant on the accuracy of your records and your business produces cat videos for the internet. However, we have to be focused on that trust of data to support any AI capabilities we may develop.
Data democratisation
This is arguably the key to resolving many of the questions and issues above in a sustainable and scalable way. Data in organisations is typically siloed, locked down, and secured - which means a wealth of data is unavailable or inaccessible.
Abstracting away from underlying technology restrictions to make data accessible and consumable across all parts of the organisation (within regulatory and legal restrictions) requires an overarching data strategy, combined with an overarching data technology.
The process of democratisation starts with education and the evolution of data culture within a firm. Delivering this change in an organisation requires a focus on the development of an enterprise strategy that includes embedded transparency and
trust in the data (for example, metadata, lineage.), as well as ensuring discoverability, transparency, and cataloguing capabilities to allow customers to find data products without needing heavy development skills and tools.
Data sits at the centre of financial services for a reason and democratising it is core to value over time. Effective solutions will need to combine cloud-based technology with cultural change to achieve a greater level of business agility and insights based
on empirical understanding of the vast trove of data generated by financial organisations.