It’s not uncommon for data management projects to result in 15-20 siloed systems being retired. And we could extrapolate that figure for global projects. With so many systems in so many territories it is easy to understand how financial institutions are
likely in need of data audits. Similarly different sets of users may be making the same data requests but from different regions without ever knowing the organisation had already paid for it. Cost savings here can be significant.
Data is expensive, but then it is also valuable. According to Investit, global spend on pricing, reference, valuation and benchmark products has achieved a five-year compound annual growth rate of over 12%. The expense growth and the increase in data volumes
means firms need to consider how they access data internally and how they share it with clients.
The global footprint of most financial institutions could quite reasonably be described as sprawling. There are many entities that generate many requests for data feeds. If all the data isn’t being pushed through a central hub, how can you easily determine
what data you have, what you are using and what you actually need? A central data hub, which doesn’t need to move the data, can provide invaluable insights through data audits.
One of these is that firms will be able to accurately charge back data across its organisation. It will be based on actual usage rather than estimated or historical usage. A 21st century data management system can incorporate the metering for data coming
into the organisation, track by any attribute from import to export and also determine who accessed it or changed it.
Another important insight could be around which data feeds are really being used and where similar data points can be arbitraged in terms in of pricing. If a firm has 20 data feeds, for example, an audit could potentially determine that two of these are
in fact redundant or it may be more economical to use source A over source B for the same values.