20 December 2014

Getting down to the data

Paul McPhater - Markit

9Posts 17,764Views 0Comments

EDM blind spot number one: Data silos, just ask Madoff

17 July 2014  |  1498 views  |  0

Data silos were a key failure in identifying Madoff's fraud, according to the Wall Street Journal. While clearly it wasn't the only issue, silos provided useful hiding places that stopped valuable data surfacing. Data silos persist across the financial services industry: lack of data consistency, validation and cleanliness remain a big problem as does duplicate costs and variable usage rates. Silos may have well emerged with good reason: Chinese walls, M&A, the best technology for that particular case in that specific bit of the business and so on. Subsequently they can be found lurking in single entities, across geographies and throughout the entire data lifecycle. It's a complicated and increasingly urgent issue as regulations demand quality data. So how to tackle this ongoing risk?

Silos have typically built up over time and this makes them complicated and costly. Data growth is also unabated and adding to the issue. The financial crisis shone the light on the importance of standardised, validated, quality data, so action is needed. The solution is to create a single data hub and this needs to be a strategic decision to get the best ROI.

Poor data management adds risk. Poor data management creates duplicate data. Poor data management adds costs. Whereas decades ago, data management projects were often a poisoned chalice, today there are many and varied examples of successful EDM projects. Getting buy in for budget and resource can be a big part of the challenge. Useful tools are to hand, customer references are invaluable and enable conversations with similar sized entities with similar issues. Proofs of concept are also a rapid and insightful way of demonstrating the ability to rapidly deliver tangible business results.

Getting a single hub that can acquire data from all sources, validate it, store it and distribute it in the format that the end user requires is the ultimate goal. This needs to happen in a consistent, fully-audited environment in order to satisfy the raft of regulations that are focussing on quality data as well as to reduce operational risk. The good news is it is achievable.

TagsRisk & regulationPost-trade & ops

Comments: (0)

Comment on this story (membership required)
Log in to receive notifications when someone posts a comment

Latest posts from Paul

Another Lehman’s legacy: connecting the data dots

19 November 2014  |  1589 views  |  0  |  Recommends 0 TagsRisk & regulationPost-trade & ops

Data audits: invaluable insights into costs

22 October 2014  |  1102 views  |  0  |  Recommends 0 TagsRisk & regulationPost-trade & ops

The blueprint for risk data aggregation

09 October 2014  |  1917 views  |  0  |  Recommends 0 TagsRisk & regulation

No matter the regulation, recourse to taxpayer off the table

26 August 2014  |  3256 views  |  0  |  Recommends 0 TagsRisk & regulationPost-trade & ops

EDM blindspots: Lessons from McDonald's latest campaign

15 August 2014  |  1628 views  |  0  |  Recommends 0 TagsRisk & regulationPost-trade & ops

Paul's profile

job title COO Enterprise Software
location London
member since 2014
Summary profile See full profile »
COO Enterprise Software at Markit - based in London.

Paul's expertise

What Paul reads
Paul writes about

Who is commenting on Paul's posts

Vishwanath Thanalapatti