Data silos were a key failure in identifying Madoff's fraud, according to
the Wall Street Journal. While clearly it wasn't the only issue, silos provided useful hiding places that stopped valuable data surfacing. Data silos persist across the financial services industry: lack of data consistency, validation and cleanliness remain
a big problem as does duplicate costs and variable usage rates. Silos may have well emerged with good reason: Chinese walls, M&A, the best technology for that particular case in that specific bit of the business and so on. Subsequently they can be found lurking
in single entities, across geographies and throughout the entire data lifecycle. It's a complicated and increasingly urgent issue as regulations demand quality data. So how to tackle this ongoing risk?
Silos have typically built up over time and this makes them complicated and costly. Data growth is also unabated and adding to the issue. The financial crisis shone the light on the importance of standardised, validated, quality data, so action is needed.
The solution is to create a single data hub and this needs to be a strategic decision to get the best ROI.
Poor data management adds risk. Poor data management creates duplicate data. Poor data management adds costs. Whereas decades ago, data management projects were often a poisoned chalice, today there are many and varied examples of successful EDM projects.
Getting buy in for budget and resource can be a big part of the challenge. Useful tools are to hand, customer references are invaluable and enable conversations with similar sized entities with similar issues. Proofs of concept are also a rapid and insightful
way of demonstrating the ability to rapidly deliver tangible business results.
Getting a single hub that can acquire data from all sources, validate it, store it and distribute it in the format that the end user requires is the ultimate goal. This needs to happen in a consistent, fully-audited environment in order to satisfy the raft
of regulations that are focussing on quality data as well as to reduce operational risk. The good news is it is achievable.