Part 1 – From 'war room' to Master Data Management
I want to start my blog with a series of posts that analyse how modern enterprise content management systems can help the Financial Service Industry to stay on top of its information
The 2008 credit crisis highlighted the importance of global financial institutions being able to manage their risk exposure. One of the biggest technology failures during the crisis was in my opinion that when the banks were executing counterparty transactions
with each other, critical information such as interest rates they charged each other was only available as “unstructured data” inside of contracts. Unstructured data refers to text-heavy information that does not fit into traditional databases. In this case,
critical information was hidden inside individual Master Transaction Agreements, not in a central repository. This meant it was not readily accessible or quantifiable from the paper or electronic documents. At the height of the crisis when the banks were
on the precipice of failure, they couldn't work out what their risk exposure was to each other. To solve this, the banks hired a team of MBA interns to work 24/7 in a 'war room' to manually read through the documents, pull out the numbers and calculate the
risk exposure. In the case of banks like Lehman Brothers, this did not happen fast enough.
As Master Transaction Agreements and other contracts are legal agreements that will define how business is done between financial institutions also in the future, the pressure has increased significantly to find ways to quickly and easily share critical
information whilst fulfilling security and compliance requirements.
IT industry analyst Gartner warns that “by 2016, 20 percent of CIOs in regulated industries will lose their jobs for failing to implement the discipline of information governance successfully.”
Master Data Management
However, there are already sound approaches in place to address these issues.
The Enterprise Data Management (EDM) Council, a not-for-profit business forum for financial institutions to elevate the practice of data management as a mandate for efficient business operations and the Carnegie Mellon University Software Engineering Institute
(SEI) have been working in conjunction with data management professionals since 2009 to develop a Data Management Maturity Model to help financial services firms improve risk management and operational efficiency. The DMM model defines requirements for developing
a data management strategy, implementing governance, managing data operations, improving data quality and integrating data effectively into business processes.
One example from the DMM model is the process of Master Data Management which most of the big banks and two regulators have already voluntarily introduced. In this process, the banks create a master data file that includes information on all the counter
parties it deals with, gives each a unique ID, and then references that ID across all of their line of business applications. It's also possible to link this information to non-relational data held in e-mails and documents. This means that all trades with
another entity can be instantly accessed and quantified.
The global nature of financial services increases the need for Master Data Management. For example, if Bank “A” has a contract with Bank “B” in several countries in which the legal entities have different names that are referenced in contracts that are stored
locally, Bank “A” has no way of finding out its risk exposure with Bank “B”.
In the next post I will analyse how a new approach to enterprise content management (ECM) helps to support Master Data Management
Quoted Gartner source: http://www.gartner.com/newsroom/id/1898914