Resources
See latest resources »
Enterprise data pipe dreams

Enterprise data pipe dreams

Source: Arnaud Picut, Fernbach

Too often, financial institutions have failed with enterprise data management projects due to over-ambitious plans to build vast, all-encompassing centralised repositories. Arnaud Picut, managing director of Fernbach, believes traditional static information repositories must move with the times.

For many organisations in the financial sector, seamless enterprise data management (EDM) has become somewhat of a pipedream. Although in theory procuring and configuring a set of technical processes by which all the data in a given institution is managed from source to storage and end of life should be achievable, in practice it has proven to be extremely difficult when financial data is by nature transactional, not static.

Banks must adopt a new approach to enterprise data management (EDM), recognising that the finance and risk applications must share data. That’s the conclusion of a survey of 40 major European and Asian banks, conducted by Fernbach.

Despite the fact almost two-thirds of the banks that we surveyed had begun projects to build a cross-data warehouse, half of these admitted their projects had failed and their scope had to be cut back. Over 30 per cent of those surveyed have now decided to focus on specific domains such as risk, compliance and finance, or CRM, rather than tackling the entire data repository. Meanwhile, many of those who have pressed ahead have adapted their original plans for the data warehouse to incorporate other commercial functions such as customer relationship management (CRM), and credit scoring or granting processes.

Why do so many banks fail in their attempts to implement all-singing, all-dancing cross-data warehouses? Simply put, vast static repositories do not meet many of today’s modern business challenges. Although cross-data warehouses are still relevant for static information, they are wholly inadequate for finance and risk management because they lack transactional information. Because existing and new regulations (including ICAAP) require banks to gather and combine such a complex information, they are finding they need to refocus data management projects to span finance, risk and compliance.

Banks face a demand for more accurate and deeper audit capabilities for both financial and risk management. They need to be able to drill down, for example, from Basel II ratios to the relevant contracts or to be able to easily find out which accounting entries have been generated by any given contract or group of contracts. In the near future, they will need to build a Capital Framework with which to model the balance between economic capital and regulatory capital. Meeting these needs requires a matrix of information that combines value and risk, a combination of true expected cash-flows and static risk data frustrated by static EDM systems. Over 60 per cent of banks surveyed by Fernbach said they were still struggling to reconcile data because their EDM was only able to store static data for risk and not all the individual cash-flows for each contract.

Banks have told us that they would like to have one single, centralised IFRS engine which captures all the transactions emerging from all the source systems. These banks are struggling to comply efficiently with International Financial Reporting Standards (IFRS) because transactions are typically stored in multiple source systems requiring besoke extensions for calculating IFRS ratios. The most effective route to achieving a centralised IFRS engine, would be to build a true transaction repository which can supply information for the risk and financial departments. That would also solve the problem of how to reconcile risk and financial accounts, creating sizeable savings. Some banks have as many as 20 people working full-time on data reconciliation, at great cost.

Almost three-quarters of those we spoke to described the unified system they required as follows, empowering banks to industrialise the reconciliation process:
  • Capable of handling both banking and trading books
  • Incorporating an event-driven repository interpreting payment, pre-payment and other transactions
  • Handling a combination of real time (for larger contracts) and batch processing (for smaller retail banking transactions)
  • Scalable to handle tens of millions of transactions
  • Generating cash-flow at the lowest level of granularity (there could be as many as ten different cashflows for a single contract)
  • Archiving information
  • Delivering common information for financial and risk departments


The first step is easy to identify: three-quarters of those answering our survey put their finger on it: a centralised true cash flow generator would be a common denominator for feeding the general ledger, performing IFRS calculations, producing accurate Alco reports, analysing liquidity gaps and making economic capital simulations. It’s not uncommon for banks today to use different analytics (including ALM, FTP, liquidity, Controlling and IFRS/Multi-GAAP) to calculate cash flows for use as a basis for most of the ratios that the bank must calculate. This means that the same events are leading to different cashflow calculations in different systems, based on different assumptions. How can that be reconciled? How can accurate reports be produced to combine figures coming from different systems?

A new style of EDM would operate like this:
  • Step 1: Data collection from all the source systems (including characteristics of contracts, market data and other static information) and quality validation
  • Step 2: Product splitting (break down each deal according to its characteristics, such as interest and capital - for example for securities with different options and structured products)
  • Step 3: Cash-flow generation at the characteristic level
  • Step 4: Ratio calculations (some basic ratios must be calculated at this level)
  • Step 5: Storage of the new information, available for multiple purposes, ideally in separate data marts for each department


The most important change in this fresh approach is the point in the process at which reconciliation takes place. It now happens at the data source level, instead of at the analytics level, after any number of assumptions have been introduced through different systems. This new approach feeds transactional data into all the other systems, ensuring they are consistent and current.

It is essential that any banks that have previously run arduous and ultimately unsuccessful EDM projects now come to terms with the fact that the problem was not the concept of centralising data, but the implementation. One thing is certain: the next generation of EDM projects will meet their business and regulatory challenges by enabling cash-flow generation and reconciliation at the source level, gaining real-time connectivity with their operational systems.

Comments: (0)

Comment resources
See all Comment resources »
The millennial mindset
/comment

The millennial mindset

Globalisation, demographic change, virtualisation, new technologies - the confluence of these drivers is forcing European banks to adapt rapidly to stay on their game and remain relevant in a world that, five years from now, will demand an entirely new way of doing business.

Thomson Reuters and multimedia
/comment

Thomson Reuters and multimedia

Learn how financial services firms are using multimedia.

Sepa - where do we stand?
/comment

Sepa - where do we stand?

The European Central Bank's Gertrude Tumpel-Gugerell, outlines the obstacles to the creation of a Single Euro Payments Area at an offsite meeting of the European Payments Council.