Blog article
See all stories »

A Crisis Needs a Utility?

I heard Francis Gross of the ECB speak at one of the panel events at the XTrakter Conference last week, and found that I couldn't avoid asking him whether the aims of the "Data Utility" initiative by the ECB could be better separated from the means by which the ECB proposes to solve them. At the moment, reference data issues for the industry and the data utility seem to be presented as a single "package". I can't say that the response to my question was a clear one to my understanding; however I would say that Francis was helpful after the panel had finished and provided a recent presentation of their ideas, of which you can find a copy here.

Looking through the presentation, the motivations put forward for why the industry needs a reference data utility seem to include:

  • Data processing must be done in an automated manner, since data volumes have moved beyond the capabilities of manual processing.
    - can't see anyone arguing with this
  • Data is a major bottleneck, with multiple providers/sources each with the own "data dialect"
    - agreed and to some extent what keeps data/data management vendors in business, but sounds sensible to standardise if possible as there are plenty of other problems to address
  • These data dialects lead to increased cost, operational risk and reduced responsiveness
    - agreed, mainly a cost aspect I would suggest
  • The recent crisis was not helped by weak data management in the industry
    - but nor was it the cause, so not a great premise for a data utility
    • lack of transparency of data
      - "transparency" is an over-used word at the moment, but certainly clarity and quality were/are needed
    • systematic risk could not be assessed due to the availability of data
      - using terms like "systematic risk" seems to imply the regulators could calculate something, whereas this discipline is new so I guess we are really talking about simply knowing who is exposed to who and how.
  • We need the capability to run large scale computing analysis on a vast pool of micro data, sometimes on an ad-hoc basis when a crisis begins
    - fundamentally agreed but also good to qualify with what you propose to be calculated - having a set of "numbers" doesn't seem to have helped much recently...

I started the above bullet point list by saying it contains the motivations for "why the industry needs a data utility" but I guess looking at the above list they really point to the more general aim of "why we need better industry-level data management". In the presentation the above points are then used to state:

"We all need the same good basic reference data. Why build more than one infrastructure?"

Maybe "Why build more than one infrastructure?" should really be changed to say "Why maintain more than one infrastructure?" given that Bloomberg, Thomson Reuters, Six Telekurs, Interactive, Markit and all the other vendors already infrastructure to do this. Not sure if I should read anything into the wording but more logical leaps of faith are to follow.

The presentation then moves on to state that shared reference data standards are a must, to which I cannot see many consumers of data disagreeing with that statement. Not sure I agree though with the overly simplistic statement that "Data will be good for all users or good for none". Trying telling that to the accountancy and risk departments for example but I suppose what we are talking about here is basic reference data not the more subjective price and valuation data. Reference data on instruments and entities is either right or wrong, and the presentation makes the good point that no amount of "data cleaning" can help this i.e. if wrong, the data needs to be re-captured from an accurate source.

The call for the establishment and use of reference data standards in the presentation then seems to be used to "slide" into a call for a standard reference data infrastructure. These two things are not necessarily the same thing and so it seems a logical leap has been taken here. The presentation talks about the possible necessity of "top down" legal compulsion for the industry, again something that I could agree and see the need for, but both the issues and legal compulsion do not automatically drive us to a "data utility" as the only option? Why couldn't legal compulsion be applied to the existing data vendors to standardise on common IDs for instance? ISIN is proposed as a standard in the presentation, but I can only assume that this is due to the ECB being mainly focussed on the bond world where to a large degree ISIN's work (i.e. are unique), whereas in the world of equities ISIN needs a lot of qualification (currency, exchange, share class...) before it uniquely identifies a quoted equity.

In summary, the presentation starts with showing how great the ECB's Centralised Security DataBase is (7 million securities, 3 million record updates/day etc...) and it does look good, but in some ways so what? The data issues for the industry seem clear, although I think the "crisis" is a bit of a red herring to the aim of data cost reduction, however the logical jump from industry need to effectively "we must have a data utility" is an interesting one, one where I would prefer that more options were discussed. It seems ironic that in these days of "transparency" it is not at all that transparent to me why more alternative solutions are not being discussed and a choice justified. Talking of choice and as a final thought, I am also not sure why the data vendors are not up in arms about this initiative - are they frantically lobbying behind the scenes? - do they simply think the utility won't go ahead? - or are they afraid of upsetting the EU? Any thoughts much appreciated!

4492

Comments: (0)

Member since

0

Location

0

More from member

This post is from a series of posts in the group:

Data Management 101

A community blog about data and how to manage it


See all

Now hiring