Blog article
See all stories »

Sibos 2013: BIG DATA:innovation,magic bullet or BAU?

Big data An interesting clash appeared during the earliest conference sessions at Sibos: the standing-room-only Innotribe kickoff was scheduled against a session on Big Data. I can only think that Big Data has now moved from innovation to business as usual and certainly that is what the participants seemed to be saying. Innotribe does have a session on Big Data on Tuesday and will delver deeper, so perhaps it's not quite mainstream yet, or maybe there is more to be exploited.

I have to confess that I'm fascinated by the concept of Big Data, but perhaps my definition is larger than other commonly-held ideas. The definition used by Brian Caplen was the best common one from Gartner: "Big data is high-volume, high-velocity and high-variety information assets that demand cost-effective, innovative forms of information processing for enhanced insight and decision making."

I think this has a lot going for it, but I think misses a key attribute: to be big, it has to be too big to store long-term. Perhaps this is covered by Gartner's 'velocity', i.e. it's not only coming in at high-speed, but leaving too. We're not creating a dam in the data river which is of unlimited capacity and therefore there's a timeframe over which we can process it before we have to throw it away.

A good example is the UK Meteorological Office (www.meto.gov.uk) which not only provides weather forecasting for the UK but also global wind forecasts everywhere but the US. According to their CIO they process over eight terabytes of data per day, which is possibly not as big as some big data sources. What is more interesting is their whole archive is only 40 terabytes which means that eh have to process a day's information into insights which can be stored long-term.

So for me, Big Data goes hand in hand with the willingness to process and discard the raw data in favour of fewer, more valuable insights. For me it's not just the size of the data, but the size of the processing task. For example, searching payments transactions for payments to a list of accounts should be quite simple. However searching for fraud networks based on analysis of flows is exponentially more complicated. In this case the complexity of processing makes it big, not just the storage size.

So, back to the session. Three eminent bankers were discussion what Big Data means to them. In summary, Big Data is important to understand how we should and do operate and it's what we're doing now. What was interesting was that the audience thought that the IT challenges had most been conquered but that the issues to be addressed were skills and governance. The audience also thought that Big Data was more appropriate for less tractable problems of fraud detection and product management.

In essence I think that Big Data is seen as a magic bullet by many banks and the reality may fall short. So I look forward to the Innotribe session on Tuesday which will look more into the art of the possible.

3567

Comments: (0)

Now hiring