Blog article
See all stories »

Big Data in 2012

Over the last few years, we have seen the dawn of ‘big data’ – a term coined by EMC to describe the large amounts of information generated by organisations today. The sheer volume of this ‘big data’, such as weather data, passenger information on public transport, or sales and marketing information, can make it very hard to analyse.

However, the financial sector has always had to deal with big data and is consequently well placed to take advantage of this trend in 2012. For brokers, fund managers and the like to make good investments, they need an in-depth knowledge of individual company performance, macro-economic trends, geographical trends and a wealth of other knowledge. In short – big data.

2012 may well be the year when IT in financial institutions catches up with what their employees have been doing for years. Although trading tools are extremely sophisticated, the rest of the industry is now developing tools which can synthesise and analyse similarly large amounts of information for one simple reason: big data, rather like investment information can be the key to gaining a competitive advantage.

Companies which can take advantage of the market intelligence available to them – as well as the data within their organisation (including sales and marketing data) – will push themselves ahead of the pack during 2012 as they learn from their mistakes, use market data and track forthcoming trends in the market. After all, post-recession, all companies have been making the most of their human resources – now they need to turn their attention to their informational resources. 

 

 

3366

Comments: (3)

A Finextra member
A Finextra member 10 February, 2012, 09:27Be the first to give this comment the thumbs up 0 likes

I don't understand the fuss about "big data". Why not just take a sample of it, rather than tackle the whole lot? Scientists have been doing this for centuries.

Ketharaman Swaminathan
Ketharaman Swaminathan - GTM360 Marketing Solutions - Pune 10 February, 2012, 11:32Be the first to give this comment the thumbs up 0 likes

@Finextra Member:

Interesting point!

In the past, due to shortage of computing power or its exorbitant costs, scientists have reached "big conclusions" on the basis of small sample sizes that were often statistically insignificant. With growing computing power and falling costs, we've perhaps reached a stage now when we can actually analyze *all the data* and come to (a) "big conclusions" that are truly applicable for the entire population and / or (b) "micro conclusions" that are applicable for an audience of one or a handful of people. This might explain all the buzz around "big data" these days.  

A Finextra member
A Finextra member 10 February, 2012, 12:20Be the first to give this comment the thumbs up 0 likes

I don't think that you can blithely disregard scientific progress and indeed the science of statistics without some evidence. "Big data" is simply - as you point out - trendy at the moment, but I get the feeling that the emperor has no clothing.

Ultimately, there is no point in analysing all the data, unless the volume is lower than the statistical level of probability that you are willing to accept. Which in the business world is, in my experience, much much fuzzier than what a scientist would accept.

Now hiring