“Beware of the geeks bearing formulas” said Warren Buffett. Although aimed at quants and their complex financial models, sometimes I feel this statement is as equally as applicable to members of the IT community like myself who are tasked with helping
make such complexities reality. This year, my colleagues and I have had many a discussion on technology and its role in the current crisis. Is it to blame?
Technology proponents out there would say most certainly not. Way back in 2008, Hensershott et al’s
Journal of Finance verified that large asset block algorithmic trading led to spread narrowing, reduced adverse asset selection, and minimised price discovery time. They concluded algorithms improved liquidity and enhanced asset management.
Since then however we’ve seen the advent of high frequency trading (HFT) and micro-second trading. Opponents need only point to the flash crash of 2010 to highlight the risks inherent in such technology. Just today, there was news that Infinium Capital Management,
a high frequency trading firm was fined $850,000 for computer errors that disrupted trading on the Chicago Mercantile Exchange.
And if HFT weren’t enough, I’ve recently been reading about Field Programmable Gate Arrays (FPGA) - chips that process at the speed of light. This may well take the debate to the next level with trading no longer constrained by the limits of traditional
Computer Processing Units (CPU). Scary stuff!
Or maybe not. Whilst such technology enables vast volumes of data to be processed in real time, it offers the ability to analyse massive amounts of data in real time too. Applied correctly, I believe the technology of today - and tomorrow - offers new and
better ways to manage risk. And the general feeling among people I talk to is that the fault lies not with the technology, rather how it is applied.