Blog article
See all stories »

The Hunt for Speed

I was recently at the Dealing With Technology (DWT) conference in London, in fact I had the enviable (or unenviable should I say after standing for seven hours straight!) task of manning our companies stand for the day, never hurts to get closer to the customer I thought. One of the key themes of the event, looking around the exhibitors and the seminars, was Speed – “High Performance Computing”, “Event Driven Processing”, “Millisecond Response Times” were all claims plastered onto the marketing materials of numerous software and network technology companies looking to attract the attention of a passing MD. 

However an interesting conversation I had with a nameless Quant who was trying to educate me on methods for speeding up inverse matrix calculations (I never asked!) led to a rather interesting point I thought – how much time is actually being spent at making those pricing and risk calculations run faster? Ok so you could plug in a database that runs in memory, adopt methodologies from the relatively new field of event-driven processing or run the whole thing over a dedicated cluster but what if the underlying calculations being executed are in need of some re-engineering?

These could be the copy-n-pasted VBA code in spreadsheets to the Analytics Libraries that cross Desks and Systems within the Banks.  Could these applications be posing an overlooked bottle-neck? As always, one has to consider if the benefit justifies the cost - Do I really need to make my system operate sub second? If so where should I be spending my investment in reaching this target? It’s amazing how often these basic questions are overlooked in the hunt for speed. 

Next Topic (most likley): Microsoft's Excel Services – Any Good?


Comments: (0)

Member since




More from member

This post is from a series of posts in the group:

Capital Markets Technology

Front Office Trading Trends and Technologies...

See all