"....Or, do we need to adjust our spot computing buying optimisation algorithms to lower our daily computing spend?"
When was the last time you sat in an operating committee meeting or your budget review sessions and had that question come up? If you have, you are well ahead of the rest of us. The reality is that today not many CTOs and CFOs in the financial services industry
have the transparency or tools in their computing infrastructure to have this conversation. But, times are changing.
The shift to the cloud has transformed how computing can be bought and sold and will in time transform the majority of the market to a commodity based one. (A quick refresher on what a commodity is and how they are traded can be found
InvestingAnswers, to be considered a commodity, an item must satisfy three conditions:
1. It must be standardised. Commodity trading requires agreed upon standards so that trades can be executed without visual inspection.
2. It must be usable (i.e. have a shelf life) upon delivery.
3. Its price must vary enough to justify creating a market for the item.
What most people don't realise is that computing (both on premise and in the cloud) has already met these conditions:
1. I know of at least one standard for computing that is being adopted. The kWAC
(Workload Allocation Cube) is a vendor agnostic market standard unit of measurement for IT. It works by comparing real time utilisation (workload) against a fixed baseline (allocation) spanning six vectors (cube): CPU, memory, storage, disk I/O, LAN I/O and
2. Computing is usable... no need for me to go into more detail here.
3. Pricing varies considerably across traditional providers (e.g. Dell, IBM, HP, Rackspace etc.) and the wider cloud (AWS, Azure, Google, etc.).
In fact, there is already a CME backed exchange that is making markets in computing forward and spot rates based on the kWAC called the
UCX. This very well may be the reference implementation and market foundation to facilitate this change in enterprise computing buying behaviour.
So, if we are heading towards a market where firms will purchase computing power in the same way they buy electricity or telecommunications - by consumption - and the market infrastructure to support this evolution is already emerging, it begs the question:
Have we adjusted our spot computing buying optimisation algorithms to lower our daily computing spend?
Or, for the traders or risk managers amongst us: Did we properly hedge our computing exposure today?