A post relating to this item from Finextra:
24 June 2009 | 15373 views | 1
The recession has spurred Wall street firms to investigate the use of new technology, particularly cloud computing, in a bid to overcome budgetary restrictions and skills shortages, according to a sur...
How many "silver bullets" have their been in the last 20 or so years?
- Open Source
The list goes on and on, starting with Babbage's Difference Engine on up to Web 3.0.
Whether you believe that cloud computing is simply the next evolution of distributed computing, or something new altogether, the fact remains that disruptive technologies have always been boldly trumpeted as the cure-all for what ails us. We will finally
realize all the enormous cost savings that we have promised our business counterparts if they will just invest $X millions of dollars in this new cure-all (while forgetting about the previous $X millions of dollars they invested in our last recommendation).
Strategists, futurists and visionaries; whether technically oriented or business oriented are rarely pragmatists. The hordes of voices and consulting firms clamoring to declare cloud computing "where we must go" are definitely not the same hordes of workers
that will have to design, implement and care for the ultimate solution. What the champions of the bleeding edge fail to recognize is that technologies aren't displaced and they rarely, if ever, die. Consider our own technology organizations. How many legacy
applications and solutions are you continuing to support year after year? Somebody, somewhere in your organization, bought that shiny new toy on the promise of a brave new world where all business problems are solved and all technology is cheap to support
and sustain. And, inevitably, that promise fails to play out exactly the way we hope or plan.
Each new technology added to an organization introduces the probability of a whole host of unintended consequences. Mainframes for instance. Most of us can recall the bold pronouncements that mainframes were dead. The advent of client/server technologies
caused prognosticators to boldly declare that mainframes were officially on the path to extinction. And yet, they survive on. They continue to shrink in physical form, and they are certainly the platform of choice for the processing of enormous transactional
So, corporations all over the world eliminated mainframe specialists. Colleges stopped teaching mainframe programming languages. Everyone moved on to the "new-new". The unintended consequence? We still have mainframes, and now we pay through the nose for
the talent necessary to support them. So, we have all the new stuff - and still have to manage all the old stuff.
Ideally I'd like to see less marketing hype, and more focus, on which functions within the financial services realm would most benefit from cloud computing approaches. Rather than curing all of our ills, how about curing the most painful and time consuming
issues first? Reporting for example - the bane of existence for virtually every senior leader in financial services technology. Whether for audit and compliance or fund reporting, moving towards a more dynamic and self-service oriented set of reporting interfaces
would most certainly reduce costs and headaches, while improving our customers' satisfaction levels.
If we find the right use for cloud computing, maybe we can introduce a set of capabilities to our constituents and stakeholders that compliment our existing capabilities; instead of throwing yet another technology on the pile whose unintended consequences
will add to our woes and cost for years to come.
That is, until the next "silver bullet" comes along to save us.