IBM is working with Canada's TD Bank to test its new 'stream computing' software, which uses supercomputing power to manage larger volumes of data than traditional technology.
IBM says the new software utilises the Blue Gene supercomputer to examine thousands of real-time information sources, enabling the bank to capitalise on up-to-the-minute changing market conditions.
The stream computing architecture uses algorithms to analyse live structured and unstructured data from any source. The system can integrate text, voice, images, video, databases, market data feeds and application data, automatically determining which information is relevant.
This reduces the work required "downstream" and provides a framework for continuously refining analysis and easily including new data sources as they become available, says IBM.
Rizwan Khalfan, CIO, TD Wholesale Banking, says: "The combination of the stream computing architecture and the Blue Gene supercomputer allows enhancements to our real-time messaging and analytical capabilities while simplifying the underlying infrastructure."
"Today's hardware and software is not optimised for real-time analysis of data. They came from a paradigm that paused and then queried a database of past information to give an answer while new information was still coming in that may affect the outcome," adds Nagui Halim, chief scientist of the stream computing project, IBM Research. "We've designed and constructed a computing system from the ground up to provide continual analytics as events happen, which has very powerful applications in financial services, government, and many other scientific and business areas."