12 December 2017
Nigel Farmer

nigelrfarmer

Nigel Farmer - Software AG

19Posts 84,255Views 2Comments

2016: Capital Markets Firms will be Buried in Data

11 February 2016  |  2920 views  |  0

This is the year when capital markets firms will have to get to grips with mountains of data; if they don’t get it under control, they will have one Big Data mess.

And Big Data can only grow. Complex sets of data arising from requirements for reporting transactions, communications surveillance, risk management (BCBS239) and swaps data repository reporting will add to the fire hose of market and trade data coming from social media, emails, instant messages, news headlines, internal video and audio data.

This immense volume of data creates a bureaucratic nightmare for capital markets firms, few of which are prepared to handle the basic regulatory requirements, never mind take advantage of “golden nuggets” that could be useful for market monitoring and transparency. Therefore the ability to analyse and act upon the data before it becomes stale and loses value will be a key differentiator for capital markets firms going forward.

The ability to monitor and make sense of fast Big Data hinges on connecting to multiple, disparate, live data sources. Some of these sources may be internal to your firm; some may live in the cloud or stream from sensors. You must be able to ingest and digest all of these data streams, and able to handle the peaks and troughs of the velocity while detecting actionable patterns.

Actionable patterns can help capital markets firms detect fraud before it happens, or sniff out non-compliant trades or behavior. This becomes critical as regulators crack down, so much so that a compliance officer could go to jail for not stopping an illegal act—it could be a rogue trader or a wild algorithm—executed by one of his colleagues.

Communications surveillance is one area that poses many problems, with the need to record and analyse data from different channels such as voice, email and instant messaging. The need to retain these records for years requires a vast amount of data to be collected. And analysis and surveillance on this data is complicated as you must look for patterns across all of the different sources, combining and correlating communications with trading activity.

In the case of fraud, it becomes more predictable when compliance officers have enough transparency to see the trail leading to the possible crime. Complex event processing has typically looked at patterns over short-ish time windows, but next generation features and integration with large in-memory data caching technologies enable correlation and pattern detection across much longer time-windows. This opens up the technology to a wider range of problems such as money laundering, where the process of cleaning dirty money can take time as transactions occur across multiple assets and channels to hide the trail of money and it source.

In pre-trade risk, cases such as wild- algorithms or fat finger errors can be spotted using monitoring technology which will ring alarm bells and stop the program before the market is impacted. Firms are increasingly looking at how streaming analytics can be combined with predictive analytics and in-memory caching technologies to increase the intelligence of systems whilst retaining low latency. 

Combining such technologies can help firms react to extreme market conditions and minimise losses, such as when the Swiss National Bank unpegged the Swiss franc from the euro.  Not only could firms not react quickly to this stress in the market, but once the dust settled many firms could not report on their positions or losses due to this event for several days.

A key aspect of BCBS 239 is that that risk systems should provide accurate and timely risk data under both normal and extreme or stressful market conditions.  In-memory caching technologies can be vital in providing large risk data sets for calculation extremely quickly and can also provide a much cheaper alternative to ripping out and replacing legacy systems.

These technologies will also be increasingly used on the trading floor, for example aggregating data from a myriad of databases and systems to provide real-time insight and streaming analytics to aid traders decision-making or to provide custom pricing to clients based on historical patterns.

Rather than being buried in data, firms taking advantage of streaming and predictive analytics and in-memory caching technologies across all asset classes and systems will be the ones that unearth the golden nuggets inside it. 

 

TagsRisk & regulation

Comments: (0)

Comment on this story (membership required)

Latest posts from Nigel

Are Smart Contracts getting smarter?

22 November 2016  |  4588 views  |  0 comments | recomends Recommends 2 TagsBlockchainInnovation

Data, Data Everywhere, nor any Knowledge to Gain

27 October 2016  |  5079 views  |  0 comments | recomends Recommends 0 TagsRisk & regulationInnovation

Capital Markets Firms Embrace Cloud Transformation

13 October 2016  |  9992 views  |  1 comments | recomends Recommends 0 TagsRisk & regulationInnovation

How to Manage Your IT Spaghetti

29 September 2016  |  3399 views  |  0 comments | recomends Recommends 0 TagsInnovation

CFTC Rules: Preventing the Rise of the Cybermen

14 September 2016  |  4142 views  |  0 comments | recomends Recommends 0 TagsSecurityRisk & regulation

Nigel's profile

job title Industry Director, Capital Markets
location London
member since 2016
Summary profile See full profile »
Nigel Farmer is the head of Capital Markets solutions at Software AG, responsible for the strategy,product management and marketing of solutions in areas such as Market Surveillance, Pre-Trade Risk, F...

Nigel's expertise

Member since 2016
19 posts2 comments
What Nigel reads
Nigel's blog archive
2016 (19)

Who's commenting on Nigel's posts