Blog article
See all stories »

In-memory analytics - operational risk management 2.0

The likes of the RBS IT glitch, PPI mis-selling claims, and the Libor rate-setting scandal have ensured that for some time operational risk has been prominent on the British news agenda. Against this backdrop of issues it seems unsurprising that the Financial Stability Board (FSB) advised that the Basel Committee on Banking Supervision prioritises management of operational risk problems in its current agenda.

Several challenges can lead to inevitable operation problems across a number of business areas. Most notable of these are fraud, along with overly complex and poorly managed processes, and inadequate IT infrastructure. However, many banks still lack access to the information needed to monitor all these areas effectively. In addition, traditional risk limits are no longer sufficient to meet the new requirements. Most banks complete tens of thousands of rules and checks on a daily basis, so it is imperative that they are able to monitor trends in order to identify breaches as and when they occur.

To use an example, an individual trader might have a daily position that makes an average of £8,000 profit. If this moved up to £20,000 a day over a period of weeks or months, this change would highlight an operational risk as it is beyond the trend. It is possible that risk limits are being breached or protocols are being ignored, meaning that the activity is not ‘business as usual’. 

Adding to this picture, there are massive sets of data involved, of different types – including the three Vs of ‘Big Data’: variety, velocity and volume. The Big Data challenge makes identifying these operational changes even more difficult. Banks need to analyse heterogeneous data every day, in real-time, and present it in a simple, consolidated format which is easily understood by staff across different business areas.  Furthermore, in light of Basel III, banks are facing up to the need for a consolidated view of market risk and profit and loss (PnL) in one place – an approach that is becoming increasingly important.

In order to make sense of Big Data and its relationship with operational risk, banks are increasingly turning to in-memory analytics, which draws data from multiple disparate silos. The data is analysed ‘in memory’, meaning in the cache or RAM and, importantly, not disk-based. Querying data sets ‘in-memory’ dramatically speeds up the response time while increasing reliability. For regulatory reporting, it is vital for CRO’s to have real-time risk data at their fingertips – and this data needs to be easily manipulated ‘on the fly’.

It is unlikely that banks will be able to protect themselves entirely against every possible risk. However, there are many operational risk aspects that are preventable and that banks can and should do more to address. The outlook is positive as banks tend to have 80% of the problem already solved as they already calculate PnL and market risk. What they must do now is look at the technology and data available to them in a consolidated manner, to make the most of the information they already have. Managing operational risk is ultimately about getting ahead of the risk trend, and that’s where the right technology can pay dividends.

 

2634

Comments: (0)

Now hiring