Blog article
See all stories »

It Takes a Village to Manage Operational Risk

Whose job is it to manage operational risk?

  • The Chief Risk Officer? Check
  • Operations Management? Check
  • Compliance? Check
  • Data Management? Check
  • Technology Providers, FinTech and RegTech? Check
  • Yours and mine? Check!

Yesterday I had the pleasure and privilege of moderating a great panel discussion on Big Data Analytics for Risk Management. The distinguished panel included Dr. Bob Mark, who was at one time CRO for a Tier 1 bank; Dr. Ravi Kalakota, who advises some of the biggest banks in the world on data strategy; and Shirish Netke, who is CEO of Amberoon, a San Francisco based startup that uses advanced machine learning for a highly visual AML monitoring solution.

On this one panel we were able to speak from the perspective of the Chief Risk Officer, Chief Data Officer, Operations Executive and Technology Provider. It turns out that all of our roles are necessary for risk management.

Data-driven operational risk management is absolutely critical. Regulators expect it, boards and shareholders expect it, and it significantly impacts the bottom line because of risk-adjusted capital requirements, and actual losses.

For too long, operational risk management has been characterized by qualitative, rather than quantitative methods. Heat maps, Key Risk Indicators, and prose statements abound in banks of all sizes.

What are the big challenges?

  1. The necessary quality and timeliness of data simply isn’t there. Those banks that have invested in enterprise-wide Master Data Management projects have found the task overwhelming and, in many cases, abandoned the effort. In any case, for most, there simply isn’t sufficient funding for such a massive project unless it is regulator-required.
  2. The tools for analyzing the limited data that is available tend to use brute-force rules-based filtering. Because of the huge potential cost of missing something important, banks have to be very conservative in the setting of rules, which results in huge volumes of false positives.
  3. Development organizations do not have the subject matter expertise to develop point solutions for specific operational risk management projects. This is exacerbated by trends to outsource operational development.
  4. Enterprise data management organizations lack the funding for a series of focused projects to collect data from multiple sources, harmonize and cleanse it, and manage it for ongoing quality.

We explored a particular area of risk management – Anti-Money Laundering monitoring – as an example that applies to a wide range of risk management activities. What we discovered in our conversation is that:

  • The CRO can’t manage money-laundering risk without due diligence from operations and compliance executives. They in turn need good data, no matter what processes are in place. The CRO and ops/compliance execs need the CDO.
  • The Ops and Compliance execs can’t monitor transactional activity effectively without throwing huge numbers of people at the problem, and even that is in itself error-prone. Ops and Compliance need the CDO and technology providers
  • The CDO can drive targeted, domain-specific programs for data management only with the right funding. Since line of business priorities typically drive project funding, it is up to the CRO to fund risk-driven data projects. (The lines of business will benefit, but not enough for them to provide significant funding).
  • The internal technology organization is overwhelmed with demand for product enhancements, with limited scope for operational and reporting needs. Risk and compliance subject matter expertise is also in great demand and very expensive. RegTech has a great opportunity to address this need.
    • False positives represent a high cost in managing AML operations.
    • False negatives represent a significant regulatory risk. 
    • Regulators continue to push for a risk-based approach to AML, but it is hard to implement.

The RegTech challenge, in partnership with its banking customers, is to minimize costs without increasing risk.

So, our conclusion is that only through multiple village chiefs (CRO, CDO, Ops and Compliance Execs, CIO with RegTech) can big progress be made in managing operational risk.

With good, well-analyzed data, operational risk will be far better managed in the future. But whichever perspective you have, don’t forget to partner with the rest of the village!

8684

Comments: (7)

A Finextra member
A Finextra member 25 July, 2016, 06:351 like 1 like Excellent article! ORM initiatives inevitably fall short of adequate funding. Reason? Effective strategic risk management (CROs role) is contingent on articulating a quantifiable value proposition for the business - "sizing" the benefit (business line role) and "sizing" the cost (Compliance role) of of compliance and of a risk event occurring. CRO & Business Lines must share joint ownership of ORM budgets as key stakeholders; Executive Board must assure proactive governance to ensure the health of a continuous ORM program.
João Bohner
João Bohner - Independent Consultant - Carapicuiba 25 July, 2016, 20:20Be the first to give this comment the thumbs up 0 likes

@Graham Seel,

As Sanjeev Ahuja said: Excellent article!

The complete, whole, thorough, obstacle is mentioned in the 'big challenge Nr. 1' of your article.
All others are 'consequences'.

The obstacle lies in the architecture taken by the banking proceedings, inherited from the start!

It has long since I hit the button that the banking business processing must be performed corporately, rather than by line-of-business.

The corporately banking business processing , dramatically reduces the huge redundancy of data and tasks generated by processing by lines-of-business.

Corporately performing process, will deal with a single source of knowledge, updated by a single state machine.

To illustrate, please check Jiang's drama, with Deutsche Bank (like him, many other bank's CxOs):

Zhiwei Jiang, global head of accounting and finance IT at Deutsche Bank, was keen to highlight the challenges faced by a traditional banking IT system, in February 2013:

“At the end of the day we still have a huge installation of IBM mainframes and hundreds of millions of pounds of investment with Oracle.
What do we do with that?
We have 46 data warehouses, which all have terabytes and petabytes of storage, where there is 90% overlap of data.
What do we do with that?” he said.

46 data warehouses with all the peripheral paraphernalia involved, replaced by a single source of knowledge, updated by a single state machine.
That yes, is a change!

So, the obstacle lies in the Processes Architecture and in the Master Data Management, corporately.

The "Bank of the Future" Architecture, constructed over the BOA - Business Oriented Architecture, provides all these needed improvements in the banking business, reducing the Operationa Expenses tenfold (at least) - not 10%!

This is a hot topic and I am open to exchange ideas on this matter.

joao.bohner@gmail.com

Graham Seel
Graham Seel - BankTech Consulting - Concord 25 July, 2016, 20:381 like 1 like

Joao, I agree that large banks have a huge challenge with their organizational structure, their legacy technology infrastructure, and even their culture. That's why Ravi Kalakota recommends tactical approaches, which should be driven by the CRO but in partnership with technology, operations and creative external technology providers (e.g. Regtech). In the long term, strategic solutions to the technology infrastructure are essential, but they're not going to happen in one gigantic "let's simplify and replace everything" project. What is needed is a prioritized and phased strategy that has enterprise-wide support (from CRO, CEO, CIO and all lines of business), priority (it cannot be bumped by "urgent" projects) and funding (Board-level funding support). This is probably a 10 year program at the biggest banks! But it is essential.

In the meantime, though, there are urgent regulatory projects that require more focused MDM, development and partnering to address specific solutions. These are tactical projects. Good tactical projects are carried out in the context of the overall strategy, and are designed (even at higher cost) so that they contribute permanently to the overall strategy. This isn't easy. but it is worth the effort if the bank has sufficiently knowledgeable and skilled leadership.

A Finextra member
A Finextra member 25 July, 2016, 21:43Be the first to give this comment the thumbs up 0 likes

Quite right that a "big bang" is not a viable approach; a tactical approach to addressing selected aspects of the larger issue (e.g., which database to start with, or which data set be given a higher priority) is clearly practical. But, clubbing everything together and taking a slice of the larger problem to solve it piecemeal is also not viable. We know that the latter is not possible without an uninterrupted "10 year game plan" with earmarked steady funding. This in turn, is unrealsitic in today's business context of short term valuations and quarterly performance reports. It's a data-driven brute force approach.

Banks and other businesses that are at-risk of data compliance issues, must first ask the question "What is it that makes this problem seem so BIG?" It's neither the amount of data, nor the amount of duplication in the data that is the crux of the issue. The challenge really is to be found in the 4 V's of today's data driven commerce - volume, velocity, variety and veracity of data. The first two are addressed easily through computing power and automation; they don't make the problem BIG. It's the latter two - the different types of data (consequently, degrees of compliance) and the wide range of data quality (consequently, extent of data cleansing and completion that is needed) have both to be addressed in one-fell-swoop. It is this that makes the problem BIG. These require neither inordinate time nor speed to resolve; both require busiiness knowledge and that is either scare or fragmented in any given organization!

The challenge then is of first defining and sizing the problem correctly, then of articulating a strategic approach for addressing it, followed by gaining executive commitment and adequate funding against a realistic and achievable plan and finally, finding and applying the best and most relevant business expertise available internally or externally. This is a knowledge-driven approach that finesses the issues of volume and velocity. It works every time!

João Bohner
João Bohner - Independent Consultant - Carapicuiba 26 July, 2016, 15:40Be the first to give this comment the thumbs up 0 likes


@Graham Seel,

Peter F Drucker once said: "strategy is doing the right things, tactics is doing things right."

Nowadays, where technological evolution is exponential, a "10 year game plan" is unacceptable.

A 'Business Architecture' - Overall Strategy - independent of technology, 'is essential.'
So new technologies will help business rather than hinder it.

I strongly believe that, with the Enterprise Architecture - "The Bank of the Future" - a time frame of two to five years, depending on the actual back office entanglement, is reasonable.

And then, yes, new tactical projects (urgent regulatory projects) are carried out in the context of the Overall Strategy.
The old tactical procedures are phased in on the Overall Strategy until they are extinguished...

Graham Seel
Graham Seel - BankTech Consulting - Concord 26 July, 2016, 16:54Be the first to give this comment the thumbs up 0 likes

Sanjeev and Joao, perhaps I should clarify my 10 year plan. I completely agree that a project or program with that kind of timeline simply cannot work. I would propose, though, that the technology culture of our banks has to incorporate a long-term view as well as a short-term one. Initiatives that address immediate point solutions must also be chipping away at the long-term issues. Otherwise, the long-term infrastructure challenges of banks will not be resolved until one by one they become urgent issues as the infrastructure crumbles.

A Finextra member
A Finextra member 26 July, 2016, 18:121 like 1 like

I think we may be in vehment agreement - your reference to a "10-year" plan was quite rightly highlighting the need for continuous improvement (as I interpreted it), which is abolutely right; also on the same page regarding long and short term perspectives.

What I recommend Banks to do differently are 2 things:

1) taking a concerted crack at foundational components of the strategy and under it addressing the tactical wins; not the other way round, and

2) getting executive buy-in on a sizeable albeit value-based scoping of the step change.

Addressing the above by identifying a minimal useful size as a threshold on the one hand, and the maximum achievable size for dramatic gains on the other hand, ensures executive ownership (lock-in), measurable and lasting business impact, and continuous improvement that is sustainable indefinitely.

Graham Seel

Graham Seel

Principal Consultant

BankTech Consulting

Member since

17 Apr 2015

Location

Concord

Blog posts

44

Comments

50

This post is from a series of posts in the group:

Innovation in Financial Services

A discussion of trends in innovation management within financial institutions, and the key processes, technology and cultural shifts driving innovation.


See all

Now hiring