The results of the Accenture 2016 Compliance Risk Study (https://www.accenture.com/us-en/insight-compliance-risk-study-2016-financial-services#block-pivoting)
raise some important points to consider to my mind. The two specific action points suggested by Accenture that I found of particular interest are:
- The improvement in resource utilization on compliance tasks
- The prioritizing of obtaining high quality data and making good use of technology architecture advancements
When I consider these two points and dig deeper to understand what this might mean, I discover the following:
Although compliance is a vital component to manage operational and reputational risk, it has always been considered to be an overhead within the business. Important for sure – but an overhead and therefore always a temptation to perhaps take ‘calculated’
short cuts to trim costs and improve margins. It increases the cost of doing business and any effort to reduce this cost dramatically improves margins. However, after the financial meltdown of 2008, management teams within financial institutions have been
grudgingly increasing expenditure on compliance, and in doing so, pushing back a number of other strategic initiatives. To remain competitive, strategic initiatives cannot remain on the back burner for long and institutions are looking at all different ways
to cut costs. Although compliance is an important activity, it is nonetheless a major focus area for the reduction of cost through improved resource utilization.
By the same token, in a wish to make processes more efficient and therefore cost effective in the medium and longer term, institutions have been spending a lot of money on modernizing their technology architecture in relation to data management operations.
This area of technology refresh continues to be a struggle due to various factors. Some of the reasons for this continued struggle are:
Institutions still heavily rely on manual processes and interventions, making data quality a very big concern. Legacy systems don’t help matters much as they invariably don’t allow easy integration with new data sources. This of course encourages the continued
addition of manual processes and interventions, which in turn creates more Excel spreadsheets and increases the dependency and the pain associated with cumbersome reconciliation processes.
Data architecture that cannot keep pace with the speed of regulatory change:
The last few years have seen drastic changes in the compliance and regulatory space. The changes have been very frequent, as regulations have been hitting the financial industry operations with a barrage of new rules. Banks have been struggling to get their
data architecture firmed-up in this ever-changing market structure, driven by the need for much tighter and rigorous compliance practices.
Gap between vision and execution:
More often, strategic architecture initiatives remain on paper longer than they should. At the time of execution the authors of the proposed architecture changes have either moved to some other project or have left the company. When execution of new system
architectures is not led by its proponents, typically the benefits that are meant to accrue in a given span of time is rarely measured - let alone achieved.
Defining a single point of data ownership for each piece of data:
This is a perennial problem in large organizations where multiple units operate in silos. Having multiple versions of the same data in different parts of the same organization is obviously a nightmare to manage.
Multiple standards of data formats:
Each asset class uses different data standards i.e. FIX, FpML, SWIFT, etc. with their own rule book for each data element. Added to this there are proprietary data formats in XML, Cobol Copybook, Excel, CSV, etc. This multiplicity of data formats makes any
aggregation exercise very complex.
Multiple regulatory reporting - globally expanding compliance obligations:
Irrespective of their place of business, firms that have financial trading exposure in any geography are now forced to comply with the reporting regulations of that region. This typically results in the existence of multiple reporting solutions specific
to each of those geographies. With the constant management of new releases and patches, keeping track of the changes and updating the architecture is a major IT challenge.
A Solution Approach:
Banks need to focus on developing solutions along the lines of high-quality infrastructure rather than resorting to what amounts to temporary “band-aid” fixes to meet the implementation deadline. A short term tactical approach may help to overcome the challenges
of meeting the immediate deadline but in the long run, the cost of maintaining such fixes will be very high and also may lead to heavy financial penalties if any data is found to be wrong.
Choosing the right data integration tool that is easy to use, does not involve a lot of programming, but is instead implemented through configuration in the main, is one key decision that needs to be made by the institution. The tool must be versatile enough
to provide the flexibility of incrementally building the solution as situations demand, allow for frequent changes and provide for future additions. The tool must have the ability to manage not only the incoming data formats but also be able to manage the
various reporting standards mandated by the regulatory agencies. Since compliance is global the solution developed should be deployable in the diverse IT infrastructure of the organization globally. Finally, the tool should improve the productivity of the
team, save costs and inevitably increase business agility while reducing risk.