The BCBS28 monitoring requirement asks banks to gather, process and report on a significant amount of data on a monthly basis. Finextra spoke to industry experts Richard Morris, product manager, cash and liquidity management at SmartStream; Henrik Lang, head of global liquidity, global transaction services, Bank of America Merrill Lynch and Daniel Moon, a risk consultant with over 10 years of liquidity risk experience about how technology can prevent legacy systems from impeding the progress of real-time intraday liquidity.
The availability of cloud-based infrastructure and solutions is making it easier for firms to manage increasingly complex regulatory demands and to transition from a batch world to one where payments and liquidity move in real time. However, the volume of data that needs to be analysed under BCBS248 standards is another obstacle.
Morris believes that technology can play a part in monitoring intraday movements and real-time settlements throughout the day. “The implementation of technology is a significant upfront investment as well as an ongoing maintenance cost in terms of running these solutions and to the extent of which these solutions can be run in the cloud. This opens up the potential for organisations that have an expensive solution to implement but also want to track intraday liquidity and not be reliant on manual processes.”
Lang adds that banks will need to continue to implement regulatory intraday liquidity reporting as it continues to evolve and becomes more complex. “In addition to storing end of day liquidity positions, banks are increasingly required to monitor and maintain records of intraday liquidity usage.
“The regulatory information banks need to monitor and report on could include: maximum intraday liquidity in normal and stressed market conditions, total payments during the day, time-specific and other critical liquidity obligations, value of customer payments made on behalf of correspondent banks and intraday credit lines extended to customers,” Lang says.
Moon explores how while cloud computing allows scalability and elasticity for demands, AI and ML helps with the automation of this process, allowing the human element to focus on analysis and liquidity management and increases the scope and depth of analysis enormously, not only alerting humans faster to trends and patterns, but also optimising intraday liquidity management.
“The advantages are clearly there. We can see that from other industries and we are increasingly seeing innovative adoption of the technology within FI, especially from newer ‘challenger’ firms.
“One of the reasons we are not seeing the adoption of cloud-based infrastructure as rapidly as some non-FI firms is that both FI companies and regulators are still getting comfortable with the technology. This is fair enough - we need to make sure we have the controls and understand the potential risks that this technology brings. It is about striking that balance between managing risk and not stifling innovation.
“Hopefully this phase will end soon and firms will start to be brave so we can see real innovation and borrow the best ideas and concepts from industries that have already embraced cloud-based infrastructure.”
However, legacy bank systems are still an impediment when fulfilling new regulatory reporting requirements quickly and efficiently but having centralised data that is granular and easily accessible will also create efficiencies. “It will also allow banks to collect and store data points regarding clients’ intra-day liquidity positions and the overall bank position across all business units, correspondent banks and other market counterparties.”
Morris goes on to explain that by applying AI and ML techniques, one can process historical transactions to gain an insight into liquidity profiles and rather than waiting until the end of the day to discover there has been a sudden demand on liquidity at a certain point during the day, this technology allows banks to plan the response before the situation arises.
“This can also be extended into a failed prediction exercise to gain a real-time understanding of a settlement lake of transactions as they progress, so the business can focus its resources and efforts where they will have the most effect,” Morris says.
Lang has a similar view and believes that the first step of optimising intraday liquidity is to have visibility of intraday liquidity use by geography, market, business and client segment. “Technologies such as APIs and cloud-based infrastructures can improve the use of data including visibility, access, standardisation and increased storage capacity.
“The second step is to be able to take action by influencing, prioritising and optimising how intraday liquidity is being used across the businesses. Technologies such as AI and ML can allow banks to build more robust and dynamic predictive models, helping improve the accuracy of intraday liquidity forecasts.”
Banks need to be able to unlock capital and leverage intraday liquidity, especially in areas with negative yields and declining rates and AI and ML can help banks create dynamic predictive analytics, which could assist in analysing client behaviour, market behaviour and liquidity trends, as Lang explains.
Nevertheless, the progression of intraday liquidity to real-time must not be considered as linear success, intraday risk must also be measured in both quantitative and behavioural terms, viewing all the data stacked onto each other and forming behavioural patterns and trends.
Moon says: “The result can initially look like a confusing mess. This is where AI and ML can come into its own - spotting insight at a greater volume and speed - then transforming this insight into clearer metrics for the human decision maker to act upon e.g. using AI/ML we can see what gets us to those peak outflow events, plus detecting them earlier (and other abnormal events). Eventually this could lead to the AI/ML taking over some of the decisions themselves e.g. using algos to manage the liquidity.”
He adds that with ML, it is important to understand the limitations of training on historical data alone, “what happens when there are events that are not captured within the historical data - or how to manage bias within the data. But again, by being aware of these risks, it can often lead to better risk management.”
Lang concludes his argument by explaining that “when new technologies, such as APIs and cloud based infrastructures, are integrated successfully, they can improve both centralisation and access to intraday liquidity data as data is brought together across platforms. Layering in AI and ML could also help enable banks to reach the next level in intraday liquidity management and forecasting.
However, banks must understand how to build new technology into the existing infrastructure without compromising information security standards and controls, but we are not there yet. Moon says that “it will come to no surprise to people within the industry just how long some of the legacy IT systems and processes can hang on for, but we are slowly but surely moving in the right direction.
“To see real progress, the skills and attributes of those working in liquidity risk has to evolve. Firms need to drop “VBA Experience” from job specifications and start hiring people with experience in Python, R, TensorFlow, etc - especially in non-IT roles, combining risk management skills with data science is incredibly powerful.”
While learning code is not required, Moon advises liquidity risk managers to increase their understanding of how the cloud, AI and ML could benefit how they lead and drive risk management, a trend emerging in newer challenger firms and increasingly in the larger firms too.
Read more about how the implementation of AI and cloud means that legacy systems are no longer an impediment to real-time intraday liquidity here.