Community
While Silicon Valley is always awash with new technology buzzwords, one that has become increasingly dominant in recent years is Stream Processing: a type of software designed to transport, process and react in real time to the massive streams of event data at the heart of large internet companies. While it isn’t wholly surprising that a company like Netflix–responsible for 36% of US internet traffic–would be interested in technology that processes data as it moves, what may be more surprising is the uptake in traditional markets like finance, which doesn’t typically generate the ‘trillions of events per day’ workloads seen at the giant tech firms. But this uptake demonstrates a more subtle and intriguing shift in the way companies are reshaping their IT functions. In much the same way that the internet led to a range of products whose value came from connecting people together, the industry may have discovered a second revolution in the way that the applications that make big companies work discover one another, interact and most importantly, share data.
The Digital Customer Experience
When a customer interacts with their bank, be it to make a payment, withdraw cash or trade a stock, that single action, or ‘event,’ triggers a slew of activity: accounts are updated, postings hit ledgers, fraud checks are run, risk is recalculated and settlement processes are initialised. So the value of that event, to the organization, is biased towards the moment it was created. The moment when this broad set of operations all need to happen at once. The more widely- and immediately-available these events are, the easier it is for a company to adapt and innovate.
A number of upstart finserv firms were quick to notice this. Monzo, for example, focuses on a mobile-first, immersive user experience. When customers buy goods with their Monzo card their mobile application alerts them of the transaction often before the receipt has finished printing at the checkout, complete with details of how much they spent and where they spent it. The bank performs real-time categorization on purchases so users know what they are spending their money on, and the mobile-notification even includes a purchase-correlated emoji: a coffee cup for Starbucks, a T-shirt for apparel etc. This is something a cynic might view as an extraneous trick, but it really highlights a far more important shift in customer’s needs. Customers no longer wish to drive to a branch and talk to their account manager face to face, instead they expect a rich and personal digital experience. One that makes them feel in touch with their finances and partnered with a bank that knows what they, the customer, is doing.
But knowing your customer isn’t simply about collecting their digital footprint, it’s about interacting with it, and them, in real time. Monzo, Zopa and Funding Circle achieve this by repurposing the same streaming technologies used by tech giants like Netflix and LinkedIn. In Monzo’s case, as soon as a payment is acquired and the balance updated, the payment ‘event’ is stored in a streaming platform, pushing those events (or anything previous) into a wide variety of separate IT services: categorization, fraud detection, notification, spending patterns and of course identifying the correct, and all important emoji.
But being able to adapt a stream of business events isn’t just about upstarts leveraging coffee cup gifs and immersive user experiences, more traditional institutions like ING, The Royal Bank of Canada (RBC) and Nordea have made the shift to streaming systems too. What makes their approach interesting is their interpretation of the event stream, not as some abstract piece of technology infrastructure, but rather as a mechanism for modelling the business itself. A financial institution has an intrinsic flow that is punctuated with real world events, be it the processing of a mortgage application, a payment or the settlement of an interest rate swap. By rethinking their business not as a set of discrete applications or silos, but rather as an evolving flow of business events, they have created a more streamlined and reactive IT service that better models how the business actually works in a holistic sense.
The introduction of the Mifid II regulation–which requires that institutions report trading activity within a minute of execution–provided a useful litmus test. This was notoriously difficult for bank’s to implement due to their siloed, fragmented and batch oriented IT systems. For the banks that had taken the trouble to instal real-time event streaming, and plumbed it across all their product aligned silos, the task was far simpler.
From Data Warehousing to Event Stream Processing
Describing streaming systems as technology that simply transports business events around a company may actually be doing them a disservice. On paper these streaming systems look much like databases. They both store data and they both support queries using a Structured Query Language (SQL). But in practice they are quite different. While data warehouses are designed to accumulate large data sets, typically towards the aft of the company where daily reports are run, streaming systems are about processing and distributing large data sets so they are available throughout the company, instantaneously.
For the adopters this powers unified customer experiences that cut across silos, machine learning models that automate decisions, and new collaborative capabilities–fraud, pricing, compliance etc.–available enterprise-wide. Opportunities where the multi-hour delays seen in batch processing systems would make them non-starters.
Unlocking Agility from the Assets Within
While the immediacy of streaming lets banks like ING or RBC respond instantaneously to their customers, the potentially greater benefit comes from an increased ability to adapt and innovate. For the last two decades, IT culture has shifted towards more agile approaches for building and delivering software. Waterfall processes, where requirements are locked down before software development commences, have been replaced with more iterative approaches where new software is developed in short, often week-long cycles from where it is visible, testable and available for feedback. These methods are beneficial because they reassign the notion of value away from fixed goals: software that does ABC delivered on date XYZ, into process goals: a process that builds software incrementally, lets customers provide fast feedback, and lets the business redefine and optimise the end product as the world around it changes.
But while this mindset works well for individual software projects, for most enterprises, already backed by many billions of dollars of technology, built up over decades of existence, embracing the same kind of agility at an organizational level is impossible. In fact, finserve boardrooms are far more likely to be graced with proposals for reinvention through the latest shiny new software platform than the less bonus-optimized steps needed to move an aging technology stack incrementally into the future.
But one of the greatest ills that faces the IT industry today is the inappropriate attribution of blame for a dysfunctional IT organization onto dysfunctional software, when dysfunctional data is far more culpable. A company may have best-in-class software engineers, but if the first thing they have to do when they start a new project is identify, collect, import and translate a swathe of flawed and hard to access corporate datasets, it is reasonable to expect the same painful release schedules seen in the days of waterfall, not to mention the likelihood that those best-in-class software engineers will slowly migrate elsewhere.
Companies that take the stream processing route observe a subtly different dynamic, because the event streams breed agility. These platforms come with tools that transform static databases into event streams, unlocking data hidden deep inside legacy systems and connecting it directly to applications, company-wide. For companies like RBC, this means pulling data out of their mainframe and making it available to any project that needs it, immediately, as well as retrospectively. The insight is simple, but powerful: if the data is always available, then the organization is always free to evolve.
So the implication for the embattled incumbents of the finance industry is that their best strategy may not be to beat the upstarts at their own game. Instead, they are better off unlocking the hidden potential of the assets that they already own. As Mike Krolnik, RBC’s head of engineering, enterprise cloud, put it: “We needed a way to rescue data off of these accumulated assets, including the mainframe, in a cloud native, microservice-based fashion.” This means rethinking IT systems, not as a collection of independent islands that feed some far off data warehouse, but as a densely connected organism, more like a central nervous system that collects and connects all company data and moves it in real time. So while buzzword bingo might point to a future of internet velocity data sets, the more valuable benefits of streaming platforms lie in the subtler, systemic effects of denser data connectivity. This may well pass our data-fetishist technology community by, but the wise money is on the long game, and the long game is won with value.
This content is provided by an external author without editing by Finextra. It expresses the views and opinions of the author.
Erica Andersen Marketing at smartR AI
04 November
Prakash Bhudia HOD – Product & Growth at Deriv
01 November
Ben O'Brien Managing Director at Jaywing
31 October
Welcome to Finextra. We use cookies to help us to deliver our services. You may change your preferences at our Cookie Centre.
Please read our Privacy Policy.