Two weeks ago, I attended and presented at the Big Data & Analytics for Financial Services conference in London. The two-day event brought together banks, product vendors, and service providers like ourselves to share what is happening in the industry today
and the vision of what is to come tomorrow.
What struck me most at the event was the gap between vision and reality. While the product vendors spoke of the great potential for big data technologies and the banks highlighted clear use cases for their use, there were few who spoke of concrete projects
and success stories. This is not a criticism, but a clear reaffirmation of the state of big data in financial services – it’s still early days.
HSBC was an exception; its Chief Data Officer gave us a taste of some of the first projects they have built on their “data lake” which brings together disparate data from across the organization. Smaller banks had clearly less to show and many admitted to
being in an experimental phase, just testing the waters, or only just weighing options and next steps.
GFT is also an exception. In my presentation, I tried to “keep it real” and talk about projects which we have done in the last 24 months and the lessons we have learned. These included tips on how to prepare the right infrastructure and project team to ensure
data quality and operational buy-in.
The banks can and should look outside of their organization to get help. Big data is not only a big technological challenge, but a functional and organizational one as well. Partners can help guide them through their journey by bringing sound advice, experience,
and best practices.
This is true no matter what stage a bank is in, whether it is an early adopter like HSBC, or a late entrant like many of the smaller banks. An objective viewpoint can be tremendously helpful in these challenging times.