Insights

How can banks turn data into insights?

Written by 10x | 25 August 2022

Raw data on its own is worthless - the value is generated by what a bank does with the data to create actionable insights. And it is how fast a bank can turn raw data into actionable insights where banks can potentially gain a competitive edge - and where, today, many are still lagging.

While traditional banking processes involve data being held in an operational system and then being extracted via a daily batch process and moved to a data warehouse for analysis, the next generation of cloud-based core banking systems take a different approach. Every time a customer performs an action - a card transaction, say - a series of ‘events’ will be generated in real time. For instance, when a customer taps their card on a payment terminal, an event is generated and sent to the bank in milliseconds. That data can then be integrated however a bank wishes, giving banks realtime context and analysis about what their customers are doing instead of having to wait for a batch process to extract that data from the system many hours later.

This is effectively a new way for banks to organize data in which source data domains – for instance the customer domain, ledger domain or accounts domain – emit events to so-called aggregator domains which allow banks to provide real-time experiences for customers. Those could be things such as geofenced offers, dynamic products and better fraud detection. With this real-time event-driven microservice approach, data should be viewed as a product that business domain teams share with other domains, who then consume that data. This process is decentralized - no central team is responsible for making this data available, it is the responsibility of each individual business domain to make it accessible and in a format that others can easily use. This brings in the concept of the data mesh - a way to organize and distribute data so that it can be accessed by anyone in the business as and when they need it. The benefit of this approach is that a data mesh enables businesses to scale more easily compared to a centralized data-sharing model, as the owners of the data are best placed to make it available to consumers with the right level of documentation and context, allowing faster innovation.

Choosing the right solution to support event streaming architecture

Now that the technology behind the event streaming architecture is understood, it is equally important to understand the technology solution that one chooses to execute this with. As discussed, event-driven architecture enables decoupled microservices, and Kafka is an obvious leader within that domain technologically - battle tested, used in many large companies with a rich ecosystem and community. However, Kafka alone is not enough. Companies choosing to embrace event driven architecture must choose a technology solution that allows them to focus on their core product while the solution takes care of running the distributed resilient data store. As an example, Confluent offers a fully managed, cloud-native data streaming platform that can stream massive volumes of data, while maintaining the scaling, resilience and availability needed for banks. Banks should choose the right technology partner offering a solution that is built on these kind of cloud-native data streaming platforms.

The technology that makes this ‘event’ streaming approach possible is called a publish-subscribe message bus - or a ‘pub sub’ model where the owner of the data publishes it, and other services subscribe to the data. Whenever an event happens on the cloud banking platform, those events are sent to the message bus, which is a single integration point that allows all of a bank’s different business functions to connect with each other and share their data. Data platform Confluent describes the message bus like a nervous system, with the services that connect to it acting like neurons. What makes this unique is that it allows multiple consumers of the same data - in this case, the different services within a bank. Take an example where a customer is making a transaction in a supermarket. That transaction is sent to the ledger as the core accounting entry and the data is then published by the ledger, which other services can subscribe to. The customer’s banking app would then use that information to show where the transaction was made, with other microservices adding in details such as a dot on a map and the logo of the supermarket. In other words, the data only needs to be captured once but can then be reused many times by different microservices to enrich the customer experience.

This article is an extract from our whitepaper "The Power of Data". To read on, please download the full whitepaper via the button below.

Get the whitepaper