<img alt="" src="https://secure.meet3monk.com/218648.png" style="display:none;">

Addressing legacy challenges – a core issue for banks

Generations of banking technology

For many banks, outdated approaches to technology are a major impediment to delivering a transformational approach to serving customers.

After decades of growth in computing power, technology is now sufficiently sophisticated at a much lower price point and is so widespread that new ways of managing data are possible.

We call this challenge the ‘legacy technology crisis’ - it exists at a time when the industry is on the cusp of a radical technology-led transformation. But why did many banks choose not to tackle this challenge earlier? To answer these questions, and to understand the challenges of the future, it is important to explore how the past has got us to where we are today.

The first wave

Over the past six decades, the banking industry has progressed through three generations of technology, reflecting ever-growing sophistication of the underlying IT infrastructure and increasing customer demands.

During 1960s and 1970s, the key requirement driving the introduction of the First-Generation technology was the need to centralise data storage across head office and branch operations.

Technology innovators at the time, such as IBM and Burroughs, produced monolithic mainframes, maintained by a small number of specialists and offering data-entry terminals to operators. They were instrumental in meeting demand for centralised ledgers. The key feature of these systems were custom-designed chips that would process every calculation twice and compare results to avoid errors. These systems processed a daily batch to settle accounts, while additional features were constrained by the need to process all the bank's transactions in the overnight batch “window”.

The second wave

During 1980s and 1990s, the Second-Generation technology (still mainframe-based) from providers such as IBM, Hitachi and Fujitsu continued the evolution of banking technology into highly reliable and well-engineered core banking systems. Additional technology stacks responsible for supporting newly-introduced telephone banking functionality were created. Workarounds to the core were introduced to allow this new channel to integrate with a system built around a branch network paradigm. The architecture worked, but it was not optimal.

Today, layers of technological workarounds are built in order to ensure that the Second-Generation technology can still address current industry demands. Some software from the First and Second generations has run continuously for decades despite several “hot swaps” of the underlying hardware.

The third wave

In the late 1990s and early 2000s, the Third-Generation commoditised solutions from vendors such as Finacle, Oracle and SAP began capturing increasing market share. This generation was optimised for cheaper commodity ‘mid-range’ infrastructure, typically based on the UNIX or Windows Server operating systems. Despite progress, the transformational breakthrough still hadn’t come: banking technology was still held back by the design decisions made in the previous century.

At the same time, technology architecture became more complex as new technology stacks were built to support internet banking through the 2000s. Selected sub-sets of a bank’s technology stack (such as general ledger functionality) were eventually redesigned from scratch to serve as “skinny cores” of updated platforms. A root-and-branch overhaul, however, of the architecture was postponed to focus on getting customers online.

Banks grew significant internal capabilities to manage applications and technology infrastructure. The dominance of on-premise solutions required vast private data centres to support operations, adding not only to the overall operating cost of a bank, but also limiting the scalability and agility of its operations.

New challenges

In recent years, complexity and scalability issues of the Third-Generation technology have presented an opportunity for new market entrants. Some of these providers began experimenting with cloud technology to solve the agility, data-storage and customer experience issues of the legacy banking platforms.

Nevertheless, incumbent technology providers do not represent a radical, long-term or sustainable improvement to legacy systems within banks. Cloud technology capabilities have yet to be fully utilised, with much technology development continuing to rely on pre-cloud architectural decisions and thinking. This means:

  • Real-time processing remains a goal realised in pockets and not a bank wide capability, hindering the in-moment decisions expected by digitally-savvy customers;
  • On-premise infrastructure remains the norm, constraining application change and requiring huge efforts to maintain and protect against cyber threats;
  • An overly modular or “skinny core” approach retained the significant risks and costs associated with integration of hundreds of third-party applications;
  • Data models remained siloed and fragmented, preventing the exploitation of AI and machine learning, while continuing to frustrate customers and colleagues where data cannot be reconciled within the enterprise;
  • Channel and product applications continue to proliferate and diverge to satisfy the latest business requirements. As a result, further technology-driven silos are generated along with business-driven silos, leaving a typical Tier 1 bank with potentially thousands of different applications to manage.

Given these challenges, it is not surprising that banks are looking at how they can protect their margins and pursue new revenue opportunities. Now more than ever, they need a unified technological approach that can manage all data held across their organisations.

Cloud-based capabilities – and specifically cloud-native technology – will be at the heart of the next revolution in banking.

Cloud-native software is designed to run in a cloud environment and can be deployed, operated and scaled on cloud platforms without involving significant infrastructure projects. A cloud-native approach links systems together in a flexible, microservices-based architecture, allowing banks to collate information they hold about their customers in a single place to offer more personalised, timely and relevant products and services. It provides a level of resilience, recoverability and security that goes beyond what was previously available.

At 10x, our platform uses the same Cloud Native Computing Foundation tools used by Google, Netflix and Airbnb to reliably run billions of application instances. A microservices-based architecture is also built into the core of our platform, allowing developers to focus on services they are building and deploying rather than the underlying infrastructure, network or hardware.

User-friendly development toolkits enable quick and easy product creation so products can be released to market in minutes, rather than months. These help reduce product development costs while embedding flexibility into a business. Agility on so many levels is built-in and can be adapted in real time to address new market opportunities and challenges. With legacy technology, that was not the case.

Banks have always wanted to make better business decisions and offer their customers more compelling services. With powerful innovative technology at their fingertips, new opportunities are there for the taking. Helping banks meet these opportunities was why 10x was created in the first place.