Why AI won’t solve legacy problems
Having attended a great briefing on AI recently, I was made aware of a recent release from IBM called WatsonX Code Assistant – a generative AI solution that can help companies migrate their legacy systems running on old COBOL code to more modern programming languages like Java.
The release highlights that the platform will be enhanced to cover other legacy languages as well and states that “at 20 billion parameters, it is on target to become one of the largest generative AI foundation models for code automation”.
Its goal is to accelerate application modernisation and reduce the cost, risk and complexity of development while also improving quality and maintainability.
This is going to be a game changer for banks struggling for resources and agility when it comes to maintaining their legacy core banking solutions. I guess the same applies to incumbent core banking software companies.
So, has AI saved the day for banks and incumbent software companies?
Hmmm. Personally, I don’t think so. Why? Because this seems like a classic case of what I call reinventing the past with new technology. It misses the point of moving to a Generation 2 (MACH/BIAN-compliant) modern core (Generation 1 being simply a rewrite into microservices to move old code into the cloud).
Legacy core banking solutions were designed with specific products in mind, and hence this is one of the reasons no bank has just a single core banking solution. They weren’t designed to be:
- Real-time, operating 24/7/365.
- Highly scalable to serve millions of customers, not just thousands of staff.
- Flexible enough to create products limited only by your imagination (and of course, regulations), with a single core for every product.
- Agile enough so that changes can be made and deployed in hours not months, without taking the whole platform down.
- Composable in a way such that any part of the system can be swapped out to use best-of-breed solutions from different vendors.
- Future-proofed by business logic leveraging AI to drive decisions, rather than having fixed business rules and parameters.
In other words, I’d argue the requirements for core banking have so fundamentally changed that migrating a legacy core banking solution to a new programming language is pointless, as they need to be reimagined and not just rewritten.
Aside from AI and MACH, there are other technologies like blockchain and smart contracts that could also provide greater flexibility in core banking.
When you look at the new requirements, the challenge is not just COBOL, it’s anything that’s been developed as little as five years ago, making even modern platforms written by neobanks potentially already outdated, let alone those that were designed originally as client-server solutions.
Such client-server designs have been put into container technology to allow them to run in a cloud environment, but that does NOT make them cloud native. However, early developments using microservices do at least have a chance of being modernised, but it will be through human effort, not automation.
Visionary banks are already planning ahead for their cores to not only be a store of currency, but a store of anything of value – whether that’s initially a CBDC or a crypto token, or even something like tokenised real estate. While some of this may not be used in the near future, it’s necessary to imagine the future to avoid creating the next legacy platform.
With the speed at which technology is evolving, this is more important now than it has ever been. In the past, building in flexibility was seen as an unnecessary cost and delay to development. But today, it must be seen as creating competitive advantage and agility for the future.
This week, I’m sticking my neck out and saying that not every problem is a nail that needs a hammer when it comes to AI. Certainly in the case of legacy core banking software, building the next generation of solutions will require fresh thinking, modern technology and a new approach.
Currently, I’m seeing too much of the “faster horses” approach to using AI, where in fact we have the opportunity to truly transform banking for the good of everyone.
About the author
Dharmesh Mistry has been in banking for more than 30 years both in senior positions at Tier 1 banks and as a serial entrepreneur. He has been at the forefront of banking technology and innovation, from the very first internet and mobile banking apps to artificial intelligence (AI) and virtual reality (VR).
He has been on both sides of the fence and he’s not afraid to share his opinions.
He founded proptech start-up AskHomey (sold to a private investor in spring 2023) and is an investor and mentor in proptech and fintech. He also co-hosts the Demystify Podcast.
Follow Dharmesh on Twitter @dharmeshmistry and LinkedIn.
Read all his “I’m just saying” musings here.
Has the industry gone through the process that Boeing and Airbus struggled with (physically and organisationally) in 2003-2010 of being an integrator of sub systems provided by independent “risk sharing partners”? The model of building everything (aircraft) in 1 place was dead. The transformation didn’t go smoothly (I was an integrator, then a supplier). Judging by the outages over Black Friday, there is room for improvement for banks and mobile network providers (does it matter if I can’t log into my bank when I can’t get a signal?). Using AI (or llm) to regurgitate old methods won’t cut it.