Buyers' brief: Insurance advances

20 August 2019

A recent report by PwC stated that “the urgent need for business and technology modernisation poses the greatest threat to the global insurance industry over the next two to three years”. Consultancy firms have a habit of drafting frightening summaries of the dangers that await insurers who fail to react to emerging market trends. And while those tasked with avoiding such pitfalls do understand the sentiment behind reports like these, in this instance their predictions are somewhat overblown.

Countless articles have been written about how large insurers are burdened by legacy mainframe platforms, but they often overlook why insurers have stuck with them for so long. Up until the mid-1990s, mainframes were the only viable option for handling the data processing requirements of businesses operating at scale. The tried and tested mainframe owes its popularity and longevity among members of the insurance industry because of the stability and reliability it offers.

“No other computer architecture can claim as much continuous, evolutionary improvement, while maintaining compatibility with previous releases,” according to IBM, who introduced the System/360 mainframe computer in April 1964.

Another plus point about these legacy systems is the price, with the cost of running the mainframe being annuitised over many years and representing a relatively small blip on insurers’ balance sheets.

In with the new

Despite all its benefits, insurers will eventually, albeit reluctantly, be forced to bid farewell to the beloved mainframe. Not only to gain access to the added functionality offered by new technologies, but because in less than a decade from now there won’t be people who know how they work.

“Where I see the real challenge over the next five to eight years is on resource and talent,” says Matt Potashnick, chief information officer for AXA UK and Ireland. ‘The traditional mainframe models from the likes of Capgemini and Tata Consultancy Services specialise in having great quality programmers that have cut their cloth and trained in legacy languages like ALGOL, COBOL and Java.’

The burning issue for insurers, therefore, is not the mainframe itself, which is rock solid, though admittedly less capable of agile development and rapid change due to the nature of legacy systems. Instead, the impetus for the insurance industry to decommission the mainframe is a new generation of programmers looking to cut their cloth according to the latest market demand.

“What we are seeing is that many programmers coming out of top schools and universities don’t want to learn these legacy systems and languages anymore,” says Potashnick. “They want to learn machine learning, PythonR and newer technologies like C# and .Net.”

“We are certainly seeing a cultural and skills shift that is forcing the insurance industry to modernise because the skills required to maintain these legacy systems will simply dry up. Look at COBOL-74 for example. The clue is in the title… it was created in the 70s and is still around today.”

Decommissioning the mainframe

Despite legacy systems not being under any immediate threat, insurers have already started carrying out platform assessments, partnering with companies like IBM to see which components can be decommissioned, simplified or moved to the cloud.

Last summer, AXA UK simplified its core systems with the help of New York-based UiPath. The pair worked together to implement robotic process automation (RPA) software to streamline various administrative tasks. By doing so, the insurer was able to save 18,000 people hours – equating to a cost saving of around £140,000 – with the help of 13 RPA bots deployed over a six-month period.

“Questions tend to come from IT people that ask if this is a long-term solution or a plaster over the cracks involved in not investing in ageing technology and diverting that resource to replacing our ten year old policy administration system, for example. They are concerned by that,” Simon Clayden, chief operating officer (technology) at AXA UK told Computerworld UK.

‘We don't build a robot to mask technical debt and still invest in the core platform as much as the robots,’ he added.

Insurers have also begun in simplifying or decommissioning aspects of their core legacy IT systems in favour of microservices like Microsoft Azure and Amazon Web Services (AWS) to improve their processing capabilities and reduce costs. 

Getting off mainframe systems and bringing these core legacy systems’ old applications up to speed is no easy feat, however. It requires a lot of heavy lifting, with insurers requiring the services of tried and tested software vendors like Salesforce and DuckCreek who are used to working at scale.

“There is so much (often bespoke functionality) in mainframes (and in the processes that underpin them) built up over many years that is it not feasible for a small start up to be able to replicate/disrupt easily given the complexity,” says Mark Budd, head of innovation at Zurich UK.  

‘If insurance companies want to 'digitise the core' to enable greater agility/innovation they need to either accept that replicating what they have in a digital platform is a long road or choose to simplify their products and processes at an organisational level - neither are easy.’

The pace of innovation within insurers’ core IT systems can look painfully slow to outsiders looking in. But insurance companies require their core to remain reliable and stable to permit them to innovate on the fringes with the help of smaller, more agile start-ups.

Innovation on the fringes

Three years ago, there was a nervousness among insurers that insurtech was going to disrupt the entire industry and bring the collapse of traditional insurance providers. However, very few of these tech start-ups opted to take on the full value chain of insurance. Instead, these fledging companies focused on enhancing various insurance products and services and, rather than overthrowing insurers entered into partnerships that ultimately ended up disrupting traditional technology suppliers.

Insurers and insurtechs opted to cooperate rather than compete because they both have what the other needs. Large-scale insurers lack the pace, agility and technical capabilities of these nimbler start-ups due to their size, but they do boast a strong brand and a large customer base – making for a perfect partnership. But not one without its challenges.

The biggest challenge for insurers working with smaller vendors and start-ups often comes down to the issue of working at scale.

“A lot of start-ups simply are not aware of the regulatory restraints that we are under, they just want to move fast and so there is often tension that we aren’t moving as quickly as they would like but they have to understand that we are bound by regulation due to the size and complexity of our business,” says Budd. “The more mature start-ups tend to recognise the burdens we bear from a regulatory standpoint.”

Working at scale for many fintechs is often a complete culture shock, explains Potashnick. It also tends to highlight any weaknesses in their technology, with many smaller third-party platforms not up to scratch in terms of stability and security when working at scale.

“Insurers will often ask start-ups questions like: when did you last have your system penetration attack tested? What does your disaster recovery plan look like? With vendors often left scratching their heads”, says Potashnick.

“We ask the same questions whether it is a £7,000 or £7m piece of software to get them to demonstrate their control at scale. We don’t ask them for annoyance. We ask them because we care about protecting our customers.”

Most vendors are very clued up on how to implement systems at scale, however. Their sole focus is driven by delivering their product to market. But many struggle or are found lacking when it comes to demonstrating simple controls such as penetration attack tests, particularly with regards to SaaS cloud technologies.

For this reason, when insurers are looking to innovate through partnerships with fintechs much of it happens from humble beginnings, with insurers opting to create minimal viable products (MVPs) based off a pilot sub-set of customers. If successful, and after running multiple tests, conversations between the two parties becomes serious, reaching a make or break stage, whereby the insurer either believes that the new system is robust enough to be implemented across the business at scale or it is forced to ditch it in favour of a similar capability being developed elsewhere and the process starts afresh.

Become a bobsguide member to access the following

1. Unrestricted access to bobsguide
2. Send a proposal request
3. Insights delivered daily to your inbox
4. Career development