IT meltdowns are becoming an all too familiar feature of the banking landscape. High demand for banking services on payday should be a predictable scenario; yet in February we saw three of the UK’s leading banks suffering performance issues, leaving mobile banking customers unable to make payments and transfers. A separate issue just two months previously left three quarters of a million customers of RBS, NatWest and Ulster Bank unable to use their payment cards for three hours on Cyber Monday; one of the busiest online shopping days of the year. These are not isolated incidents, performance problems are strangling the industry as a whole; no one seems to be immune.
I have great sympathy for bank IT departments; they are dealing with creaking legacy systems, soaring volumes of transactions, and pressure to innovate in order to compete. Today’s customers expect 24/7 availability. Failure to meet these expectations results in very public condemnation, as beleaguered customers flock to Twitter to make a public spectacle of their dissatisfaction. Yet to understand why these problems are occurring, you need to look at how the banking industry has been transformed over recent years.
The shift towards customer-centric banking
Times have certainly changed in the world of banking, driven predominantly by new technology and changing customer expectations. Traditionally, people would visit their local branch to conduct transactions, whereas now we expect the bank to come to us. People expect round-the-clock access, no matter where they are, and from whichever device they choose. Added to this, customer loyalty is continually being weakened. Gone are the days of the ‘bank for life’; switching providers can be completed at the click of a button, as consumers Google their way to a better deal. It is therefore much harder to differentiate on rates; instead, service now holds the key to standing out in a crowded marketplace. As such, banks have to think carefully about the customer experience in a way they have never had to before.
Technological warfare: legacy vs. innovation
While banks recognise the need for high performance across multiple platforms, delivering this is no mean feat. Given the nature of data that banks store, there are stringent regulations in place and security needs to be a top priority. Added to this is the sheer size of banking institutions and the broad range of specialist applications that are required to operate the business; making changes is a huge undertaking, involving thousands of employees and customers, across a widely dispersed geographical spread. This means many banks are encumbered by legacy technologies, as replacing applications and infrastructure is prohibitively expensive and risky.
Yet this reliance on legacy technology is creating difficulties in itself. For stability and security purposes, most customer data still resides on 30 year old mainframes; yet as IT becomes more consumer-focused, banks have to deploy newer, more agile technology in order to compete. As a result, IT departments are constantly trying to find fixes and bridges to marry old and new world technologies, while also ensuring round the clock availability; no easy task! Added to this, IT departments are siloed into different technology disciplines, with specialists focusing on different parts of the customer journey; this means the introduction of new technologies into the IT infrastructure is often fragmented in approach. As a result, inefficiencies can easily slip through the cracks, and problem resolution can take much longer than is needed.
No more excuses: The need for embedded performance management
As banks increasingly integrate new technologies with legacy systems, it certainly creates an added layer of complexity, but this is no excuse for slow systems or poor end-user experience. Legacy infrastructure is a reality, but that should not stand in the way of innovation. Customers should expect the same high levels of performance whether they are accessing their accounts on the web or on mobile, with browser-based and mobile applications that are designed with performance at their core; not as an after-thought. It is unacceptable for customers to be unable to access their accounts, even for an hour; excuses around high levels of demand and glitches are wearing decidedly thin.
Banks need to start reviewing technology performance from the perspective that counts: the customer’s. In order to gain a true picture of the end-user experience, banks need to stop looking at technology in isolation. What is needed is a holistic approach, with end-to-end visibility from the mainframe through to whichever device the customer is using. Performance monitoring and optimisation needs to be built in at every stage: from development, to implementation, right through to management and customer service. IT cannot keep passing the blame from one department to another when something goes wrong, they need to form a united front, working together to identify and fix problems swiftly and effectively. In order to gain this cross-silo visibility they need to deploy next generation performance management tools.
In short, for customer loyalty and satisfaction to grow, IT must act before business suffers and customers are affected. Banks need to get a handle on this situation; otherwise, we will see more failures and more unhappy customers.
Michael Allen has been working in the IT industry for 20 years, the last 16 with Compuware. Michael has held many commercial, technical and leadership management positions at Compuware, working within the Application Performance Management (APM) business unit. Today, he is EMEA Vice President & CTO of APM Solutions at Compuware and regarded as a key visionary in the performance management market. Michael works closely with the financial services sector, helping organisations to meet rising performance expectations; in particular focusing on areas like mobile and next-generation online innovations and applications, such as: alerts, remote deposit capture and person-to-person payments.