By Denise Montgomery, research director, Ovum financial services research and advisory, Asia-Pacific.
Banks continue to invest in new mobile front-end applications, online development, and agility. To speed the quality and relevance of decision-making and customer experience in such fast-paced environments, however, analytics are often embedded in these customer-facing processes. While new technologies, particularly in the data management arena, have the potential to be game changers, current user case studies mostly resemble ‘faster mousetraps’ delivering on old promises to deliver a 360 degree overview of bank customers an appropriate service. You need to look at the back end to truly improve customer service.
Insufficient attention is being paid by banks to how and when the back-end applications that supply the data will be real-time enabled and integrated. Much time and money has been spent on building ‘industrial-strength’ business intelligence (BI) environments, but they continue to be designed and implemented in batch environments by banks and are neither timely nor agile. Lack of clarity around real-time use cases, the level of investment required, legacy issues, and a profusion of new vendor offerings is resulting in end-to-end real-time enablement being deferred or consigned to the ‘too hard’ basket.
BI is becoming fast and operational
Banking organisations are now starting to embed BI in operational processes as a way to react quickly to customer demands and unexpected events, and gain competitive advantage. To be effective it must operate at a transactional pace and be able to analyse real-time data streams that reflect the current state of business. Ovum believes ‘real-time’ is best construed as having the appropriate interaction data quickly enough to affect the interaction to which it relates. The terms ‘real-time’ and ‘near real-time’ in the retail customer interaction environment have hence become, in effect, interchangeable. This is in contrast to the investment banking and financial markets arena, in which competitive pressure is pushing requirements for ultra-low-latency (sub-sub-second) transactions.
Big data is not necessarily 'fast data'
The immense volumes of ‘big data’ generated from increasingly unstructured data sources, such as remote machine sensors, social networks, multimedia, video streaming, and the digitisation of value exchanges, is estimated to reach eight zettabytes by 2015. Methods of interrogating this data are being piloted, or productionised, in most banks, but the processing of this data is not necessarily fast. In most cases it is being done offline, as humans and machines search for patterns and relevance in the data.
These customer insights then need to be operationalised in a faster more responsive environment. This may mean the coding of decision rules in real-time applications or machine-to-machine learning environments. Either way, the volume of data increases as the response time for the use of this data decreases.
Technological developments helping real-time analytics and processing
Developments surrounding in-memory technology, in-memory analytics, machine capabilities such as natural language processing (think Watson), the commercialisation of financial markets' complex event processing (CEP) technologies, visualisation tools, and the NoSQL juggernaut, combined with increased processing power and lower-cost grid computing, mean that it is now both possible and commercially viable for banks to real-time enable both the front and back office.
In face of this technology proliferation, banks need to be clear about the degree of latency, complexity, and integration required, as the available technology solutions involve significant cost implications.
Is it really new?
A range of user cases are developing but, in many ways, these are not new products or goals. Rather, new developments in technology are enabling banks to deliver on past promises. Churn analysis, credit risk, fraud detection, and targeted individual marketing are the primary frontline end uses. The ability to offer guidance to agents (such as call center staff) during banking interactions, to deliver individually customised offers or prompts, to anticipate and support customer self-service, and to route calls seamlessly across channels requires real-time decision engines. So far, however, the analytics supporting frontline retail processes have largely been highly summarised, based on lag or hindsight data, and often delivered too late to have much actual influence.
Risk management outcomes can also be significantly enhanced via the ability to process large quantities of information in near real-time and then immediately analyse the results. Credit ratings can be checked, high-risk customers identified, and action anticipated or taken far sooner.
Perhaps of more bottomline impact is the ability to incorporate more dynamic event interaction into analytic models; allowing banks to do more integrated, more informed, and more accurate scenario analysis on the fly. Such capabilities allow banks to enter and exit markets faster, speed up the accuracy of impact analysis for corporate deals, manage counterparty risk and spreads, and improve the frequency and valuation of collateral. Greater immediacy leads to the ability to narrow margins and improve financial performance. Inferences about how events are affecting markets, portfolios, and customers in real-time simply cannot be made with traditional summary-reporting technology.
Technology is not the inhibitor
The tools and technology that enable data to loop rapidly through an enterprise are available. The real sticking points in delivering this capability will be the usual suspects: cross-enterprise cooperation, cross-silo data integration, and cultural transformations to deliver agile, customer-facing processes.
Banks may continue firing off new applications at the front end, but are they paying attention to how to integrate data and processes across the enterprise? If not, it is time they did.