The dictionary entry for the word Utility defines it as a 'the state of being useful, profitable, or beneficial', but delivering all three in financial markets in a mutual, single proposition, is a challenge. It is also the root cause holding back the industry from a big bang change to using utility services.
So why is the industry once again beating the drum on outsourcing to utilities?
Maybe it’s the advancements in technology, or new commercial models that are either tackling and solving persistent issues, or because they are offering a paradigm change to do things in a different way.
There is a well-documented history of how IT management has yo-yoed from in-house to outsource, flip flopping as regular as the cyclical movement of planets over the decades. In contrast, the steady but incremental progression of data management utility offerings reflects the resistance from trading system practitioners, whose world has been built on control of process with reconciliation at its core, but where there has been high post-trade friction, touching many surfaces.
To break ranks from well-worn paradigms, it has needed both developments in enabling technology and a significant shift in management attitudes, to provide both the oxygen and dried ink to the recent, high profile utility wins. Outsourcing is a mature and a well-used component in operating a modern business but by any comparison, these recent financial service contracts won by well-known utilities were long in the making and even longer in the execution.
The seeds of this modernising outlook, after a forced period of self-analysis of what banks are really in business for, were strewn far and wide in the wake of the market crisis a decade ago, culminating in a simple statement of being: to focus talent, intellect and effort on the creation of wealth (through trading) and to protect that wealth through highly risk managed and efficient operations at lowest possible cost. The core tenet of most successful business strategies of 'only do what we can do', is beginning to ring louder and clearer in the financial industry as it assesses which functions and processes are viable to outsource to a utility.
For independent software vendors of post-trade systems, upgrades to existing systems or replacements with new solutions are few and far between, yet their market is increasingly challenged by new fintech cutting and splicing legacy systems with best of breed component plug-ins. Agile new entrants will provide these new solutions to dislodge, or at the very least, force incumbents to really up their game.
The move up the proposition curve by independent software vendors into valued added managed services, such as utilities, while not new, now reflects more starkly an acknowledged risk of competitive solutions succeeding as part of a more viable change agenda, where new available technology is met with C-level will to implement it.
Control the environment and you control the solution; an interesting old school plan for modern times.
To justify a utility outsource decision, a number of conditions must be achieved to be signed off; lower operating costs than existing operations and not just a transfer of cost with risk; improved trade processing leading to lower costs; lower operating costs than your competitors; re-tasking of domain skilled staff to higher value workflows. Justified benefits must include a financial net cash saving in existing operational costs; the debate is whether both direct and indirect costs should be included and if there should be a recovery of contract set up costs. Add to that the calculations of real rate of return for these type of contracts spanning many years, then all of a sudden, the accountants have their work cut out.
Most of the utility models in play today are based on classical, functional lines but are always customised - so are mainly hybrid - which is somewhat contradictory based on the accepted definition of utility and how its benefits are derived. It is new technology that is breaking down this throwback concept, as customisation can now be delivered at very small scale and at low cost.
However, this is having a profound effect on the configuration and success of utility offerings, where clients pick and mix to retain certain aspects of their operation alongside the utility, either to protect their intellectual property, to meet individual client service level agreements, to fulfil their fiduciary responsibilities, or to comply with financial market and general business regulations.
The challenge for the utility is to undertake and assist the delivery of its clients’ obligations and their end user promises and provide the technology to route in, out and back, data workflows with its client, where the benefit of efficiency offered by the utility is offset against the client’s desire to retain certain workflows in-house, which is driving dis-aggregation of process and data, to produce a best of both worlds.
A data management utility cannot deliver all things to all involved, so any commercial agreement will be a compromise, so the defining mark of such a collaboration will be what compromises each party is willing to make to sign the service agreement.
The theme of post-trade is a reconciliation model but it is just that model which is being challenged; why plan around reconciliation? Why not aim for touchless trade processing, at least in human terms and deploy artificial intelligence (AI) and machine learning (ML) to benefit the organisation, with all the necessary governance and controls?
Any investment in AI/ML technology requires not just financial but intellectual capital too and the latter is arguably, better provided by the operating entity who commissions it, at least for the initial training. While a utility may provide intelligent technology, it will rely on the client user to train the systems in a collaborative way, which raises the valid issue of ownership and transfer of intellectual property and the commercial benefits that may flow from it.
A key difference in existing utility models is whether or not common reference or market data is democratised, i.e. designed to be held within a central repository and accessible by applications or functions that need it, crossing the boundaries of consuming entities serviced by that utility, or siloed by subscription channels chosen by the contract owner, where data is managed within that specific envelope.
As clients have a duty of care to ensure operational processes underwrite their company’s service level agreements, the focus should be on applying new technology to reduce internal costs of operation and one proven solution is to integrate common data sets that are community quality assured with no differential value at the point of entry into clients’ workflows, but where further enrichment and unique benefits are specific to that client and its ability to service it.
This poses the question of what data is actually common and what data is unique to an entity and highlights the significant challenge a utility has in realising the full benefits of its service.
It is at this point that utility economics start to get interesting; the potential targets for these type of outsourced services are the usual top tier suspects, whose distinct trading system configurations are almost public knowledge within the industry. Simple napkin maths will highlight that, from the top 15 or so tier 1 banks - assuming one in four won’t outsource - there is only a prospect list of 12 banks to share between three viable utility service providers with four clients each.
While each utility service contract may run into the millions, so do the potential operating costs, collared by baseline costs of still having to run siloed services to avoid commingled unique client data. The reality is economics drives commingling but market forces, governance and regulations are currently driving segregation.
The technical and economic challenge is how to share common data sets between service users to realise operational savings, to sweat out each thin margin from low margin contracts. Typical clients, to justify the outsource, are demanding at least 30% savings, so the utility service provider has to carry highly weighted operational costs until at least three other similar contracts are in place and profits start rolling. The utility provider must be committed to the long haul and have deep pockets, as it will be challenged by new technology, racing to empower clients, unless the utility can leverage distinct service level agreement benefits across disparate client profiles and utilising AI/ML to manage separate data siloes for those clients, effectively and profitably.
Are we therefore at a stalemate?
The broad economics that make a utility profitable are driven simply by the number of participating clients, of the right type of course, and while there may be a number of breakeven points for a mix of number and quality of clients, the bread and butter will be with larger firms. Shareholder patience will be interesting to observe, as this big bet for utilities will face a potentially very long tail to achieve viable returns in a changing environment, where enabling technology will allow clients to service their processes internally at lower costs and potentially invalidate the utility argument.
Assisted technology may be going full circle by introducing better tech to self-operate in-house, where AI/ML will save the headcount whilst the intellectual head of the beast is retained within the firm’s walls; this may have great appeal to certain firms. AI needs structured data by all accounts to be able to perform its magic but together with its cousin big data, could provide some unbelievable solutions that could easily shift our paradigm far, far off the page.
Are we destined to see the emergence of UtaaS, Utility as a Service models, whose role is to manage a ready market for post-trade services on the fly, either in the guise of an industry super custodian, managing competing processing services, or a number of brokers in the same role?
This does, however, assume a future state where trade encryption is solved by blockchain, allowing instant switching of data flows between processing providers on a bid and offer post-trade services market, where learned machines ingest disparate workflows and process risks are monitored by all-seeing AI controllers. Not so far-fetched in a future world where symbol identity standards are unified across assets, instruments, vendors and trade participants.
Dystopia or Utopia…you decide.