The consumerisation of IT has arrived and is having a deep and dramatic effect on our everyday lives. In fact, whether we realise it or not, digitalisation has seeped into our subconscious to the point that it has altered digital interactions as well as assumptions of what is expected.
We expect a seamless client experience, which translates primarily into “always on” and a confidence that platforms will learn and evolve over time. Personalisation around applications based on users’ behaviours is perhaps more commonplace than people realise. At home or in the office, the technology and digital platforms we use are focused on personalising the experience and improving efficiency to get the task done faster.
Whether we realise it or not, it is our micro experiences that are critical to the success of the platform. This personalisation drives the value of the technology we use – it may be the ease of finding a movie on Netflix that suits our individualized preferences or the ability to settle a credit card with a simple swipe or tap.
The challenge, however, is that old architectures cannot be wrapped or encapsulated to deliver these strategies. Legacy platforms were built to support a different set of use cases and “cloud wrapping” or encapsulating old capabilities into new paradigms will not deliver the desired business outcome. These dated systems are often based on batch processes, with no clear separation of concerns and have a historic focus on a small set of super users as opposed to information consumers and citizen developers.
As it relates to the technology that will empower those in financial services, we need to re-imagine – we cannot replicate the last 20 years in which monolithic applications have been stretched and bent to support new requirements, new regulations, and increasingly global needs. For their success, these transformative projects need to avoid conflicts with broader platform and data strategies that can be delivered with agility while also spanning across a multi-year roadmap.
At the macro level, the platform has evolved to a federated, highly orchestrated set of services, interacting in real time and, in some cases, over multiple geographies to create a highly resilient, scalable ecosystem.
Last November, at Eagle’s Engage client conference, we outlined our vision for our cloud-native, deployment-model transformation. Ten months later, we are seeing positive results both in terms of employee engagement and client feedback. We have added key talent to further increase our velocity and deliver meaningful, people-driven change to our technology platform. I have spent quite a bit of time on the road this year and we see distinct synergies between our journey and that of our clients. We are all on a transformative journey—the speed of which is greater than I can remember in the 20 years I have been in this industry.
A platform is an experience that can stitch together and orchestrate an organisation’s diverse data needs with a common set of APIs and data taxonomies across a global, resilient and scalable foundation. Technology strategy and service abstraction should map to business capabilities. The concept of “separation of concerns” and avoiding service conflation is critical to both business and technology agility as well as longevity. This also does not mean simply adopting an Infrastructure-as-a-Service strategy—it is certainly far more than just “moving to the cloud”.
The platform and the software that operates on it need to be designed for this new frontier: it needs to support data residency and locality concerns; it needs to be scalable; and, critically, it must always be resilient.
Data is no longer confined to single applications, instead being democratised through next generation architectures. The data platform is paramount and it needs to support a broader, more diverse set of needs than ever before. We continue to see support for canonical data structures using open standards to simplify integration and create a consistent taxonomy as data flows across the enterprise.
Consumers should also be able to choose as little or as much to consume as needed and, over time, applications will be abstracted to services that encapsulate combinations of technology, business and operations.
Services need to expand beyond the traditional confines of an application. They need to be built as scalable, discoverable utilities and it is incredibly important that the granularity of that service is carefully managed and aligned to the experience needed.
A single service may not add material business value in isolation – I view it as analogous to the flywheel concept introduced by Jim Collins. Over time we create a set of capabilities that can be integrated, curated and improved quickly to deliver business value faster and more consistently. It is also important to mitigate duplication across an enterprise and the platform needs to offer intuitive, accessible and robust discovery capabilities. As an industry, our ideal should be to share more of these services that, by themselves, offer no unique competitive advantage. Our collective end goal should be to build a modular architecture that simplifies the integration of external services with a still loftier objective of creating efficiencies across the entire industry.
This is precisely why Eagle is working so closely with our BNY Mellon colleagues to leverage existing NEXEN capabilities while also building new cloud-native services that can be leveraged by both BNY Mellon and all external clients. The unique opportunity for Eagle to plug into this platform to consume big data, leverage machine learning, contribute APIs and data strategies, and enhance reporting capabilities is truly transformative.
We recognise that as time moves forward, we will all be considered digital natives. It may still be years away, but the heightened assumptions that come with this transformation will be unavoidable, which is why we at Eagle are intent on setting expectations rather than responding to them.