Speaking to bobsguide, James Corcoran, chief technology officer (CTO) Solutions, Kx – a vendor of in-memory, streaming analytics technology utilised within the financial services industry – explains how automation plays a key role in surveillance solutions and assesses the future challenges arising from the changing landscape.
Why is effective surveillance so important at the moment? What is the driving factor for people to be adopting this technology now?
There's never been more regulatory scrutiny than there is now. Firms are under huge amounts of pressure to make sure they comply with today’s regulations, and to implement systems that are future-proofed against the evolving regulatory landscape.
Getting surveillance right is difficult – organisations often have different solutions for different business units, there tends to be a lack of cohesion across different locations, and there's usually an operational overhead associated with investigating potential breaches of compliance or malpractice.
It’s incredibly important to have a surveillance strategy that’s effective right across your organisation in order to capture and make sense of all of the interactions, dependencies, changes, communications, patterns and behaviours across the entire trade lifecycle.
How does the Kx approach to this differ to your competitors?
Most vendors take a narrow approach to solving a specific problem for a subset of the market’s overall surveillance needs. We take a different approach – we look at surveillance as a streaming analytics challenge.
Having an effective surveillance solution requires firms to capture and analyze high volume data streams across multiple functional areas within the organisation. From that perspective, we put data at the heart of our design. That's why we say that you can think of us not simply as an application, but as a platform which allows us to continuously apply streaming analytics to new datasets for deeper insights as markets and behaviours evolve.
What role does automation and machine learning play in your monitoring tools?
The initial driver for incorporating machine learning techniques in our solution was to reduce the number of false positive alerts that the system generates, and we do that by learning from analysts’ and users’ behavior over time. We’ve recently released Kx Auto ML, a module that automates the task of applying machine learning solutions to real-world problems. These tools are designed to automate data preparation, feature extraction, model optimisation and deployment, so that we can streamline the user experience and allow our customers to apply their resource capital to higher value problem solving.
How does the system remain up to date? How often is it updated?
Our platform-first approach gives us a strategic edge when it comes to anticipating our customers’ future needs. Many of our clients start by adopting Kx for a specific set of requirements – either implementing the system in a specific environment or for a particular asset class or business line. But over time, most of our customers scale their implementation by adding additional datasets and analytics, which allows them to stay ahead of the evolving needs of their businesses.
How many modifications or specifications can a user make to the platform to customise it to their requirements?
One of our key differentiators is flexibility. Our solution is highly configurable from the front end through to the back end, and our customers can integrate market data, transaction data, reference data, emails, data from instant messaging systems, and news feeds into our platform. Once the data’s available, users can build their own analytics, or use our off-the-shelf models to customise workflows for an experience that’s tailored to their business needs and specific requirements.
If an organization wanted to investigate its data from the platform, is that possible?
Because effective surveillance requires the integration and analysis of multiple large data streams across the organization, we end up creating large platforms containing rich data assets that can be mined for insights across a variety of use-cases. We open up access to the underlying data and analytics via dashboards and APIs to unlock these insights, and most of our customers use this new source of cleansed, merged data to bootstrap new data-driven applications.
Are there any limitations in the platform, specifically around cross-asset monitoring? How do you solve these issues?
We don’t impose any technical limitations. The reason this type of analysis tends to be a problem for the industry is that firms often have different operational silos, with different IT teams working within those silos rather than across them. Thus, data is also siloed. Our most effective implementations are where our customers deploy Kx as a platform horizontally across the enterprise. When compared with other platforms, Kx can combine these disparate datasets using a lower footprint which requires less computing power. Cross-asset monitoring also requires a greater deal of domain expertise, and we apply proprietary algorithms across these larger datasets to monitor for micro-patterns which may seem innocuous when witnessed on a single market but could be deemed predatory when viewed through a broader lens.
If a firm wants to utilise your platform, what benefits can they expect to realise?
Our solution delivers insights on data when it’s most valuable – in real-time. With Kx streaming analytics, users can analyse data at a breadth and scale that was previously inaccessible to them. This delivers benefits from a total cost of ownership perspective, but perhaps more importantly, allows our customers to implement strategic platforms that put data and analytics at the heart of their businesses.
Is there a limit as to how much data can be stored in the system and is there any difficulty in consolidating them?
There’s no limit to the amount of data that can be stored in our platform. Most of our customers deploy Kx across their various on-prem, cloud and hybrid cloud environments, capturing and acting on vast quantities of streaming data. The Kx platform consolidates these streams of data in real-time, allowing users to make in the moment decisions – something which traditional software applications really struggle with.
How do you expect surveillance requirements and concerns to be changing in the next five years? Is there a specific area that you believe is already moving and shifting drastically?
The regulatory landscape is constantly evolving, with regulators demanding increased levels of monitoring, reporting and transparency. Firms operating across multiple regulatory jurisdictions have to constantly adapt to new requirements. Therefore, as a vendor in this space, we don’t stand still. Markets and behaviours evolve over time, and software platforms must be built to evolve too.
Specifically, we think the FX and Fixed Income markets are undergoing significant structural changes, and as traditional over the counter markets become more electronic, there’s an emerging need to automate monitoring and surveillance workloads that have historically been manual in nature.
The industry is also undergoing a seismic shift to the cloud, with new application deployments being cloud-first and existing on-prem applications being migrated at an accelerating pace. The latest version of our software includes features and tools to accelerate cloud adoption and we're working closely with our customers to ensure they benefit from all the functionality the cloud has to offer.