Regulators are putting data quality under the microscope

Reconciliation and data quality are often seen as an operations/back office topic, but the implications stretch across the enterprise - particularly as regulatory scrutiny increases

by | November 22, 2021 | Gresham Technologies

Data may offer a world of exciting potential for financial institutions – but what about the more onerous, back-end practical work that goes on to enable firms to actually use this data? Understanding the journey that your data takes and how it is used is key. Without this, firms risk reporting and making decisions based on data that they are not completely confident in – a disaster waiting to happen.

Obtaining this level of understanding means knowing your data inside-out. How does the way it was collected affect it? What data validation – if any – has taken place? Where have changes occurred and are you able to track them, and demonstrate that you have done this to auditors and regulators? It’s not enough to simply focus on the minimum data requirements for timely report submission: it’s only by looking at the bigger picture and examining your processes that you can identify data quality issues before they turn into a problem for your business.

It isn’t just the desire to leverage the commercial potential of data that is driving this topic higher up on the priority agenda. Regulators are scrutinising the data that financial institutions send them more closely than ever. Increasingly, regulators don’t just want to know what your numbers are: they want to see where they have come from and how you know that they are correct.

Conducting a detailed analysis of your data and how your organisation uses it is only part of the story. This activity must be treated as a living process, not a one-off task to be forgotten about later. Documents need to be kept up to date, changes managed, and data monitored in real-time. Taking a holistic approach avoids the fragmentation that creates unnecessary complexity and expense and leads to errors further down the line.

A perfect example of this is in regulatory reporting: a process where handling everything from the ingestion and submission of data to the management of exceptions, optimisation, and reconciliation in one integrated system allows you to see and verify what you are reporting all the way back to the source. With regulatory scrutiny and vigilance ever-increasing, a granular understanding of your data and controls is more important than ever.

To discover what upcoming regulatory changes will mean for your data and controls, download our industry report: Data integrity: Your key to confidence in a complex regulatory environment.



Four Signs Your Firm Is Ready for a Front-To-Back Solution

Other | Asset management Four Signs Your Firm Is Ready for a Front-To-Back Solution

Fixed Income Innovation – More Fixes, Less Fixed

Best Practice | Asset management Fixed Income Innovation – More Fixes, Less Fixed

TS Imagine
White Paper: Predicting the Past

White Paper | Asset management White Paper: Predicting the Past

TS Imagine
Market Data Contracts – Managing the Invisible Fence

Other | Asset management Market Data Contracts – Managing the Invisible Fence