Why data integrity holds the key to regulatory confidence

Philip Flood, business development director, regulatory and STP services at Gresham Tech

November 1, 2021 | Gresham Technologies

Data is the lifeblood of activity for today’s financial institutions, but if your activities and decisions are data-dependent then you need to know that this data is complete and correct. Understanding the processes that shape your data, and the contexts in which it is used, is key.

Questions that you should be able to answer include: How is your data impacted and changed by the way it was collected? What data validation – if any – has taken place? Can you track where changes have occurred and demonstrate this to auditors? Only by analysing the answers to these questions can you identify critical data quality issues and inaccuracies.

The growth of big data and analytics technology means that firms are keen to leverage the commercial value of their information but for financial institutions this desire is rapidly being overtaken on the data quality agenda by regulatory reporting and compliance concerns.

Regulators are scrutinising the data that they receive from financial institutions more closely than ever before. Increasingly, regulators don’t just want to know what your reporting numbers are. They want to know where they have come from and how you know that they are correct. Add this to the increasing complexity of regulatory regimes and reporting and the number of different rules and data fields involved, and you can see why increased regulatory interest in data quality is giving some firms cause for concern.

Managing data integrity must be treated as a living process, not a one-off task. Documentation must be kept up to date, changes managed, and data monitored in real-time. This holistic strategy avoids many of the problems associated with more siloed approaches, which create fragmentation and unnecessary complexity, giving rise to errors and a lack of confidence in data.

With regulatory scrutiny and vigilance only set to increase, a granular understanding of your data and controls is more important than ever. This is why many firms are coming to realise that reliance on point solutions and systems, developed in-house during the previous decade or so in response to new, urgent and fast-changing regulatory requirements, no longer serve their best interests.

Instead they are looking to handle everything from the ingestion and submission of data to the management of exceptions, optimisation, and reconciliation in one integrated system, enabling them to see and verify what they are reporting on, all the way back to the source. This enables them to gain the confidence in their data that regulators now expect to see– and have begun challenging firms on.


To learn more about how data integrity can support your regulatory compliance and reporting, download our white paper: Data integrity: Your key to confidence in a complex regulatory environment.


 

Categories:

Resources

Month-end balance sheet account reconciliations: process reinvention

Case Study | Accounting Month-end balance sheet account reconciliations: process reinvention

ReconArt

Month-end balance sheet account reconciliations: process reinvention

The University of Auckland has a decentralized finance function. Its policy is to have all balance sheet accounts reconciled by… Continue Reading

View resource