BoE and FCA to revise data collection

By Rebekah Tunstead | 20 November 2019

UK regulators are set to review how they approach data collection, according to the chief data officer at the Bank of England (BoE) and the Financial Conduct Authority (FCA)’s head of central data services, innovation, strategy and competition division.

“Some of the collections were done many years ago, and were there for different reasons, and different ways of working. We are at the point now where some of those collections, we are finding them hard to read for different questions than the original one they were raised for and that is because they have been aggregated and we can’t trace back to the granular data that underpins the aggregated before we collect it,” said the FCA’s Steven Green.

Both regulators were speaking on a panel at the Financial Information Management conference in central London this week.

Green did not point to a specific regime, but did suggest the rule maker was reviewing older obligations.

“So, over the next year or two we are expecting to go through some of our collections and go ‘actually we would be better off if we made those more granular because that will answer 50 questions instead of the one question that it was raised for.’”

The FCA and BoE are running pilot phases of digital regulatory reporting, which is hoped to enable real-time regulatory reporting, increase efficiency, and reduce costs. In March, the two regulators published a report on the first pilot phase of digital regulatory reporting. During a six month period Barclays, Credit Suisse, Lloyds, Nationwide, Natwest, and Santander experimented with the use of distributed ledger technology for live regulatory reporting. The FCA and BoE are to publish an update to the digital regulatory reporting initiative in January 2020, according to Burrows.

Responding to a point about trade reporting efficiency, Oliver Burrows, chief data officer at the BoE said the focus of the regulators had been to create an effective framework, but acknowledged that “if we had our time again, we would probably have looked at the processes by which we collected data to make it more efficient before this explosion of data to improve the effectiveness.

“When I think about enabling analytics at the bank, helping us make better policy decisions, a very interesting limitation on that is the data that we take in through the front door, the data that we find and collect from financial institutions. It is very critical for us to think about how that might evolve.

“[We’ve] never really stepped back with a white sheet of paper and said, ‘well actually there is not so much a need for abstraction now of days, we could base things more on real world descriptions of data, operational descriptions of data, and work out how we answer questions given a conformed description of the data on a much more granular level,” he said.

The UK’s Financial Conduct Authority (FCA) announced in July it would replace its main regulatory data collection system, Gabriel. Green said the new platform aims to improve user experience.

“It is about making error control easier so when people are submitting it is not such an onerous process,” he said.

Also on the panel, Patrick Hogan, head of section in the banking supervisory division of the European Central Bank said the bank had legacy issues it needs to address referring to the two different systems used for the collection of supervisory data and data for monetary and financial stability purposes.

* This article was updated on November 21, to reflect that those quoted in this piece were speaking on a panel at the Financial Information Management conference in central London this week.

Become a bobsguide member to access the following

1. Unrestricted access to bobsguide
2. Send a proposal request
3. Insights delivered daily to your inbox
4. Career development