The Fifth Day of BCBS 239: Quality

By Stephen Engdahl | 10 December 2014

G. Sibley now has his Fundamentals in place – data definitions, governance, architecture, and scope. It’s time for us to begin the next phase: building data management capabilities. We are going to touch on four specific topics during this phase of the 12 Days of BCBS 239.

For those ready to take a deeper dive into data management capabilities, we highly recommend the EDM Council’s excellent measurement and assessment model, the Data Management Capability Assessment Model (DCAM), which addresses financial institution strategy, organisational structures, technology and operational best practices, organised into 35 primary capabilities and 109 sub-capabilities, each backed with objective-based measurement criteria.

G. Sibley chooses to start with data quality. BCBS 239 calls for accuracy and integrity, in both risk reporting practices and in processes for risk data aggregation.

  • BCBS 239 Principle 3 asks banks to measure and monitor accuracy of data.
  • BCBS 239 Principle 7 calls for reasonableness checks on quantitative information, and for exception reports to explain data errors.

What does quality mean to you? How do you measure it?

Quality is more than just whether G. Sibley’s present falls apart within a week of receipt. Measurement of quality includes measuring characteristics such as reasonableness (for example, a price within x% of yesterday’s price), completeness (required fields not missing), alignment with valid domain values (currency codes equate to known currencies, market codes represent known markets, and many more), and other factors.

With an EDM/MDM system in place, data quality can be measured at multiple points in the data supply chain – at the source level, after enrichment and standardisation, during gold copy creation, and at the point of distribution to the business, to ensure it is fit for purpose.

You can’t manage that which you cannot measure, so measuring quality is an important first step. But, measuring quality alone isn’t sufficient. Quality must be actively managed. This requires efficient workflows for error detection, research and resolution, as well as a framework for root cause analysis and continuous improvement to ensure that data quality remains at acceptable levels or follows an upward trend.

Look out for day six tomorrow.
 

By Steve Engdahl, SVP, Product Strategy, Goldensource 

Become a bobsguide member to access the following

1. Unrestricted access to bobsguide
2. Send a proposal request
3. Insights delivered daily to your inbox
4. Career development