Solvency II – making demands on your data quality process.

By Andrew Sexton | 22 March 2016

Solvency II is finally upon us and much has been said about how the pressures will ripple across insurers and their asset managers. But how can asset managers ensure that they are managing the data quality demands of Solvency II and meeting the high standards of their insurance clients?

For asset managers, this has presented well documented challenges, requiring them to deliver a variety of often complex asset and reference data in a consistent format and faster than they are used to. Challenges include: collating data across multiple sources; the need for new data such as classifications, credit ratings, benchmark curves and new data taxonomies for securities instruments, such as CIC and NACE; providing underlying performance data for asset-backed instruments or instruments held within fund of funds for the fund look-through requirement of Solvency II.

In order to deliver the right depth of data at the right time, the underlying foundation of an asset manager’s data management process needs to be solid.  For example, the fund look-through process of providing an expanded view of a fund and all its underlying constituents may highlight anomalies in the asset manager’s data. It could highlight issues such as showing you have five holdings in the same stock all with different valuations performed at different times.

Such anomalies in your data need to be identified, validated and cross checked and the data re-aggregated. To achieve this kind of control over your data requires processes and workflows which can identify the issues and enable the user to investigate, escalate (where necessary), and resolve them with a consistent, reliable and systematic approach.

When you are able to demonstrate this level of control over your data, your insurance clients can be confident that you are providing a thorough and reliable service, which helps them with their regulatory reporting obligations. Naturally, those asset managers that can successfully demonstrate control over their processes and are investing in servicing their clients will be favoured by the insurance industry.

This approach also provides a means to execute on broader data governance strategies, and delivers the strong foundation you need to be able to respond quickly and more easily to future regulatory requirements.

What does a process-led, systematic approach entail?

So how can an asset manager put in place the right longer-term processes to help themselves and their clients? The following are key data management process capabilities to help with meeting Solvency II and other regulatory requirements:

  • Data Visibility: You need to have visibility across all points of the data architecture – from raw data sources to data hubs to warehouses to consuming applications and more – without the need to physically move or transform that data. This can be done with advanced meta data modelling.
  • Workflow: Any effective governance process needs to incorporate high levels of process control and workflow to enable control and ownership of issues, escalation and resolution according to the data governance procedures. Enabling event-driven workflow (“if that happens, do this”) provides even greater control over your data. Workflows provide scalability for current and future data quality projects.
  • Data Quality: The visibility and workflow capabilities mean that setting up the checks and balances, for example, testing for completeness, accuracy, comparisons across all sources and endpoints of the asset and reference data is a relatively easy step. Exceptions all feed into a logical workflow owned and controlled by the business users, ensuring you’re reaching the right level of data quality.
  • Audit Trail, Management Info and Business Intelligence: Tie these capabilities together with a full audit trail and the MI and BI capabilities to generate the reports and analysis, and you will be able to demonstrate the rigour in your data management processes to both your insurance company clients as well as the regulators. 

Does your data quality management process shape up? Whatever data management platform you have in place, you need to make sure that have the tools to address these key factors in order to better manage your ongoing client requirements around Solvency II and other regulations. 

To read more about Curium Data Systems and commentaries around this and other data management challenges within financial services companies press here.

By Andrew Sexton, Sales Director, Curium Data Systems.

Become a bobsguide member to access the following

1. Unrestricted access to bobsguide
2. Send a proposal request
3. Insights delivered daily to your inbox
4. Career development