Global Derivatives Trade Reporting is Hard! What Firms are Doing to Get it Right

By Brian Lynch | 5 November 2014

'Major Bank Fined $** Million for Inaccurate and Incomplete Trade Reporting.’ How long will it be before we see that headline describing a CFTC (Dodd-Frank) or ESMA (EMIR) regulatory action?  A recent ₤4.7 million fine levied by the FCA for MiFID-related trade misreporting shows the regulators are serious. Headlines like this highlight the real costs of firms failing to get their trade reporting data right.

Regulators know that the terabytes of data they receive through their new derivatives reporting processes do not always meet the standards of timeliness, accuracy and completeness expected when they drafted their regulations. With trade matching rates as low as 3% in Europe the cat is out of the bag and pressure for improved data quality is increasing.

Why is accurate derivatives trade reporting hard to achieve?  If you have not examined the FpML specification for a simple interest rate swap, or reviewed the Core CSV specification for commodities swap trades, you would be forgiven for thinking ‘this shouldn’t be hard.’  There are no analytics, matching, allocations or computations involved.  It is a ‘simple’ case of taking the data in the firm’s internal trade repository and sending it in a market standard format to one or more global trade repositories such as the CME, DTCC, Unavista, ICE or Bloomberg.

Unfortunately there are a range of assumptions in the above statement that simply don’t hold, for example:

  1. Few (if any) top tier market participants have a central trade repository they can rely on for regulatory reporting.
  2. To meet the gamut of global regulations it is events, not trades, which firms must report. Sometimes in real time.
  3. Reporting parties may choose to outsource some of their reporting obligations to other parties in the trade flow.
  4. Few trade repositories offer a true Global Trade Repository across all asset classes.  Where they do it is usually the product of multiple regional, and sometimes Asset Class level, implementations. Each of these may behave differently.
  5. Lastly, what ‘data standards’ are we talking about? FpML, CSV, FixML, anotherML?

Trade Reporting indirectly exposes a reporting firm’s processes and infrastructure to the regulators.  Many reporting firms have invested in a reporting gateway or hub that centralises reporting to the trade repository.  But even with a central gateway, the data is ultimately fed by a multitude of disparate trade capture systems.

Data quality as experienced by the regulators is the result of numerous ‘front to back’ data flows and multiple mappings and transformations.

Data Quality Issues

The structural issues described above already serve to complicate a reporting party’s ability to get the data right.  Low matching rates are an indicator of the real problems reporting firms are facing. Firms have been struggling with the quality of trade bookings for many years. Regulators have recently added to the pain by introducing several new fields; LEI (Legal Entity Identifier), Product Taxonomy (Product, Sub Product) and UTI / USI (Universal Trade / Swap Identifier):

LEIs are mandated by ESMA.  However, not all trading parties have LEIs and market participants cannot force their clients to get an LEI. Trade Repositories do not take responsibility for validating the LEI for either content or form.  LEIs appear high on the 'mismatch' culprit list.

Product Taxonomy implies that every OTC trade be allocated a clear product definition.  No doubt standardisation has made this easier. However, the flexibility of various OTC models and the myriad of trade booking options that exist means that putting trades into one bucket can be quite subjective.

UTI/USI: This field name differs across jurisdictions and can be found in different places in CSV and FpML specifications. The rules for who creates UTIs vary based on execution, clearing and jurisdiction of the trade.  Timing of the trade lifecycle and the need to create records for risk reporting before a USI exists has made this one of the most debated topics in the Trade Reporting space.

These are all new complications layered on top of historical challenges. As someone who supported Credit Default Swap trade bookings at a time when achieving a T+3 confirmation standard was considered a high bar, I think the industry has come a long way to get to where it is today.

So how are organisations meeting this challenge and what are the best practices that should be observed?

Embrace Standards

FpML is the best the industry has when it comes to swaps reporting. It has been around as an industry standard since the first draft was announced in 1999.  FpML is now a mature standard with v.5.7 recently published. FpML offers clients a robust messaging standard for communicating trades ‘on the wire’ in a structured format. Many participants have embraced FpML as the internal model for messaging and in some cases persistence of trades. 

There are ‘good’ and ‘bad’ practices associated with the use of FpML. Given that it is an external standard most clients find a need to extend FpML for use internally.  This is not a problem since it is XML (Extensible Markup Language). However, there are common mistakes that can be made when extending FpML that should be avoided. (I won’t discuss them here, talk to an expert!)

One of the issues we see with the use of FpML by the industry is over-dependence on Schema validation as the only measure of the quality of the FpML message. Schema validation goes without saying and, provided the TR being reported to doesn’t break the standard itself, it should be applied to every message. However, there are two other levels of validation required to ensure the quality of your FpML trade message:

Scheme Validation: (Not to be confused with Schema validation) The FpML standard allows certain data fields to be further defined by ‘schemes’, soft enum lists that describe the allowable values for a particular field and the perspective of the field itself e.g. whose Trade ID is this, mine or my trading parties? This can be defined by the PartyID Scheme.  Market participants can download or reference the ISDA schemes or they can create their own to ensure their content matches an allowable set of values.

Semantic Validation: ISDA has defined a comprehensive set of rules that separate ‘good’ FpML trade messages from ‘bad’ FpML trade messages. There are a number of vendor applications that implement these validation rules and allow clients to truly validate the robustness of their trade message (see Handcoded.com or http://validate.trade).

Automate, Automate, Automate

Trade Reporting is the ideal challenge for no touch automation of testing (regression testing at a minimum). Depending on asset class and jurisdictional coverage the number of different trade permutations is high.  However, the results are deterministic and the inputs can be scripted.  An upfront investment in automated testing infrastructure and broad coverage of test inputs vs. expected results will yield return on investment in the form of lower costs, improved quality and reduced risk.

Software delivery processes are improving. Continuous integration and build tools allow for automated nightly software builds and unit tests that can be executed before the development team comes in each morning.  This facilitates more agile development and rapid identification of data, coding or software build issues.

One obstacle to ‘no touch, automation’ is the availability of a robust test environment against which to run tests. Testing participants need a permanent and reliable test environment.  Some firms are investing in internal test harnesses that can act as a proxy for the trade repository of their choice.  This approach gives firms flexibility on timing and scale of the testing they run.  It also allows them configure what environment they test against.

If the Trade Repository offers a UAT environment it is generally going to reflect the ‘next’ TR release and may include features or bugs that are incompatible with production reporting today. In order to do valid regression testing reporting firms should test against a test harness that reflects the environment their next release will ‘go live’ into.

Visualise the data

Data visualisation is one of the industry hot buttons right now.  Firms are able to leverage exciting new visualisation tools and techniques to rapidly identify and diagnose errors and anomalies. Tableau and Qlikview are two commonly talked about BI tools.  These tools and techniques can be applied to any data.

NoSQL (unstructured) technologies are a cost effective alternative to the overhead of designing and maintaining a structured relational database to capture non transactional reporting output from real-time systems.  Options such as Splunk or ELK (the Open Source stack) are a great way to rapidly add a comprehensive reporting and monitoring capability. By capturing test and production data output in Splunk it is possible to put together rich management reports and metrics to articulate the overall quality and stability of a trade reporting system.

With a monitoring solution able to consume ‘any’ system-generated data, firms can easily integrate and correlate infrastructure and process flow output with the underlying business data.  For example:

How many IR Swaps Confirm events were processed yesterday and through which server or gateway? 

How many CDS trades failed to meet CFTC timing requirements and where was the delay in the system [messaging, transformation, mapping, etc.]?

The right transparency solution allows reporting parties to build both current and historical reports. Historical analysis using graphic visualisation tools allows reporting participants to quickly identify performance issues and manage the scalability of their platforms. The right monitoring technology allows a firm to immediately identify when new builds improve or degrade performance in terms of timing and quality of data.  This allows support teams to quickly identify an offending software release and roll back the change.

Rich, real-time transparency is critical to the stability, supportability and auditability of any enterprise trade reporting system.

Production Data Quality Assurance

A common approach to Production data quality monitoring is Schema validation of FpML messages coupled with a reliance on the Trade Repository to ACK, NACK or WACK the trade message.  Unfortunately, firms using this approach are not taking direct control of the risks associated with their reporting data.

All Trade Repositories have implemented levels of validation into their solutions, but there are gaps, and the final responsibility for the quality of trade reporting lies with participants, not the TR.

Reliance on the Trade Repository for data quality validation creates a number of potential risks:

 - The TR validation fails (bugs or feature gaps) and results in false positives.

 - The TR does not take responsibility for rich, semantic trade validation.

 - The TR does not have access to internal reference data, e.g. LEI, UTI Prefix and Format.

 - Participants’ failure rates (NACKS) are exposed to the regulators, inviting unwanted attention.

  -TR error codes may be incomplete or unclear (issue resolution).

Given these risks, how can firms achieve additional protection and confidence in their data?

Many participants have already taken advantage of the FpML validation tools mentioned earlier.  Applying comprehensive FpML validation is a great start, ensuring the trade message is structurally, syntactically and semantically correct.

There are additional layers of validation that can be applied to Production Data:

- Trade Repository specification rules - applying validation rules that replicate the TR's 'next version' specification and stay ahead of bug fixes and updates.

  • This allows reporting parties to test new requirements before the TR's UAT environment is made available by including the latest specifications in automated testing processes.

- Additional Regulatory rules - Some regulatory rules are not currently captured in the TR specifications or are not yet mandatory but are indicated by regulators.  For example:

  • Mandatory clearable trades require a clearing party or an exception – CFTC
  • UTI must be prefixed with 0000 or E01 or E02 or E03 – ESMA

- Additional ‘Sense and Sensibility’ checks:

  • Outsised or undersised notional (large or small thresholds)
  • Duplicate trade checks
  • Internal reference data checks – LEI, AccountNumber, Book, TraderName etc.

Applying and monitoring a comprehensive set of reporting validations mitigates the risk of misreporting.  Coupled with the transparency and visualisation tools described above, these additional data quality validations deliver a robust platform for management reporting, audit and regulatory compliance.

These additional data quality checks can be run asynchronously without interrupting the reporting process.   This provides ‘in flight’ monitoring of production data quality issues that can be resolved through upstream improvements.  Only when a data issue is critical enough should it warrant direct interference with regulatory reporting obligations.

Conclusion

Building and maintaining a robust Global Trade Reporting platform that delivers timely, accurate and complete trade reporting across all asset classes and regulatory jurisdictions is not easy! Some firms are out of the woods with respect to the initial solution implementation but the level of churn remains high.  As regulators analyse the data they have received they are making reporting changes and imposing new standards that must be coded to. Trade Repositories will continue to add new features and products.  Reporting firms will continue to evolve internal trade processing and workflow solutions requiring ongoing regression testing.

Organisations that wish to remain out of the headlines must invest in this area of the business process, apply best practices and take proactive responsibility for the quality of their trade reporting data.

Standards > Automation > Visualisation > Validation

By Brian Lynch, CEO, Risk Focus

Become a bobsguide member to access the following

1. Unrestricted access to bobsguide
2. Send a proposal request
3. Insights delivered daily to your inbox
4. Career development