73% of respondents state they have, or are planning, to re-evaluate data management processes due to growing use of OTC derivatives
OTC derivatives are shaking up data management processes in financial services firms, according to an international survey by A-Team Group. The research, commissioned by GoldenSource Corporation, studied trends in the management of instrument data and found that almost three quarters of respondents had, or were planning, to re-evaluate their data management processes because of the growing use of OTC derivatives.
85% of the respondents cited derivatives/options when asked to identify the top five most difficult data types to source and manage. Structured products, including mortgage-backed securities, collaterized mortgage obligations, loan-level detail for structured products, and collateralized debt obligations (CDOs) for U.S. and European Markets, followed with 45% of the votes.
One senior data manager suggested that 50% of the firm's manual data effort went into sourcing and resolving issues in OTC products. Similarly, a respondent from a large US broker/dealer commented: "We have quality data - no question. But, it is the last 1-2% that drains our resources. Of that small percentage, new OTC instruments need a lot of the focus."
Other key findings of the report were:
* Applying standardized reference data policies enterprise-wide remains a leading challenge as reported by nearly 86% of the respondents.
* Flexibility in data models and hierarchies for data selection are of growing importance in a competitive and increasingly complex environment. Although 75% of respondents felt that their firm had the flexibility to define instruments in a way that matches business requirements - for example, being able to combine the identifier and place of settlement, or place of trading - 25% of these stated that it still required some manual intervention.
* Data terminals are in decline - 60% of firms surveyed said that reliance on terminal-based services as a source of reference data will decrease, 20% thought it would stay the same, and just 10% thought it would increase.
* Industry-level collaboration on instrument data validation is desired - 89% of respondents thought there was a role for industry-level peer-to-peer validation of base-level instrument data. However, concerns were expressed over a lack of likely participants and incentives for institutions to share information.
"In our discussions with survey participants we found that financial institutions unanimously recognize the need to establish Enterprise Data Management, with standard data processes, practices, and controls to maintain firm-wide consistency, driven in particular by the requirements of risk management and regulatory compliance," says Maryann Houglet, Vice President of Strategic Consulting at A-Team Group. "However, they are still facing challenges in ensuring consistency and quality of the instrument data being consumed across the enterprise - particularly due to the growth in use of complex instruments such as OTC derivatives."
"Complexity in the markets driven by the boom in OTC derivatives is reopening age-old problems such as securities identification and data linkage. This research confirms that a flexible data management platform is key to ensuring business growth remains unimpeded," says Paul Kennedy, VP Product Management at GoldenSource Corporation.
The report, Static Instrument Data Turns Dynamic, is the first of three targeted EDM surveys of senior reference data managers and focuses on the challenges of managing instrument data. A-Team Group held structured discussions with more than 40 senior individuals involved in reference data management at financial services firms. Geographically, the survey sampled respondents from around the globe, in particular from the U.K., mainland Europe (with an emphasis on France, Germany, Switzerland, and Benelux), the U.S., and Canada with the majority of respondents (78%) having global responsibility for data management.