A stronger focus by regulators on data management and the replication of risk calculations is increasing the strain on risk management departments, claims Xenomorph. Regulators are becoming more concerned with the quality of data going into risk management systems, encompassing data validation processes, the replication of past risk calculations and audit trail on all inputs that contribute to an institutionâs risk reports.
Given the volumes of raw data involved, this âdata on dataâ is adding to the data management challenge, particularly when dealing with the storage and management of real-time prices and rates. Xenomorph has recently seen clientsâ requirements change from being simply audit trail on static data and end of day prices, to encompass all kinds of data, from high-volume market ticks to more complex data such as volatility surfaces.
Moving beyond raw data, regulators want institutions to control and reproduce calculation, model and process outputs. For instance, if a piece of market data is altered due to a change in a data cleansing process, the data change must be stored but so too must the alteration in process that caused the data correction. On the pricing model and risk calculation front, the outputs themselves are valuable data which need to be reproduced.
Brian Sentance, CEO of Xenomorph said: âIt seems as though regulators are thinking âIf it moves, audit it!â. This desire to control models, calculations, processes and data requires a change in mindset from both market practitioners and vendors over precisely what constitutes âdataâ in the field of data management. This is particularly the case as trading, risk management and processing become increasingly automated and real-time - putting increasing pressure on institutions to automate and move away from manual data validation.â