After the financial crisis and particularly in Europe, recently, where the second Markets in Financial Instruments Directive (Mifid II) aimed to encourage transparency in the markets, many financial institutions were caught under wider, more stringent banking regulations. Without the necessary infrastructure around new compliance and reporting rules, many firms struggled, and found resources stretched.
Those firms caught by the new rules – including the larger, more established banks and trading houses, commodities and energy firms who are required to file Mifid II reports – have had to work tirelessly as the directive has taken shape to get into line with the rules.
However, AxiomSL CEO Alex Tsigutkin says the new regulations have in many ways been a blessing for firms, who had previously overlooked their data management processes. The new regulations, he says, put pressure on firms to reassess those systems, and work out new and innovative ways to process the data they have at hand while considering what they needed to do to comply with the swathes of new rules.
“With Mifid II, data quality issues have surfaced, because it requires granular, transaction level reporting, so firms realised as they tried to comply with transaction level reporting that their data quality suffers,” he says. “[Mifid II] has in a way helped to flush out problems, because previously the problems were hidden – data was aggregated incorrectly, perhaps, or correctly but the underlying data quality was questionable then the aggregations would be wrong.
“With Mifid II and the types of reporting that is required across other jurisdictions people in the industry have learnt they cannot provide a holistic solution to the problem,” he says.
Therefore, as regulatory compliance has become more of an issue for firms, those firms have inspected their data management systems and realised that by bettering those systems they can operate more competitively.
“Data controls and data management capabilities have become more important. That’s not just on the regulatory front, it’s also become more important for how firms attempt to keep competitive in financial markets.”
New data requirements have popped up in a range of jurisdictions across the world. In the US, FDIC 370 is putting pressure on large depository institutions to submit their reports to the regulators with analysis of deposits and depositors. In Singapore, MAS610 requires thousands of data points to be aggregated, and risk exposures to be presented to the regulator regularly, requiring high end taxonomy and technological investment. In Australia, new data requirements for financial institutions are putting pressure on the industry, and of course the General Data Protection Regulation (GDPR) is set to test firms in many different industries on their data management capabilities.
“The regulatory burdens on firms around their data management systems just keeps accelerating,” says Tsigutkin.
For international firms operating across jurisdictions, the fact that each regulator has outlined different requirements complicates matters.
“It’s different across different tiers of the organisation, and different enterprises,” says Tsigutkin. “Everyone has a different requirement and different data points at granular levels.”
“But sometimes smaller and medium sized organisations are under greater pressures because of course they don’t have the same resources as tier 1 institutions to deal with these issues internally,” he says.
Last year, AxiomSL conducted a survey on how firms trust their data. Astonishingly, 69% of Asia-Pacific respondents said they had grown more concerned in 2017 than in 2016 about their organisation’s ability to comply with regulations, and in North America 28% said the same thing.