Data volume increases of more than 40% per year mean that, in 2 ½ years, a financial firm will have doubled its data. What other valuable asset does a firm have that grows at such a rate?
Data is the lifeblood of a firm – data must flow seamlessly through its products and lines of business, not only to support growth strategies but also to meet ever-expanding regulatory mandates. All too often financial firms attempt to address increasing data volumes by creating more silos across business units. This data fragmentation results in major struggles for firms when aggregating, enriching, analysing and validating vast amounts of information in order to accurately report their overall risk exposure and ﬁnancial positions.
The drive to improve data governance has set the stage for increasing the strategic role of chief data officers (CDOs). CDOs, who now report to the CEO and not the CIO, bring direction to data management strategies for bank data systems – whether it is for reporting to regulatory authorities or providing analysis and support for trading, investment, risk management and loan decisions. Despite the importance of data and its abundance, financial institutions still suffer from incomplete data and operational challenges around big data integration. Regulators expect comprehensive and granular reporting, signed off at the board level. This requires both accurate reports and high-quality data, supported by effective data governance. Failure to meet such requirements or prove that controls are in place to identify discrepancies can lead to fines and higher capital limits that will push up firms’ costs.
Financial ﬁrms are now undertaking a significant amount of technical work and investment in IT infrastructure to address the magnitude of the problems they face in relation to data risk aggregation, data lineage and data governance. The importance of this work is underlined by the Basel Committee in BCBS 239 (Principles for Effective Risk Data Aggregation and Risk Reporting), which spells out a set of criteria to ensure that aggregated risk data is accurate, comprehensive, consistent and reliable. By setting out these overarching principles, BCBS 239 focuses on obliging banks to develop the right data governance capabilities rather than merely hitting a compliance date.
To address these requirements and enhance information oversight, financial reporting and analysis, firms should establish integrated data taxonomies and architecture across the enterprise, which includes information on the characteristics of the data (metadata) as well as unified naming conventions for the data. AxiomSL’s platform can create common translation tables to standardise clients’ metadata (e.g. reference data for legal entity, counterparty, product etc.), which may be dispersed across the bank. Further, this integrated approach consolidates financial, risk and reference data while managing large volume of data. Robust relationship models can be built for aggregated reporting results with validation, reconciliation and variance reports to meet normal and stress/crisis reporting accuracy requirements.
Regulators now have access to so much data, they can more easily identify data inconsistencies or a lack of data lineage information, including aggregation and transformation logic. As a result, an unstructured approach will, at best, be an inefficient way to meet regulatory requirements and, at worst, may not be enough to secure compliance. To avoid a siloed response, financial firms need to build a data governance environment that is capable of adapting to new and evolving regulatory requirements, including FDSF, CCAR, other stress testing regulations and Basel liquidity rules. While doing this, the data governance environment should also provide data granularity, risk control standards and transparency across the management of the data lifecycle, as well as the validation and drill-down process.
AxiomSL’s high performance platform enables financial firms to consolidate modern and legacy technologies by enriching data taxonomies and metadata across an organisation’s entire technology infrastructure. The enriched data serves as the foundation for aggregation, reporting and analytics, while retaining the ability to drill down and trace/view data at the original source. Maintenance of the enriched data is also flexible and transparent since AxiomSL retains the identity of each source, thus allowing for ease of transition to support ongoing infrastructure initiatives. Further, this strategic platform offers an automated or on-demand workflow capability that allows users to execute processes to retrieve data, perform calculations and create reports for any attributes that are being aggregated.
As a result, absence of data quality and data governance will result in slower response times, higher costs, loss of trust, increased regulatory scrutiny and missed business opportunities. With these increasing data challenges, the unarguable fact is that today major financial institutions around the world are focusing on data lineage, data quality, integration and metadata to establish a solid foundation for data and process governance. Financial firms need to implement an integrated technological platform that allows for complete data lineage. This includes showing and reporting on how all upstream and downstream objects have traveled and were impacted by a change to that data throughout the entire process, with an audit trail and validation checks at the business rule or report level.
By Alex Tsigutkin, CEO, AxiomSL