Why collateral management has become the new buzzword
22 September 2011
Collateral sufficiency is a well established risk management process but has become a growing concern for the financial community in recent years. Thanks to regulatory initiatives, as well as the increasing complexity arising from the large and ever-growing volumes of collateralised deals, banks need a clearer and immediate view on collateral availability to reduce counterparty risk and support the company’s funding decisions.
Regulatory mandates, such as the European Market Infrastructure (EMIR) and its equivalent over the pond, the Dodd-Frank Wall Street Reform Act, have transformed this classic risk management process. Collateral management has typically been viewed as a compliance and operations based post-trade control process that was generally reviewed operationally on a daily basis, and from a sufficiency perspective, on a monthly or even quarterly basis. Now, however, it is a process that needs to be brought into pre-trade execution decision-making across the organisation.
As a result of these changes, risk managers in investment banks must have a full and detailed view of their collateral portfolio with up-to-date information on what is pledged and available. They also need to obtain insight on how their collateral portfolio will evolve over time. In addition, it is important to understand how a portfolio will react under stress tests and other scenarios to better anticipate and optimise pledging.
The regulatory overhaul
These factors mean that there is now a strong impetus for financial institutions to improve their collateral management. In the UK, the much anticipated final report by Sir John Vickers has recently been published and, as expected, recommends the ring-fencing of banks. The report states that where ring-fenced banks settle payments on behalf of other banks or financial companies, prudential limits should be imposed on the resulting exposures to those companies, including collateral requirements and limits on the size of potential exposures (including intraday and overnight, secured and unsecured). What’s more, in the US, recent proposals for collateralisation under the new central counterparty clearing (CCP) model, rather than the incumbent credit support annexes (CSA), means that margins not only have to be posted on a dollar-for-dollar basis with the CCP through a clearing member-owned futures commission merchant (FCM), but they also have to be marked to market daily and the margin impact calculated on a pre-trade basis.
For the majority of financial institutions, their existing and inherently batch based collateral engines will struggle to meet these proposals for collateralisation. As it stands, banks are unable to calculate their risk exposure in a timely fashion which involves financial institutions projecting thousands of scenarios and then aggregating them in a non-linear fashion. This means banks will have to design scenarios that cover the broad range of trade types, build calculation capacity to run the scenarios and then be in a position to undertake agile analysis of these broad data sets. The move toward being able to assess the collateral or margin requirements pre-trade is presenting a new set of challenges for banks.
In order for banks to have a clear view on collateral availability in real-time, banks will need to enhance their existing technology frameworks. One way to do this is by combining continuous Complex Event Processing (CEP) and incremental Online Analytical Processing (OLAP). As the complexity of the collateral calculations evolve towards a more scenario based approach, the ability to incrementally assess the impact of each new trade and merge this into the existing counterparty exposure is fundamental. The classic brute-force approach simply takes too much computing time. Incremental OLAP engines have been designed to handle just such requirements. When combined with CEP capabilities the traders and risk managers are notified immediately when market changes move the counterparty exposures beyond certain pre-defined thresholds. The result not only provides instantaneous pre-trade exposure impact but continuous monitoring of the overall portfolio as rates move, even if a new transaction is not made.
Yet despite the dire need for investment in collateral management, many financial institutions remain hamstrung and reluctant to invest in new technology at present. The industry remains under the pressures which have characterised the last few years, namely squeezed budgets and ongoing uncertainty. As a result, industry regulation is typically approached as a tick-box exercise rather than as a strategic project that can deliver business value and collateral management is no exception to this rule.
While financial return will continue to be an important factor in any company’s decision-making process, the nature of the financial services industry means that financial institutions need to adapt and invest to remain competitive. New, innovative banking players coming onto the scene along with new directives mean that banks cannot afford to rest on their laurels.
Regulatory clampdown is likely to continue over the next twelve months placing increasing scrutiny on banks’ ability to understand their collateral position in real-time. There is no doubt that risk exposure will remain a key priority for banks in the short-, medium- and long-term. If implemented and deployed appropriately, technology can assist with collateral management; avoiding duplication of analysis and aiding risk managers in making well informed decisions quickly to support the business. The ability to analyse in real-time the impact pre-trade means banks can see the effect of a trade across their entire portfolio, thereby potentially reducing credit exposure. Not only does it reduce risk but it also means banks can price more aggressively which will ultimately drive up profits - something that we know all banks have an interest in.