Banks must get their technology platforms and processes in place before the end of 2019 in order to allow two years for data management testing and model approval prior to the Fundamental Review of the Trading Book (FRTB) implementation deadline, according to Eugene Stern, Bloomberg’s head of market risk product.
“To decide what they are going to do, to be able to calibrate the impact of the new rules in terms of the standardized approach/internal model approach balance and make decisions about where they could go for IMA approval and how to structure their desks the first half of the year is really going to be the tight period to do those benchmarking exercises and start to make decisions,” says Stern.
FRTB was developed by the Basel Committee on Banking Supervision (BCBS) as part of Basel III. It is a set of capital requirements intended to be applied to banks’ wholesale trading activities to create a more resilient market.
The revised framework was published in January this year and will take effect as of January 1, 2022. According to a statement by the Bank of International Settlement (BIS) the revised framework “is estimated to result in a weighted average increase of about 22% in total market risk capital requirements relative to the Basel 2.5 framework.”
As the industry continues to understand the full implications of FRTB, there is a need to get ready for the looming “hard” implementation dates that are fast approaching, according to Stern.
“In the first paragraph, in the first sentence there was a hard date of January 1, 2022 and that is the Basel Committee really putting a stake in the ground and saying this has to be the date no matter what,” says Stern.
“The immediate next thing because there is a tight implementation timeline is that banks need to benchmark what their capital is likely to be under different approaches so that they can make informed decisions about what they are going to do if you want to work backwards from the January 1, 2022 deadline,” he says.
Technology vendors building FRTB solutions also face difficulties, as the final technical standards have not yet been published, according to Jouni Aaltonen, managing director in the prudential regulation division of the Association for Financial Markets in Europe (AFME).
“As long as there is no regional or national implementation, or law, in the EU or the US it is still in a flux because a lot of the very minute detail for a lot of the elements is not in the Basel text and that needs to be implemented into regional law. Before those are nailed down it is really hard to have a complete and concrete solution that will allow banks to comply with the FRTB rules,” says Aaltonen.
While a number of initiatives may be underway “there is no solution that can be baked into a plan,” says Aditya Oak, principal consultant, at Brickendon.
“Despite all the regulations that have gone before, many organisations are having to use FRTB to lift market data capabilities to where they should have been a decade ago. Unless this is planned with a view to the full operational impact (and not just FRTB) we’ll find reconciliation projects filling much of 2022,” he says.
For Aaltonen, the key challenge for banks, and an area where fintechs may be able to assist, will be the alignment of banks’ front and back office systems.
“Where there are banks with lots of legacy systems and risk functions using different cuts of the data, or even different sources of the data than P&L evaluation functions in the front office, it will be a more complicated project for those banks, and they will need to invest more heavily into technology to get the data aligned between front and back office systems,” says Aaltonen.
However, the exercise to map risk factors to external price observation or even internal price observation is going to remain challenging, according to Daniel Percy-Hughes, principle consultant at Synechron.
“[Vendors] can only really offer a partial solution which is data that provides information in a very standardized way and that will only be of benefit to firms if they are equally able to group or identify their own risk factors in a similarly standardized way,” says Percy-Hughes.
“It is the types of vendors that are already aggregating data in mass, so data repositories, clearing houses, exchanges and therefore should have a broad enough coverage of price observations that they can market something of benefit back to the clients. But I don’t see there really being room for a tech solution to sort FRTB,” he says.
“It perhaps is the case that the smaller tier two firms they will still look to the vendor to do more of the heavy lifting or the resolution of observation to risk factor than they might do themselves internally.”
Bloomberg’s Stern sympathizes.
“Some of the smaller institutions are going to have to put in place systems, data infrastructures and get the underlying data on those platforms that they have never had to do before. For some of the larger institutions they have data and infrastructure already but now they are going to have to do a sizeable amount of work to make sure that that is all aligned across different asset classes and across front and middle office things like the P&L attribution test so that will definitely require rethinking and in many cases rebuilding your data architecture, and it is really data across the board,” says Stern.
There may be additional problems in aligning structures, particular regarding the time point in which international desks use to measure their profit and loss (P&L).
“You have challenges around the front office, if they have a desk in Hong Kong they may use a completely different time point for measuring their P&L compared to that time stamp of the risk systems which may use the UK close of business. Getting the front office system to also provide data cuts for the risk system close of business or vice versa is something that needs thinking and getting the systems aligned, and the desk structures aligned to best accommodate similarities within the trading activities,” says AFME’s Aaltonen.
“Unless you have the final rule you can’t really develop something in the bank because you don’t know if it will comply with the final regulation so it is still a bit of guess work but it is certainly much better than it was before,” he says.
“At the moment nobody knows exactly what the data would look like in reality with real portfolios and with the systems developed and actually taking a look at it, so it is difficult to say at this time.
Internal model approach vs a standardized approach
In March 2018 the AFME, said the EBA was “fully aware of the fact that the Profit and Loss attribution (PLA) test is possibly the biggest challenge in operationalising the internal model approach.”
The PLA test is an assessment to determine whether a bank’s internal risk management models appropriately reflect the risks of individual trading desks.
In the final revisions to FRTB, new PLA test metrics were introduced to better differentiate between models that were performing well and those that weren’t. As well as changes to the consequence of failing the test from the pervious pass or fail to a traffic light system to “reduce potential cliff effects in capital requirements.”
The new thresholds mean that a trading desk in the amber zone can continue to use internal models but will be subject to a capital surcharge. Those in the red zone must use the standardized approach.
The main theme with this last round of final changes is the Basel committee “trying very hard” to make the internal models approach work and be palatable for banks within the risk constraints that they want to set, says Stern.
“The Basel committee has worked pretty hard to make the challenge fit for purpose. The original version of the P&L attribution test where they were looking at ratios, depending on what you were trading and how you were hedged you stood a pretty good chance of failing those tests even if your front and middle office model alignment was very good. They’ve adjusted those tests to have the statistical test that they have now and the threshold that they have now in the last version they widened those green and amber zones to make those tests better capture what they are really trying to get after which is the front and middle office alignment.
One of the considerations firms must make as the deadline looms is whether they use a standardized approach or a modelled approach, and each one comes with a cost, according to Gregg Jones, director of risk and capital at the International Swaps and Derivatives Association (Isda).
“It is not just necessarily the capital cost as being potentially more beneficial than the other but also the cost of the implementation, and the continuous maintenance and validation of such a framework,” says Jones.
“Firms may find that the technology costs of business change and technology change costs of looking to use their own models might not be outweighed by a benefit in reduced capital charge,” adds Percy-Hughes.
While many bigger banks will have mainly internal systems, it is possible that the smaller banks that may benefit more from outsourcing the development of systems for the standardized approach, according to AFME’s Aaltonen.
“For smaller organizations in the development of the standardized approach might be an area where fintechs can get involved, but obviously in terms of the internal models banks need to have the price observation in all their systems for all their trading activity so they can’t really rely on external data,” says Aaltonen.