Innovations in Operational Risk Measurement - knowing your score

Effective risk management must begin with meaningful risk measurement. Loss history, on its own, or even linked with high level RAG assessments fails to provide a totally meaningful basis for risk measurement so banks may need to find an alternative. There’s a small consulting company that is challenging the perceived wisdom of the current assessment approach and have come up with a unique and innovative method for measuring operational risk that is highly effective but amazingly simple.

We begin with the very obvious premise that transactions drive operational risk. Peter Hughes, a veteran of JPMorgan Chase explains, "Banks construct operating environments comprised of people, technology, premises, processes and controls. If a weak operating environment and a strong operating environment process zero transactions the operational value-at-risk of both will be limited to the cost of constructing the environments. If, however, both environments process a high volume of complex transactions, the operational value-at-risk presented by the weak environment will be far greater than the strong environment. The challenge we posed ourselves was how to measure this difference".

Thirty-Four Core Operational Activities for any Financial Transaction
Peter and his team of accredited Risk professionals set out to find a way of measuring the robustness of operating environments and the risks inherent in the transactions they handle. For more than 5 years they studied operating environments throughout the globe and made a most intriguing discovery... any financial transaction processing cycle in any industry can be reduced to one, or a combination of, 34 core operational activities. For example, most transaction processing cycles involve a transfer of value. This can be in the form of a payment, settlement or delivery of securities. The risk of financial loss presented by this activity is low if the transfer of value is to a guaranteed party, medium for market counterparties (financial loss is limited to compensation claims) and high for any other party (risk of loss of principal). As a result they catalogued all 34 operational activities and assigned a standardised risk factor to each one.

Volume matters
Peter then studied how risk profiles are affected by changes in transaction volumes. Their conclusion was that, as volumes increase, the rate of relative change in risk decreases. In other words, the risk of financial loss undergoes a greater change in relative terms in a band from zero to 10 thousand dollars than in a band that goes from 10 billion to a hundred billion dollars. With this information they constructed a table that assigns transaction volumes to predetermined value bands and a standardised risk weighting to each band.

Risk Concentration can be expressed in common units
The method uses the above data to provide a measure of operational risk concentration. This is done by identifying the operational activities that comprise a unit’s transaction processing cycles, extracting the relevant risk factors from the activity table and applying these to the volume band weightings. The result is a measure that blends the relative risk of transactions and processing volumes to produce a value that OpRAss refers to as "ORU’s" – OpRAss Risk Units. In other words, OpRAss can quantify the operational risk in any financial transaction processing cycle that is expressed in its own risk currency "ORU’s".

Peter explains further, "Risk concentrations expressed in ORU’s represent the absolute risk presented by a transaction processing cycle. Operations Managers manage this risk by constructing operating environments that incorporate, wherever possible, risk management best practices. We believed it was possible from our research to determine best practice benchmarks for each risk type and compare these to an operating environment’s actual risk management performance. Our aim was to create a methodology that would measure, on a scale of zero to 100, the effectiveness of an operating environment’s risk countermeasures where 100 is best practice."

Theoretically, according to the team, if an Operations Manager achieves a Risk Mitigation Indicator (RMI) of 100 he / she should be able to handle transaction volumes to infinity without risk of financial loss. They then quickly point out that an RMI of 100 is unachievable… it would need to start with an STP rate of 100%!

It’s the people who do the work who really know the score!
The risk mitigation benchmarks are organised into nine risk categories; Control, People, Execution, Business Recovery, Risk Culture, Management Oversight, Application Security, Physical Access and Policies and Procedures and 28 sub-categories. To achieve the best results the actual assessment against the OpRAss benchmarks must be facilitated by one of their assessors working with the operations personnel who own the processes. "It’s the people who do the work who really know the score!" comments Peter.

The final step is to calculate the operational value-at-risk expressed in ORU’s by blending the risk concentration and the RMI.

Breakthrough Methodology
Peter is confident that the OpRAss Method represents a real breakthrough in the measurement of operational risk. "ORU’s and RMI’s are calculated based on fully standardised tables so results are directly comparable between banks and between departments and teams within banks. The method measures risks in the same way that they are managed so they are totally relevant and meaningful to the people who actually manage the risks."

When asked whether the Method could be used for capital adequacy purposes under Pillar 1 of Basel II, Peter is more philosophical, "We believe the Advanced Measurement Approaches (AMA) are in a blind alley. Loss history collection is random, it has no relevance in today’s world where operating environments are constantly changing and, unlike credit risk, operational risk does not have a concept of exposure… maximum credit loss is limited by credit exposure whereas maximum operational risk loss is limited by a bank’s equity". The team believes that its method has great potential for Basel II. Their view is that if a bank is using the basic or standardised approaches the value of an ORU can be determined by calculating the total capital requirement and dividing this by the total ORU’s. If a bank can demonstrate to a regulator that total ORU’s have been effectively reduced this must present a strong argument in support of a reduced capital charge.

The team also sees some very exciting possibilities for AMA users. Their conclusion is that loss history on its own is meaningless but, if the ORU and RMI of the unit where the loss event occurred were to be attached, the viability of meaningful statistical models becomes greater as consistent measures of default probability (RMI) and exposure (ORU), which are vital components in credit risk models, can be incorporated.

Keeping it Simple, just Simple enough
Peter understands that the proving ground for the method will be in the next two years and several of the world largest financial organisations are already evaluating the approach. We’ve kept the method simple and relevant to the business unit manager while still addressing the Basel requirements. On the basis of "you can better manage what you can measure" anything that helps the Business and Operational Risk teams achieve this can only be good for the bank and the regulators

Peter Hughes is a Director of OpRAss Limited, a specialist team of consultants purely focussed on operational Risk Management.

The contact details are:
OpRAss Limited
Tel: + 44 (0) 1202

Become a bobsguide member to access the following

1. Unrestricted access to bobsguide
2. Send a proposal request
3. Insights delivered daily to your inbox
4. Career development