You don't have javascript enabled.

The application of the Pareto Principle in risk and compliance

Eric A. Sohn, CAMS, Director of Business Product, Dow Jones Risk & Compliance The Pareto Principle (sometimes called the 80/20 Rule) is simple: the vast majority of the benefit is derived from a relatively small percentage of the effort. Whether those figures are truly 80% and 20% is immaterial; the general principle holds true that

  • Eric Sohn
  • February 13, 2017
  • 6 minutes

Eric A. Sohn, CAMS, Director of Business Product, Dow Jones Risk & Compliance

The Pareto Principle (sometimes called the 80/20 Rule) is simple: the vast majority of the benefit is derived from a relatively small percentage of the effort. Whether those figures are truly 80% and 20% is immaterial; the general principle holds true that effort does not translate uniformly into equivalent levels of results.

An interesting extension of the Principle is that it applies, again roughly, to the remainder. If 20% of the effort results in 80% of the results, one should expect that 20% of the remaining 80% of the effort will result in 80% of the remaining 20% of results. If you do that math, this second bite at the apple yields 16% of results for 16% of the total potential effort, which is a pretty fair trade. And if you add both sets of figures up, 36% of the effort results in 96% of results. While 80% may not be an acceptable success rate, 96% might be.

We can recategorize these two sets of efforts (the 96% results achieved, and the 4% not achieved) as having differing levels of efficiency. The large majority of results are at least relatively efficient, in that effort is at least commensurate with the results. But that stubborn remainder is highly inefficient, to the point of being considered a matter of diminishing returns.

Risk and compliance and Pareto

Risk management and regulatory compliance, unfortunately, obeys the Pareto Principle. This is partly due to data quality issues in the information being screened, whether it is static information like customer data or transactional information like insurance claims payments. Fortunately, addressing such issues is generally under one’s control, albeit often with significant effort.  

The larger issue causing inefficiencies, however, are those concerning the data quality of the databases which identify those who may present undue risk. This data may be incomplete, inaccurate or outdated and may also be  irrelevant to an organization’s true goals. While there are, in any large set of data, incorrect values, many of the gaps derive from an inability to identify the desired information. For example, the birthdate of the daughter of a city’s mayor (who must be identified as a Politically Exposed Person, or PEP, as part of an anti-money laundering, or AML, program), regardless of her age, may not be publicly available. Similarly, while the arrest of someone for fraud may make the local paper, the dismissal of charges, or an acquittal, may not appear anywhere other than the town clerk’s office. In the first case, time is wasted trying to ascertain if the PEP is, in fact, the same person as in the data record; in the second, time is spent looking at an event that may not be of concern and that might have been systematically excluded.

How can compliance departments manage these quality gaps to be more efficient while not unduly increasing risk? It starts with finding vendors of risk and compliance data, as well as investing in internal or third party tools, that can separate the wheat from the chaff, in line with your risk tolerance. In addition to gaining that technical flexibility, you need to promulgate policies and procedures that let you winnow the application alerts effectively, For example, except for terrorism, if the country associated with a negative news story does not match a client’s country of residence, an organization could choose to not generate an alert. Similarly, if a relative or close associate of certain types of PEPs (e.g. Mayors, provincial officials) do not have birth dates listed, those could be skipped over as well. Lastly, alerts could also be avoided for those who were removed from sanctions lists.

PEP goes the weasel

There is an additional data “quality” issue with these databases which should be considered: An extreme case of 'Politically Exposed Pareto'. A tiny fraction of the persons considered to be PEPs are significantly more likely to be bribed, or to demand bribes (the crime that overwhelmingly leads them to launder money), than the rest, according to the statistics in the OECD’s 2014 Foreign Bribery Report. Yet regulations requires labeling them equally as PEPs – and putting in some effort in identifying them all.

PEPs, not being necessarily bad actors, but mostly those who have the opportunity to abuse their position, present a unique challenge, given how small a portion the people of real concern represent of the total  population. There are a number of strategies that can be considered to reduce the screening and subsequent research workload, while minimizing the additional risk.

First, one can require a higher degree of matching accuracy for those less likely to be of concern, That can be accomplished by requiring more exact text matches of the PEP’s name, requiring matching elements other than the name, or both – depending on the nature of office held, or whether the office is currently being held, for example.

More radically, but perhaps more to the point, firms can consider reducing the population of the records to be screened as PEPs based on their financial profile. For example, if the value of the relationship is small, and the products the client uses are low-risk (e.g. standard deposit products in a bank, or life insurance with no cash value), perhaps their official position is irrelevant (for now) from a perspective of actual money laundering risk.

If that makes one (or one’s legal counsel) uncomfortable, one could find a middle ground that still makes operational costs less onerous. One could put lower-quality PEP matches (e.g. those with no listed date of birth or identification documents) on a shorter transaction monitoring leash. If a $5000 threshold is normally used for identifying suspicious transactions, why not set such clients to generate an alert at $3000? If the alert is never triggered, significant research effort has been saved with no apparent risk. And if the alert is triggered, one could “flip the switch” and treat the person as a PEP, regardless of the lack of certainty.

Success, or Sisyphus?

Over time, regulatory expectations have, and will continue to increase. They require casting wider nets to identify bad actors committing less significant crimes (or potential crimes in the case of the Panama Papers leak) from more and more pools of larger and larger sets of names. The percentages of the Pareto Principle, if not precisely accurate today, will be reflective of reality in the future (if not become more extreme), unless firms take steps to recognise the limits of their data and systems, and take principled steps to manage the efficiency and risk exposure of their program accordingly.