Discussing Risk Data Strategy: An Interview with Philip Chamberlain

Risk data is an area that has been largely overlooked for many years. Today the situation is different and the area is facing increasing regulatory scrutiny, as Systemically Important Financial Institutions (SIFIs) rush to comply with the Basel 239 Principles for Effective Risk Data Aggregation and Risk Reporting. An enterprise’s success depends on its ability …

August 29, 2014 | marcus evans

Risk data is an area that has been largely overlooked for many years. Today the situation is different and the area is facing increasing regulatory scrutiny, as Systemically Important Financial Institutions (SIFIs) rush to comply with the Basel 239 Principles for Effective Risk Data Aggregation and Risk Reporting. An enterprise’s success depends on its ability to analyse risk data efficiently and effectively, in ways that uncover both risks and opportunities. Being able to extract and escalate critical risk information is nearly impossible without a robust risk management framework supported by a strong technology infrastructure.

Philip Chamberlain, Vice President, Risk Governance and Risk Appetite at the Prudential Insurance Company of America recently spoke with GFMI about key topics to be discussed at their upcoming Risk Data Aggregation, Governance and Reporting Conference, November 3-4, 2014 at the Double Tree by Hilton Metropolitan in New York City.

Why is risk data such a key issue for financial institutions now?

Philip Chamberlain: Relevant information about an institution’s exposure to loss, and about the product and financial markets around us, has always been critical to managing a financial institution. Two facts, however, focus the issue more sharply for us now. First, we are well into a data capture and storage revolution that raises the bar on what a financial institution can and should infer from its risk data resources. Some financial institutions will fully seize the opportunity; those who do not will find themselves operating at a big disadvantage going forward. Second, financial regulators have greatly expanded their requirements for analysis and reporting of risk data, as to volume, precision and timeliness. For the institutions subject to the higher-level requirements, required regulatory risk data is a critical challenge in itself, and a costly one.

What are the key challenges institutions are facing in creating a holistic view of risk data?

PC: Like computers and computer programs, much financial reporting amounts to processing facts by established rules, as may be seen in external financial reporting statements, regulatory report filings and income tax returns. Risk is different, at least for management use. Risk is analysis, and analysis needs to be consistent over the enterprise in order to have comparable facts in your risk equation. This is a challenge of common concepts, vocabulary and measurement, all of it difficult in a larger financial institution. Doing a thorough job of capturing material risk data is also costly, and must be done with consistent leadership over years, not months. Finally, there is the trap of aiming for regulatory compliance as the objective. Difficult as it may be, each institution needs to define a risk data environment that best meets its risk management needs—a superset beyond regulatory requirements.

What is the importance of standardising terminology for risk data?

PC: High-quality risk analysis always depends on clear definitions, starting with terminology and extending to all risk data captured. If one has a problem involving apples, oranges and pomegranates, so to speak, it does no good to discuss the problem in terms of “fruit.” Managerial (non-regulatory) definitions are the most challenging, and the most critical. With a diverse group of analysts and managers, an institution needs to insist on the clear focus of consistent terminology.

What do you think attendees will gain from attending this event?

PC: Attendees will benefit from the perspectives and priorities of professionals who have worked in organizing and reporting financial data on a grand scale. I have had the privilege of working alongside some of the other speakers, and can attest to their skills, experience and insight. The next best thing to acquiring experience personally is to borrow it from speakers and participants at programs like this one.
 

Philip Chamberlain will be leading the session “Normalizing Risk Data Terminology to Ensure an Enterprise-Wide Risk Data Strategy” on Monday, November 3, 2014 at the GMFI Risk Data Aggregation, Governance and Reporting Conference.

Mr. Chamberlain is a Financial Risk Engineer focused on the issues of large financial institutions. He is currently Vice President, Risk Governance and Risk Appetite in Enterprise Risk Management at Prudential Insurance. Until 2010, he was a Managing Director in the Bank of New York Mellon's Risk Management Sector. Responsibilities included firm-wide Basel II implementation, functional leadership for BNY's Basel II credit implementation, stress testing, economic capital, economic credit portfolio model, return on credit risk and internal ratings models, risk data warehousing, and credit risk data generally. A graduate of Yale and New York Universities, Mr. Chamberlain's career with the Bank of New York reached back to 1973, with the last ten years focused on Basel II implementation.

GFMI is a specialized provider of content-led conferences for the financial markets. Carefully researched with leading financial market experts, our focused quality events deliver key bottom-line value through targeted presentations, interactive discussions and high-level networking opportunities.

 

By Marcus Evans

Categories:

Resources

Regulatory reporting: 7 Questions with Philip Flood, Gresham Technologies

Other | Behavior detection & predictive analytics Regulatory reporting: 7 Questions with Philip Flood, Gresham Technologies

Gresham Technologies

Regulatory reporting: 7 Questions with Philip Flood, Gresham Technologies

Philip Flood, Business Development Director, Regulatory and STP Services, recently joined the ‘7 questions with…’ podcast with Gert Raeves of… Continue Reading

View resource
Real-time payments tech put pressure on banks

Best Practice | Behavior detection & predictive analytics Real-time payments tech put pressure on banks

Intix

Real-time payments tech put pressure on banks

The transformation to real-time has seen the market modernise, but there is a further need for banks to have the… Continue Reading

View resource
TransferGo Case Study - payments industry

Case Study | Behavior detection & predictive analytics TransferGo Case Study - payments industry

ReconArt

TransferGo Case Study - payments industry

Bank statement and Account Payables reconciliation. Seamless integration with NetSuite. TransferGo outlined two major product requirements. First – full… Continue Reading

View resource

New GFT podcast on AI

In the latest episode of our new podcast series on AI entitled ‘Artificial Intelligence, Intelligently Applied’, our host Simon Thompson… Continue Reading

View resource