THEMATIC PROGRAMS

December 26, 2024

March 26-27, 2010
Industrial-Academic Forum on Operational Risk

Talk Titles and Abstracts

Emre Balta (Office of the Comptroller of the Currency (OCC))
The Known, the Unknown, and the Unknowable: Challenges in Validating AMA Models

Basel II does not specify a particular approach or distributional assumptions for the AMA-based models for the operational risk capital charge. This flexibility inherent in AMA creates a broad range-of-practice that makes comparison/benchmarking of the model results a significant challenge for the validation teams and the supervisors. Furthermore, the ultimate object of interest, the .999 quantile of the aggregate loss distribution over a one-year time horizon, combined with the heavy-tailed nature of
the operational losses make the process particularly sensitive to choice of alternative models and underlying assumptions. In this talk, we focus on the practical challenges faced in the validation of AMA models, particularly with respect to high tail estimation, tail dependence, and model uncertainty.

________________________________________________________

Eric Cope
IBM Research, Zurich
Penalized Likelihood Estimators for Truncated Data

We investigate the performance of linearly penalized likelihood estimators for estimating distributional parameters in the presence of data truncation. Truncation distorts the likelihood surface to create instabilities and high variance in the estimation of these parameters and the penalty terms help in many cases to decrease estimation error and increase robustness. Approximate methods are provided for choosing a priori good penalty estimators, which are shown to perform well in a series of simulation experiments. The robustness of the methods are explored heuristically using both simulated as well as real data drawn from an operational risk context.

________________________________________________________
Mathias Degen
Postdoctoral Research Fellow at Cornell University, Ithaca NY
Diversification benefits: a second-order approximation

The quantification of diversification benefits due to aggregation of risk plays a prominent role in the (regulatory) capital management of large firms within the financial industry. However, the complexity of today's risk landscape makes a quantifiable reduction of risk concentration a challenging task. We discuss some of the issues that may arise. The theory of second-order regular variation and second-order subexponentiality provides the ideal methodological framework to derive second-order approximations for diversification benefits. As a byproduct, this allows us to analyze the accuracy of the closed-form OpVaR approximation (Böcker-Klüppelberg).

________________________________________________________
Kabir Dutta
Principal/Senior Consultant in the Insurance Economics and Risk Management Practice of the Charles River Associate International
On Using Scenario Analysis in The Measurement of Operational Risk: A Systematic Approach for Data Integration

While scenario analysis is an important tool for the risk measurement, its use in the measurement of operational risk capital has been quite arbitrary and often inaccurate. Using a method based on the change of measure approach used in financial economics for asset pricing we will show how one can measure operational risk exposure of an institution using scenario analysis along with the internal loss event data. We will also show that the proposed method can be used in many different situations such as in the calculation of operation risk capital, stress testing, and what-if assessment for the scenario analysis, among others. Using this method one could create a catastrophic bond on various segments of operation risk exposures of an institution.

Please note that the presentation is based on my latest paper ( with David Babbel) available at the following:

http://fic.wharton.upenn.edu/fic/papers/10/p1010htm.htm

________________________________________________________
Joerg Fritscher
Deutsche Bank
Stabilizing the calculation of expected shortfall contributions using conditional Monte Carlo methods

The computation of important risk measures such as Value-at-Risk (VaR) or expected shortfall (ESF) contributions using Monte Carlo (MC) simulation becomes a challenging task when heavy-tailed loss distributions are involved. In Operational Risk (OR) one is usually confronted with such types of distributions and thus forced to use a large number of scenarios to obtain numerically stable estimates of aggregate risk capital (i.e. VaR). However, the computation of ESF tail contributions required for the allocation of capital at divisional level is even more difficult to stabilize, which makes straightforward MC simulation often impracticable for this purpose.
Asmussen and Kroese have successfully employed a variance-reducing methodology for the rare event simulation with heavy tails: conditional Monte Carlo estimators.
This presentation describes the general technique of conditional MC simulations as well as its application within the LDA. Furthermore, the implementation in DB's AMA model for the calculation of contributory capital for the different cells of our business line/event type matrix via expected shortfall contributions is introduced. The perfomance of the Asmussen-Kroese algorithm and plain MC are compared to demonstrate the superiority of conditional MC.

________________________________________________________
Elise Gourier
Swiss Banking Institute, University of Zurich
Operational risk quantification using extreme value theory and copulas: from theory to practice

In this talk we present an empirical study pointing out several pitfalls of the standard methodologies for quantifying operational losses. Firstly, we use Extreme Value Theory to model real heavy-tailed data. We show that using the Value-at-Risk as a risk measure may lead to a mis-estimation of the capital requirements. In particular, we examine the issues of stability and coherence and relate them to the degree of heavy-tailedness of the data. Secondly, we introduce dependence between the business lines using Copula Theory. We show that standard economic thinking about risk diversification may be inappropriate when infinite-mean distributions are involved, as it is standard in operational risk.

________________________________________________________
Giulio Mignola
Intesa Sanpaolo
Challenges in measuring operational risks from loss data

Under the Advanced Measurement Approach of the Basel II Accord, banks are required to measure their total annual operational risk exposures at the 99.9th percentile of the loss distribution. Meeting this measurement standard, given the amount of operational loss data that is currently available from either internal or external sources is extremely challenging. Furthermore some difficulties arise in applying the Loss Distribution Approach to computing operational risk exposures, as well as in validating the capital models. Finding many of these problems insurmountable, a possible way forward is to suggest some changes to the regulatory framework that could, at least partially, circumvent these difficulties.

________________________________________________________
Martin Neil
Queen Mary University, London
Using Hybrid Dynamic Bayesian Networks to model Operational Risk in Finance

This paper presents recent new ideas on using cause-effect modeling, in the form of Hybrid Dynamic Bayesian Networks (HDBNs), to estimate extreme financial losses resulting from operational failures. The presentation will focus on a particularly important loss process - rogue trading - with the aim of demonstrating the advantage of explicit modeling of banking processes and risk culture over purely statistical models derived from actuarial loss data alone. Value at Risk is calculated by applying a new state-of-the-art HDBN algorithm that approximates continuous loss distributions and aggregates across loss types using a process called dynamic discretization. We conclude that the statistical properties of the model have the potential to explain recent large scale loss events and offer improved means of loss prediction.

________________________________________________________
Tony Peccia
Citi group
CRO for Citibank Canada
Rethinking Basel II for Operational Risk

The capital charge in most financial institutions is largely determined by actuarial models that provide little insights into the actual risk factors that drive the operational risks exposure and remains an abstraction for most business managers. Operational risk reporting largely consists of reporting losses after the fact to mostly those that were actively involved in resolving the loss event, accompanied by an abstract capital amount and either aggregated RCSA/KRI information or pages and pages of detailed risk issues that are not actionable to the recipients of the report. So what to do?

________________________________________________________
Beatriz Santa Cruz Blanco
BBVA
Metodologías de riesgo corporativo
Issues in Modelling Tails in Operational Risk

Basel II establishes different methodologies for the measurement of the operational risk under advanced methods, being most used by the industry the one that is based on the actuarial models (Loss Distribution Approach). According to the method above mentioned, the regulatory capital by operational risk is obtained from the distribution function of losses, by means of convolution between the frequency and severity distributions of every couple business line - risk type, as well as from the correlation with other ones accounted for by the Financial Institution. We focus on the modeling of the distribution of severity, in particular, by instance of outliers which mismatch the adjustment: a) understate capture of the empirical information of the entity; b) great extrapolation, providing figures of expected loss and percentile of the loss distribution out of any economic rationality. We present an alternative methodology to the traditional one used for modeling outliers. The method proposed could be a first approach to conduct a technical assessment of scenarios in which outliers are involved, being applicable not only to operational risk in financial area, but to another sectors like energetic and aerospace

________________________________________________________
Anupam Sahay
KeyCorp
Director Risk Models & Operational Risk, Risk Management
Analytic Approximations for Operational Risk Capital

In the loss distribution approach for operational risk capital modeling, severity and frequency are modeled separately and then combined to obtain the aggregate loss distribution and capital. From the point-of-view of analytic approximations, the realm of modeling can be divided into four quadrants, based on whether the tail of the severity distribution is light or heavy, and whether the expected frequency is low or high. We present asymptotic approximations for operational risk capital that are relevant in these quadrants. The accuracy of the approximations and their practical usage are discussed.

________________________________________________________
Alberto Suarez
Universidad Autónoma de Madrid
Robust quantification of the exposure to operational risk: Bringing economic sense to economic capital

Operational risk is commonly analyzed in terms of the distribution of aggregate yearly losses. Risk measures can then be defined as statistics of this distribution that focus on the region of extreme losses. Assuming independence among the operational risk events and between the likelihood that they occur and their magnitude separate models are made for the frequency and for the severity of the losses. These models are then combined to estimate the distribution of aggregate losses. While the detailed form of the frequency distribution does not significantly affect the risk analysis, the choice of model for the severity often has a significant impact on operational risk measures. For heavy-tailed data these measures are dominated by extreme losses, whose probability cannot be reliably extrapolated from the available data. With limited empirical evidence, it is difficult to distinguish among alternative models that produce very different values of the risk measures. Furthermore, the estimates obtained can be unstable and overly sensitive to the presence or absence of single extreme events. Setting a bound on the maximum amount that can be lost in a single event reduces the dependence on the distributional assumptions and improves the robustness and stability of the risk measures, while preserving their sensitivity to changes in the risk profile. This bound should be determined by an expert on the basis of economic arguments and validated by the regulator so that it can be used as a control parameter in the risk analysis.

Back to top