Keynote Abstracts
John Birge (University of Chicago, Booth School of Business)
Managing risk with operational and financial instruments
Due to various market imperfections, firms have an incentive to manage
idiosyncratic risk. Operational flexibility and financial tools offer complementary
mechanisms for this process. This talk will discuss the motivation for different
forms of risk management, relative advantages of operational and financial
approaches, and a model for constructing an optimal risk management portfolio.
Endre Boros (Rutgers University)
How to mitigate the risk of blowing up and the cost of being too cautious?
Finding ways to intercept illicit nuclear materials and weapons destined
for the U.S. via the maritime transportation system is an exceedingly difficult
task. Today, only a small percentage of containers arriving to U.S. ports
are inspected. Current technology provides highly uncertain detection, with
a high risk of both unnoticing hidden nuclear material and raising costly
false alarms. We present a new mathematical model for creating an optimal
inspection policy based on a set of noisy and possibly stochastically dependent
sensor technologies.
Matt Davison (University of Western Ontario)
Energy Storage: A problem at the intersection of quant finance, optimization,
and energy policy
New renewable energy technologies are a major part of "green shift"
plans to decrease humanity's carbon footprint. While wind and solar power
hold the promise of clean energy, they pose a challenge to the operation of
modern electricity networks because their output fluctuates both dramatically
and unpredictably. Energy storage represents one solution to these fluctuations.
As for wind turbines and solar panels, the economics of storage require careful
thought.
This talk begins with a survey of the technological, economic, and regulatory
framework for renewable energies and storage technologies. Next an instructive
toy problem illustrating the interaction of finance, optimization, and energy
policy advertised in the title is developed and analyzed. With our intuition
firmly established, we discuss more sophisticated extensions to this model
in the realms of pump storage hydro and natural gas storage. The talk concludes
with some remarks about the policy impact of this research and its extensions.
John Hull (Rotman School of Management, University
of Toronto)
CVA and wrong way risk
This paper proposes a simple model for incorporating wrong-way and right-way
risk into CVA (credit value adjustment) calculations. These are the calculations
made by a dealer to determine the reduction in the value of its derivatives
portfolio because of the possibility of a counterparty default. The model
relates the hazard rate of the counterparty to the value of the transactions
outstanding between the dealer and the counterparty. Numerical results for
portfolios of 25 instruments dependent on five underlying market variables
are presented. The paper finds that wrong-way and right-way risk have a significant
effect on the Greek letters of CVA as well as on CVA itself. It also finds
that the nature of the effect depends on the collateral arrangements.
David Olson (University of Nebraska-Lincoln)
Broader Perspectives of Risk Management
The 21st Century has seen tremendous investment uncertainty, with at least
three major disruptive episodes in the 2001 dot.com bubble, the 2008 banking
crisis, and now the current debt crises from Iceland, Portugal-Ireland-Greece-Spain,
Italy, and the United States. Investors seek to be scientific, and base their
decisions on data. But the only data we have is the past, and these three
episodes have led to radically different data regimes. Value-at-risk calculations
in the past decade have been based upon historical data, which demonstrated
apparent normality. Past data also was used to identify correlations, allowing
generation of combined portfolios with expected compensating risks. These
assumptions have proven to be problematic. This study briefly discusses economic
philosophies of risk management, reviews portfolio models reflecting tradeoffs
between expected return and risk, and discusses the risk of basing decisions
based solely upon historical data.
This is a joint work with Desheng Wu (University of Toronto)
Thomas Salisbury
Planning for retirement: sustainability versus legacy
When planning for retirement, individuals face the risk of living longer
than expected, and therefore running out of the capital needed to sustain
their income. There are ways to ensure against this, eg by purchasing annuities.
But these have a downside; a true annuity offers mortality credits but leaves
no legacy behind for one's heirs. In other words, retirees confront trading
off their interests (ie sustainability) versus their kids' interests (ie legacy).
I will discuss a pair of risk metrics that are useful in making decisions
about this tradeoff: the Retirement Sustainability Quotient (RSQ) and Expected
Financial Legacy (EFL). These are expectations, one focused on the current
generation and the other one on the next. They can be analyzed in simple models
via PDEs. This allows retirement decisions to be viewed as selecting a point
on an RSQ/EFL efficient frontier. This talk reports on joint work with Huaxiong
Huang, Moshe Milevsky, and Faisal Habib.
David Simchi-Levi (Massachusetts Institute of
Technology)
Mitigating Business Risks from the Known-Unknown to the Unknown-Unknown
In the last few years we have seen an increase in the levels of risk and volatility
faced by enterprises. Some recent examples include the unrest in the Middle
East, inflation in China, the Japanese tsunami disruption, the Iceland volcano
eruption, oil price volatility, product recalls and huge fluctuations in financial
markets.This requires business executives to systematically address business
risks both the know-unknown operational risks as well as the unknown-unknown
extreme risks. Unfortunately, there is very little that can be done after
a disaster has occurred. Companies therefore need to devote more attention
to planning their operations so they can better respond to mega disasters
as well as more mundane operational problems. Luckily, there are proven ways
to analyze the different sources of risks, assess the impact on the business
and build various mitigation measures into the business strategy.
Rudi Zagst (TUM, Germany)
The Crash-NIG Copula Model - Pricing of CDOs under changing market conditions
It is well known that the one-factor copula models are very useful for risk
management and measurement applications involving the generation of scenarios
for the complete universe of risk factors and the inclusion of CDO structures
in a portfolio context. For this objective, it is necessary to have a simple
and fast model that is also consistent with the scenario simulation framework.
We present three extensions of the NIG one-factor copula model which jointly
have not been considered so far: (i) tranches with different maturities modeled
in a consistent way, (ii) a portfolio with different rating buckets, relaxing
the assumption of a large homogeneous portfolio, and (iii) different correlation
regimes. The regime-switching component of the proposed Crash-NIG copula model
is especially important in view of the last credit crisis. We also introduce
liquidity premiums into the Crash-NIG copula model and show that the credit
crisis was substantially driven by liquidity effects.
-------------------------------------------------------------------------
Talk Abstracts
Yuri Lawryshyn (University of Toronto)
Valuing Risky Projects Based on Managerial Cash Flow Estimates: A Real
Options Approach
Standard methods for valuing project alternatives are based on the Discounted
Cash Flow (DCF) approach, where the weighted average cost of capital (WACC)
is most commonly used as the discount factor. The DCF approach, by its nature,
assumes that there is no managerial flexibility / optionality imbedded in
the project, and that the financial risk profile of the cash-flows matches
that of the average project, or investment, of the company. Real options analysis
(ROA) has been recognized as a superior method for valuing managerial flexibility
and can be utilized to estimate the value associated with managerial flexibility
and to account for the risk profiles associated with cash-flow estimates.
However, survey literature has shown that the adoption of ROA as a practitioner's
tool has stagnated at a usage rate of approximately 10%, mostly because of
the difficulty associated in practical implementation. We propose an approach
which utilizes cash-flow estimates from managers as key inputs and results
in project value cash-flows that exactly match arbitrary estimates. We achieve
this through the introduction of an observable, but not tradable, market sector
indicator process which drives the project's cash-flow, rather than modeling
the project value directly. Our framework can be used to value managerial
flexibilities and obtain hedges in an easy to implement manner for a variety
of real options such as entry/exit, multistage, abandonment, etc. As well,
our approach to ROA provides a co-dependence between cash-flows, is consistent
with financial theory, requires minimal subjective input of model parameters,
and bridges the gap between theoretical ROA frameworks and practice.
This is a joint work with Sebastian Jaimungal (University of Toronto)
Alexander Melnikov (University of Alberta)
Quantile risk management of equity-linked life insurance contracts
with stochastic interest rate
The talk studies the problem of pricing equity-linked life insurance contracts
in a two factor jump-diffusion financial market with stochastic interest rate,
and focuses on the valuation of insurance contracts with stochastic guarantee.
The contracts under consideration are based on two risky assets satisfying
a two-factor jump-diffusion model: one asset is responsible for future gains,
another one is a stochastic guarantee. As most life insurance products are
long-term contracts, it is more practical to consider them in a stochastic
interest rate environment instead of a constant interest rate. In our setting,
stochastic interest rate is described by a jump-diffusion model too. Quantile
hedging technique is exploited to price such finance/insurance contracts with
initial capital constraints. Explicit formulas for both the price of the contracts
and the survival probability are obtained. Our results are illustrated by
numerical example based on financial indexes Russell 2000 and S&P 500.
Pablo Oliveres (Ryerson University)
Computing OR-risk measures: recent techniques and open problems
We discuss some recent advances and open problems in a Loss Distribution
Approach under the context of operational risk and credit risk. Among those
topics we review the modeling of severity, frequency and dependence between
different business lines and type of events. We discuss asymptotic methods
to compute probabilities far in the tail, therefore Value-at-Risk and expected
shortfall. Also we address to the use of an Extreme Value approach in modeling
the loss distribution and how to combine external and internal data.
Liayn Yang (Rotman School of Business, University
of Toronto)
Differential Access to Price Information in Financial Markets
Recently exchanges have been supplementing their tape revenue by directly
selling trade and quote data to some traders. We analyze how this practice
affects the cost of capital, market liquidity and welfare by studying a two-period
economy in which rational traders can purchase information about past transactions
from the exchanges. In an economy in which traders are endowed with private
signals about asset value, allowing the exchange to sell price data increases
the cost of capital and worsens market liquidity relative to a world in which
all traders freely observe previous prices. However, selling price data reduces
the cost of capital and increases liquidity relative to an economy in which
no traders can observe price information. If traders have to decide whether
to purchase private signals, as well as whether to purchase price data, selling
price data can cause traders to reduce their effort to gather information
on the underlying asset. This secondary effect may increase the equilibrium
cost of capital, but paradoxically it results in greater liquidity. Our welfare
analysis also shows that as more previous price information is present in
the market, noise traders are made better-off and speculative rational traders
are made worse-off. In our view, allowing exchanges to sell price information
is undesirable because it generally reduces efficiency and market quality.
We believe that the practice should be restricted.
-------------------------------------------------------------------------
Session Abstracts
Nabeel Butt (University of Western Ontario)
A tree-based approximation for a multidimensional transaction cost
model
The problem of optimizing portfolios in the presence of transaction costs
has attracted significant interest. In this talk we consider a discrete-time
formulation of a fixed cost transaction cost model. We examine applicability
of numerical tree approximation as an alternative simplistic approach to solve
transaction cost problems .The approach is able to solve a model in arbitrary
dimensions and non-standard geometries. Using the exact probability model
in context of dynamic programming could be computationally intensive for it
might involve root finding or optimization of complicated integrals. We provide
a computational study of tree-based method on a simple fixed transaction cost
model and highlight its many advantages.
This is a joint work with Matt Davison (University of Western Ontario)
Ryan Donnelly (University of Toronto)
A Branch-and-Price Algorithm for Solving an Order Cutoff Assignment Problem
We investigate a particular class of guaranteed withdrawal benefit products
where the underlying fund is a mixed fund and driven by two classes of diffusive
processes: local volatility and stochastic volatility. By rewriting the guarantee
as an Asian option, and through dimensional reduction techniques the problem
is written in terms of a two dimensional PDE. We then provide an efficient
ADI algorithm, whereby the correlation terms are treated explicitly while
other operators are split, to solve the PDE and demonstrate how the various
model parameters affect the valuation of this complex product. Finally, since
these guarantees of very long termed, we demonstrate how stochastic interest
rates can be easily incorporated.
This is a joint work with Sebastian Jaimungal (University of Toronto) and
Dmitri Rubisov (BMO Capital Markets).
Kevin D. Ferreira (University of Toronto)
The Marketing of Innovative Products with Adoption Network Effects
Throughout history there have been many examples of innovations that "for-some-reason"
were incredibly successful. On the flip-side, there are also numerous examples
of innovations that may be deemed as incredible failures. New products are
an important source of sales and profit for a firm; furthermore, new product
developments typically involve large financial commitments. As a result, forecasting
the acceptance of a new product would be a significant advantage to any firm.
However, this task can often be difficult which is evident from the number
of failed products that have been introduced to market. We present a word
of mouth diffusion model that aims to shed some light on these conditions,
and provide a tool to aid in the planning of the launch of a new innovation.
It is able to capture the process by which customers become aware of a new
technology via a diffusion function, and the adoption decision at the micro-level.
Meng Han (University of Toronto)
Approximations to Loss Probabilities in Credit Portfolios
Credit risk analysis and management at the portfolio level are challenging
problems for financial institutions due to their portfolios' large size, heterogeneity
and complex correlation structure. The conditional-independence framework
is widely used to calculate loss probabilities for credit portfolios. The
existing computational approaches within this framework fall into two categories:
(1) simulation-based approximations and (2) asymptotic approximations. The
simulation-based approximations often involve a two-level Monte Carlo method,
which is extremely time-consuming, while the asymptotic approximations, which
are typically based on the Law of Large Numbers (LLN), are not accurate enough
for tail probabilities, especially for heterogeneous portfolios. We give a
more accurate asymptotic approximation based on the Central Limit Theorem
(CLT), and we discuss when it can be applied. To further increase accuracy,
we also propose a hybrid approximation, which combines the simulation-based
approximation and the asymptotic approximation. We test our approximations
with some artificial and real portfolios. Numerical examples show that, for
a similar computational cost, the CLT approximation is more accurate than
the LLN approximation for both homogeneous and heterogeneous portfolios, while
the hybrid approximation is even more accurate than the CLT approximation.
Moreover, the hybrid approximation significantly reduces the computing time
for comparable accuracy compared to simulation-based approximations.
This is a joint work with Alex Kreinin (Algorithmics) and Ken Jackson
(University of Toronto).
Sean Jewell (University of Toronto)
Stochastic Pairs Trading through cointegration
Historically, simple pairs trading strategies have been a prevalent contrarian
indicator, and many have practiced both fundamental and basic statistical
approaches. Due to the utter inadequacy in using correlation as a stable indicator
we borrow the notion of conintegration from econometrics to assemble stable
baskets of securities. We retain pairs trading's essential concept--mean reversion--
and rebuild a strategy suggested by Kim: from a basket of cointegrated securities
we fit a stochastic mean-reverting Ornstein-Uhlenbeck process, and implement
dynamic-allocation methods to maximize an expected utility function. [1]
[1] Kim, S.-J., Primbs, J., and Boyd, S. Dynamic spread trading. Stanford
University, June 2008.
Michael Jong Kim (University of Toronto)
A Two-State Regime Switching Model with General Sojourn Time Distributions
Regime-switching models in financial time series typically follow an underlying
Markov process. As a result, the sojourn times in each regime follows an exponential
(in continuous time) or geometric (in discrete time) distribution. In real
financial time series, this restrictive class of distributions may be unrealistic.
In this paper we propose a two-state, continuous-time regime switching model
with general sojourn time distribution. Multivariate return data that is stochastically
related to the regime process is available at discrete time points. Using
the EM algorithm, it is shown that maximizers of pseudo likelihood function
have explicit closed form expressions. Furthermore, forecasting and value
at risk (VaR) evaluation can be done analytically. The results developed in
the paper are illustrated on real carbon emission financial data.
This is a joint work with Desheng Dash Wu (University of Toronto) and
Luis Seco (University of Toronto).
Zheng Li (University of Toronto)
On Estimating Large Covariance Matrices
In statistical finance, data sets are becoming much larger, with more parameters.
Old statistical methods
of handling data sets are no longer reasonable in many cases, as the increased
number of parameters make the standard covariance estimates converge far too
slowly. This
talk will concern itself with certain methods of estimating the covariance
matrix more efficiently, and the finance applications of these methods.
Melissa Mielkie (University of Western Ontario)
Dynamic Hedging in a Market Driven by Regime-Switching Volatility
Much work on pricing and hedging options has used either simple constant
volatility stock price models or stochastic volatility models where volatility
can take any positive value. Observation of real market data suggests that
volatility, while stochastic, is well modeled by moves between a finite number
(often just two) states. We propose that the transitional probabilities of
volatility are given by an N-state Markov model, and that the actual jumps
between volatility regimes are driven by independent Poisson jump processes.
Heston's stochastic volatility technique of using an additional "hedging"
derivative is employed. Practical problems with this approach for the two
volatility state case lead us to examine several different hedging strategies
to determine their profitability. We consider the effects of an option going
too far in- or out-of the money on our hedging strategies, and show how the
market price of volatility risk is necessary to price an option in a market
driven by regime-switching volatility.
This is a joint work with Matt Davison (University of Western Ontario).
Mike Pavlin (University of Toronto)
Corporate Payout Policy, Cash Savings, and The Cost of Consistency: Evidence
from a Structural Estimation
We develop a dynamic model in which firms choose their optimal financing,
investment, dividends, and cash holdings while facing costly equity issuance,
debt and capital adjustments costs and taxed interest on cash balances. We
extend this base-case model to capture the effect of a manager, who perceives
a cost to cutting payout. Applying simulated method of moments (SMM) to the
dynamic model we infer that the magnitude of this downward adjustment cost
accounts for an equity value loss of approximately 7% in US firms. Results
include payout smoothing leading to increased accumulation of excess cash
and larger estimated payout consistency cost for firms which have more dispersed
analyst forecasts, compensate their CEOs with low pay-performance packages
and have larger institutional holdings.
Jason Ricci (University of Toronto)
Self-Exciting Marked Point Processes for Algorithmic Trading
Algorithmic Trading (AT) and High Frequency (HF) trading, which are responsible
for over 70% of US stocks trading volume, have greatly changed the microstructure
dynamics of tick-by-tick stock data.
Recently, self-exciting processes have been used to model trading activity
at high frequencies. Such processes can account for the clustering of intensity
of trades and the feedback effect which trading induces. Here, we use a multi-factor
Hawkes process to model the limit-order book dynamics and study the optimal
control problem for a trader who places limit buy-and-sell orders in a limit
order book with a stochastic fill rate function. Asymptotic expansions in
the level of risk-aversion lead to closed form and intuitive results which
are also adapted to the state of the market.
This is a joint work with Alvaro Cartea (University Carlos III) and Sebastian
Jaimungal (University of Toronto)
Johnny Tam (University of Toronto)
A Branch-and-Price Algorithm for Solving an Order Cutoff Assignment
Problem
We define an order cutoff for a retailer as a time in the day such that orders
sent to the depot before this point will be delivered by tomorrow, and orders
submitted after will be delivered by the day after tomorrow. The later a retailer's
cutoff, the sooner it receives its orders which helps it to maintain ideal
inventory levels. Given a choice of cutoffs, not all retailers in a supply
chain can have the latest one since transportation takes a significant amount
of time. This paper tries to assign optimal order cutoffs to retailers. We
call this an order cutoff assignment problem and we solve it using a mathematical
programming approach called branch-and-price. 60 sample problems were solved
and results showed that branch-and-price becomes more effective as the number
of vehicles increase but much less effective as the number of retailers increase.
Jue Wang (University of Toronto)
Prediction of fractal Physiological signals based on self-similarity
Abstract: Many physiological signals (e.g. heart rate, arterial blood pressure)
exhibit complex fractal dynamics across multiple time scales, which can be
difficult to model in traditional time series framework. A common feature
of these fractal signals is statistical self-similarity: small parts of the
signal resemble the whole in their statistical property. In this paper, we
present a new prediction scheme by exploiting this self-similarity. The prediction
is implemented in the wavelet domain: using Haar wavelet, the original prediction
along the time axis is converted into the prediction of wavelet coefficients
on a dyadic tree. We apply the new prediction methods to the mean arterial
pressure (MAP) signals and found incorporating short-term variations can significantly
improve the long-term prediction. The results suggest the self-similarity
based prediction is an extremely promising tool for clinical practice.
Jean Xi (University of Western Ontario)
An inverse Stieltjes moment-based method in model parameter estimation
under a Markov-modulated market
We consider a Markov-modulated Black-Scholes-type market consists of a
riskless asset and a risky asset whose dynamics depend on an unobservable
continuous-time Markov chain. The coupled system of Dupire-type partial differential
equations satisfied by the option price in such a market is derived. Using
an inverse Stieltjes moment approach, we recover the model parameters, which
include the volatilities and the intensity rates of the Markov chain. We show
the applicability and accuracy of our proposed method by proving numerical
demonstrations. Sensitivity analyses are also carried out to examine the behaviour
of the estimated results when model parameters are varied.
This is a joint work with Rogemar Mamon (University of Western Ontario)
and Marianito Rodrigo (University of Wollongong).
Pei Jun Zhao (University of Toronto)
Drug Development - A Tale of Financial Risks and Gains
Drug development is an expensive and risky process for every pharmaceutical
company. From market research to laboratory experimentation, companies often
face numerous failures before gaining an insightful or serendipitous drug
discovery. This is then followed by years of testing and awaiting government
approval. Currently, the price tag in releasing a new drug onto the market
is, on average, anywhere from $300 to $800 million. Thus, to cover the cost
of research, development, and accreditation, the drug must be stand the test
of time, and sell successfully for years to come. With rapid advancement in
biotechnology, cutting-edge pharmaceutical research may lead to novel drugs,
but often at the expense of increased risk. So it is very important to take
a balanced approach. First, companies should wait until more research has
confirmed the safety and efficacy of recent biotechnology. Second, companies
must not lose out on future market share. Finally, as the cost of genetic
testing decreases, and as the technology attains popularity in society, the
field of genetic screening offers a new frontier in the design of custom-made
drugs that fit individual needs while minimizing side-effects - all in an
attempt to lower risks and increase gains.
Back to top