Instructor
Prof. Shige Peng, Shandong University and Fields Distinguished Scientist in Residence
Scientific Background
Albert Einstein’s universe was deterministic. His well known “HE does not play dice” reflected this deterministic point of view, which for a long time played the dominant role in our scientific community, especially following the great discovery of Newton’s laws of mechanics and of gravitation. Today, our views have changed. Since the second half of the last century, it has become increasingly evident that our universe is essentially governed by the laws of uncertainty, in which probability theory plays a principal role in qualitative and quantitative analysis.
The history of probability theory can be traced back to the well-known communications between Pierre de Fermat and Blaise Pascal around 1654 on gambling problems. Since then many mathematicians have contributed to the development of the notion of probability, for example, C. Huygens, J. Bernoulli, A. de Moivre, P.S. Laplace, S. Poisson, C.F. Gauss, and others.
L. Bachelier introduced the notion of Brownian motion in order to study the problem of option pricing. But it was not until 1933, that the axiomatic foundation of probability theory was finally laid down by A. N. Kolmogorov in his Grundbegriffe der Wahrscheinlichkeitsrechnug. The stochastic integral and the corresponding stochastic analysis introduced by Kiyoshi Itô (1942) are powerful, beautiful and fundamental and have been widely applied to finance.
The economist Frank Knight challenged the feasibility of using probability to treat uncertainty. He stated that the “mathematical, or a priori type of probability is practically never met within business” and “the conception of an objectively measurable probability of chance is simply inapplicable”. The term “Knightian uncertainty” or “ambiguity” is now widely accepted, especially among economists, to refer situations where there is no objective probability or distribution available for making decisions.
This notion of Knightian uncertainty has deeply influenced further development of the well-known expected utility theory established by von Neumann and Morgenstern (1944) based on objective probabilities, and that of Savage (1954) using subjective probabilities. In fact the former was seriously challenged by the Allais paradox (1953) and later by the Ellsberg paradox (1961). In order to resolve Ellsberg’s paradox, Gilboa and Schmeidler (1989) established the theory of maximum expected utility (MEU). A dynamic version of their theory was provided by Epstein and Schneidler (2003). Inspired by methods of robust control, Hansen and Sargent (2000, 2001) introduced the multiplier preference (MP) theory. A related development is the prospect theory of Tversky & Kahneman and the corresponding mathematical formulation of Zhou. These developments of the expected utility theory can be viewed as nonlinear expected utility, i.e. a utility where the (linear) expectation is replaced by a nonlinear one. On the other hand, in his book “Robust Statistics”, Huber (1981) proposed a notion of an upper expectation, which is a sublinear expectation.
A more direct motivation for nonlinear expectations is the development of monetary risk measures for risky positions in finance. Artzner et al. (1997, 1999) axiomatically proposed coherent risk measures, that challenged the widely used VaR risk measure. Föllmer and Schied (2002), and Fritelli and Rosazza (2002) went further and introduced the notion of convex risk measures. All of these risk measures fall into the category of nonlinear expectations. Their corresponding representation theorems indicate an aversion for the uncertainty of probabilities, or Knightian uncertainty. From both theoretical and practical points of view, an important fundamental and challenging question is: can a new theoretical framework of nonlinear expectation be developed, which is comparable with that of the beautiful and powerful structure of modern probability theory?
In the continuous time framework, a new dynamically consistent nonlinear expectation, called 𝑔-expectation, was introduced (Peng, 1997) via a BSDE (backward stochastic differential equation) in which the driving function 𝑔 = 𝑔(𝑦, 𝑧) is specified. Most Brownian filtration martingales have their nonlinear versions. It has been conjectured that any dynamically consistent nonlinear expectation w.r.t. the filtration of Brownian-motion, that is absolutely continuous w.r.t. the corresponding Wiener measure, can be expressed in terms of a 𝑔 -expectation with a specified driving function 𝑔. This gives a sufficiently large pool of dynamical risk measures or dynamical utilities for financial risk controls as well as for economic theory.
An important case in finance is when the unknown probability measures of uncertainty cannot be dominated by one probability measure. The well-known volatility uncertainty is a typical example. This situation can be solved by introducing a fully nonlinear dynamically consistent nonlinear expectation called 𝐺-expectation, cf. Peng (2004, 2005, 2007). It is also theoretically interesting that under such a 𝐺-expectation, the canonical process becomes incrementally independent and stable, i.e., a 𝐺-Brownian motion. Many basic stochastic analyses can be re-established using this new framework, including stochastic calculus of Itô’s type and nonlinear martingale theory.
From the point of view of statistics, obtaining parametric distribution models under Knightian uncertainty is another crucially important point for practical implementations. A typical example is the so-called 𝐺-normal distribution 𝑁(𝜇, [𝜎21, 𝜎22]), which is parameterized by its mean 𝜇 , upper variance 𝜎2 and lower variance 𝜎1. As in classical probability theory, such new types of distributions under sublinear expectations are derived from a new central limit theorem under Knighting uncertainty. Corresponding to this, another typical distribution under a sublinear expectation is the so-called maximal distribution 𝑀([𝜇1, 𝜇2]) derived from a new type of law of large numbers under Knightian uncertainty.
Course Outline
∙ Expectation based probability theory: law of large numbers (LLN), central limit theorem (CLT);
∙ Basic introduction of stochastic analysis through heat equation (normal distribution);
∙ Brownian motion, Itô’s stochastic analysis and SDE;
∙ Basic consideration of Knightian uncertainty (ambiguity);
∙ Basic properties of BSDE: existence and uniqueness and comparison theorems;
∙ BSDE and nonlinear PDE, the nonlinear Feynman-Kac formula;
∙ Data sampling, parameter estimates with and nonlinear Monte Carlo approaches;
∙ Forward-Backward BSDE and the related stochastic Hamiltonian systems;
∙ Stochastic optimization and differential games;
∙ Risk measures in finance (𝐺-RM, 𝐺-VaR);
∙ BSDE and path dependent partial differential equations;
∙ BSDE and nonlinear martingales.
Prerequisites
Basic knowledge of analysis, probability theory, statistics, partial differential equations.
Course Length
10 total hours of lectures, 2 hours per day for five days, 1-3 pm, May 15-19, 2017.