"Liquidity and The True Simulation of Dynamic Portfolio Risk"
Liquidity risk is a killer; it killed the Hunt Brothers, Metalgesellshaft,
Barings, Long Term Capital Management and more. There's nothing like
stumbling on a scenario where the cost of a trade is prohibitive, where
a hedge is no longer a hedge because the one side cannot trade.
Today's risk standards do not allow for the measurement of liquidity
risk. They are based on measures that assume a static portfolio. The
reality is that liquidity risk is manifested when a portfolio must be
rebalanced or a market event requires added margin payments. To measure
liquidity true dynamic portfolio risk measurement is essential. A liquidity
crisis often occurs over a period and not at a single point in time.
It's the cumulative erosion of capital that can bring a trader to her
knees. Liquidity risk must be measured over time and not at a single
horizon. The dynamics of the portfolio must be accounted for or else
the simulation will be misleading. Proper accounting for portfolio aging
and scenario dependent valuation are a must.
This talk describes the framework we have implemented to accomplish
this on large-scale industrial portfolios.
Darrell Duffie, Stanford University
"Correlated Default Timing and Valuation"
This talk will suggest simple models and illustrative calculations
for the valuation and simulation of contingent claims that depend on
the times and identity of correlated credit events, such as defaults.
Examples include credit derivatives with a first-to-default feature,
credit derivatives signed with a defaultable counterparty, credit-enhancement
or guarantees, and collateralized debt obligations.
Daniel Dufresne, University of Melbourne
"Laguerre Series for Asian Options"
We consider the problem of pricing continuously averaged Asian options,
when the risky asset is modelled as Geometric Brownian motion. This
problem is equivalent to finding the distribution of the integral of
geometric Brownian motion over a finite interval (denoted A in the sequel).
(1) Some new results are derived: -- the law of 1/A is determined by
its moments; -- expressions for the moments of 1/A. (2) It is shown
how Laguerre series apply to density functions and option values; the
formulas are simpler if expressed in terms of the ladder height dsitribtution.
(3) Series expressions are obtained for the density function of A(t)
and also for Asian options. Numerical illustrations show perfect fit
with simulation results.
Robert J. Elliott, University of Alberta
"Affine Bond Prices and Stochastic Flows"
Stochastic flows and their Jacobians are used to show why, when the
short rate is Gaussian (as in the Vasicek or Hull-White models), or
square root (as in the Cox-Ingersoll-Ross, or Duffie-Kan models), the
bond price is an exponential affine function
Paul Embrechts, ETH Zurich
"What Financial Risk Managers can learn from Actuaries"
The flux of methodological input in the financial industry has very
much been one from banking towards insurance. Recently, various fields
of finance have witnessed a reversed flow, so much so that the banking
industry is well-advised to take notice of so-called insurance-analytics
(a term coined by Till Guldimann, Infinity). Examples of this flow are
to be found in:
- risk management (going beyond VaR)
- credit risk management (actuarial reserving)
- credit derivatives (using survival analytic methods).
Some examples of the above will be discussed, both from a theoretical
as well as applied point of view.
This is joint work with C.Klueppelberg and T.Mikosch.
Helyette Geman, University Paris Dauphine and ESSEC
"Stochastic Time Changes and Asset Price Modeling"
Despite its pivotal role in the theoretical financial literature (Capital
Asset Pricing Model, Black-Scholes-Merton formula), the normality of
asset returns has been consistently refuted in empirical research. The
first part of the talk establishes on a high-frequency database of S
& P500 returns that a remarkable quasi-perfect normality can be
recovered using a stochastic clock driven by the number of trades (where
no a priori distribution is assumed for the transaction time). This
result is consistent with the beautiful theorem established by Monroe
(1978) on "Processes that can be embedded in Brownian motion".
The second part of the talk argues, in full agreement with recent market
moves observed around the world, that price processes should be represented
as pure jump processes. Continuity and normality may be obtained after
a time change related to the order flow. Different types of Levy processes
for the modeling of asset prices are analyzed, as well as the relationship
between the corresponding Levy measure and the intensity of the economic
activity.
The talk is based on two articles by Ane and Geman (1997) and Geman-Madan-Yor
(1998).
David C. Heath, Cornell University / CMU
"Futures-based Term Structure Models"
There are currently two paradigms for term structure modelling: modelling
the spot rate, and modelling the term structure of forward rates. Each
has advantages and disadvantages: For spot rate modelling the question
of model choice is unclear, while for most HJM models computations are
difficult. We present a new class of term structure models essentially
as general as either of the above and for which differences between
models are easy to understand and, for a class of interesting models,
computations are easy.
Bjarne Hojgaard, Aalborg University
"Optimal Risk Controland Dividend Distribution Policies for Insurance
Corporations"
We consider a model of a financial corporation which has to find an
optimal policy balancing its risk and expected profits. The example
treated is related to an insurance company with the risk control method
being reinsurance. Under this scheme the insurance company divert part
of its premium stream to another company in exchange of an obligation
to pick up that amount of each claim which exceeds a certain level 'a'
(rentention level). This reduces the risk but it also reduces the potential
profit. The objective is to make a dynamic choice of the retention level
and find the dividend distribution policy, which maximizes the cumulative
expected discounted dividend pay-outs.
Consider the classical Cramer-Lundberg model, that is (when no dividends
are distributed) the reserve R(t) of the company is assumed to be given
by R(t)=x+p(a,l)t-(U(a,1)+...+U(a,N(t)) where N(t) is a Poisson process
with intensity b>0, U(a,i) is the i.i.d. claim sizes, when the retention
level is a and p(a,l)=(1+l)bE(U(a,1)), where l>0 is the relative
safety loading. We then have that the process lR(t/l^2) converges in
distribution to a BM(m(a),s(a)),where m(a)=bE(U(a,1)) and s^2(a)=bE([U(a,1)]^2).
Hence we consider the following problem: A policy is a pair (a(t),L(t)),
where a(t) denotes the retention level at time t and L(t) denotes the
total amount of dividend distributed until time t. The reserve r(t)
is assumed to be governed by dr(t)=m(a(t))dt+s(a(t))dW(t)-dL(t) and
the objective is to maximize present value of dividend pay-out until
eventual ruin.
We consider two different reinsurance strategies: 1. Proportional reinsurance,
where the retention level a is between 0 and 1 and U(a,i)=aU(i). 2.
Excess-of-loss reinsurance, where the retention level a is non-negative
U(a,i)=min(U(i),a).
Mathematically this becomes a mixed singular-regular control problem
for diffusion processes. Its analytical part is related to a free boundary
(Stephan) problem for a linear second order differential equation and
closed form solutions are found in both cases.
Ioannis Karatzas, Columbia University
"Dynamic Measures of Market-Risk"
Suppose that we operate in the framework of a standard financial market
over a finite time-horizon [0,T], at the end of which we face a certain
liability C -- a random quantity representing a payment that has to
be made at time t=T . Suppose also that (due to market incompleteness,
or insufficient initial funds, or both) we find it impossible to hedge
at t=T this liability perfectly, that is, with probability one.
How can we then quantify, at the outset t=0, the risk associated with
the hedging of the liability C at time t=T? One way is to try and maximize
the probability of perfect hedge (cf. Karatzas (1997), or Foellmer and
Leukert (1998)). This is, in a sense, equivalent to a dynamic version
of the familiar "value at risk" concept.
Another approach is to try to minimize the "expected shortfall" E[max(C-X(T;
x, p(.))), 0)] over admissible portfolios p(.), and then to maximize
the resulting quantity over all risk-neutral probability measures P.
Here x is the initial capital X(T)=X(T; x, p(.)) the terminal wealth
corresponding to x and to the portfolio p(.), and E denotes expectation
with respect to the probability measure P. The resulting max-min quantity
can then be used as a measure of risk associated with the liability
C; as such, it satisfies a number of desirable "coherence" properties
postulated by Artzner, Eber, Delbaen and Heath (1996).
In the case of a complete market, when there is only one risk-neutral
probability measure P, we present a fully-developed theory for this
problem -- along with specific examples of contingent claims C for which
explicit computations of risk are possible. The classes of admissible
portfolios p(.) and "scenarios" (probability measures) P are rich enough
to accommodate margin requirements and uncertainty about the actual
values of stock appreciation rates, respectively. We also survey recent
work on this problem in the context of incomplete markets, and point
out to connections with the generalized Neyman-Person lemma when testing
a simple hypothesis against a composite alternative.
This is joint work with J. Cvitanic.
Alexander Levin and Alexander Tchernitser, Bank of Montreal
"Multifactor Stochastic Variance Value-at-Risk Model"
A standard Value-at-Risk (VaR) model corresponds to stable market conditions
and assumes a multivariate normal distribution for risk factors with
known constant volatilities and correlations. However, the actual risk
factor distributions exhibit significant deviations from normality.
Excess kurtosis, skewness, and volatility fluctuations are typical for
many market variables. Fat-tailed and skewed distributions result in
the underestimation of actual VaR by the standard model.
The Stochastic Variance VaR model developed by the Bank of Montreal
accounts for uncertainty and instability of the risk factor volatilities.
The model naturally describes the dynamics of underlying asset returns
for short holding periods typical for VaR calculations. The SV-VaR model
fits the actual historical distributions of the risk factors better
than the traditional VaR model. Higher moments (skewness, kurtosis)
are more accurately captured with the SV-VaR model, which also incorporates
correlations between risk factors, as well as correlations between risk
factors and their volatilities.
The one-period exponential distribution for the stochastic variance
is derived from the Maximum Entropy Principle. This model is extended
to the Gamma SV Model that gives the Bessel distribution for the probability
density of the risk factor. Corresponding stochastic processes with
closed form solutions for the stochastic variance and risk factor dynamics
are obtained. Derived simple volatility term structure differs from
the term structure for well-known diffusion SV models in the case of
short holding periods and better describes an empirical term structure
of the risk factor kurtosis.
A general calibration procedure for the class of multifactor SV-VaR
models is developed. A closed form solution for the VaR of one-factor
linear portfolios is obtained. For the multifactor nonlinear portfolios,
a simple two-step Monte Carlo simulation procedure is proposed. Numerical
results for equity, commodity, interest rate, and foreign exchange rate
risk are presented.
Andrew W. Lo, MIT
"When is Time Continuous?"
Continuous-time stochastic processes have become central to many disciplines,
yet the fact that they are approximations to physically realizable phenomena
is often overlooked. We quantify one aspect of the approximation errors
of continuous-time models by investigating the replication errors that
arise from delta-hedging derivative securities in discrete time. We
characterize the asymptotic distribution of these replication errors
and its joint distribution with other assets as the number of discrete
time periods increases. We introduce the notion of temporal granularity
of a continuous-time stochastic process, which allows us to characterize
the degree to which discrete-time approximations of continuous-time
models can track the payoff of a derivative security. We derive closed
form expressions for the temporal granularity of geometric Brownian
motion and an Ornstein-Uhlenbeck process using call options. We also
introduce alternative measures of the replication error and analyze
their properties.
This is joint work with D. Bertsimas and L. Kogan.
Ludger Overbeck, Deutsche Bank AG
"Credit Portfolio Risk Management Based on Coherent Risk Measures"
A financial institution uses the economic capital for credit risk as
a protection against severe losses in the entire credit portfolio. Mathematically,
it is usually defined as a quantile of the distribution of future losses,
or even simpler as a multiplier of the standard deviation of this distribution.
The classical portfolio theory explains then how to distribute the capital
across the whole portfolio.
Since the fundamental work of Artzner et. al about coherent risk measures,
other risk measures, like the conditional expectation of the losses
given that the loss already exceeded a given threshold, are analyzed
in research as well as in applications. Some features of these risk
measures are exploit. In particular we present a capital allocation
process in the spirit of the exceedance over threshold measures. These
measures are compared with classical portfolio theory, i.e. with the
risk contributions based on a variance/covariance approach.
L.C.G. Rogers, University of Bath, England
"Designing and Estimating Models of High-Frequency Data"
Most financial houses have access to high-frequency data, which typically
gives the time, price and amount of every trade (or quote) in a particular
asset. Such detailed information should be more revealing than a single
price per day, but it will be hard to extract the additional value if
one tries to use a model which supposes that the observed prices are
a diffusion process! In this talk, we present a class of models for
such data which treat the data as intrinsically discrete, and we show
how easily-updated estimation procedures can recover parameter values
from a range of simulated examples.
Stephen Ross, MIT
"Topics in Finance"
In finance, as in pathology, we can learn more from failure than from
success. This paper exhumes three famous financial failures, the Hunt
Brothers silver ventures, Metallgesellschaft's oil futures losses, and
the recent LTCM and related hedge fund failures. We do a post mortem
on each and see what we can learn. Not surprisingly, the cause of death
was similar in each case, or, to put it more familiarly, those who pay
no heed to history are doomed to repeat it.
Hiroshi Shirakawa, Tokyo Institute of Technology
"Evaluation of Yield Spread for Credit Risk"
We study the rational evaluation of yield spread for defaultable credit
with fixed maturity. The default occurs when the asset value hits a
given fraction of the nominal credit value. The yield spread is continuously
accumulated to the initial credit as an insurance fee for future default.
By the rational credit pricing, we prove the unique existence of equilibrium
yield spread which satisfies the arbitrage free property. Furthermore
we show that this spread yield is independent of the choice of interest
rate process. For the quantitative study of rational yield spread, we
derive an explicit analytic formula for the equilibrium and show numerical
example for various parameters.
Steven E. Shreve, CMU
"Pricing and Hedging Dangerous Exotic Options"
The Black-Scholes "delta-hedging" approach cannot be implemented for
exotic options which exhibit large "gamma" values (e.g., options which
knock out in the money), because this would require frequent large changes
in the hedge position. For such options, one can build a "margin of
safety" into the price, and use this margin to avoid large changes of
position in the underlying. A general methodology for this, based on
the idea of pricing and hedging under portfolio constraints, will be
presented.
Stuart Turnbull, CIBC
"The Intersection of Market and Credit Risk"
Economic theory tells that market risk and credit risks are intrinsically
related to each other and are not separable. We start by describing
the two main approaches to pricing credit risky instruments: the structural
approach and the reduced form approach. It is argued that the standard
approaches to credit risk management - Credit Metrics, Credit Risk Plus
and KMV - are of limited value, if applied to portfolios of interest
rate sensitive instruments.
Empirically it is observed that returns on high yield bonds have a
higher correlation with the return on an equity index and a lower correlation
with the return on a Treasury bond index than do low yield bonds. The
KMV and Credit Metrics methodologies cannot reproduce these empirical
observations given their assumptions of constant interest rates. Altman
(1983) and Wilson (1997) have shown that macro economic variables appear
to influence the aggregate rate of business failures. We show how to
incorporate empirical observations into the reduced form Jarrow-Turnbull
(1995) model. The volatility of the credit spread can be used to determine
the sensitivities of the credit spread to the different factors. Correlation
plays an important role in existing methodologies. Here default probabilities
are correlated due to their common dependence on the same economic factors.
We discuss the implications for pricing, given different assumptions
about a bond holder's claim in the event of default. We compare the
Duffie-Singleton ( 1997) assumption to the legal claim approach, where
a bond holder's claim is assumed to be accrued interest plus principal.
Default risk and the uncertainty associated with the recovery rate may
not be the sole determinants of the credit spread. We show how to incorporate
a convenience yield as one of the determinants of the credit spread.
Incorporating market and credit risk implies that it is necessary to
use the martingale distribution for pricing and the natural distribution
to describe the value of the portfolio in order to calculate the value-at-risk.
We show how to generalize the Credit Metrics methodology to incorporate
stochastic interest rates.
KOLMOGOROV LECTURER
Hans Foellmer, Humboldt Universitaet - Berlin
"Probabilistic Problems arising from Finance"
We review some recent developments in Probability which are motivated
by problems of hedging derivatives in incomplete financial markets.
This will include new variants of decomposition theorems for semimartingales,
the construction of efficient hedges which minimize the shortfall risk
under some cost constraint, and some results on Brownian motion related
to the heterogeneity of information among financial agents.
Return to Finance Workshop Homepage
Return to Workshop Schedule