Financial institutions are facing the important task of estimating and controlling their exposure to market risk, which is caused by changes in prices of equities, commodities, exchange rates and interest rates. A new chapter of risk management was opened when the Basel Committee on Banking Supervision proposed that banks may use internal models for estimating their market risk (Basel Committee on Banking Supervision; 1995). Its implementation into national laws around 1998 allowed banks to not only compete in the innovation of financial products but also in the innovation of risk management methodology. Measurement of market risk has focused on a metric called Value at Risk (VaR). VaR quantifies the maximal amount that may be lost in a portfolio over a given period of time, at a certain confidence level. Statistically speaking, the VaR of a portfolio is the quantile of the distribution of that portfolio's loss over a specified time interval, at a given probability level.
The implementation of a firm-wide risk management system is a tremendous job. The biggest challenge for many institutions is to implement interfaces to all the different front-office systems, back-office systems and databases (potentially running on different operating systems and being distributed all over the world), in order to get the portfolio positions and historical market data into a centralized risk management framework. This is a software engineering problem. The second challenge is to use the computed VaR numbers to actually control risk and to build an atmosphere where the risk management system is accepted by all participants. This is an organizational and social problem. The methodological question how risk should be modeled and approximated is - in terms of the cost of implementation - a smaller one. In terms of importance, however, it is a crucial question. A non-adequate VaR-methodology can jeopardize all the other efforts to build a risk management system. See (Jorion; 2000) for more on the general aspects of risk management in financial institutions.
VaR methodologies can be classified in terms of statistical modeling decisions and approximation decisions. Once the statistical model and the estimation procedure is specified, it is a purely numerical problem to compute or approximate the Value at Risk. The modeling decisions are:
While there is a plethora of analyses of alternative statistical models for market risks (see Barry Schachter's Gloriamundi web site), mainly two classes of models for market risk have been used in practice:
In this paper we consider certain approximations of VaR in the conditional
Gaussian class of models. We assume that the conditional expectation of
,
, is zero and its conditional covariance matrix
is estimated and given at time
. The change in the portfolio value over
the time interval
is then
The only general method to compute quantiles of the distribution of
is Monte Carlo simulation. From discussion with practitioners ``full
valuation Monte Carlo'' appears to be practically infeasible for portfolios
with securities whose mapping functions are first, extremely costly to compute
- like for certain path-dependent options whose valuation itself relies on
Monte-Carlo simulation - and second, computed inside complex closed-source
front-office systems, which cannot be easily substituted or adapted in their
accuracy/speed trade-offs. Quadratic approximations to the portfolio's value
as a function of the risk factors
Both assumptions of the Delta-Gamma-Normal approach - Gaussian innovations
and a reasonably good quadratic approximation of the value function -
have been questioned. Simple examples of portfolios with options can be
constructed to show that quadratic approximations to the value function can
lead to very large errors in the computation of VaR (Britton-Jones and Schaefer; 1999). The
Taylor-approximation (1.1) holds only locally and is
questionable from the outset for the purpose of modeling extreme
events. Moreover, the conditional Gaussian framework does not allow to model
joint extremal events, as described by Embrechts et al. (1999). The Gaussian
dependence structure, the copula, assigns too small probabilities to joint
extremal events compared to some empirical observations.
Despite these valid critiques of the Delta-Gamma-Normal model, there are good
reasons for banks to implement it alongside other models. (1) The statistical
assumption of conditional Gaussian risk factors can explain a wide range of
``stylized facts'' about asset returns like unconditional fat tails and
autocorrelation in realized volatility. Parsimonious multivariate conditional
Gaussian models for dimensions like 500-2000 are challenging enough to be the
subject of ongoing statistical research, Engle (2000).
(2) First and second derivatives of financial products w.r.t. underlying market
variables ( deltas and gammas) and other ``sensitivities'' are widely
implemented in front office systems and routinely used by traders. Derivatives
w.r.t. possibly different risk factors used by central risk management are
easily computed by applying the chain rule of differentiation. So it is
tempting to stay in the framework and language of the trading desks and
express portfolio value changes in terms of deltas and gammas. (3) For many
actual portfolios the delta-gamma approximation may serve as a good
control-variate within variance-reduced Monte-Carlo methods, if it is not a
sufficiently good approximation itself. Finally (4), is it extremely risky
for a senior risk manager to ignore delta-gamma models if his friendly
consultant tells him that 99% of the competitors have it implemented.
Several methods have been proposed to compute a quantile of the distribution
defined by the model (1.1), among them Monte Carlo simulation
(Pritsker; 1996), Johnson transformations (Zangari; 1996a; Longerstaey; 1996), Cornish-Fisher
expansions (Zangari; 1996b; Fallon; 1996), the Solomon-Stephens approximation
(Britton-Jones and Schaefer; 1999), moment-based approximations motivated by the theory of estimating
functions (Li; 1999), saddle-point approximations (Rogers and Zane; 1999), and
Fourier-inversion (Albanese et al.; 2000; Rouvinez; 1997). Pichler and Selitsch (1999) compare five
different VaR-methods: Johnson transformations, Delta-Normal, and
Cornish-Fisher-approximations up to the second, fourth and sixth moment. The
sixth-order Cornish-Fisher-approximation compares well against the other
techniques and is the final recommendation. Mina and Ulmer (1999) also compare
Johnson transformations, Fourier inversion, Cornish-Fisher approximations, and
partial Monte Carlo. (If the true value function
in Monte Carlo
simulation is used, this is called ``full Monte Carlo''. If its quadratic
approximation is used, this is called ``partial Monte Carlo''.) Johnson
transformations are concluded to be ``not a robust choice''. Cornish-Fisher is
``extremely fast'' compared to partial Monte Carlo and Fourier inversion, but
not as robust, as it gives ``unacceptable results'' in one of the four sample
portfolios.
The main three methods used in practice seem to be Cornish-Fisher expansions, Fourier inversion, and partial Monte Carlo, whose implementation in XploRe will be presented in this paper. What makes the Normal-Delta-Gamma model especially tractable is that the characteristic function of the probability distribution, i.e. the Fourier transform of the probability density, of the quadratic form (1.1) is known analytically. Such general properties are presented in section 1.2. Sections 1.3, 1.4, and 1.5 discuss the Cornish-Fisher, Fourier inversion, and partial Monte Carlo techniques, respectively.