Value at risk (VaR) is one of the most
widely used models in risk management. It is based on probability
and statistics. 1 VaR can be characterized as a
maximum expected loss, given some time horizon and within a given
confidence interval. Its utility is in providing a measure of risk
that illustrates the risk inherent in a portfolio with multiple
risk factors, such as portfolios held by large banks, which are
diversified across many risk factors and product types. VaR is used
to estimate the boundaries of risk for a portfolio over a given
time period, for an assumed probability distribution of market
performance. The purpose is to diagnose risk exposure.
Definition
Value at risk describes the probability
distribution for the value (earnings or losses) of an investment
(firm, portfolio, etc.). The mean is a point estimate of a
statistic, showing historical central tendency. Value at risk is
also a point estimate, but offset from the mean. It requires
specification of a given probability level, and then provides the
point estimate of the return or better expected to occur at the
prescribed probability. For instance, Fig. 6.1 gives the normal
distribution for a statistic with a mean of 10 and a standard
deviation of 4 (Crystal Ball was used, with 10,000 replications).
Fig.
6.1
Normal distribution (10,4). ©Oracle. used
with permission
This indicates a 0.95 probability (for
all practical purposes) of a return of at least 3.42. The precise
calculation can be made in Excel, using the NormInv function for a
probability of 0.05, a mean of 10, and a standard deviation of 4,
yielding a return of 3.420585, which is practically the same as the
simulation result shown in Fig. 6.1. Thus the value of the investment at the
specified risk level of 0.05 is 3.42. The interpretation is that
there is a 0.05 probability that things would be worse than the
value at this risk level. Thus the greater the degree of assurance,
the lower the value at risk return. The value at the risk level of
0.01 would only be 0.694609.
The Basel Accords
VaR is globally accepted by regulatory
bodies responsible for supervision of banking activities. These
regulatory bodies, in broad terms, enforce regulatory practices as
outlined by the Basel Committee on Banking Supervision of the Bank
for International Settlements (BIS). The regulator that has
responsibility for financial institutions in Canada is the Office
of the Superintendent of Financial Institutions (OSFI), and OSFI
typically follows practices and criteria as proposed by the Basel
Committee.
Basel I
Basel I was promulgated in 1988,
focusing on credit risk. A key agreement of the Basel Committee is
the Basel Capital Accord (generally referred to as “Basel” or the
“Basel Accord”), which has been updated several times since 1988.
In the 1996 (updated, 1998) Amendment to the Basel Accord, banks
were encouraged to use internal models to measure Value at Risk,
and the numbers produced by these internal models support capital
charges to ensure the capital adequacy, or liquidity, of the bank.
Some elements of the minimum standard established by Basel are:
-
VaR should be computed daily, using a 99th percentile, one-tailed confidence interval.
-
A minimum price shock equivalent to ten trading days be used. This is called the “holding period” and simulates a 10-day period of liquidating assets in a period of market crisis.
-
The model should incorporate a historical observation period of at least 1 year.
-
The capital charge is set at a minimum of three times the average of the daily value-at-risk of the preceding 60 business days.
In 2001 the Basel Committee on Banking
Supervision published principles for management and supervision of
operational risks for banks and domestic authorities supervising
them.
Basel II
Basel II was published in 2009 to deal
with operational risk management of banking. Banks and financial
institutions were bound to use internal and external data, scenario
analysis, and qualitative criteria. Banks were required to compute
capital charges on a yearly basis and to calculate 99.9 %
confidence levels (one in one thousand events as opposed to the
earlier one in one hundred events). Basel II included standards in
the form of three pillars:
- 1.
Minimum capital requirements.
- 2.
Supervisory review, to include categorization of risks as systemic, pension related, concentration, strategic, reputation, liquidity, and legal.
- 3.
Market discipline, to include enhancements to strengthen disclosure requirements for securitizations, off-balance sheet exposures, and trading activities.
Basel III
Basel III was a comprehensive set of
reform measures published in 2011 with phased implementation dates.
The aim was to strengthen regulation, supervision, and risk
management of the banking sectors.
Pillar 1 dealt with capital, risk
coverage, and containing leverage:
-
Capital requirements to improve bank ability to absorb shocks from financial and economic stress:Common equity ≥ 0.045 of risk-weighted assets
-
Leverage requirements to improve risk management and governance:Tier1 capital ≥ 0.03 of total exposure
-
Liquidity requirements to strengthen bank transparency and disclosure:High quality liquid assets ≥ total net liquidity outflows over 30 days
Pillar 2 dealt with risk management
and supervision.
Pillar 3 dealt with market discipline
through disclosure requirements.
The Use of Value at Risk
In practice, these minimum standards
mean that the VaR that is produced by the Market Risk Operations
area is multiplied first by the square root of 10 (to simulate
10 days holding) and then multiplied by a minimum capital
multiplier of 3 to establish capital held against regulatory
requirements.
In summary, VaR provides the worst
expected loss at the 99 % confidence level. That is, a
99 % confidence interval produces a measure of loss that will
be exceeded only 1 % of the time. But this does mean there
will likely be a larger loss than the VaR calculation two or three
times in a year. This is compensated for by the inclusion of the
multiplicative factors, above, and the implementation of Stress
Testing, which falls outside the scope of the activities of Market
Risk Operations.
Various approaches can be used to
compute VaR, of which three are widely used: Historical Simulation,
Variance-covariance approach, and Monte Carlo simulation.
Variance-covariance approach is used for investment portfolios, but
it does not usually work well for portfolios involving options that
are close to delta neutral. Monte Carlo simulation solves the
problem of non-linearity approximation if model error is not
significant, but it suffers some technical difficulties such as how
to deal with time-varying parameters and how to generate maturation
values for instruments that mature before the VaR horizon. We
present Historical Simulation and Variance-covariance approach in
the following two sections. We will demonstrate Monte Carlo
Simulation in a later section of this chapter.
Historical Simulation
Historical simulation is a good tool
to estimate VAR in most banks. Observations of day-over-day changes
in market conditions are captured. These market conditions are
represented using upwards of 100,000 points daily of observed and
implied Market Data. This historical market data is captured and
used to generate historical ‘shocks’ to current spot market data.
This shocked market data is used to price the Bank’s trading
positions as against changing market conditions, and these revalued
positions then are compared against the base case (using spot
data). This simulates a theoretical profit or loss. Each day of
historically observed data produces a theoretical profit/loss
number in this way, and all of these theoretical P&L numbers
produce a distribution of theoretical profits/losses. The (1-day)
VaR can then be read as the 99th percentile of this
distribution.
The primary advantage of historical
simulation is ease of use and implementation. In Market Risk
Operations, historical data is collected and reviewed on a regular
basis, before it is added to the historical data set. Since this
data corresponds to historical events, it can be reviewed in a
straightforward manner. Also, the historical nature of the data
allows for some clarity of explanation of VaR numbers. For
instance, the Bank’s VaR may be driven by widening credit spreads,
or by decreasing equity volatilities, or both, and this will be
visible in actual historical data. Additionally, historical data
implicitly contains correlations and non-linear effects (e.g.
gamma, vega and cross-effects).
The most obvious disadvantage of
historical simulation is the assumption that the past presents a
reasonable simulation of future events. Additionally, a large bank
usually holds a large portfolio, and there can be considerable
operational overhead involved in producing a VaR against a large
portfolio with dependencies on a large and varied number of model
inputs. All the same, other VaR methods, such as
variance-covariance (VCV) and Monte Carlo simulation, produce
essentially the same objections. The main alternative to historical
simulation is to make assumptions about the probability
distributions of the returns on the market variables and calculate
the probability distribution of the change in the value of the
portfolio analytically. This is known as the variance-covariance
approach. VCV is a parametric approach and contains the assumption
of normality, and the assumption of the stability of correlation
and at the same time. Monte Carlo simulation provides another tool
to these two methods. Monte Carlo methods are dependent on
decisions regarding model calibration, which have effectively the
same problems. No VaR methodology is without simplifying
assumptions, and several different methods are in use at
institutions worldwide. The literature on volatility estimation is
large and seemingly subject to unending growth, especially in
acronyms. 2
Variance-Covariance Approach
VCV Models portfolio returns as a
multivariate normal distribution. We can use a position vector
containing cash flow present values to represent all components of
the portfolio and describe the portfolio. VCV approach concerns
most the return and covariance matrix(Q) representing the risk attributes of
the portfolio over the chosen horizon. The standard deviation of
portfolio value (σ), also
called volatility, is computed:
(1)
The volatility (σ) is then scaled to find the desired
centile of portfolio value that is the predicted maximum loss for
the portfolio or VaR:
(2)
For example, for
a multivariate normal return distribution, f(Y) = 2.33 for
Y = 1 %.
It is then easy to calculate VaR from
the standard deviation (1-day VaR = 2.33 s). The
simplest assumption is that daily gains/losses are normally
distributed and independent. The N-day VaR equals
times the one-day VaR. When there is autocorrelation equal to r the
multiplier is increased from N to
Besides being easy to compute, VCV
also lends itself readily to the calculation of the calculation of
the marginal risk (Marginal VaR), Incremental VaR and Component VaR
of candidate trades. For a Portfolio where an amount x i is invested in the ith component of the portfolio, these
three VaR measures are computed as:
-
Marginal VaR:
-
Incremental VaR: Incremental effect of ith component on VaR
-
Component VaR
VCV uses delta-approximation, which
means the representative cash flow vector is a linear approximation
of positions. In some cases, a second-order term in the cash flow
representation is included to improve this approximation.
3
However, this does not always improve the risk estimate and can
only be done with the sacrifice of some of the computational
efficiency. In general, VCV works well in calculating linear
instruments such as forward, interest rate SWAP, but works quite
badly in non-linear instruments such as various options.
Monte Carlo Simulation of VaR
Simulation models are sets of
assumptions concerning the relationship among model components.
Simulations can be time-oriented (for instance, involving the
number of events such as demands in a day) or process-oriented (for
instance, involving queuing systems of arrivals and services).
Uncertainty can be included by using probabilistic inputs for
elements such as demands, inter-arrival times, or service times.
These probabilistic inputs need to be described by probability
distributions with specified parameters. Probability distributions
can include normal distributions (with parameters for mean and
variance), exponential distributions (with parameter for a mean),
lognormal (parameters mean and variance), or any of a number of
other distributions. A simulation run is a sample from an infinite
population of possible results for a given model. After a
simulation model is built, a selected number of trials is
established. Statistical methods are used to validate simulation
models and design simulation experiments.
Many financial simulation models can
be accomplished on spreadsheets, such as Excel. There are a number
of commercial add-on products that can be added to Excel, such as
@Risk or Crystal Ball, that vastly extend the simulation power of
spreadsheet models. 4 These add-ons make it very easy to
replicate simulation runs, and include the ability to correlate
variables, expeditiously select from standard distributions,
aggregate and display output, and other useful functions.
The Simulation Process
Using simulation effectively requires
careful attention to the modeling and implementation process. The
simulation process consists of five essential steps:
Develop
a conceptual model of the system or problem under study
. This step begins with
understanding and defining the problem, identifying the goals and
objectives of the study, determining the important input variables,
and defining output measures. It might also include a detailed
logical description of the system that is being studied. Simulation
models should be made as simple as possible to focus on critical
factors that make a difference in the decision. The cardinal rule
of modeling is to build simple models first, then embellish and
enrich them as necessary.
- 1.
Build the simulation model . This includes developing appropriate formulas or equations, collecting any necessary data, determining the probability distributions of uncertain variables, and constructing a format for recording the results. This might entail designing a spreadsheet, developing a computer program, or formulating the model according to the syntax of a special computer simulation language (which we discuss further in Chap. 7).
- 2.
Verify and validate the model . Verification refers to the process of ensuring that the model is free from logical errors; that is, that it does what it is intended to do. Validation ensures that it is a reasonable representation of the actual system or problem. These are important steps to lend credibility to simulation models and gain acceptance from managers and other users. These approaches are described further in the next section.
- 3.
Design experiments using the model . This step entails determining the values of the controllable variables to be studied or the questions to be answered in order to address the decision maker’s objectives.
- 4.
Perform the experiments and analyze the results . Run the appropriate simulations to obtain the information required to make an informed decision.
As with any modeling effort, this
approach is not necessarily serial. Often, you must return to
pervious steps as new information arises or as results suggest
modifications to the model. Therefore, simulation is an
evolutionary process that must involve not only analysts and model
developers, but also the users of the results.
Demonstration of VaR Simulation
We use an example Monte Carlo
simulation model published by Beneda 5 to
demonstrate simulation of VaR and other forms of risk. Beneda
considered four risk categories, each with different
characteristics of data availability:
-
Financial risk—controllable (interest rates, commodity prices, currency exchange)
-
Pure risk—controllable (property loss and liability)
-
Operational—uncontrollable (costs, input shortages)
-
Strategic—uncontrollable (product obsolescence, competition)
Beneda’s model involved forward sale
(45 days forward) of an investment (CD) with a price that was
expected to follow the uniform distribution ranging from 90 to 110.
Half of these sales (20,000 units) were in Canada, which
involved an exchange rate variation that was probabilistic
(uniformly distributed from −0.008 to −0.004). The expected price
of the CD was normally distributed with mean 0.8139, standard
deviation 0.13139. Operating expenses associated with the Canadian
operation were normally distributed with mean $1,925,000 and
standard deviation $192,500. The other half of sales were in the
US. There was risk of customer liability lawsuits (2, Poisson
distribution), with expected severity per lawsuit that was
lognormally distributed with mean $320,000, standard deviation
$700,000. Operational risks associated with US operations were
normally distributed with mean $1,275,000, standard deviation
$127,500. The Excel spreadsheet model for this is given in Table
6.1.
Table
6.1
Excel model of investment
A
|
B
|
C
|
|
---|---|---|---|
1
|
Financial risk
|
Formulas
|
Distribution
|
2
|
Expected basis
|
−0.006
|
Uniform(−0.008,−0.004)
|
3
|
Expected price per CD
|
0.8139
|
Normal(0.8139,0.13139)
|
4
|
March futures price
|
0.8149
|
|
5
|
Expected basis 45 days
|
=B2
|
|
6
|
Expected CD futures
|
0.8125
|
|
7
|
Operating expenses
|
1.925
|
Normal(1,925,000,192,500)
|
8
|
Sales
|
20,000
|
|
9
|
|||
10
|
Price $US
|
100
|
Uniform(90,110)
|
11
|
Sales
|
20,000
|
|
12
|
Current
|
0.8121
|
|
13
|
Receipts
|
=B10 * B11/B12
|
|
14
|
Expected exchange rate
|
=B3
|
|
15
|
Revenues
|
=B13 * B14
|
|
16
|
COGS
|
=B7 * 1,000,000
|
|
17
|
Operating income
|
=B15 − B16
|
|
18
|
|||
19
|
Local sales
|
20,000
|
|
20
|
Local revenues
|
=B10 * B19
|
|
21
|
Lawsuit frequency
|
2
|
Poisson(2)
|
22
|
Lawsuit severity
|
320,000
|
Lognormal(320,000,700,000)
|
23
|
Operational risk
|
1,275,000
|
Normal(1,275,000,127,500)
|
24
|
Losses
|
=B21 * B22 + B23
|
|
25
|
Local income
|
=B20 − B24
|
|
26
|
|||
27
|
Total income
|
=B17 + B25
|
|
28
|
Taxes
|
=0.35 * B27
|
|
29
|
After Tax Income
|
=B27 − B28
|
In Crystal Ball, entries in cells B2,
B3, B7, B10, B21, B22 and B23 were entered as assumptions with the
parameters given in column C. Prediction cells were defined for
cells B17 (Canadian net income) and B29 (Total net income after
tax). Results for cell B17 are given in Fig. 6.2, with a probability of
0.9 prescribed in Crystal Ball so that we can identify the VaR at
the 0.05 level.
Fig.
6.2
Output for Canadian investment. ©Oracle.
used with permission
Statistics are given in Table
6.2.
Table
6.2
Output statistics for operating
income
Forecast
|
Operating income
|
---|---|
Statistic
|
Forecast values
|
Trials
|
500
|
Mean
|
78,413.99
|
Median
|
67,861.89
|
Mode
|
–
|
Standard Deviation
|
385,962.44
|
Variance
|
148,967,005,823.21
|
Skewness
|
−0.0627
|
Kurtosis
|
2.99
|
Coefficient of variability
|
4.92
|
Minimum
|
−1,183,572.09
|
Maximum
|
1,286,217.07
|
Mean standard error
|
17,260.77
|
The value at risk at the 0.95 level
for this investment was −540,245.40, meaning that there was a 0.05
probability of doing worse than losing $540,245.50 in US dollars.
The overall investment outcome is shown in Fig. 6.3.
Fig.
6.3
Output for after tax income. ©Oracle. used
with permission
Statistics are given in Table
6.3.
Table
6.3
Output statistics for after tax
income
Forecast
|
Operating income
|
---|---|
Statistic
|
Forecast values
|
Trials
|
500
|
Mean
|
96,022.98
|
Median
|
304,091.58
|
Mode
|
–
|
Standard Deviation
|
1,124,864.11
|
Variance
|
1,265,319,275,756.19
|
Skewness
|
−7.92
|
Kurtosis
|
90.69
|
Coefficient of variability
|
11.71
|
Minimum
|
−14,706,919.79
|
Maximum
|
1,265,421.71
|
Mean standard error
|
50,305.45
|
On average, the investment paid off,
with a positive value of $96,022.98. However, the worst case of 500
was a loss of over $14 million. (The best was a gain of over $1.265
million.) The value at risk shows a loss of $1.14 million, and Fig.
6.3 shows that
the distribution of this result is highly skewed (note the skewness
measures for Figs. 6.2 and 6.3).
Beneda proposed a model reflecting
hedging with futures contracts, and insurance for customer
liability lawsuits. Using the hedged price in cell B4, and
insurance against customer suits of $640,000, the after-tax profit
is shown in Fig. 6.4.
Fig.
6.4
After-tax profit with hedging and
insurance. ©Oracle. used with permission
Mean profit dropped to $84,656
(standard deviation $170,720), with minimum −$393,977 (maximum gain
$582,837). The value at risk at the 0.05 level was a loss of
$205,301. Thus there was an expected cost of hedging (mean profit
dropped from $96,022 to $84,656), but the worst case was much
improved (loss of over $14 million to loss of $393,977) and value
at risk improved from a loss of over $1.14 million to a loss of
$205 thousand.
Conclusions
Value at risk is a useful concept in
terms of assessing probabilities of investment alternatives. It is
a point estimator, like the mean (which could be viewed as the
value at risk for a probability of 0.5). It is only as valid as the
assumptions made, which include the distributions used in the model
and the parameter estimates. This is true of any simulation.
However, value at risk provides a useful tool for financial
investment. Monte Carlo simulation provides a flexible mechanism to
measure it, for any given assumption.
However, Value at risk has undesirable
properties, especially for gain and loss data with non-elliptical
distributions. It satisfies the well-accepted principle of
diversification under assumption of normally distributed data.
However, it violates the widely accepted subadditive rule; i.e.,
the portfolio VaR is not smaller than the sum of component VaR. The
reason is that VaR only considers the extreme percentile of a
gain/loss distribution without considering the magnitude of the
loss. As a consequence, a variant of VaR, usually labeled
Conditional-Value-at-Risk
(or CVaR), has been used. With respect to computational issues,
optimization CVaR can be very simple, which is another reason for
adoption of CVaR. This pioneer work was initiated by Rockafellar
and Uryasev, 6 where CVaR constraints in
optimization problems can be formulated as linear constraints. CVaR
represents a weighted average between the value at risk and losses
exceeding the value at risk. CVaR is a risk assessment approach
used to reduce the probability a portfolio will incur large losses
assuming a specified confidence level. CVaR has been applied to
financial trading portfolios, 7 implemented through scenario
analysis, 8 and applied via system dynamics.
9
A popular refinement is to use copulas, multivariate distributions
permitting the linkage of a huge number of distributions.
10
Copulas have been implemented through simulation modeling
11
as well as through analytic modeling. 12
We will show how specified confidence
levels can be modeled through chance constraints in the next
chapter. It is possible to maximize portfolio return subject to
constraints including Conditional Value-at-Risk (CVaR) and other
downside risk measures, both absolute and relative to a benchmark
(market and liability-based). Simulation CVaR based optimization
models can also be developed.
Notes
- 1.
Jorion, P. (1997). Value at Risk: The New Benchmark for Controlling Market Risk. New York: McGraw-Hill.
- 2.
Danielson, J. and de Vries, C.G. (1997). Extreme returns, tail estimation, and value-at-risk. Working Paper, University of Iceland (http://www.hag.hi.is/~jond/research); Fallon, W. (1996). Calculating value-at-risk. Working Paper, Columbia University (bfallon@groucho.gsb.columbia.edu); Garman, M.B. (1996). Improving on VaR. Risk 9,No. 5.
- 3.
JP Morgan (1996). RiskMetrics™-technical document, 4th ed.
- 4.
Evans, J.R. and Olson, D.L. (2002). Introduction to Simulation and Risk Analysis 2nd ed. Upper Saddle River, NJ: Prentice Hall.
- 5.
Beneda, N. (2004). Managing an asset management firm’s risk portfolio, Journal of Asset Management 5:5, 327–337.
- 6.
Rockafellar, R.T. and Uryassev, S. (2002). Conditional value-at-risk for general loss distributions. Journal of Banking & Finance 26:7, 1443–1471.
- 7.
Al Janabi, M.A.M. (2009). Corporate treasury market price risk management: A practical approach for strategic decision-making. Journal of Corporate Treasury Management 3(1), 55–63.
- 8.
Sawik, T. (2011). Selection of a dynamic supply portfolio in make-to-order environment with risks. Computers & Operations Research 38(4), 782–796.
- 9.
Mehrjoo, M. and Pasek, Z.J. (2016). Risk assessment for the supply chain of fast fashion apparel industry: A system dynamics framework. International Journal of Production Research 54(1), 28–48.
- 10.
Guégan, D. and Hassani, B.K. (2012). Operational risk: A Basel II++ step before Basel III. Journal of Risk Management in Financial Institutions 6(1), 37–53.
- 11.
Hsu, C.-P., Huang, C.-W. and Chiou, W.-J. (2012). Effectiveness of copula-extreme value theory in estimating value-at-risk: Empirical evidence from Asian emerging markets. Review of Quantitative Finance & Accounting 39(4), 447–468.
- 12.
Kaki, A., Salo, A. and Talluri, S. (2014). Scenario-based modeling of interdependent demand and supply uncertainties. IEEE Transactions on Engineering Management 61(1), 101–113.