© Springer-Verlag GmbH Germany 2017
David L. Olson and Desheng Dash WuEnterprise Risk Management ModelsSpringer Texts in Business and Economics10.1007/978-3-662-53785-5_8

8. Data Envelopment Analysis in Enterprise Risk Management

David L. Olson and Desheng Dash Wu2, 3
(1)
Department of Management, University of Nebraska, Lincoln, Nebraska, USA
(2)
Stockholm Business School, Stockholm University, Stockholm, Sweden
(3)
Economics and Management School, University of Chinese Academy of Sciences, Beijing, China
 
Charnes, Cooper and Rhodes 1 first introduced DEA (CCR) for efficiency analysis of Decision-making Units (DMU). DEA can be used for modeling operational processes, and its empirical orientation and absence of a priori assumptions have resulted in its use in a number of studies involving efficient frontier estimation in both nonprofit and in private sectors. DEA is widely applied in banking 2 and insurance. 3 DEA has become a leading approach for efficiency analysis in many fields, such as supply chain management, 4 petroleum distribution system design, 5 and government services. 6 DEA and multicriteria decision making models have been compared and extended. 7
Moskowitz et al. 8 presented a vendor selection scenario involving nine vendors with stochastic measures given over 12 criteria. This model was used by Wu and Olson 9 in comparing DEA with multiple criteria analysis. We start with discussion of the advanced ERM technology, i.e., value-at-risk (VaR) and view it as a tool to conduct risk management in enterprises.
While risk needs to be managed, taking risks is fundamental to doing business. Profit by necessity requires accepting some risk. 10 ERM provides tools to rationally manage these risks. We will demonstrate multiple criteria and DEA models in the enterprise risk management context with a hypothetical nuclear waste repository site location problem.

Basic Data

For a set of data including a supply chain needing to select a repository for waste dump siting, we have 12 alternatives with four criteria. Criteria considered include cost, expected lives lost, risk of catastrophe, and civic improvement. Expected lives lost reflects workers as well as expected local (civilian bystander) lives lost. The hierarchy of objectives is:
A194906_2_En_8_Figa_HTML.gif
The alternatives available, with measures on each criterion (including two categorical measures) are given in Table 8.1:
Table 8.1
Dump site data
Alternatives
Cost (billions)
Expected lives lost
Risk
Civic improvement
Nome AK
40
60
Very high
Low
Newark NJ
100
140
Very low
Very high
Rock Springs WY
60
40
Low
High
Duquesne PA
60
40
Medium
Medium
Gary IN
70
80
Low
Very high
Yakima Flats WA
70
80
High
Medium
Turkey TX
60
50
High
High
Wells NE
50
30
Medium
Medium
Anaheim CA
90
130
Very high
Very low
Epcot Center FL
80
120
Very low
Very low
Duckwater NV
80
70
Medium
Low
Santa Cruz CA
90
100
Very high
Very low
Models require numerical data, and it is easier to keep things straight if we make higher scores be better. So we adjust the Cost and Expected Lives Lost scores by subtracting them from the maximum, and we assign consistent scores on a 0–100 scale for the qualitative ratings given Risk and Civic Improvement, yielding Table 8.2:
Table 8.2
Scores used
Alternatives
Cost
Expected lives lost
Risk
Civic improvement
Nome AK
60
80
0
25
Newark NJ
0
0
100
100
Rock Springs WY
40
100
80
80
Duquesne PA
40
100
50
50
Gary IN
30
60
80
100
Yakima Flats WA
30
60
30
50
Turkey TX
40
90
30
80
Wells NE
50
110
50
50
Anaheim CA
10
10
0
0
Epcot Center FL
20
20
100
0
Duckwater NV
20
70
50
25
Santa Cruz CA
10
40
0
0
Nondominated solutions can be identified by inspection. For instance, Nome AK has the lowest estimated cost, so is by definition nondominated. Similarly, Wells NE has the best expected lives lost. There is a tie for risk of catastrophe (Newark NJ and Epcot Center FL have the best ratings, with tradeoff in that Epcot Center FL has better cost and lives lost estimates while Newark NJ has better civic improvement rating, and both are nondominated). There are also is a tie for best civic improvement (Newark NJ and Gary IN), and tradeoff in that Gary IN has better cost and lives lost estimates while Newark NJ has a better risk of catastrophe rating, and again both are nondominated. There is one other nondominated solution (Rock Springs WY), which can be compared to all of the other 11 alternatives and shown to be better on at least one alternative.

Multiple Criteria Models

Nondominance can also be established by a linear programming model. We create a variable for each criterion, with the decision variables weights (which we hold strictly greater than 0, and to sum to 1). The objective function is to maximize the sum-product of measure values multiplied by weights for each alternative site in turn, subject to this function being strictly greater than each sum-product of measure values time weights for each of the other sites. For the first alternative, the formulation of the linear programming model is:
 $$ \mathrm{Max}{\sum}_{i=1}^4{w}_i{y}_1 $$
s.t.  $$ {\sum}_{i=1}^4{w}_i=1 $$
For each j from 2 to 12:  $$ {\sum}_{i=1}^4{w}_iy{x}_1\ge {\sum}_{i=1}^4{w}_i{y}_j $$ +0.0001
 $$ {w}_i\ge 0.0001 $$
This model was run for each of the 12 available sites. Non-dominated alternatives (defined as at least as good on all criteria, and strictly better on at least one criterion relative to all other alternatives) are identified if this model is feasible. The reason to add the 0.0001 to some of the constraints is that strict dominance might not be identified otherwise (the model would have ties). The solution for the Newark NJ alternative was as shown in Table 8.3:
Table 8.3
MCDM LP solution for Nome AK
 
Criteria
Cost
Lives
Risk
Improve
 
Object
Newark NJ
0
0
100
100
99.9801
Weights
 
0.0001
0.0001
0.4975
0.5023
1.0000
 
Nome AK
60
80
0
25
12.5708
 
Rock Springs WY
40
100
80
80
79.9980
 
Duquesne PA
40
100
50
50
50.0040
 
Gary IN
30
60
80
100
90.0385
 
Yakima Flats WA
30
60
30
50
40.0485
 
Turkey TX
40
90
30
80
55.1207
 
Wells NE
50
110
50
50
50.0060
 
Anaheim CA
10
10
0
0
0.0020
 
Epcot Center FL
20
20
100
0
49.7567
 
Duckwater NV
20
70
50
25
37.4422
 
Santa Cruz CA
10
40
0
0
0.0050
The set of weights were minimum for the criteria of Cost and Expected Lives lost, with roughly equal weights on Risk of Catastrophe and Civic Improvement. That makes sense, because Newark NJ had the best scores for Risk of Catastrophe and Civic Improvement and low scores on the other two Criteria.
Running all 12 linear programming models, six solutions were feasible, indicating that they were not dominated {Nome AK, Newark NJ, Rock Springs WY, Gary IN, Wells NE and Epcot Center FL}. The corresponding weights identified are not unique (many different weight combinations might have yielded these alternatives as feasible). These weights also reflect scale (here the range for Cost was 60, and for Lives Lost was 110, while the range for the other two criteria were 100—in this case this difference is slight, but the scales do not need to be similar. The more dissimilar, the more warped are the weights.) For the other six dominated solutions, no set of weights would yield them as feasible. For instance, Table 8.4 shows the infeasible solution for Duquesne PA:
Table 8.4
LP solution for Duquesne PA
 
Criteria
Cost
Lives
Risk
Improve
 
Object
Duquesne PA
40
100
50
50
99.9840
Weights
 
0.0001
0.9997
0.0001
0.0001
1.0000
 
Nome AK
60
80
0
25
79.9845
 
Newark NJ
0
0
100
100
0.0200
 
Rock Springs WY
40
100
80
80
99.9900
 
Gary IN
30
60
80
100
60.0030
 
Yakima Flats WA
30
60
30
50
59.9930
 
Turkey TX
40
90
30
80
89.9880
 
Wells NE
50
110
50
50
109.9820
 
Anaheim CA
10
10
0
0
9.9980
 
Epcot Center FL
20
20
100
0
20.0060
 
Duckwater NV
20
70
50
25
69.9885
 
Santa Cruz CA
10
40
0
0
39.9890
Here Rock Springs WY and Wells NE had higher functional values than Duquesne PA. This is clear by looking at criteria attainments. Rock Springs WY is equal to Duquesne PA on Cost and Lives Lost, and better on Risk and Civic Improvement.

Scales

The above analysis used input data with different scales. Cost ranged from 0 to 60, Lives Lost from 0 to 110, and the two subjective criteria (Risk, Civic Improvement) from 0 to 100. While they were similar, there were slightly different ranges. The resulting weights are one possible set of weights that would yield the analyzed alternative as non-dominated. If we proportioned the ranges to all be equal (divide Cost scores in Table 8.2 by 0.6, Expected Lives Lost scores by 1.1), the resulting weights would represent the implied relative importance of each criterion that would yield a non-dominated solution. The non-dominated set is the same, only weights varying. Results are given in Table 8.5.
Table 8.5
Results using scaled weights
Alternative
Cost
Lives
Risk
Improve
Dominated by
Nome AK
0.9997
0.0001
0.0001
0.0001
 
Newark NJ
0.0001
0.0001
0.4979
0.5019
 
Rock Springs WY
0.0001
0.7673
0.0001
0.2325
 
Gary IN
0.00001
0.0001
0.0001
0.9997
 
Wells NE
0.0001
0.9997
0.0001
0.0001
 
Epcot Center FL
0.0002
0.0001
0.9996
0.0001
 
Duquesne PA
       
Rock Springs WY
Wells NE
Yakima Flats WA
       
Six alternatives
Turkey TX
       
Rock Springs WY
Anaheim CA
       
All but Newark NJ
Duckwater NV
       
Five alternatives
Santa Cruz CA
       
Eight alternatives

Stochastic Mathematical Formulation

Value-at-risk (VaR) methods are popular in financial risk management. 11 VaR models were motivated in part by several major financial disasters in the late 1980s and 1990s, to include the fall of Barings Bank and the bankruptcy of Orange County. In both instances, large amounts of capital were invested in volatile markets when traders concealed their risk exposure. VaR models allow managers to quantify their risk exposure at the portfolio level, and can be used as a benchmark to compare risk positions across different markets. Value-at-risk can be defined as the expected loss for an investment or portfolio at a given confidence level over a stated time horizon. If we define the risk exposure of the investment as L, we can express VaR as:
 $$ Prob\left\{L\le VaR\right\}=1\hbox{--} \alpha $$
A rational investor will minimize expected losses, or the loss level at the stated probability (1 − α). This statement of risk exposure can also be used as a constraint in a chance-constrained programming model, imposing a restriction that the probability of loss greater than some stated value should be less than (1 − α).
The standard deviation or volatility of asset returns, σ, is a widely used measure of financial models such as VaR. Volatility σ represents the variation of asset returns during some time horizon in the VaR framework. This measure will be employed in our approach. Monte Carlo Simulation techniques are often applied to measure the variability of asset risk factors. 12 We will employ Monte Carlo Simulation for benchmarking our proposed method.
Stochastic models construct production frontiers that incorporate both inefficiency and stochastic error. The stochastic frontier associates extreme outliers with the stochastic error term and this has the effect of moving the frontier closer to the bulk of the producing units. As a result, the measured technical efficiency of every DMU is raised relative to the deterministic model. In some realizations, some DMUs will have a super-efficiency larger than unity. 13
Now we consider the stochastic vendor selection model. Consider N suppliers to be evaluated, each has s random variables. Note that all input variables are transformed to output variables, as was done in Moskowitz et al. 14 The variables of supplier j (j=1,2…N) exhibit random behavior represented by  $$ {\tilde{y}}_j $$ =  $$ \left({\tilde{y}}_{1j},\cdots, {\tilde{y}}_{sj}\right) $$ , where each  $$ {\tilde{y}}_{rj} $$ (r = 1 , 2 ,  …  , s) has a known probability distribution. By maximizing the expected efficiency of a vendor under evaluation subject to VaR being restricted to be no worse than some limit, the following model (1) is developed:
 $$ \mathrm{Max}{\sum}_{i=1}^4{w}_i{y}_1 $$
s.t.  $$ {\sum}_{i=1}^4{w}_i=1 $$
For each j from 2 to 12: Prob{ $$ {\sum}_{i=1}^4{w}_iy{x}_1\ge {\sum}_{i=1}^4{w}_i{y}_j $$ +0.0001}≥(1-α)
 $$ {\mathrm{w}}_{\mathrm{i}}\ge 0.0001 $$
Because each  $$ {\tilde{y}}_j $$ is potentially a random variable, it has a distribution rather than being a constant. The objective function is now an expectation, but the expectation is the mean, so this function is still linear, using the mean rather than the constant parameter. The constraints on each location’s performance being greater than or equal to all other location performances is now a nonlinear function. The weights w i are still variables to be solved for, as in the deterministic version used above.
The scalar α is referred to as the modeler’s risk level, indicating the probability measure of the extent to which Pareto efficiency violation is admitted as most α proportion of the time. The α j (0 ≤ α j  ≤ 1) in the constraints are predetermined scalars which stand for an allowable risk of violating the associated constraints, where 1 − α j indicates the probability of attaining the requirement. The higher the value of α, the higher the modeler’s risk and the lower the modeler’s confidence about the 0th vendor’s Pareto efficiency and vice-visa. At the (1 − α)% confidence level, the 0th supplier is stochastic efficient only if the optimal objective value is equal to one.
To transform the stochastic model (1) into a deterministic DEA, Charnes and Cooper 15 employed chance constrained programming. 16 The transformation steps presented in this study follow this technique and can be considered as a special case of their stochastic DEA, 17 where both stochastic inputs and outputs are used. This yields a non-linear programming problem in the variables w i , which has computational difficulties due to the objective function and the constraints, including the variance-covariance yielding quadratic expressions in constraints. We assume that  $$ {\tilde{y}}_j $$ follows a normal distribution N( $$ {\overline{y}}_j $$ , Bjk), where  $$ {\overline{y}}_j $$ is its vector of expected value and Bjk indicates the variance-covariance matrix of the jth alternative with the kth alternative. The development of stochastic DEA is given in Wu and Olson (2008). 18
We adjust the data set used in the nuclear waste siting problem by making cost a stochastic variable (following an assumed normal distribution, thus requiring a variance). The mathematical programming model decision variables are the weights on each criterion, which are not stochastic. What is stochastic is the parameter on costs. Thus the adjustment is in the constraints. For each evaluated alternative y j compared to alternative y k :
wcost( y j cost – z*SQRT(Var[y j cost ]) + w lives y j lives + w risk y j risk + w imp y j imp
wcost( y k cost – zSQRT(Var[y k cost ] + 2*Cov[y j cost ,y k cost ]
+ Var[y k cost ] + w lives y k lives + w risk y k risk + w imp y k imp
These functions need to include the covariance term for costs between alternative y j compared to alternative y k .
Table 8.6 shows the stochastic cost data in billions of dollars, and the converted cost scores (also billions of dollars transformed as $100 billion minus the cost measure for that site) as in Table 8.2. The cost variances will remain as they were, as the relative scale did not change.
Table 8.6
Stochastic data
Alternative
Cost measure
Mean cost
Cost variance
Expected lives lost
Risk
Civic improvement
S1 Nome AK
N(40,6)
60
6
80
0
25
S2 Newark NJ
N(100,20)
0
20
0
100
100
S3 Rock Springs WY
N(60,5)
40
5
100
80
80
S4 Duquesne PA
N(60,30)
40
30
100
50
50
S5 Gary IN
N(70,35)
30
35
60
80
100
S6 Yakima Flats WA
N(70,20)
30
20
60
30
50
S7 Turkey TX
N(60,10)
40
10
90
30
80
S8 Wells NE
N(50,8)
50
8
110
50
50
S9 Anaheim CA
N(90,40)
10
40
10
0
0
S10 Epcot Center FL
N(80,50)
20
50
20
100
0
S11 Duckwater NV
N(80,20)
20
20
70
50
25
S12 Santa Cruz CA
N(90,40)
10
40
40
0
0
The variance-covariance matrix of costs is required (Table 8.7):
Table 8.7
Site covariances
 
S1
S2
S3
S4
S5
S6
S7
S8
S9
S10
S11
S12
S1
6
2
4
2
2
3
3
3
2
1
3
2
S2
 
20
3
10
9
5
2
1
4
5
1
4
S3
   
5
2
1
2
3
3
2
1
3
2
S4
     
30
10
8
2
2
6
5
1
4
S5
       
35
9
3
2
5
6
1
4
S6
         
20
3
2
10
8
2
12
S7
           
10
3
2
1
3
2
S8
             
8
2
1
3
2
S9
               
40
5
1
12
S10
                 
50
2
8
S11
                   
20
2
S12
                     
40
The degree of risk aversion used (α) is 0.95, or a z-value of 1.645 for a one-sided distribution. The adjustment affected the model by lowering the cost parameter proportional to its variance for the evaluated alternative, and inflating it for the other alternatives. Thus the stochastic model required a 0.95 assurance that the cost for the evaluated alternative be superior to each of the other 11 alternatives, a more difficult standard. The DEA models were run for each of the 12 alternatives. Only two of the six alternatives found to be nondominated with deterministic data above were still nondominated {Rock Springs WY and Wells NE}. The model results in Table 8.8 show the results for Rock Springs WY, with one set of weights {0, 0.75, 0.25, 0} yielding Rock Springs with a greater functional value than any of the other 11 alternatives. The weights yielding Wells NE as nondominated had all the weight on Lives Lost.
Table 8.8
Output for Stochastic Model for Rock Springs WY
Object
Rock Springs WY
36.322
100
80
80
94.99304
Weights
 
0.0001
0.7499
0.24993
0.0001
1
 
Nome AK
67.170
80
0
25
59.999
 
Newark NJ
9.158
0
100
100
25.004
 
Duquesne PA
50.272
100
50
50
87.494
 
Gary IN
40.660
60
80
80
64.999
 
Yakima Flats WA
38.858
60
30
30
52.497
 
Turkey TX
47.538
90
30
30
74.994
 
Wells NE
57.170
110
50
50
94.993
 
Anaheim CA
21.514
10
0
0
7.501
 
Epcot Center FL
32.418
20
100
100
40.004
 
Duckwater NV
29.158
70
50
50
64.995
 
Santa Cruz CA
21.514
40
0
0
29.997
One of the alternatives that was nondominated with deterministic data {Nome AK} was found to be dominated with stochastic data. Table 8.9 shows the results for the original deterministic model for Nome AK.
Table 8.9
Nome AK alternative results with original model
Object
Nome AK
60
80
0
25
64.9857
Weights
 
0.7500
0.2498
0.0001
0.0001
1
 
Newark NJ
0
0
100
100
0.020
 
Rock Springs WY
40
100
80
80
54.994
 
Duquesne PA
40
100
50
50
54.988
 
Gary IN
30
60
80
100
37.505
 
Yakima Flats WA
30
60
30
50
37.495
 
Turkey TX
40
90
30
80
52.491
 
Wells NE
50
110
50
50
64.986
 
Anaheim CA
10
10
0
0
9.998
 
Epcot Center FL
20
20
100
0
20.006
 
Duckwater NV
20
70
50
25
32.492
 
Santa Cruz CA
10
40
0
0
17.491
The stochastic results are shown in Table 8.10:
Table 8.10
Nome AK alternative results with stochastic model
Object
Nome AK
55.97
80
0
25
55.965
Weights
 
0.9997
0.0001
0.0001
0.0001
1
 
Newark NJ
9.009
0
100
100
9.027
 
Rock Springs WY
47.170
100
80
80
47.182
 
Duquesne PA
50.403
100
50
50
50.408
 
Gary IN
41.034
60
80
100
41.046
 
Yakima Flats WA
39.305
60
30
50
39.307
 
Turkey TX
47.715
90
30
80
47.721
 
Wells NE
57.356
110
50
50
57.360
 
Anaheim CA
21.631
10
0
0
21.625
 
Epcot Center FL
32.527
20
100
0
32.529
 
Duckwater NV
29.305
70
50
25
29.310
 
Santa Cruz CA
21.631
40
0
0
21.628
Wells NE is shown to be superior to Nome AK at the last set of weights the SOLVER algorithm in EXCEL attempted. Looking at the stochastically adjusted scores for cost, Wells NE now has a superior cost value to Nome AK (the objective functional cost value is penalized downward, the constraint cost value for Wells NE and other alternatives are penalized upward to make a harder standard to meet).

DEA Models

DEA evaluates alternatives by seeking to maximize the ratio of efficiency of output attainments to inputs, considering the relative performance of each alternative. The mathematical programming model creates a variable for each output (outputs designated by u i ) and input (inputs designated by v j ). Each alternative k has performance coefficients for each output (y ik ) and input (x jk ). The classic Charnes, Cooper and Rhodes (CCR) 19 DEA model is:
 $$ Max\ {efficiency}_k=\frac{\sum_{i=1}^2{u}_i{y}_{ik}}{\sum_{j=1}^2{v}_j{x}_{jk}} $$
s.t. For each k from 1 to 12:  $$ \frac{\sum_{i=1}^2{u}_i{y}_{ik}}{\sum_{j=1}^2{v}_j{x}_{jk}}\le 1 $$
 $$ {\mathrm{u}}_{\mathrm{i}},{\mathrm{v}}_{\mathrm{j}}\ge 0 $$
The Banker, Charnes and Cooper (BCC) DEA model includes a scale parameter to allow of economies of scale. It also releases the restriction on sign for u i , v j .
 $$ Max\ {efficiency}_k=\frac{\sum_{i=1}^2{u}_i{y}_{ik}+\gamma }{\sum_{j=1}^2{v}_j{x}_{jk}} $$
s.t. For each k from 1 to 12:  $$ \frac{\sum_{i=1}^2{u}_i{y}_{ik}+\gamma }{\sum_{j=1}^2{v}_j{x}_{jk}}\le 1 $$
u i  , v j  ≥ 0, γ unrestricted in sign
A third DEA model allows for super-efficiency. It is the CCR model without a restriction on efficiency ratios.
 $$ Max\ {efficiency}_k=\frac{\sum_{i=1}^2{u}_i{y}_{ik}}{\sum_{j=1}^2{v}_j{x}_{jk}} $$
s.t. For each l from 1 to 12:  $$ \frac{\sum_{i=1}^2{u}_i{y}_{il}}{\sum_{j=1}^2{v}_j{x}_{jl}}\le 1 $$ for lk
 $$ {\mathrm{u}}_{\mathrm{i}},{\mathrm{v}}_{\mathrm{j}}\ge 0 $$
The traditional DEA models were run on the dump site selection model, yielding results shown in Table 8.11:
Table 8.11
Traditional DEA model results
 
CCR DEA
 
BCC DEA
 
Super-CCR
Super-CCR
Alternative
Score
Rank
Score
Rank
Score
Rank
Nome AK
0.43750
10
1
1
0.43750
10
Newark NJ
0.75000
6
1
1
0.75000
6
Rock Springs WY
1
1
1
1
1.31000
1
Duquesne PA
0.62500
7
0.83333
8
0.62500
7
Gary IN
1
1
1
1
1.07143
2
Yakima Flats WA
0.5
8
0.70129
9
0.5
8
Turkey TX
0.97561
3
1
1
0.97561
3
Wells NE
0.83333
5
1
1
0.83333
5
Anaheim CA
0
11
0.45000
12
0
11
Epcot Center FL
0.93750
4
1
1
0.93750
4
Duckwater NV
0.46875
9
0.62500
10
0.46875
9
Santa Cruz CA
0
11
0.48648
11
0
11
These approaches provide rankings. In the case of CCR DEA, the ranking includes some ties (for first place and 11th place). The nondominated Nome AL alternative was ranked tenth, behind dominated solutions Turkey TX, Duquesne PA, Yakima Flats WA, and Duckwater NV. Nome dominates Anaheim CA and Santa Cruz CA, but does not dominate any other alternative. The ranking in tenth place is probably due to the smaller scale for the Cost criterion, where Nome AK has the best score. BCC DEA has all dominated solutions tied for first. The rankings for 7th through 12 reflect more of an average performance on all criteria (affected by scales). The rankings provided by BCC DEA after first are affected by criteria scales. Super-CCR provides a nearly unique ranking (tie for 11th place).

Conclusion

The importance of risk management has vastly increased in the past decade. Value at risk techniques have been becoming the frontier technology for conducting enterprise risk management. One of the ERM areas of global business involving high levels of risk is global supply chain management.
Selection in supply chains by its nature involves the need to trade off multiple criteria, as well as the presence of uncertain data. When these conditions exist, stochastic dominance can be applied if the uncertain data is normally distributed. If not normally distributed, simulation modeling applies (and can also be applied if data is normally distributed).
When the data is presented with uncertainty, stochastic DEA provides a good tool to perform efficiency analysis by handling both inefficiency and stochastic error. We must point out the main difference for implementing investment VaR in financial markets such as banking industry and our DEA VaR used for supplier selection is that the underlying asset volatility or standard deviation is typically a managerial assumption due to lack of sufficient historical data to calibrate the risk measure.

Notes

  1. 1.
    Charnes, A., Cooper, W.W. and Rhodes, E. (1978). Measuring the efficiency of decision-making units, European Journal of Operational Research 2, 429–444.
     
  2. 2.
    Banker, R.D., Chang,H. and Lee, S.-Y. (2010). Differential impact of Korean banking system reforms on bank productivity. Journal of Banking & Finance 34(7), 1450–1460; Gunay, E.N.O. (2012). Risk incorporation and efficiency in emerging market banks during the global crisis: Evidence from Turkey, 2002–2009. Emerging Markets Finance & Trade 48(supp5), 91–102; Yang, C.-C. (2014). An enhanced DEA model for decomposition of technical efficiency in banking. Annals of Operations Research 214(1), 167–185.
     
  3. 3.
    Segovia-Gonzalez, M.M., Contreras, I. and Mar-Molinero, C. (2009). A DEA analysis of risk, cost, and revenues in insurance. Journal of the Operational Research Society 60(11), 1483–1494.
     
  4. 4.
    Ross, A. and Droge, C. (2002). An integrated benchmarking approach to distribution center performance using DEA modeling, Journal of Operations Management 20, 19–32; Wu, D.D. and Olson, D. (2010). Enterprise risk management: A DEA VaR approach in vendor selection. International Journal of Production Research 48(16), 4919–4932.
     
  5. 5.
    Ross, A. and Droge, C. (2004). An analysis of operations efficiency in large-scale distribution systems, Journal of Operations Management 21, 673–688.
     
  6. 6.
    Narasimhan, R., Talluri, S., Sarkis, J. and Ross, A. (2005). Efficient service location design in government services: A decision support system framework, Journal of Operations Management 23:2, 163–176.
     
  7. 7.
    Lahdelma, R. and Salminen, P. (2006). Stochastic multicriteria acceptability analysis using the data envelopment model, European Journal of Operational Research 170, 241–252; Olson, D.L. and Wu, D.D. (2011). Multiple criteria analysis for evaluation of information system risk. Asia-Pacific Journal of Operational Research 28(1), 25–39.
     
  8. 8.
    Moskowitz, H., Tang, J. and Lam, P. (2000). Distribution of aggregate utility using stochastic elements of additive multiattribute utility models, Decision Sciences 31, 327–360.
     
  9. 9.
    Wu, D. and Olson, D.L. (2008). A comparison of stochastic dominance and stochastic DEA for vendor evaluation, International Journal of Production Research 46:8, 2313–2327.
     
  10. 10.
    Alquier, A.M.B. and Tignol, M.H.L. (2006). Risk management in small- and medium-sized enterprises, Production Planning & Control, 17, 273–282.
     
  11. 11.
    Duffie, D. and Pan, J. (2001). Analytical value-at-risk with jumps and credit risk, Finance & Stochastics 5:2, 155–180; Jorion, P. (2007). Value-at-risk: The New Benchmark for Controlling Market Risk. New York: Irwin.
     
  12. 12.
    Crouhy, M., Galai, D., and Mark, R. M. (2001). Risk Management. New York, NY: McGraw Hill.
     
  13. 13.
    Olesen, O.B. and Petersen, N.C. (1995). Comment on assessing marginal impact of investment on the performance of organizational units, International Journal of Production Economics 39, 162–163; Cooper, W.W., Hemphill, H., Huang, Z., Li, S., Lelas, V., and Sullivan, D.W. (1996). Survey of mathematical programming models in air pollution management, European Journal of Operational Research 96, 1–35; Cooper, W.W., Deng, H., Huang, Z.M. and Li, S.X. (2002). A one-model approach to congestion in data envelopment analysis, Socio-Economic Planning Sciences 36, 231–238.
     
  14. 14.
    Moskowitz et al. (2000), op. cit.
     
  15. 15.
    Charnes, A. and Cooper, W.W. (1959). Chance-constrained programming, Management Science 6:1, 73–79; see also Huang, Z. and Li, S.X. (2001). Co-op advertising models in manufacturer-retailer supply chains: A game theory approach, European Journal of Operational Research 135:3, 527–544.
     
  16. 16.
    Charnes, A., Cooper, W.W. and Symonds, G.H. (1958). Cost horizons and certainty equivalents: An approach to stochastic programming of heating oil, Management Science 4:3, 235–263.
     
  17. 17.
    Cooper, W.W., Park, K.S. and Yu, G. (1999). IDEA and AR-IDEA: Models for dealing with imprecise data in DEA, Management Science 45, 597–607.
     
  18. 18.
    Wu and Olson (2008), op cit.
     
  19. 19.
    Charnes, A., Cooper, W. and Rhodes, E. (1978), op cit.