Für neue Autoren:
kostenlos, einfach und schnell
Für bereits registrierte Autoren
Seminararbeit, 2004
70 Seiten, Note: 1,7
List of Figures
List of Tables
List of Abbreviations
List of Symbols
1 Introduction
1.1 Purpose of the Study
1.2 Methodology
2 Calculation of VaR
2.1 Risk Management Framework
2.2 VaR Concept
2.3 Calculation Methods
2.3.1 Historical Method
2.3.2 Monte Carlo Simulation Method
2.3.3 Analytical Method
2.4 Adaptation of the Principle Methods
2.4.1 Weighting of Past Observations
2.4.2 Backtesting
2.4.3 Scenario Analysis and Stress Testing
2.4.4 Extreme Value Theory
3 Evaluation of VaR
3.1 Method Comparison
3.2 Chances of VaR
3.2.1 Practicability as a Risk Measure
3.2.2 Realistic View of Risk
3.2.3 Effective Risk Monitoring
3.2.4 Flexibility of VaR models
3.2.5 Support by Regulatory Body
3.3 Limitations of VaR
3.3.1 Confidence Interval
3.3.2 Comparability of Results
3.3.3 Sub-additivity
3.3.4 Historical Data
3.3.5 Application Difficulties
3.3.6 Estimation Error and Fat-Tails
3.3.7 Estimation Bias and Manipulation
4 Conclusions
Appendices
References
Figure 1: Risk Management Overview
Figure 2: Firm-wide risks
Figure 3: Probability Distribution
Figure 4: Monthly VaR Distribution for FTSE 100 (1/88 – 1/95)
Figure 5: Comparison of Stress Testing and Scenario Analysis
Figure 6: Example for Incoherence of VaR
Figure 7: Comparison of different VaR methodologies
Figure 8: Changes in the distributions of the NASDAQ Composite Index
Figure 9: Standard Errors in Simulation Methods and Analytical Methods
Table 1: The Basel Penalty Zones
illustration not visible in this excerpt
The risk and return framework is generally accepted and discussed by scientists, at least since Markowitz introduced his Portfolio Theory in 1952. Subsequently, models were developed to evaluate investments under consideration of risk and return.^{[1]} Traditionally, practitioners primarily focused on past earnings as a measure of the profitability of an investment, without adequately considering potential risks. Therefore, the development of professional risk management systems was often neglected. Thus, the possibility of high losses was not appropriately incorporated in their investment strategies.^{[2]}
The consequences of such mistreatment became evident in the mid 1990s, when some of the world’s largest companies faced huge losses and sometimes even insolvency. Most of these failures were a direct result of inappropriate use of financial instruments and insufficient internal control mechanisms. The most spectacular debacles even resulted in losses of more than one billion dollars for each affected institution.^{[3]}
In case of Barings Bank, a single trader ruined the 233-year old British financial institution by inappropriate investments in high-risk futures in 1995. The consequent loss of $1.3 billion, realized in a very short period, could not be absorbed and forced the downfall of Barings. At Daiwa Bank, it was also a single trader who caused a $1.1 billion deficit. In contrast, the losses were accumulated over 11 years from 1984. Another well-publicized bankruptcy was declared in 1994 by the Californian Orange County, after losses of $1.8 billion. Such evidence of poor risk management and control shows that proper financial risk management is crucial for all kinds of institutions in order to guarantee stability and continuity.^{[4]}
Therefore, it is necessary to establish adequate risk management processes and to develop appropriate tools, which quantify risk exposures of both entire institutions and single financial instruments. This risk quantification should alert management early enough to prevent exceptional losses. One of the key concepts addressing these problems of modern risk management was introduced in 1993 with the Value-at-Risk (VaR) models.
In 1993, about 30% of professional dealers were already using VaR-like tools to measure risk.^{[5]} Within one year, this number increased to 43% and an additional 37% of the dealers intended to implement VaR until the end of 1995.^{[6]}
However, non-financial companies did not implement VaR as fast as professional dealers did. In 1995, about 35% of non-financial institution in the United States indicated they were using VaR^{[7]}, and it took them another three years to arrive at 44%.^{[8]} Nowadays, a variety of standard VaR software packages is available and almost every publication about risk management covers VaR.^{[9]}
As VaR has become one of the most prevalent tools for risk measurement in recent years and because of the importance of risk management to institutions, the purpose of this seminar paper is to examine the chances and limitations of VaR models and to evaluate its usefulness as a part of an early warning system.
To asses the chances and limitations of VaR models, it is initially described how VaR fits into the risk management framework (2.1). Afterwards, the conceptual aspects of VaR and its calculation are introduced (2.2), followed by a detailed explanation of three principle methods for VaR determination (2.3). Additionally, a variety of common adaptations to the calculation methods is presented (2.4). The third chapter begins with a comparison of assumptions and characteristics of the various methods as well as an indication of resulting differences (3.1). Then, the chances, opportunities and advantages of VaR models are elaborated and evaluated (3.2). Subsequently, the limitations and problems that VaR is confronted with are discussed, its validity is examined and triggers for manipulation are identified (3.3). Finally, the applicability of VaR models in the context of financial risk management is pointed out and the findings of this seminar paper are summarized.
Managing risk successfully is inevitable for any corporation, as the efficiency of that risk management is a crucial factor for business continuity. Therefore, a company should implement a risk management process, which comprises the steps of establishing a risk management context: identifying, assessing (i.e. analyzing and evaluating) and treating risks as well as monitoring and reviewing this process. This enables companies to control the risks they are exposed to and threatened by in their day-to-day business.^{[10]}
Figure 1: Risk Management Overview
illustration not visible in this excerpt
Source: Joint Standards Australia/Standards New Zealand Committee OB/7 on Risk Management (1999), p. 8.
Proper identification of all risk exposures is fundamental for successful risk management, because only identified risks can be managed. In order to structure risk identification it is possible to distinguish risk into different categories and then focus on the sources of risk in each category. One possible categorization is to divide risks into business risks, non-business risks and financial risks. Within this structure, business risks are those risks resulting directly from the company’s operations.^{[11]} Non-business risks, in contrast, cannot be controlled directly by the company. For example, fundamental changes in the economy or the political environment as strategic risks cannot be influenced.^{[12]} Financial risks arise from a company’s activities in financial markets and may result in financial obligations.^{[13]}
The first step of the financial risk management process defined by the Basel Committee is the identification of internal and external factors that could adversely influence the achievements of the objectives of an institution. It is also recommended to classify into controllable and non-controllable risks.^{[14]}
Figure 2: Firm-wide risks
illustration not visible in this excerpt
Firm-wide risks
Source: Jorion, Philippe (2001), p. 468.
However, Jorion divided financial risks into more detailed categories being credit risk, liquidity risk, operational risk, legal risk and market risk. Credit risk is the risk of debtors not being able to settle their debts. Liquidity risk arises where assets cannot be liquidated at market price or where obligations cannot be met. Operational risk arises from errors and accidents of either human or technical nature and can result in credit or market risk. Legal risk is generally related to credit risk and occurs in cases, where transactions are not enforceable by law. Finally, market risks are subject to movements in the level of market price volatility.^{[15]}
The second step of the financial risk management process is the assessment of risk by usage of a risk model and relevant data.^{[16]} However, assessing risk may be a problem as risk, which is the volatility of unexpected outcomes, can hardly be measured or predicted without appropriate tools or methods. One attempt to quantify and express market risk was made by introducing the model of VaR in 1993.^{[17]}
The final steps of financial risk management, as defined by the Basel Committee, comprise monitoring and reporting of the risk assessments on a regularly basis as well as controlling these risks by senior management.^{[18]} After including VaR into the frameworks of risk management, the concept of VaR will now be introduced.
The VaR concept was developed to satisfy the need for a risk measure that is able to summarize a company’s exposure to normal market movements and the corresponding potential losses. The methodology to aggregate several market positions to a single consistent measure of risk has become known as VaR.^{[19]}
This aggregation is subject to various simplifying assumptions made in its calculation.^{[20]} Jorion defines the VaR as a summary of the worst loss that can be expected over a holding period with a given confidence level. In other words, a distribution of gains and losses over a holding period is projected and the VaR describes its percentile.^{[21]}
Greater losses than predicted by the VaR measure only occur with a small, pre-defined probability. From a financial perspective, VaR is a measure that summarizes the exposure of a portfolio (or an asset) to market risk based on the distribution of price changes over a given period.^{[22]}
For a given holding period illustration not visible in this excerpt and confidence level p, the VaR is the loss in market value that is only exceeded with the probability 1-p. The parameter t depends on the entity’s time horizon. Actively trading companies (typically financial firms) commonly use one day, while non-financial firms and passively trading companies normally use longer periods. The confidence level illustration not visible in this excerptis primarily determined by how the designer or user of the risk measure wants to interpret the VaR. Theory provides little guidance about that choice.^{[23]}
For example, if the selected confidence level is 99%, the VaR will be the dollar loss that is expected to occur no more than 1 per cent of the time over the holding period. That means the VaR is the 0.01 critical value of the probability distribution. This distribution is called the VaR distribution and it represents possible portfolio returns or future values. An example of a VaR distribution is presented in Figure 3. The range of VaR possibilities corresponds to the confidence interval; the left-hand tail of the distribution shows the possibilities that are not included in the VaR.^{[24]}
Figure 3: Probability Distribution
illustration not visible in this excerpt
Source: Smithson, Charles /Minton, Lyle (1997), p. 27.
Thus, the most important step in calculating a VaR measure for a portfolio is the determination of an adequate distribution of changes in the portfolio value. Generally, the portfolio it is valued with a current price list. All components that affect the value of the portfolio are called market factors, for example foreign currency exchange rates, key interest rates or simply stock prices. Then, the portfolio is revalued using a number of alternative price lists. These alternative lists can be generated with several different methods, for example by using market risk scenarios, which will be explained in the subsequent sections. With that price lists possible changes in the value of the portfolio are estimated. These changes are ordered from lowest to highest value and the result is a distribution of changes.^{[25]}
Alternatively, a normal distribution for the changes in the value of the portfolio can be assumed. In this case, parameters like the standard deviation have to be estimated, for example by using the underlying market factors.
Another important step in the computation of a quantitative measure of market risk, like the VaR, is that adequate market factors that affect the portfolio value must be selected. That means economic or financial indicators and instruments that may cause changes of the portfolio value should be identified. After suitable market factors are found, it may be necessary to limit the number of values according to the computing capacity; otherwise, the calculation could become unmanageable.^{[26]}
As mentioned above, several methods to calculate or estimate the VaR distribution (respectively the alternative price lists) were developed. They range from very simplistic to extremely complex ones. The former ones make assumptions in order to reduce the required data and computing capacity, while the more realistic and complex methods revalue the portfolio with the most appropriate market risk scenarios possible.^{[27]}
There cannot be any consensus about which method is best, because the methods have different scopes of application. If several methods are found to be adequate for a specific purpose, the choice of the most appropriate method is mainly based on the user’s aversion to oversimplistic and unrealistic assumptions.^{[28]}
The three most popular methods are the historical method, the Monte Carlo simulation and the analytical method. Each is a simplification of reality and makes several assumptions.
The historical method is a simplistic approach that is based on the assumption that the future behaves like the past. That means the VaR distribution can be estimated based on a frequency distribution of returns over a historic observation period. This frequency of past returns can be called the probability that similar returns will be realized in the future.^{[29]} Such a frequency distribution for the FTSE-100 index is shown in figure 4. The VaR is determined on the pre-defined confidence level over a certain holding period. If the confidence level is 95% over a one-month risk horizon, the VaR will quantify the maximum portfolio loss that will only be exceeded on five in the next 100 months.
Figure 4: Monthly VaR Distribution for FTSE 100 (1/88 – 1/95)
illustration not visible in this excerpt
Source: Culp, Christopher L. /Mensink, Ron /Neves, Andrea M. P. (1998), p. 24.
It is calculated by multiplying the current portfolio value by the percentile historical return for the frequency distribution. In this case it is the 5th percentile, that was -6.87% in the FTSE-100 sample. If a one million dollar portfolio consists of all FTSE-100 securities in proportions similar to the index, the VaR will be $68,700.^{[30]}
A more sophisticated variation of the historical method is the historical simulation approach. Instead of using past portfolio returns, a historical simulation uses past data of a set of market factors, which determine the value of the portfolio. With the current portfolio composition and the historical market data, it is possible to determine what the portfolio value would have been in the past. That means the current portfolio is exposed to actual (historically observed) changes in the market factors and hypothetical profits and losses are generated. These profits and losses are hypothetical because historical returns are simulated under the assumption of a constant portfolio character. With that simulated history of returns, an empirical distribution can be constructed. The fact that the profits and losses of the portfolio are only hypothetical distinguishes the historical simulation from the “standard” historical approach.^{[31]}
In the first step of the calculation, adequate market factors have to be identified. Then, it is necessary to collect sufficient historical data about all market factors of the portfolio. This data is gathered by observing the values of the market factors over a particular time horizon and computing the corresponding changes. A VaR calculation over 100 trading days, for example, will result in a vector containing 99 observed changes for each market factor. An alternative price list can be derived by adding each of the observed changes to the current value of the market factor. With the current and alternative values for the market factors, the portfolio value and the corresponding changes can be calculated. After sorting these hypothetical profits and losses from lowest to highest value and determining an empirical distribution, the associated VaR can be determined according to the pre-defined confidence interval.^{[32]}
In the case of multi-instrument portfolios, some extension of the methodology is required. Despite the fact that additional market factors must be obtained, it is crucial that correlations between the several instruments are considered. In order to do so, the daily^{[33]} profits and losses for each instrument are calculated and then summed up for each day. This is done before they are ordered from lowest to highest value to consider the fact that profits and losses on different instruments may offset each other. By netting the profits against the losses for each day in the sampling period, possible correlations are captured.^{[34]}
The Monte Carlo simulation, also referred to as stochastic simulation approach, tries to generate a more comprehensive risk measure.
First, it is necessary to identify adequate market factors. The random behavior of these basic market variables is simulated by a sampling methodology. A pseudo-random number generator is used to compute thousands (or tens of thousands) hypothetical changes in the market factors in order to simulate reality. The usually very complex interaction between the various risk factors is considered by simulating a large number of potential paths.^{[35]} With the hypothetical changes in the market factors, a large number of hypothetical portfolio profits and losses can be constructed. This results in a distribution of possible portfolio profits and losses, from which the VaR can be determined. With a large number of sampled and included paths, many combinations and permutations of random market events can be captured.^{[36]}
For this approach, a statistical distribution (including its parameters) is chosen that approximates possible market factor changes. It randomly draws the returns of the market factors from that distribution. That does not mean that a distribution of returns is assumed, that distribution is simulated.^{[37]}
Which distribution and parameters are chosen can be decided from the designers of the risk measurement tool according to what they think adequately describes possible future changes in the market factors. However, because beliefs about future changes are normally based on observations from the past, designers technically choose a distribution that most adequately approximates past changes.^{[38]}
Akin historical simulation, in the case of multi-instrument portfolios, some extension of the methodology is required. First, it is likely that many more market factors exist. These factors must be identified and their relation to the instruments of the portfolio has to be expressed. Second, a joint distribution of possible market factor changes has to be found. Finally, similar to historic simulation, possible correlations between instruments have to be considered by netting profits against losses for each day in the sampling period.^{[39]}
The analytical method, also described as variance-covariance method or delta-normal approach, assumes that the underlying market factors have a multivariate normal distribution or another Gaussian distribution.^{[40]} In order to construct a distribution of portfolio profits and losses, which is also assumed to follow the chosen statistical standard pattern, the standard deviation of changes in the portfolio is needed. Similar to the recent section, the choice of an adequate distribution is usually based on experience, i.e. historical data. The determination of that standard distribution involves the calculation of all standard deviations of all instruments in the portfolio. In addition, all correlations among them have to be considered in order get the cumulative effects on the portfolio. After obtaining the portfolio distribution, the VaR is determined with standard mathematical properties of the assumed statistical standard distribution.^{[41]}
However, the estimation of all standard deviations and correlations has extensive data and calculation requirements, especially if a large number of market factors are involved. The data may either not exist or may be too costly and complex to acquire. In order to solve that problem, the analytical approach substitutes the portfolio with a simpler portfolio that has a comparable risk. In order to find a simple portfolio that approximates the original one, the set of market factors is downsized to a set of basic market factors that represent most of the portfolio changes. Then, a standardized financial instrument is allocated to every basic market factor that only reflects the risk of that particular factor. The result is a portfolio that contains standardized positions and that is exposed to the same risk as the “real” portfolio, because it has the same relation to the basic market factors. This methodology is known as risk mapping or decomposition. It reduces the data requirements and simplifies the computation because the number of standard deviations and correlations, which have to be considered, is reduced. Risk mapping allows for considerable flexibility concerning the selection of adequate market factors and standardized positions.^{[42]}
[...]
^{[1]} The evolution of analytical risk management tools is illustrated in Appendix A.
^{[2]} See: Leong, K. (1997), p. 41.
^{[3]} Appendicies B and C give an overview of the cost of financial insolvencies for the economy as well as a list of the ten worst losses attributed to derivatives.
^{[4]} Additional examples are Metallgesellschaft (1994, -$1.3 billion), Procter & Gamble (1994, -$157 million) and the savings & loans bail-out. See Jorion, P. (2001), pp. 36-42; Leong, K. (1997), p.41.
^{[5]} See: Group of Thirty (1993).
^{[6]} See: Group of Thirty (1994).
^{[7]} See: Bodnar, G. M. (1996), p. 124.
^{[8]} See: Bodnar, G. M., et al. (1998), p.84.
^{[9]} See: Reed, N. (1997), p. 23.
^{[10]} See: Jorion, P. (2001), p. 3; Joint Standards Australia/Standards New Zealand Committee OB/7 on Risk Management (1999), pp. 7-20.
^{[11]} See: Appendix D for information on categories of operational risk.
^{[12]} Those risks can only be avoided by an institution by not performing the operations, which would cause the risk exposure.
^{[13]} See: Jorion, P. (2001), p. 4.
^{[14]} See: Basel Committee on Banking Supervision (2001), pp. 7-8.
^{[15]} See: Jorion, P. (2001), pp. 4-20.
^{[16]} See: Basel Committee on Banking Supervision (2001), pp. 7-8.
^{[17]} See: Jorion, P. (2001), pp. 3-11; Hoffman, D. /Johnson, M. (1997), pp. 55-58.
^{[18]} See: Basel Committee on Banking Supervision (2001), pp. 7-8.
^{[19]} See: Reed, N. (1997), p. 23.
^{[20]} See: Linsmeier, T. J. /Pearson, N. D. (2000), p. 48.
^{[21]} See: Jorion, P. (2001), p. 22.
^{[22]} See: Reed, N. (1997), p. 24.
^{[23]} See: Linsmeier, T. J. /Pearson, N. D. (2000), pp. 48-49; Duffie, D. /Pan, J. (1997), p. 3; Dowd, K., et al. (2004), pp. 52-55.
^{[24]} See: Culp, C. L., et al. (1998), p. 3; Duffie, D. /Pan, J. (1997), p. 3.
^{[25]} See: Smithson, C. /Minton, L. (1997), p. 27.
^{[26]} See: Linsmeier, T. J. /Pearson, N. D. (2000), p. 49.
^{[27]} See: Culp, C. L., et al. (1998), p. 5.
^{[28]} See: Reed, N. (1997), p. 24.
^{[29]} See: Hendricks, D. (1996), pp. 39-40.
^{[30]} See: Culp, C. L., et al. (1998), p. 6.
^{[31]} See: Leong, K. (1997), pp. 44-45; Linsmeier, T. J. /Pearson, N. D. (2000), p. 50.
^{[32]} See: Smithson, C. /Minton, L. (1997), p. 28.
^{[33]} For a sampling size is one day, otherwise the sampling interval needs to be adapted.
^{[34]} See: Linsmeier, T. J. /Pearson, N. D. (2000), p. 50; Hendricks, D. (1996), pp. 43-45.
^{[35]} As the calculation of this high number of paths is very time consuming and needs immense computational power, techniques were developed to reduce that costliness, for a description see: Glasserman, P., et al. (2000)
^{[36]} See: Leong, K. (1997), Linsmeier, T. J. /Pearson, N. D. (2000), pp. 56-57.
^{[37]} See: Simons, K. (1996), p. 133; De Brouwer, P. (2001), p.319.
^{[38]} See: Linsmeier, T. J. /Pearson, N. D. (2000), pp. 56-57.
^{[39]} See: Ibid., p. 57.
^{[40]} See: De Brouwer, P. (2001), pp. 310-311.
^{[41]} See: Linsmeier, T. J. /Pearson, N. D. (2000), p. 53; Reed, N. (1997), p. 24.
^{[42]} See: Linsmeier, T. J. /Pearson, N. D. (2000), p. 54.
Masterarbeit, 87 Seiten
Wissenschaftlicher Aufsatz, 13 Seiten
Masterarbeit, 111 Seiten
Forschungsarbeit, 28 Seiten
BWL - Bank, Börse, Versicherung
Wissenschaftlicher Aufsatz, 29 Seiten
BWL - Beschaffung, Produktion, Logistik
Hausarbeit (Hauptseminar), 32 Seiten
Masterarbeit, 87 Seiten
Wissenschaftlicher Aufsatz, 13 Seiten
Masterarbeit, 111 Seiten
Forschungsarbeit, 28 Seiten
BWL - Bank, Börse, Versicherung
Wissenschaftlicher Aufsatz, 29 Seiten
BWL - Beschaffung, Produktion, Logistik
Hausarbeit (Hauptseminar), 32 Seiten
Der GRIN Verlag hat sich seit 1998 auf die Veröffentlichung akademischer eBooks und Bücher spezialisiert. Der GRIN Verlag steht damit als erstes Unternehmen für User Generated Quality Content. Die Verlagsseiten GRIN.com, Hausarbeiten.de und Diplomarbeiten24 bieten für Hochschullehrer, Absolventen und Studenten die ideale Plattform, wissenschaftliche Texte wie Hausarbeiten, Referate, Bachelorarbeiten, Masterarbeiten, Diplomarbeiten, Dissertationen und wissenschaftliche Aufsätze einem breiten Publikum zu präsentieren.
Kostenfreie Veröffentlichung: Hausarbeit, Bachelorarbeit, Diplomarbeit, Dissertation, Masterarbeit, Interpretation oder Referat jetzt veröffentlichen!