scispace - formally typeset
Search or ask a question

Showing papers in "Journal of Risk in 2000"


Journal ArticleDOI
TL;DR: In this paper, a new approach to optimize or hedging a portfolio of financial instruments to reduce risk is presented and tested on applications, which focuses on minimizing Conditional Value-at-Risk (CVaR) rather than minimizing Value at Risk (VaR), but portfolios with low CVaR necessarily have low VaR as well.
Abstract: A new approach to optimizing or hedging a portfolio of nancial instruments to reduce risk is presented and tested on applications. It focuses on minimizing Conditional Value-at-Risk (CVaR) rather than minimizing Value-at-Risk (VaR), but portfolios with low CVaR necessarily have low VaR as well. CVaR, also called Mean Excess Loss, Mean Shortfall, or Tail VaR, is anyway considered to be a more consistent measure of risk than VaR. Central to the new approach is a technique for portfolio optimization which calculates VaR and optimizes CVaR simultaneously. This technique is suitable for use by investment companies, brokerage rms, mutual funds, and any business that evaluates risks. It can be combined with analytical or scenario-based methods to optimize portfolios with large numbers of instruments, in which case the calculations often come down to linear programming or nonsmooth programming. The methodology can be applied also to the optimization of percentiles in contexts outside of nance.

5,622 citations


Journal ArticleDOI
TL;DR: The problem of how to specify a correlation matrix occurs in several important areas of finance and of risk management as mentioned in this paper, where the most important desideratum is the recovery of the real-world correlation matrix, the problem is in principle well defined and readily solvable by means of well established statistical techniques.
Abstract: The problem of how to specify a correlation matrix occurs in several important areas of finance and of risk management. A few of the important applications are, for instance, the specification of a (possibly time-dependent) instantaneous correlation matrix in the context of the BGM interestrate option models, stress-testing and scenario analysis for market risk management purposes, or the specification of a correlation matrix amongst a large number of obligors for credit-derivative pricing or credit risk management. For those applications where the most important desideratum is the recovery of the real-world correlation matrix, the problem is in principle well defined and readily solvable by means of wellestablished statistical techniques. In practice, however, the estimation problems can be severe: a small number of outliers, for instance, can seriously “pollute” a sample; non-synchronous data can easily destroy or hide correlation patterns; and the discontinuities in the correlation surface amongst forward rates when moving from deposit rates to the future strip, and from the latter to the swap market are well known to practitioners. In all these cases, the user often resorts to bestfitting the “noisy” elements of the sample correlation matrix by means of a plausible parametric function. This if, for instance, the route taken by Rebonato [8] for his calibration of the BGM

236 citations


Journal ArticleDOI
TL;DR: In this paper, O'Brien et al. proposed folding the stress-tests into the risk model, thereby requiring all scenarios to be assigned probabilities, which is the standard approach to stress-testing.
Abstract: In recent months and years the idea of supplementing VaR estimates with "stress- testing" has been met with lavish praise and has worked its way into all sorts of regulatory documents. The call for more and better stress-testing has become a mantra for risk-managers and regulators. In the present paper, we hold the standard approach to stress-testing up to a critical light. The current practice is to stress-test outside the basic risk model. Such an approach yields two sets of forecasts -- one from the stress-tests and one from the basic model. The stress scenarios, conducted outside the model, are never explicitly assigned probabilities. As such, there is no guidance as to the importance or relevance of the results of stress-tests. Moreover, how to combine the two forecasts into a usable risk metric is not known. Instead, we suggest folding the stress-tests into the risk model, thereby requiring all scenarios to be assigned probabilities. Acknowledgements: I gratefully acknowledge helpful input from Jim O'Brien, Matt Pritsker, Pat Parkinson and Pat White. Any remaining errors and inaccuracies are mine. The opinions expressed do not necessarily represent those of the Federal Reserve Board or its staff.

135 citations



Journal ArticleDOI
TL;DR: In this article, an empirical analysis of electric power prices using data from twelve regional markets is presented, where the authors show that price behavior changes with each regional market so that a firm that seek to value or hedge power based contracts must use instruments from the region in which it operates.
Abstract: This paper contains an empirical analysis of electric power prices using data from twelve regional markets. A central feature of the paper is the explicit recognition that it is not possible to store power or carry a negative inventory. Our objective is to characterize and explain the high degree of autocorrelation and seasonality in power prices and address salient issues that are pertinent for the valuation and hedging of power based financial contracts. We show that price behavior changes with each regional market so that a firm that seek to value or hedge power based contracts must use instruments from the region in which it operates.

70 citations



Journal ArticleDOI

44 citations



Journal ArticleDOI

39 citations


Journal ArticleDOI
TL;DR: In this article, the authors proposed a new model for evaluating credit risk of a portfolio consisting of interest rate sensitive assets, which is distinguished from existing risk valuation models such as CreditMetrics™ or CREDITRisk+ by (1) the dynamics of the default-free interest rate as well as hazard rate pro- cesses of defaultable assets are described by stochastic differential equations; and (2) prices of individual assets are evaluated by the single riskneutral valuation framework.
Abstract: This paper proposes a new model for evaluating credit risk of a portfolio consisting of interest rate sensitive assets. Our model is distinguished from existing risk valuation models such as CreditMetrics™ or CREDITRISK+ by (1) the dynamics of the default-free interest rate as well as hazard rate pro- cesses of defaultable assets are described by stochastic differential equations; and (2) prices of individual assets are evaluated by the single risk-neutral valuation framework. It is then possible to evaluate not only credit risk but also market risk of the portfolio in a synthetic manner. It is shown that value at risk (VaR) of the portfolio is approximately evaluated as a closed form solution.

37 citations


Journal ArticleDOI
TL;DR: In this paper, the authors show that the correlation between correlation and volatility can lead to a downward bias in the estimated value-at-risk, and propose a number of pragmatic approaches that risk managers might adopt for dealing with this issue.
Abstract: Many popular techniques for determining a securities firm's value-at-risk are based upon the calculation of the historical volatility of returns to the assets that comprise the portfolio and of the correlations between them. One such approach is the JP Morgan RiskMetrics methodology using Markowitz portfolio theory. An implicit assumption underlying this methodology is that the volatilities and correlations are constant throughout the sample period and, in particular, that they are not systematically related to one another. However, it has been suggested in a number of studies that the correlation between markets increases when the individual volatilities are high. This paper demonstrates that this type of relationship between correlation and volatility can lead to a downward bias in the estimated value-at-risk, and proposes a number of pragmatic approaches that risk managers might adopt for dealing with this issue.


Journal ArticleDOI
TL;DR: The authors reevaluated the mathematical and economic meaning of no arbitrage in frictionless markets and showed that arbitrage and no private monetary value components are not equivalent to the existence of an equivalent martingale measure.
Abstract: This paper reevaluates the mathematical and economic meaning of no arbitrage in frictionless markets. Contrary to the traditional view, no arbitrage is not generally equivalent to the existence of an equivalent martingale measure. Departures from this equivalence allow asset prices to contain a monetary component. The refined view is that no arbitrage and no private monetary value components are equivalent to the existence of an equivalent martingale measure. The implications of prices having a monetary value component for option pricing are discussed.






Journal ArticleDOI
TL;DR: Property and characteristics of the approaches found in the literature are presented and two new approaches are introduced to study the quality of cashflow maps used in the computation of value-at-risk (VaR).
Abstract: This article is devoted to the study cashflow maps used in the computation of value-at-risk (VaR). Properties and characteristics of the approaches found in the literature are presented and two new approaches are introduced. The goal of this paper is to study the quality of these maps. This is done by calculating the risk induced by the difference between the mapped cashflows and the original one.