scispace - formally typeset
Search or ask a question
Author

Jean-Marc Eber

Bio: Jean-Marc Eber is an academic researcher. The author has contributed to research in topics: Coherent risk measure & Dynamic risk measure. The author has an hindex of 4, co-authored 4 publications receiving 8519 citations.

Papers
More filters
Journal ArticleDOI
TL;DR: In this paper, the authors present and justify a set of four desirable properties for measures of risk, and call the measures satisfying these properties "coherent", and demonstrate the universality of scenario-based methods for providing coherent measures.
Abstract: In this paper we study both market risks and nonmarket risks, without complete markets assumption, and discuss methods of measurement of these risks. We present and justify a set of four desirable properties for measures of risk, and call the measures satisfying these properties “coherent.” We examine the measures of risk provided and the related actions required by SPAN, by the SEC=NASD rules, and by quantile-based methods. We demonstrate the universality of scenario-based methods for providing coherent measures. We offer suggestions concerning the SEC method. We also suggest a method to repair the failure of subadditivity of quantile-based methods.

8,651 citations

Journal ArticleDOI
TL;DR: In this article, a time-0 coherent risk measure is defined for value processes and two other constructions of measurement processes are given in terms of sets of test probabilities, when the sets fulfill a stability condition also met in multi-period treatment of ambiguity as in decision-making.
Abstract: Starting with a time-0 coherent risk measure defined for “value processes”, we also define risk measurement processes. Two other constructions of measurement processes are given in terms of sets of test probabilities. These latter constructions are identical and are related to the former construction when the sets fulfill a stability condition also met in multiperiod treatment of ambiguity as in decision-making. We finally deduce risk measurements for the final value of locked-in positions and repeat a warning concerning Tail-Value-at-Risk.

450 citations

01 Jan 2002
TL;DR: In this paper, a multi-period extension of the Tail VaR model is presented, which takes into account intermediate monitoring by supervisors or shareholders of a locked-in position, the possibility of intermediate actions, availability of extraneous cash flows, of possible capital inor outflows require handling sequences of unknown future values.
Abstract: We explain why and how to deal with the definition, acceptability, computation and management of risk in a genuinely multitemporal way. Coherence axioms provide a representation of a risk-adjusted valuation. Some special cases of practical interest allowing for easy recursive computations are presented. The multiperiod extension of Tail VaR is discussed. 1. NEW QUESTIONS WITH MULTIPERIOD RISK RISK EVOLVING OVER SEVERAL PERIODS of uncertainty is different from one-period risk in many ways. An analysis of multiperiod risk requires consideration of new issues, since: availability of information may require taking into account intermediate monitoring by supervisors or shareholders of a locked-in position, the possibility of intermediate actions, availability of extraneous cash flows, of possible capital inor outflows require handling sequences of unknown future “values”. A PORTFOLIO OR A STRATEGY built over several periods should be analyzed with respect to these issues. We attempt to: distinguish models of future worth at the end of a holding period from models in which successive values or cash flows are examined, and are subject to some investment/financing strategy. give some information about the necessity and/or availability of remedial funding at some intermediate date either in the case of sudden loss or in the case of insolvency of the firm, as urged for example in [Be] (notice that one-period models considered neither the source of (extra-) capital at the beginning of the holding period nor the actual consequences of a “bad event” at the end of the same period). take into account the actual time evolution of risk and of available capital. Study whether a relevant risk-adjusted measurement should consider more than the distribution of final net worth of a strategy, to decide upon its acceptability at the initial date. distinguish between the opinion of a risk manager on some strategy, and the attitude of a supervisor/regulator who, at any date, considers only the This research has benefited from support by PriceWaterhouseCoopers and by RiskLab, ETH Zurich. 1 current portfolio, refusing to take into account future possible changes in the composition of the portfolio (see the example in Section 8). Remark. With one period of uncertainty, capital appeared both as a buffer at the initial date and as wealth at the final date. Intermediate dates raise the question of the nature of capital (valued in a market or accounting way) at such dates. 2. REVIEW OF ONE PERIOD COHERENT ACCEPTABILITY COHERENT ONE PERIOD RISK ADJUSTED VALUES’ theory is best approached (see [ADEH1], p. 69, [ADEH2], Section 2.2, as well as [He]) by taking the primitive object to be an “acceptance set”, that is a set of acceptable future net worths, also called simply “values”. This set is supposed to satisfy some “coherence” requirements. If we assume here (as well as in following sections) a zero interest rate for simplicity, the representation result states the following: for any acceptance set, there exists a set P of probability distributions (called generalised scenarios or test probabilities) on the space Ω of states of nature, such that a given position, with future (random) value denoted by X, is acceptable if and only if: For each test probability P ∈ P, the expected value of the future net worth under P, i.e. EP[X], is non-negative. The risk-adjusted value π(X) of a future net worth X is defined as follows: compute, under each test probability P ∈ P , the average of the future net worth X of the position, in formula EP [X], take the minimum of all numbers found above, which corresponds to the formula π(X) = infP∈P EP [X]. The axioms of coherent risk measures, well known by now (see[ADEH1]), translate for coherent risk-adjusted values into: monotonicity: if X ≥ Y then π(X) ≥ π(Y ), translation invariance: if a is a constant then π(a · 1 + X) = a + π(X), positive homogeneity: if λ ≥ 0 then π(λ ·X) = λ · π(X), superadditivity: π(X + Y ) ≥ π(X) + π(Y ). Remark. The risk measure ρ(X) for X studied in [ADEH1] and [ADEH2], is simply the negative of the risk adjusted value π(X) for X. The change of sign will simplify the treatment of measures of successive risks. 3. COHERENT MULTIPERIOD RISK-ADJUSTED VALUE THE CASE OF T PERIODS OF UNCERTAINTY will be described here in the language of trees. As noted already by one of the authors, they allow for some things “more easily done than said”, and we first need to define a few terms. We represent the availability of information over time by the set Ω of “states of nature” at date T and, for each date t = 0, ... , T , the partition Nt of Ω consisting of the set of smallest events which by date t are declared to obtain or not. These events are “tagged” by the date t and are called the nodes of the tree at date t. We use for such a node n the notation (n, t(n)) or n× {t(n)}. The partition Nt+1 is a refinement of the partition Nt and this provides the ancestorship relation of (m, t) to (n, t + 1) by means of the inclusion n ⊂ m. 2 For example, the “three period (four date) binomial tree” can be described in two ways (see Figure 1) by Ω = N3 = {[uuu], [uud], [udu], [udd], [duu], [dud], [ddu], [ddd]}, N2 = {[uu], [ud], [du], [dd]},N1 = {[u], [d]}, N0 = {[]}. The ancestorship relation amounts to suppress the right hand letter in each word based on u and d and the tagging amounts to count the number of letters within the brackets. From now on we shall most of the time neglect to write the brackets [ and ]. {ω1, ... , ω8} × {0} {ω1, ω2, ω3, ω4} × {1} t t t t t t t t t t t t {ω1, ω2} × {2} j j j j j j {ω1} × {3} d d d

91 citations


Cited by
More filters
Journal ArticleDOI
TL;DR: In this paper, a new approach to optimize or hedging a portfolio of financial instruments to reduce risk is presented and tested on applications, which focuses on minimizing Conditional Value-at-Risk (CVaR) rather than minimizing Value at Risk (VaR), but portfolios with low CVaR necessarily have low VaR as well.
Abstract: A new approach to optimizing or hedging a portfolio of nancial instruments to reduce risk is presented and tested on applications. It focuses on minimizing Conditional Value-at-Risk (CVaR) rather than minimizing Value-at-Risk (VaR), but portfolios with low CVaR necessarily have low VaR as well. CVaR, also called Mean Excess Loss, Mean Shortfall, or Tail VaR, is anyway considered to be a more consistent measure of risk than VaR. Central to the new approach is a technique for portfolio optimization which calculates VaR and optimizes CVaR simultaneously. This technique is suitable for use by investment companies, brokerage rms, mutual funds, and any business that evaluates risks. It can be combined with analytical or scenario-based methods to optimize portfolios with large numbers of instruments, in which case the calculations often come down to linear programming or nonsmooth programming. The methodology can be applied also to the optimization of percentiles in contexts outside of nance.

5,622 citations

Journal ArticleDOI
TL;DR: Fundamental properties of conditional value-at-risk are derived for loss distributions in finance that can involve discreetness and provides optimization shortcuts which, through linear programming techniques, make practical many large-scale calculations that could otherwise be out of reach.
Abstract: Fundamental properties of conditional value-at-risk (CVaR), as a measure of risk with significant advantages over value-at-risk (VaR), are derived for loss distributions in finance that can involve discreetness. Such distributions are of particular importance in applications because of the prevalence of models based on scenarios and finite sampling. CVaR is able to quantify dangers beyond VaR and moreover it is coherent. It provides optimization short-cuts which, through linear programming techniques, make practical many large-scale calculations that could otherwise be out of reach. The numerical efficiency and stability of such calculations, shown in several case studies, are illustrated further with an example of index tracking.

3,010 citations

Posted Content
TL;DR: In this article, the authors provide a general framework for integration of high-frequency intraday data into the measurement, modeling and forecasting of daily and lower frequency volatility and return distributions.
Abstract: This paper provides a general framework for integration of high-frequency intraday data into the measurement, modeling and forecasting of daily and lower frequency volatility and return distributions. Most procedures for modeling and forecasting financial asset return volatilities, correlations, and distributions rely on restrictive and complicated parametric multivariate ARCH or stochastic volatility models, which often perform poorly at intraday frequencies. Use of realized volatility constructed from high-frequency intraday returns, in contrast, permits the use of traditional time series procedures for modeling and forecasting. Building on the theory of continuous-time arbitrage-free price processes and the theory of quadratic variation, we formally develop the links between the conditional covariance matrix and the concept of realized volatility. Next, using continuously recorded observations for the Deutschemark/Dollar and Yen /Dollar spot exchange rates covering more than a decade, we find that forecasts from a simple long-memory Gaussian vector autoregression for the logarithmic daily realized volatitilies perform admirably compared to popular daily ARCH and related models. Moreover, the vector autoregressive volatility forecast, coupled with a parametric lognormal-normal mixture distribution implied by the theoretically and empirically grounded assumption of normally distributed standardized returns, gives rise to well-calibrated density forecasts of future returns, and correspondingly accurate quintile estimates. Our results hold promise for practical modeling and forecasting of the large covariance matrices relevant in asset pricing, asset allocation and financial risk management applications.

2,898 citations

Journal ArticleDOI
TL;DR: In this article, the authors provide a general framework for integration of high-frequency intraday data into the measurement, modeling, and forecasting of daily and lower frequency volatility and return distributions.
Abstract: This paper provides a general framework for integration of high-frequency intraday data into the measurement, modeling, and forecasting of daily and lower frequency volatility and return distributions. Most procedures for modeling and forecasting financial asset return volatilities, correlations, and distributions rely on restrictive and complicated parametric multivariate ARCH or stochastic volatility models, which often perform poorly at intraday frequencies. Use of realized volatility constructed from high-frequency intraday returns, in contrast, permits the use of traditional time series procedures for modeling and forecasting. Building on the theory of continuous-time arbitrage-free price processes and the theory of quadratic variation, we formally develop the links between the conditional covariance matrix and the concept of realized volatility. Next, using continuously recorded observations for the Deutschemark / Dollar and Yen / Dollar spot exchange rates covering more than a decade, we find that forecasts from a simple long-memory Gaussian vector autoregression for the logarithmic daily realized volatilities perform admirably compared to popular daily ARCH and related models. Moreover, the vector autoregressive volatility forecast, coupled with a parametric lognormal-normal mixture distribution implied by the theoretically and empirically grounded assumption of normally distributed standardized returns, gives rise to well-calibrated density forecasts of future returns, and correspondingly accurate quantile estimates. Our results hold promise for practical modeling and forecasting of the large covariance matrices relevant in asset pricing, asset allocation and financial risk management applications.

2,823 citations

Book
16 Oct 2005
TL;DR: The most comprehensive treatment of the theoretical concepts and modelling techniques of quantitative risk management can be found in this paper, where the authors describe the latest advances in the field, including market, credit and operational risk modelling.
Abstract: This book provides the most comprehensive treatment of the theoretical concepts and modelling techniques of quantitative risk management. Whether you are a financial risk analyst, actuary, regulator or student of quantitative finance, Quantitative Risk Management gives you the practical tools you need to solve real-world problems. Describing the latest advances in the field, Quantitative Risk Management covers the methods for market, credit and operational risk modelling. It places standard industry approaches on a more formal footing and explores key concepts such as loss distributions, risk measures and risk aggregation and allocation principles. The book's methodology draws on diverse quantitative disciplines, from mathematical finance and statistics to econometrics and actuarial mathematics. A primary theme throughout is the need to satisfactorily address extreme outcomes and the dependence of key risk drivers. Proven in the classroom, the book also covers advanced topics like credit derivatives. Fully revised and expanded to reflect developments in the field since the financial crisis Features shorter chapters to facilitate teaching and learning Provides enhanced coverage of Solvency II and insurance risk management and extended treatment of credit risk, including counterparty credit risk and CDO pricing Includes a new chapter on market risk and new material on risk measures and risk aggregation

2,580 citations