scispace - formally typeset
Search or ask a question
Author

Michael B. Gordy

Bio: Michael B. Gordy is an academic researcher from Federal Reserve System. The author has contributed to research in topics: Credit risk & Common value auction. The author has an hindex of 20, co-authored 46 publications receiving 2931 citations.

Papers
More filters
Journal ArticleDOI
TL;DR: This paper showed that ratings-based capital rules, including both the current Basel Accord and its proposed revision, can be reconciled with the general class of credit value-at-risk models.

728 citations

Journal ArticleDOI
TL;DR: In this article, a comparative anatomy of two especially influential benchmarks for credit risk models, the RiskMetrics Group's CreditMetrics and Credit Suisse Financial Product's CreditRisk, is presented.
Abstract: Within the past two years, important advances have been made in modeling credit risk at the portfolio level. Practitioners and policy makers have invested in implementing and exploring a variety of new models individually. Less progress has been made, however, with comparative analyses. Direct comparison often is not straightforward, because the diAerent models may be presented within rather diAerent mathematical frameworks. This paper oAers a comparative anatomy of two especially influential benchmarks for credit risk models, the RiskMetrics Group’s CreditMetrics and Credit Suisse Financial Product’s CreditRisk a . We show that, despite diAerences on the surface, the underlying mathematical structures are similar. The structural parallels provide intuition for the relationship between the two models and allow us to describe quite precisely where the models diAer in functional form, distributional assumptions, and reliance on approximation formulae. We then design simulation exercises which evaluate the eAect of each of these diAerences individually. ” 2000 Elsevier Science B.V. All rights reserved.

639 citations

Journal ArticleDOI
TL;DR: The authors showed that the marginal impact of introducing Basel II depends strongly on the extent to which market discipline leads banks to vary lending standards procyclically in the absence of binding regulation.

460 citations

Journal ArticleDOI
TL;DR: In this paper, the authors show that a relatively small number of trials in the inner step can yield accurate estimates, and they analyze how a fixed computational budget may be allocated to the inner and the outer step to minimize the mean square error of the resultant estimator.
Abstract: Risk measurement for derivative portfolios almost invariably calls for nested simulation. In the outer step, one draws realizations of all risk factors up to the horizon, and in the inner step, one reprices each instrument in the portfolio at the horizon conditional on the drawn risk factors. Practitioners may perceive the computational burden of such nested schemes to be unacceptable and adopt a variety of second-best pricing techniques to avoid the inner simulation. In this paper, we question whether such short cuts are necessary. We show that a relatively small number of trials in the inner step can yield accurate estimates, and we analyze how a fixed computational budget may be allocated to the inner and the outer step to minimize the mean square error of the resultant estimator. Finally, we introduce a jackknife procedure for bias reduction.

134 citations

Journal ArticleDOI
TL;DR: It is shown that a relatively small number of trials in the inner step can yield accurate estimates, and how a fixed computational budget may be allocated to the inner and the outer step to minimize the mean square error of the resultant estimator is analyzed.
Abstract: Risk measurement for derivative portfolios almost invariably calls for nested simulation. In the outer step one draws realizations of all risk factors up to the horizon, and in the inner step one re-prices each instrument in the portfolio at the horizon conditional on the drawn risk factors. Practitioners may perceive the computational burden of such nested schemes to be unacceptable, and adopt a variety of second-best pricing techniques to avoid the inner simulation. In this paper, we question whether such short cuts are necessary. We show that a relatively small number of trials in the inner step can yield accurate estimates, and analyze how a fixed computational budget may be allocated to the inner and the outer step to minimize the mean square error of the resultant estimator. Finally, we introduce a jackknife procedure for bias reduction and a dynamic allocation scheme for improved efficiency.

123 citations


Cited by
More filters
Book
01 Jan 2001
TL;DR: This is the essential companion to Jeffrey Wooldridge's widely-used graduate text Econometric Analysis of Cross Section and Panel Data (MIT Press, 2001).
Abstract: The second edition of this acclaimed graduate text provides a unified treatment of two methods used in contemporary econometric research, cross section and data panel methods. By focusing on assumptions that can be given behavioral content, the book maintains an appropriate level of rigor while emphasizing intuitive thinking. The analysis covers both linear and nonlinear models, including models with dynamics and/or individual heterogeneity. In addition to general estimation frameworks (particular methods of moments and maximum likelihood), specific linear and nonlinear methods are covered in detail, including probit and logit models and their multivariate, Tobit models, models for count data, censored and missing data schemes, causal (or treatment) effects, and duration analysis. Econometric Analysis of Cross Section and Panel Data was the first graduate econometrics text to focus on microeconomic data structures, allowing assumptions to be separated into population and sampling assumptions. This second edition has been substantially updated and revised. Improvements include a broader class of models for missing data problems; more detailed treatment of cluster problems, an important topic for empirical researchers; expanded discussion of "generalized instrumental variables" (GIV) estimation; new coverage (based on the author's own recent research) of inverse probability weighting; a more complete framework for estimating treatment effects with panel data, and a firmly established link between econometric approaches to nonlinear panel data and the "generalized estimating equation" literature popular in statistics and other fields. New attention is given to explaining when particular econometric methods can be applied; the goal is not only to tell readers what does work, but why certain "obvious" procedures do not. The numerous included exercises, both theoretical and computer-based, allow the reader to extend methods covered in the text and discover new insights.

28,298 citations

Book
16 Oct 2005
TL;DR: The most comprehensive treatment of the theoretical concepts and modelling techniques of quantitative risk management can be found in this paper, where the authors describe the latest advances in the field, including market, credit and operational risk modelling.
Abstract: This book provides the most comprehensive treatment of the theoretical concepts and modelling techniques of quantitative risk management. Whether you are a financial risk analyst, actuary, regulator or student of quantitative finance, Quantitative Risk Management gives you the practical tools you need to solve real-world problems. Describing the latest advances in the field, Quantitative Risk Management covers the methods for market, credit and operational risk modelling. It places standard industry approaches on a more formal footing and explores key concepts such as loss distributions, risk measures and risk aggregation and allocation principles. The book's methodology draws on diverse quantitative disciplines, from mathematical finance and statistics to econometrics and actuarial mathematics. A primary theme throughout is the need to satisfactorily address extreme outcomes and the dependence of key risk drivers. Proven in the classroom, the book also covers advanced topics like credit derivatives. Fully revised and expanded to reflect developments in the field since the financial crisis Features shorter chapters to facilitate teaching and learning Provides enhanced coverage of Solvency II and insurance risk management and extended treatment of credit risk, including counterparty credit risk and CDO pricing Includes a new chapter on market risk and new material on risk measures and risk aggregation

2,580 citations

Book ChapterDOI
01 Jan 1998

1,532 citations

Journal ArticleDOI
TL;DR: In this paper, the authors argue that insufficient attention has so far been paid to the link between monetary policy and the perception and pricing of risk by economic agents - what might be termed the "risk-taking channel" of monetary policy.
Abstract: Few areas of monetary economics have been studied as extensively as the transmission mechanism. The literature on this topic has evolved substantially over the years, following the waxing and waning of conceptual frameworks and the changing characteristics of the financial system. In this paper, taking as a starting point a brief overview of the extant work on the interaction between capital regulation, the business cycle and the transmission mechanism, we offer some broader reflections on the characteristics of the transmission mechanism in light of the evolution of the financial system. We argue that insufficient attention has so far been paid to the link between monetary policy and the perception and pricing of risk by economic agents - what might be termed the "risk-taking channel" of monetary policy. We develop the concept, compare it with current views of the transmission mechanism, explore its mutually reinforcing link with "liquidity" and analyse its interaction with monetary policy reaction functions. We argue that changes in the financial system and prudential regulation may have increased the importance of the risk-taking channel and that prevailing macroeconomic paradigms and associated models are not well suited to capturing it, thereby also reducing their effectiveness as guides to monetary policy.

1,365 citations

Journal ArticleDOI
TL;DR: In this article, the authors proposed a new approach to sparsity called the horseshoe estimator, which is a member of the same family of multivariate scale mixtures of normals.
Abstract: This paper proposes a new approach to sparsity called the horseshoe estimator. The horseshoe is a close cousin of other widely used Bayes rules arising from, for example, double-exponential and Cauchy priors, in that it is a member of the same family of multivariate scale mixtures of normals. But the horseshoe enjoys a number of advantages over existing approaches, including its robustness, its adaptivity to dierent sparsity patterns, and its analytical tractability. We prove two theorems that formally characterize both the horseshoe’s adeptness at large outlying signals, and its super-ecient rate of convergence to the correct estimate of the sampling density in sparse situations. Finally, using a combination of real and simulated data, we show that the horseshoe estimator corresponds quite closely to the answers one would get by pursuing a full Bayesian model-averaging approach using a discrete mixture prior to model signals and noise.

1,260 citations