scispace - formally typeset
Search or ask a question

Showing papers by "Kenneth L. Judd published in 2006"


Posted Content
TL;DR: In this article, the authors examine models where individuals are difierentiated by two or more characteristics, and examine the numerical challenges posed by these problems, and show that the extra dimensionality produces substantively different results.
Abstract: Beginning with Mirrlees, the optimal taxation literature has generally focused on economies where individuals are difierentiated by only their productivity. In this paper we examine models where individuals are difierentiated by two or more characteristics, and the numerical challenges posed by these problems. We examine cases where individuals difier in productivity, elasticity of labor supply, and their "basic needs". We flnd that the extra dimensionality produces substantively difierent results. In particular, we flnd cases of negative marginal tax rates for some high productivity taxpayers. In our examples, income becomes a fuzzy signal of who should receive a subsidy under the planner’s objective, and the planner chooses less redistribution than it would in more homogeneous societies. We also show examine optimal taxation in an OLG model, and flnd that there is much less redistribution if the planner, as most governments do, does not discriminate on the basis of age. Multidimensional optimal tax problems are di‐cult nonlinear optimization problems because the linear independence constraint qualiflcation does not hold at all feasible points and often fails to hold at the solution. To robustly solve these nonlinear programs, we use SNOPT with an elastic-mode, which has been shown to be efiectively for degenerate nonlinear programs.

52 citations


Book ChapterDOI
TL;DR: The authors discusses the challenges faced by computational researchers and proposes some solutions, and discusses the difficulties faced by them in the field of economics and discusses some solutions to solve these problems, but also proposes new methodological issues.
Abstract: Computer technology presents economists with new tools, but also raises novel methodological issues. This essay discusses the challenges faced by computational researchers, and proposes some solutions.

41 citations


Journal ArticleDOI
TL;DR: In this article, the incentive equilibrium in differentiated product oligopoly was investigated and it was shown that managers are overcompensated at the margin for profits under linear demand and cost functions.
Abstract: Proposition 5 in our paper dealt with the incentive equilibrium in differentiated product oligopoly The proposition states that under linear demand and cost functions, the incentive equilibrium is such that managers are overcompensated at the margin for profits Given the correction of equation (24) above, we need to add that this proposition holds only when A (ba)c Note, however, that this condition has little impact on our analysis, since in our setting it is a necessary condition for having a duopolistic market in which both firms produce positive quantities while having nonnegative profits

26 citations


Journal ArticleDOI
TL;DR: In this article, the authors clarified their assumptions and contrast them with the examples in Bossaerts and Zame, and showed that the no-trade theorem for the dynamic Lucas infinite horizon economy with heterogeneous agents is an artifact of the assumption that asset dividends and individual endowments follow the same stationary finite state Markov process.

16 citations


Posted Content
TL;DR: A direct optimization approach to the problem of maximum likelihood estimation of games is presented and it is shown that it is significantly faster than the NFXP approach when applied to the canonical Zurcher bus repair model.
Abstract: Maximum likelihood estimation of structural models is regarded as computationally difficult by many who want to apply the Nested Fixed-Point approach We present a direct optimization approach to the problem and show that it is significantly faster than the NFXP approach when applied to the canonical Zurcher bus repair model The NFXP approach is inappropriate for estimating games since it requires finding all Nash equilibria of a game for each parameter vector considered, a generally intractable computational problem We reformulate the problem of maximum likelihood estimation of games as an optimization problem qualitatively no more difficult to solve than standard maximum likelihood estimation problems The direct optimization approach is also applicable to other structural estimation problems such as auctions and RBC models, and also to other estimation strategies, such as the methods of moments It is also easily implemented on standard software implementing state-of-the-art nonlinear programming algorithms

3 citations


Posted Content
TL;DR: In this article, the authors show that if all bonds have finite maturity and do not span the consol, then equilibrium will devitate, often significantly, from two-fund separation even with the classical preference assumptions.
Abstract: The two-fund separation theorem from static porfolio analysis generalizes to dynamic Lucas-style asset model only when a consol is presemt. If all bonds have finite maturity and do not span the consol, then equilibrium will devitate, often significantly, from two-fund separation even with the classical preference assumptions. Furthermore, equilibrium bond trading volume is unrealistically large, particularly for long-term bond, and would be very costly in the presence of transaction costs. We demonstrate that investors choosing two-fund portfolios with bond ladders that approximately replicate consols do almost as well as traders with equilibrium investment strategies. This result is enhanced by adding bonds to the collection of assets even if they are not necessary for spanning. In the light of these results, we argue that transaction cost considerations make portfolios using two-fund separation and bond laddering nearly optimal investment strategies.

2 citations


Posted Content
TL;DR: Methods from approximation theory, numerical quadrature, and symbolic computation that have helped economists tackle high-dimensional problems, and current work that will further reduce the computational cost of multidimensional problems are surveyed.
Abstract: Economic analysis often leads to multidimensional numerical problems. The {\em Curse of Dimensionality\/} often leads researchers to adopt methods designed for very high-dimension problems, but inefficient for problems of intermediate dimension. However, a little mathematics can greatly help dealing with the {\em Curse\/}. We will survey methods from approximation theory, numerical quadrature, and symbolic computation that have helped economists tackle high-dimensional problems, and current work that will further reduce the computational cost of multidimensional problems.

2 citations