scispace - formally typeset
Search or ask a question

Showing papers in "Research Papers in Economics in 1986"


Posted Content
TL;DR: In this article, a linearization of a rational expectations present value model for corporate stock prices produces a simple relation between the log dividend-price ratio and mathematical expectations of future log real dividend changes and future real discount rates.
Abstract: A linearization of a rational expectations present value model for corporate stock prices produces a simple relation between the log dividend-price ratio and mathematical expectations of future log real dividend changes and future real discount rates. This relation can be tested using vector autoregressive methods. Three versions of the linearized model, differing in the measure of discount rates, are tested for United States time series 1981-1986: versions using real interest rate data. The results yield a metric to judge the relative importance of real dividend growth, measured real discount rates and unexplained factors in determining the dividend-price ratio.

3,367 citations


Posted Content
TL;DR: In this article, conditions for obtaining cosistency and asymptotic normality of a very general class of estimators (extremum estimators) are presented, and the results are also extended to two-step estimators.
Abstract: Asymptotic distribution theory is the primary method used to examine the properties of econometric estimators and tests. We present conditions for obtaining cosistency and asymptotic normality of a very general class of estimators (extremum estimators). Consistent asymptotic variance estimators are given to enable approximation of the asymptotic distribution. Asymptotic efficiency is another desirable property then considered. Throughout the chapter, the general results are also specialized to common econometric estimators (e.g. MLE and GMM), and in specific examples we work through the conditions for the various results in detail. The results are also extended to two-step estimators (with finite-dimensional parameter estimation in the first step), estimators derived from nonsmooth objective functions, and semiparametric two-step estimators (with nonparametric estimation of an infinite-dimensional parameter in the first step). Finally, the trinity of test statistics is considered within the quite general setting of GMM estimation, and numerous examples are given.

2,145 citations


Book ChapterDOI
TL;DR: The authors presented at the World Congress of the Econometric Society, Cambridge, Massachusetts, 1985, The authors, a paper that was later used at the International Journal of Mathematical Information.
Abstract: This paper was presented at the World Congress of the Econometric Society, Cambridge, Massachusetts, 1985

1,454 citations


Posted Content
TL;DR: In this article, an examination of data on labor input and the quantity of output reveals that most U.S. industries have marginal costs far below their prices, based on the empirical finding that cyclical variations in labor input are small compared to variations in output.
Abstract: An examination of data on labor input and the quantity of output reveals that most U.S. industries have marginal costs far below their prices. The corilusion rests on the empirical finding that cyclical variations in labor input are small compared to variations in output. In booms, firms produce substantially more output and sell it for a price that exceeds the costs of the added inputs. The paper documents the disparity between price and marginal cost,where marginal cost is estimated from variations in cost from one year to the next. It considers a wide variety of explanations of the flndings that are consistent with competition, but none is found to be plausible.

1,371 citations


Posted Content
TL;DR: This paper showed that large exchange rate shocks may shift historical relationships between exchange rates and trade flows, such as the rise of the dollar from 1980 to 1985, and that large capital inflow, which leads to an initial appreciation, can result in a persistent reduction in the exchange rate consistent with trade balance.
Abstract: This paper presents a theoretical basis fcr the srgunent that large exchange rate shocks - such as the rise of the dollar from 1980 to 1985 - may shift historical relationships between exchange rates and trade flows. We begin with partial models in which large exchange rate fluctuations lead to entry or exit decisions that are not reversed when the currency returns to its previous level. When we develop a simple model of the feedback from "hysteresis" in trade to the exchange rate itself. Here we see that a large capital inflow, which leads to an initial appreciation, can result in a persistent reduction in the exchange rate consistent with trade balance.

799 citations


Posted ContentDOI
TL;DR: In this paper, the authors characterize preference relations over acts which have a numerical representation by the functional J(f) = min > {∫ uo f dP / P∈C } where f is an act, u is a von Neumann-Morgenstern utility over outcomes, and C is a closed and convex set of finitely additive probability measures on the states of nature.
Abstract: Acts are functions from states of nature into finite-support distributions over a set of 'deterministic outcomes'. We characterize preference relations over acts which have a numerical representation by the functional J(f) = min > {∫ uo f dP / P∈C } where f is an act, u is a von Neumann-Morgenstern utility over outcomes, and C is a closed and convex set of finitely additive probability measures on the states of nature. In addition to the usual assumptions on the preference relation as transitivity, completeness, continuity and monotonicity, we assume uncertainty aversion and certainty-independence. The last condition is a new one and is a weakening of the classical independence axiom: It requires that an act f is preferred to an act g if and only if the mixture of f and any constant act h is preferred to the same mixture of g and h. If non-degeneracy of the preference relation is also assumed, the convex set of priors C is uniquely determined. Finally, a concept of independence in case of a non-unique prior is introduced. (This abstract was borrowed from another version of this item.)

742 citations


Posted Content
TL;DR: In this paper, the authors present the developments in time series analysis and forecasting theory and practice and discuss the application of time series procedures in mainstream economic theory and econometric model building.
Abstract: Economic Theory, Econometrics, and Mathematical Economics, Second Edition: Forecasting Economic Time Series presents the developments in time series analysis and forecasting theory and practice. This book discusses the application of time series procedures in mainstream economic theory and econometric model building. Organized into 10 chapters, this edition begins with an overview of the problem of dealing with time series possessing a deterministic seasonal component. This text then provides a description of time series in terms of models known as the time-domain approach. Other chapters consider an alternative approach, known as spectral or frequency-domain analysis, that often provides useful insights into the properties of a series. This book discusses as well a unified approach to the fitting of linear models to a given time series. The final chapter deals with the main advantage of having a Gaussian series wherein the optimal single series, least-squares forecast will be a linear forecast. This book is a valuable resource for economists.

701 citations


Posted Content
TL;DR: In this paper, the authors study temporal volatility patterns in seven nominal dollar spot exchange rates, all of which display strong evidence of autoregressive conditional heteroskedasticity (ARCH).
Abstract: We study temporal volatility patterns in seven nominal dollar spot exchange rates, all of which display strong evidence of autoregressive conditional heteroskedasticity (ARCH). We first formulate and estimate univariate models, the results of which are subsequently used to guide specification of a multivariate model. The key element of our multivariate approach is exploitation of factor structure, which facilitates tractable estimation via a substantial reduction in the number of parameters to be estimated. Such a latent-variable model is shown to provide a good description of multivariate exchange rate movements: the ARCH effects capture volatility clustering, and the factor structure captures commonality in volatility movements across exchange rates.

647 citations


Posted Content
TL;DR: In this article, sixteen essays are drawn from a body of work strongly influenced by the thought of Joseph A. Schumpeter, and each essay tests hypotheses derived from the Schumpetersian propositions that technological innovation gives capitalist economies their peculiar dynamics through a process of "creative destruction," that technological progress has radically increased real income per capita in Western industrialized nations, and that monopoly market structures and their pursuit are a powerful engine of technological progress.
Abstract: These sixteen essays are drawn from a body of work strongly influenced by the thought of Joseph A. Schumpeter. They are particularly appropriate in a time when low rates of growth have become the norm in the Western world and much of the economic debate focuses on prescriptions for industrial regeneration. Each essay tests hypotheses derived from the Schumpeterian propositions that technological innovation gives capitalist economies their peculiar dynamics through a process of "creative destruction," that technological progress has radically increased real income per capita in Western industrialized nations, and that monopoly market structures and their pursuit are a powerful engine of technological progress.

562 citations



Posted Content
TL;DR: The recent European experience of high persistent unemployment has led to the development of theories of unemployment hysteresis embodying the idea that the equilibrium unemployment rate depends on the history of the actual unemployment rate as discussed by the authors.
Abstract: The recent European experience of high persistent unemployment has led to the development of theories of unemployment hysteresis embodying the idea that the equilibrium unemployment rate depends on the history of the actual unemployment rate. This paper summarizes two directions of research on hysteresis that appear especially promising. Membership theories are based on the distinction between insiders and outsiders and explore the idea that wage setting is largely determined by firms' incumbent workers rather than by the unemployed. Duration theories explore the idea that the long term unemployed exert much less downwards pressure on wages than do the short term unemployed.

Posted Content
TL;DR: In this paper, a collection of items is to be distributed among several bidders, and each bidder is to receive at most one item, and it has been shown that there is a unique vector of equilibrium prices that is optimal, in a suitable sense, for the bidderers.
Abstract: A collection of items is to be distributed among several bidders, and each bidder is to receive at most one item. Assuming that the bidders place some monetary value on each of the items, it has been shown that there is a unique vector of equilibrium prices that is optimal, in a suitable sense, for the bidders. In this paper we describe two dynamic auction mechanisms: one achieves this equilibrium and the other approximates it to any desired degree of accuracy.


Book ChapterDOI
TL;DR: In this paper, the authors show that the set of competitive equilibria is finite and the equilibrium prices and allocations in the commodity spot markets are uniquely determined by the asset allocation is generically constrained suboptimal.
Abstract: Let assets be denominated in an a priori specified numeraire. Whether or not the asset is complete, a competitive equilibrium exists as long as arbitrage is possible when assets are free. Generically, the set of competitive equilibria is finite, and the equilibrium prices and allocations in the commodity spot markets are uniquely determined by the asset allocation is generically constrained suboptimal: there exists an arbitrarily small reallocation of the existing assets, which leads to a Pareto improvement in welfare when prices and allocations in the commodity spot markets adjust to maintain equilibrium.

Posted Content
TL;DR: In this article, the authors investigate empirically a model of aggregate consumption and leisure decisions in which goods and leisure provide services over time and find substantial evidence against the overidentifying restrictions.
Abstract: This paper investigates empirically a model of aggregate consumption and leisure decisions in which goods and leisure provide services over time. The implied time non-separability of preferences introduces an endogenous source of dynamics which affects both the co-movements in aggregate compensation and hours worked and the cross-relations between prices and quantities. These cross-relations are examined empirically using post-war monthly U.S. data on quantities, real wages and the real return on the one-month Treasury bill. We find substantial evidence against the overidentifying restrictions. The test results suggest that the orthogonality conditions associated with the representative consumer's intratemporal Euler equation underlie the failure of the model. Additionally, the estimated values of key parameters differ significantly from the values assumed in several studies of real business models. Several possible reasons for these discrepancies are discussed.

Posted Content
TL;DR: In this article, the authors present a review of inference about large autoregressive or moving average roots in univariate time series, and structural change in multivariate time-series regression.
Abstract: This chapter reviews inference about large autoregressive or moving average roots in univariate time series, and structural change in multivariate time series regression. The "problem" of unit roots is cast more broadly as determining the order of integration of a series; estimation, inference, and confidence intervals are discussed. The discussion of structural change focuses on tests for parameter stability. Much emphasis is on asymptotic distributions in these nonstandard settings, and one theme is the general applicability of functional central limit theory. The quality of the asymptotic approximations to finite-sample distributions and implications for empirical work are critically reviewed.

Posted Content
TL;DR: This paper examined the deterrent effect of formal sanctions on criminal behavior and found that the reward component of the rational-choice model does not support the cost or deterrent component, as measured by perceived risks of formal sanction.
Abstract: This study examines the deterrent effect of formal sanctions on criminal behavior. While most research on deterrence assumes a rational-choice model of criminal decision-making, few studies consider all of the major elements of the model. In particular, three critical limitations characterize the empirical literature on deterrence: the failure to establish a causal ordering of sanctions and crime consistent with their temporal ordering; the focus on conventional populations and nonserious criminal acts, which are of less interest to the question of how society controls its members; and the inattention to the return or reward component of the decision-making process. To address these issues, we specify, estimate, and test a rational-choice model of crime on data that were collected on individuals, gathered within a longitudinal design, and derived from three distinct populations of persons at high risk of formal sanction. The results support the reward component of the rational-choice model, but fail to support the cost or deterrent component, as measured by perceived risks of formal sanctions. (abstract Adapted from Source: American Sociological Review, 1986. Copyright © 1986 by the American Sociological Association) Legal Sanctions Crime Prevention Deterrence Rational Choice Theory Adult Crime 07-02

Posted Content
TL;DR: This textbook provides an introduction to econometrics through a grounding in probability theory and statistical inference, and encourages the mastering of fundamental concepts and theoretical perspectives which guide the formulation and solution of problems in econometric modelling.
Abstract: This textbook provides an introduction to econometrics through a grounding in probability theory and statistical inference. The emphasis is on the concepts and ideas underlying probability theory and statistical inference, and on motivating the learning of them both at a formal and an intuitive level. It encourages the mastering of fundamental concepts and theoretical perspectives which guide the formulation and solution of problems in econometric modelling. This makes it an ideal introduction to empirical econometric modelling and the more advanced econometric literature. It is recommended for use on courses giving students a thorough grounding in econometrics at undergraduate or graduate level.

Posted Content
TL;DR: The only comprehensive introduction which Leontief has written to his model of Input-Output Economics, for which he was awarded the Nobel Prize in economic Science in 1972, is as discussed by the authors.
Abstract: The only comprehensive introduction which Leontief has written to his model of Input-Output Economics, for which he was awarded the Nobel Prize in Economic Science in 1972. Many of the chapters have already appeared as articles in journals, but Leontief's writings have a range and consistency that gives this collection a sense of coherence. The book begin with non-technical articles on the theory of Input-Output Economics and progresses to more technical essays, and then to specific applications of the theory. This edition has been thoroughly revised, at least one third of the material being new.

Posted Content
TL;DR: A survey of recent developments in the literature on efficiency wage theories of unemployment can be found in this paper, where a wide variety of evidence on inter-industry wage differences is analyzed.
Abstract: This paper surveys recent developments in the literature on efficiency wage theories of unemployment. Efficiency wage models have in common the property that in equilibrium firms may find it profitable to pay wages in excess of market clearing. High wages can help reduce turnover, elicit worker effort, prevent worker collective action, and attract higher quality employees. Simple versions of efficiency wage models can explain normal involuntary unemployment,segmented labor markets, and wage differentials across firms and industries for workers with similar productive characteristics. Deferred payment schemes andother labor market bonding mechanisms appear to be able to solve some efficiency wage problems without resultant job rationing and involuntary unemployment. A wide variety of evidence on inter-industry wage differences is analyzed. Efficiency wage models appear useful in explaining the observed pattern of wage differentials.The models also provide several potential mechanisms for cyclical fluctuations in response to aggregate demand shocks.

Posted Content
TL;DR: In this paper, the authors argue that the axiomatic formulation offers the surest path to a solution that is as objective as possible, minimally distorted by the unwitting imposition of personal values.
Abstract: This study comes to grips with the industrial outranking problem, one of the major outstanding problems of current operations research and managerial decision-making. The problem, simply stated, is this: given a large but finite set of criteria, and a large but finite number of alternatives, how can the criteria be ranked in priority order, and how should the alternatives be ranked from best to worst consistent with the ordering of criteria that may be conflicting or incommensurable? There have been many proposed solutions to the problem. Numerous empirical recipes—among them the majority method—have been submitted, based in large part on the subjective judgments and biases of various observers. The authors argue that the axiomatic formulation offers the surest path to a solution that is as objective as possible, minimally distorted by the unwitting imposition of personal values. They then develop a system of consistent and appealing axioms, confront the paradoxes that put axiomatic systems in general at risk, and demonstrate the applicability of their system to realistic industrial outranking problems. Even within the axiomatic framework, however, some leeway remains for subjective choice and conscious value decisions. One ad hoc criterion of choice the authors selected was that their method should be neither so flexible and open that personal biases might easily slip in, nor so artificially rigid that the play of intuition and creativity was systematically excluded. The book also takes a hard look at the theoretical and practical defects of the majority method, the favored proposed solution, and at such associated issues as committee decision techniques, strategic majority voting; and restriction conditions.

Posted Content
TL;DR: The authors brings the practical world of trade policy and of government and business strategy together with the world of academic trade theory, focusing on the impact of changes in the international trade environment and on how new developments and theory can guide our trade policy.
Abstract: This volume of original essays brings the practical world of trade policy and of government and business strategy together with the world of academic trade theory. It focuses in particular on the impact of changes in the international trade environment and on how new developments and theory can guide our trade policy.

Posted Content
TL;DR: In this article, the authors used data for the 1000 largest US manufacturing firms in 1950 and 1972 to find that there are persistent differences in profitability and market power across large US companies and that companies with persistently high profits have high market shares and sell differentiated products.
Abstract: Profits in the Long Run asks two questions: Are there persistent differences in profitability across firms? If so, what accounts for them? This book answers these questions using data for the 1000 largest US manufacturing firms in 1950 and 1972. It finds that there are persistent differences in profitability and market power across large US companies. Companies with persistently high profits are found to have high market shares and sell differentiated products. Mergers do not result in synergistic increases in profitability, but they do have an averaging effect. Companies with above normal profits have their profits lowered by mergers. Companies with initially below normal profits have them raised. In addition, the influence of other variables on long-run profitability, including risk, sales, diversification, growth and managerial control, is explored. The implications of antitrust policy are likewise addressed.


Posted Content
TL;DR: This article showed that the existing estimates of prewar gross national product exaggerate the size of cyclical fluctuations, and that the original Kuznets estimates are based on the assumption that GNP moves one-for-one with commodity output valued at producer prices.
Abstract: This paper shows that the existing estimates of prewar gross national product exaggerate the size of cyclical fluctuations The source of the exaggeration is that the original Kuznets estimates are based on the assumption that GNP moves one-for-one with commodity output valued at producer prices New estimates of GNP for 1869-1918 are derived using the estimated aggregate relationship between GNP and commodity output for the interwar and postwar eras The new estimates of GNP indicate that the business cycle is only slightly more severe in the pre-Worid War I era than in the post-World War II era

Posted Content
TL;DR: Semi-parametric models as mentioned in this paper combine a parametric form for some component of the data generating process (usually the behavioral relation between the dependent and explanatory variables) with weak nonparametric restrictions on the remainder of the model, usually the distribution of the unobservable errors.
Abstract: A semiparametric model for observational data combines a parametric form for some component of the data generating process (usually the behavioral relation between the dependent and explanatory variables) with weak nonparametric restrictions on the remainder of the model (usually the distribution of the unobservable errors). This chapter surveys some of the recent literature on semiparametric methods, emphasizing microeconometric applications using limited dependent variable models. An introductory section defines semiparametric models more precisely and reviews the techniques used to derive the large-sample properties of the corresponding estimation methods. The next section describes a number of weak restrictions on error distributions -- conditional mean, conditional quantile, conditional symmetry, independence, and index restrictions -- and show how they can be used to derive identifying restrictions on the distributions of observables. This general discussion is followed by a survey of a number of specific estimators proposed for particular econometric models, and the chapter concludes with a brief account of applications of these methods in practice.

Posted Content
TL;DR: In this article, a number of studies have been conducted to examine patent data to examine different aspects of technological change, including the relationship between RLD expenditures and the level of patenting, and how R&D spills over from one firm to another.
Abstract: This paper summarizes a number of studies which use patent data to examine different aspects of technological change. It describes our firm level data set construction effort; reports on the relationship between RLD expenditures and the level of patenting; analyzes the relationship between patents, RD reports on the estimation of the value of patent rights based on European patent renewal data; and describes the use of patent data to estimate the importance of R&D spillovers. It concludes that patent data represent a valuable resource for the analysis of technological change. They can be used to study longer-run interfirm differences in inventive activity and as a substitute for R&D data where they are not available in the desired detail. It is possible also to use a firm's distribution of patenting by field to infer its position in "technological space" and use it in turn to study how R&D spills over from one firm to another. Moreover, patent renewal data, which are also becoming available in the U.S., allow one to construct more relevant "quality weighted" inventive "output" measures .

Posted Content
TL;DR: Game-Theoretic Models of Bargaining as discussed by the authors provides a comprehensive picture of the new developments in bargaining theory and especially shows the way the use of axiomatic models has been complemented by the new results derived from strategic models.
Abstract: Game-Theoretic Models of Bargaining provides a comprehensive picture of the new developments in bargaining theory. It especially shows the way the use of axiomatic models has been complemented by the new results derived from strategic models. The papers in this volume are edited versions of those given at a conference on Game Theoretic Models of Bargaining held at the University of Pittsburgh. There are two distinct reasons why the study of bargaining is of fundamental importance in economics. The first is that many aspects of economic activity are directly influenced by bargaining between and among individuals, firms, and nations. The second is that bargaining occupies an important place in economic theory, since the 'pure bargaining problem' is at the opposite pole of economic phenomena from the case of 'perfect competition'. This volume is an outgrowth of the renewed interest in the strategic approach to the theory of bargaining and to the general theory of non-cooperative games.

Posted Content
TL;DR: The authors examines the interactions between tax policy, international capitol mobility, and international competitiveness and concludes that tax policies that stimulate national investment without affecting national savings must inevitably lead to deterioration in a country's trade balance in the short and intermediate run.
Abstract: This paper examines the interactions between tax policy, international capitol mobility, and international competitiveness. It demonstrates that tax policies which stimulate national investment without affecting national savings must inevitably lead to deterioration in a country's trade balance in the short and intermediate run. This conclusion, which contradicts a great deal of popular rhetoric highlights the importance of considering the macroeconomic as well as the microeconomic aspects of tax changes. Yore generally, the effects of tax policies depend critically on the extent of the international capital flows which they generate. The paper examines the issue of international capital mobility both theoretically and empirically. A variety of considerations suggest that while tax policies could generate large capital flows, governments pursue policies which tend to inhibit capital flows following tax changes. This makes the analysis of tax policies difficult.