scispace - formally typeset
Search or ask a question

Showing papers in "Research Papers in Economics in 1987"


Posted Content
TL;DR: In this paper, the random walk model is strongly rejected for the entire sample period (1962-1985) and for all sub-periods for a variety of aggregate returns indexes and size-sorted portfolios.
Abstract: In this paper, we test the random walk hypothesis for weekly stock market returns by comparing variance estimators derived from data sampled at different frequencies. The random walk model is strongly rejected for the entire sample period (1962-1985) and for all sub-periods for a variety of aggregate returns indexes and size-sorted portfolios. Although the rejections are largely due to the behavior of small stocks, they cannot be ascribed to either the effects of infrequent trading or time-varying volatilities. Moreover, the rejection of the random walk cannot be interpreted as supporting a mean-reverting stationary model of asset prices, but is more consistent with a specific nonstationary alternative hypothesis.

2,920 citations


Posted Content
TL;DR: The authors discusses inter-industry studies of the relations among various measures of market structure, conduct, and performance, and discusses that tradition has indeed uncovered many stable, robust, and empirical regularities.
Abstract: Publisher Summary This chapter discusses inter-industry studies of the relations among various measures of market structure, conduct, and performance. It discusses that tradition has indeed uncovered many stable, robust, and empirical regularities. Inter-industry research has taught much about the way markets look, especially within the manufacturing sector in developed economies, even if it has not shown exactly the way markets work. Work in some areas has produced no clear picture of the important patterns in the data, and non-manufacturing industries have not received attention commensurate with their importance. But cross-section studies are limited by serious problems of interpretation and measurement. Future inter-industry research should adopt a modest, descriptive orientation and aim to complement case studies by uncovering robust empirical regularities that can be used to evaluate and develop theoretical tools. Much of the most persuasive recent work relies on nonstandard data sources, particularly panel data that can be used to deal with disequilibrium problems and industry-specific data, which mitigate the problem of unobservable industry-specific variables.

1,106 citations


Posted Content
TL;DR: In this paper, the authors present the chronological development of the concept of excess burden and the related study of optimal tax theory, and uncover the interrelationships among various apparently distinct results, bringing out the basic structure of the entire problem.
Abstract: The purpose of this paper is to present the chronological development ofthe concept of excess burden and the related study of optimal tax theory. A main objective of this exercise is to uncover the interrelationships among various apparently distinct results, so as to bring out the basic structure of the entire problem.The paper includes a discussion of various measures of excess burden,focusing on issues of approximation, informational requirements, aggregation over individuals, and the effects of technology. Included in the presentation of optimal tax theory is a section on tax reform, as well as an application of the theory to the case where uncertainty is present.

902 citations


Posted Content
TL;DR: In this article, the authors present a collection of the important articles on efficiency wage theory and explain why there are labor market equilibria with employers preferring to pay wages in excess of the market-clearing wage and thereby explain involuntary unemployment.
Abstract: One of the more troubling aspects of the ferment in macroeconomics that followed the demise of the Keynesian dominance in the late 1960s has been the inability of many of the new ideas to account for unemployment remains unexplained because equilibrium in most economic models occurs with supply equal to demand: if this equality holds in the labor market, there is no involuntary unemployment. Efficiency Wage Models of the Labor Market explores the reasons why there are labor market equilibria with employers preferring to pay wages in excess of the market-clearing wage and thereby explains involuntary unemployment. This volume brings together a number of the important articles on efficiency wage theory. The collection is preceded by a strong, integrative introduction, written by the editors, in which the hypothesis is set out and the variations, as described in subsequent chapters, are discussed.

788 citations


Posted Content
TL;DR: In this paper, the authors examined the impact of changes in real GNP on the forecast of output over a long time horizon and found that an unexpected change in output today should not substantially change one's forecast in, say, five or ten years.
Abstract: According to the conventional view of the business cycle, fluctuations in output represent temporary deviations from trend. The purpose of this paper is to question this conventional view. If fluctuations in output are dominated by temporary deviations from the natural rate of output, then an unexpected change in output today should not substantially change one's forecast of output in, say, five or ten years. Our examination of quarterly postwar United States data leads us to be skeptical about this implication. The data suggest that an unexpected change in real GNP of 1 percent should change one's forecast by over 1 percent over a long horizon.

747 citations





Posted Content
TL;DR: In this paper, the term structure data for U.S. government securities were provided for all time intervals from 1946 to 1987 and the data relate to the concepts in the paper more precisely than does any previously published data series.
Abstract: This paper consolidates and interprets the literature on the term structure, as it stands today. Definitions of rates of return, forward rates and holding returns for all time intervals are treated here in a uniform manner and their interrelations, exact or approximate, delineated. The concept of duration is used throughout to simplify mathematical expressions. Continuous compounding is used where possible, to avoid arbitrary distinctions based on compounding assumptions. Both the theoretical and the empirical literature are treated. The attached tables by J. Huston McCulloch give term structure data for U. S. government securities 1946-1987. The tables give discount bond yields, forward rates and par bond yields as defined in the paper. The data relate to the concepts in the paper more precisely than does any previously published data series.

569 citations


Posted Content
TL;DR: The Fundamental Privatization Theorem (analogous to The Fundamental Theorem of Welfare Economics) is presented, providing conditions under which government production cannot improve upon private production as discussed by the authors.
Abstract: In this paper, the choice between public and private provision of goods and services is considered. In practice, both modes of operation involve significant delegation of authority, and thus appear quite similar in some respects. The argument here is that the main difference between the two mod- concerns the transactions cats faced by the government when attempting to intervene in the delegated production activities. Such intervention is generally less costly under public ownership than under private ownership. The greater ease of intervention under public ownership can have its advantages; but the fact that a promise not to intervene is more credible under private production can also have beneficial incentive effects, The Fundamental Privatization Theorem (analogous to The Fundamental Theorem of Welfare Economics) is presented, providing conditions under which government production cannot improve upon private production. The restrictiveness of these conditions is evaluated.

552 citations


Posted Content
TL;DR: In this article, the authors discuss three common mechanisms for achieving coordination, with particular reference to the choice of compatibility standards, and show that the committee is more likely to achieve coordination than the market mechanism.
Abstract: We discuss three common mechanisms for achieving coordination, with particular reference to the choice of compatibility standards. The first involves explicit communication and negotiation before irrevocable choices are made: It represents what standardization committees do. The second mechanism, by contrast, involves no explicit communication and depends on unilateral irrevocable choices: It succeeds if one agent chooses first and the other(s) follow(s). This is a simple version of "market leadership." We analyze these two mechanisms in a simple model and show that the committee is more likely to achieve coordination. Moreover, although the committee is slower, it outperforms the market mechanism, even when we allow for the value of speed. Third, we examine a hybrid of the first two mechanisms, in which both communication and unilateral preemptive actions are allowed. We show that, far from worsening its performance, unilateral actions improve the committee system. This hybrid system more closely resembles the committee system, the more important coordination is relative to conflict. (This abstract was borrowed from another version of this item.) (This abstract was borrowed from another version of this item.)

Posted Content
TL;DR: This paper showed that innovations in M1 have statistically significant marginal predictive value for industrial production, both in a bivariate model and in a multivariate setting including a price index and an interest rate.
Abstract: Previous authors have reached puzzlingly different conclusions about the usefulness of money for forecasting real output based on closely related regression-based tests. An examination of this and additional new evidence reveals that innovations in M1 have statistically significant marginal predictive value for industrial production, both in a bivariate model and in a multivariate setting including a price index and an interest rate. This conclusion follows from focusing on the trend properties of the data, both stochastic and deterministic, and from drawing inferences using asymptotic theory that explicitly addresses the implications of these trends for the distributions of the various test statistics.


Posted Content
TL;DR: In this article, a more humanitarian approach to economic policies alternatives to regain economic growth and development while simultaneously adopting people-oriented policies directed at the most vulnerable groups must be found, and the need to introduce policies and measures in developing countries that integrate the human dimension s part of structural adjustment policies (SAPs) and not as an additional welfare component to it.
Abstract: This study produced by UNICEF documents the need to introduce policies and measures in developing countries that integrate the human dimension s part of structural adjustment policies (SAPs) and not as an additional welfare component to it. The study calls for a more humanitarian approach to economic policies alternatives to regain economic growth and development while simultaneously adopting people- oriented policies directed at the most vulnerable groups must be found. This book is divided into 2 parts. Part I includes 5 chapters: 1) economic decline and human welfare in the 1st half of the 1980s; 2) adjustment policies 1980-85: effects on child welfare; 3) the impact on government expenditure; 4) adjustment at the household level: potentials and limitations of survival strategies; 5) country experience with adjustment (10 case studies). Part II includes 11 chapters: 6) an overview of the alternative approach; 7) alternative macropolicies meso policies and vulnerable groups; 8) social policymaking: restructuring targeting efficiency; 9) policy approaches towards small farmers; 10) supporting productive employment among vulnerable groups; 11) health policy and program options: compensating for the negative effects of economic adjustment; 12) education; 13) nutrition interventions; 14) monitoring and statistics for adjustment with a human face; 15) the international system and the protection of the vulnerable; and 16) summary and conclusions.


Posted Content
TL;DR: In this paper, the authors examined the effect of tariffs and exchange rates on U.S. prices of Japanese cars, trucks and motorcycles, and found that the pass-through relation varies across products, ranging from about 0.6 for trucks to unity for motorcycles.
Abstract: This paper examines the effect of tariffs and exchange rates on U.S. prices of Japanese cars, trucks and motorcycles. In particular, we test whether the long run pass-through of tariffs and exchange rates are identical: the symmetry hypothesis. We find that this hypothesis is easily accepted in our sample. We also find that the pass-through relation varies across products, ranging from about 0.6 for trucks to unity for motorcycles. These coefficients have very different implications for trade policy. We explain the results based on demand, cost and institutional conditions in each industry. We also find weak evidence that the pass-through of exchange rates has fallen in more recent years.

Journal Article
TL;DR: This article reviewed an emerging economic literature on the effects of and determinants of student effort and cooperativeness and how putting student motivation and behavior at center of one's theoretical framework changes one's view of how schools operate and how they might be made more effective.
Abstract: Students face four decision margins: (a) How many years to spend in school, (b) What to study, (c) How much effort to devote to learning per year and (d) Whether to disrupt or assist the learning of classmates. The thousands of studies that have applied human capital theory to the first two questions are reviewed elsewhere in this volume and the Handbook series. This chapter reviews an emerging economic literature on the effects of and determinants of student effort and cooperativeness and how putting student motivation and behavior at center of one's theoretical framework changes one's view of how schools operate and how they might be made more effective. In this new framework students have a dual role. They are both (a) investors/consumers who choose which goals (outputs) to focus on and how much effort to put into each goal and (b) workers getting instruction and guidance from their first-line supervisors, the teachers. A simple model is presented in which the behavior of students, teachers and administrators depends on the incentives facing them and the actions of the other actors in the system. The incentives, in turn, depend upon the cost and reliability of the information (signals) that is generated about the various inputs and outputs of the system. Our review of empirical research support many of the predictions of the model. Student effort, engagement and discipline vary a lot within schools, across schools and across nations and have significant effects on learning. Higher extrinsic rewards for learning are associated the taking of more rigorous courses, teachers setting higher standards and more time devoted to homework. Taking more rigorous courses and studying harder increase student achievement. Post-World War II trends in study effort and course rigor, for example, are positively correlated with achievement trends. Even though, greater rigor and higher standards improve learning, parents and students prefer easy teachers. They pressure tough teachers to lower standards and sign up for courses taught by easy graders. Curriculum-based external exit examinations (CBEEES) improve the signaling of academic achievement to colleges and the labor market and this increases extrinsic rewards for learning. Cross-section studies suggest that CBEEES result in greater focus on academics, more tutoring of lagging students, and higher levels of achievement. Minimum competency examinations (MCE) do not have significant effects on learning or dropout rates but they do appear to have positive effects on the reputation of high school graduates. As a result, students from MCE states earn significantly more than students from states without MCEs and the effect lasts at least eight years. Students who attend schools with studious well-behaved classmates learn more. Disruptive students generate negative production externalities and cooperative hard-working students create positive production externalities. Peer effects are also generated by the norms of student peer cultures that encourage disruptive students and harass nerds. In addition learning is poorly signaled to employers and colleges. Thus, market signals and the norms of student peer culture do not internalize the externalities that are pervasive in school settings and as a result students typically devote less effort to studying than the taxpayers who fund schools would wish.

Posted Content
TL;DR: In this article, the authors argue that the notion that a real security is redundant when it can be synthesized by a dynamic trading strategy ignores the informational role of real securities markets.
Abstract: Recent advances in financial theory have created an understanding of the environments in which a real security can be synthesized by a dynamic trading strategy in a risk free asset and other securities. We contend that there is a crucial distinction between a synthetic security and a real security, in particular the notion that a real security is redundant when it can be synthesized by a dynamic trading strategy ignores the informational role of real securities markets. The replacement of a real security by synthetic strategies may in itself cause enough uncertainty about the price volatility of the underlying security that the real security is no longer redundant. Portfolio insurance provides a good example of the difference between a synthetic security and a real security. One form of portfolio insurance uses a trading strategy in risk free securities ("cash") and index futures to synthesize a European put on the underlying portfolio. In the absence of a real traded put option (of the appropriate striking price and maturity), there will be less information about the future price volatility associated with current dynamic hedging strategies. There will thus be less information transmitted to those people who could make capital available to liquidity providers. It will therefore be more difficult for the market to absorb the trades implied by the dynamic hedging strategies, In effect, the stocks' future price volatility can rise because of a current lack of information about the extent to which dynamic hedging strategies are in place.

Posted Content
TL;DR: Easterlin this paper showed that population size can be as restrictive as a factor as sex, race, or class on equality of opportunity in the U.S. And in showing this, he demonstrates that the population size of a generation can be a limiting factor on the personal welfare of its members.
Abstract: In this influential work, Richard A. Easterlin shows how the size of a generation—the number of persons born in a particular year—directly and indirectly affects the personal welfare of its members, the make-up and breakdown of the family, and the general well being of the economy. "[Easterlin] has made clear, I think unambiguously, that the baby-boom generation is economically underprivileged merely because of its size. And in showing this, he demonstrates that population size can be as restrictive as a factor as sex, race, or class on equality of opportunity in the U.S."—Jeffrey Madrick, Business Week

Posted Content
TL;DR: In this paper, the authors derived a discrete choice model of the demand for medical care from a theoretical model that implies a natural interrelation between price and income, and showed that if health is a normal good, then demand becomes more elastic as income falls, implying that user fees would reduce the access to care for the poor proportionally more than for the rich.
Abstract: In this paper, we derive a discrete choice model of the demand for medical care from a theoretical model that implies a natural interrelation between price and income. We show that, in the context of a discrete choice model, if health is a normal good, then the price elasticity of the demand for health care must decline as income rises. This implies that the models in previous discrete choice studies which restrict the price effect to be independent of income are misspecified. The model is estimated using data from a 1984 Peruvian survey, and a parsimonious flexible functional form. Unlike previous studies, we find that price plays a significant role in the demand for health care, and that demand becomes more elastic as income falls, implying that user fees would reduce the access to care for the poor proportionally more than for the rich. Our simulations show that user fees can generate substantial revenues, but are accompanied by substantial reductions in aggregate consumer welfare, with the burden of the loss on the poor. These results demonstrate that undiscriminating user fees would be regressive both in terms of access and welfare.

Posted Content
TL;DR: In this article, a lecture on the theory of economic growth was given in a lecture hall and the topic of the lecture should be "on or associated with the work for which the Prize was awarded".
Abstract: I have been told that everybody has dreams, but that some people habitually forget them even before they wake up. That seems to be what happens to me. So I do not know if I have ever dreamt about giving this Lecture. I know that I have been in this room before, but that was in real life, and I was awake. If I have given this lecture in my dreams, there is no doubt that the topic was the theory of economic growth. I am told that the subject of the lecture should be "on or associated with the work for which the Prize was awarded." That is pretty unambiguous. But I would not even wish to use the leeway offered by the phrase "associated with." Growth theory is exactly what I want to talk about: for itself, for its achievements, for the gaps that remain to be filled, and also as a vehicle for some thoughts about the nature of theoretical research in macroeconomics, and empirical research as well.

Posted Content
TL;DR: In this article, the authors argue that to understand the behavior of productivity statistics, it is necessary to reexamine the basic assumptions underlying growth accounting, and they offer theoretical and empirical support for the assertion that the elasticity of output with respect to an input like capital or labor might differ from the share of the input in total factor income.
Abstract: This article argues that to understand the behavior of productivity statistics, it is necessary to reexamine the basic assumptions underlying growth accounting. In particular, it offers theoretical and empirical support for the assertion that the elasticity of output with respect to an input like capital or labor might differ from the share of the input in total factor income. The theories offered in support of this possibility allow for spillovers of knowledge, specialization with monopolistic competition, and endogenous accumulation of labor-saving technological change. Evidence on the form of aggregate production is drawn from data for many countries and for long historical time periods. The specific interpretation of the productivity slowdown that is offered is that a low elasticity of output with respect to labor causes labor productivity growth rates to fall when labor growth speeds up.

Posted Content
TL;DR: A survey of empirical models of the 1970s, published in Economic Interdependence and Flexible Exchange Rates, edited by J. Bhandari (M.I.T. Press: Cambridge), in 1983, is here supplemented with a brief epilogue to update the literature to 1987 as discussed by the authors.
Abstract: “Monetary and Portfolio-Balance Models of Exchange Rate Determination” was a survey of empirical models of the 1970s, published in Economic Interdependence and Flexible Exchange Rates , edited by J. Bhandari (M.I.T. Press: Cambridge), in 1983. It is here supplemented with a brief epilogue to update the literature to 1987, including some skeptical observations on recent claims that “random walk” results constitute evidence in favor of an “equilibrium” model of the exchange rate.

Posted Content
TL;DR: In this article, survey data on interest rate expectations are used to separate the forward interest rate into an expected future rate and a term premium and test separately two competing alternative hypotheses in tests of the term structure: that the expectations hypothesis does not hold, and that expected future long rates overreact to changes in short rates.
Abstract: Survey data on interest rate expectations are used to separate the forward interest rate into an expected future rate and a term premium These components are used to test separately two competing alternative hypotheses in tests of the term structure: that the expectations hypothesis does not hold, and that expected future long rates over- or underreact to changes in short rates While the spread consistently fails to predict future interest rate changes, we find that the nature of this failure is different, for short versus long maturities For short maturities, expected future rates are rational forecasts The poor predictions of the spread can therefore be attributed to variation in term premia For longer-term bonds, however, we are unable to reject the expectations theory, in that a steeper yield curve reflects a one-for-one increase in expected future long rates Here the perverse predictions of the spread reflect investors' failure to raise sufficiently their expectations of future long rates when the short rate rises We confirm earlier findings that bond rates underreact to short rate changes, but now this result cannot be attributed to the term premium


Posted Content
TL;DR: In this article, the authors argue that the theoretical case for long run neutrality is extremely weak, in that it depends upon improbable assumptions that are either directly or indirectly falsified through empirical observation, and that the approximate validity of short run neutrality depends primarily upon assumptions that have at least an aura of plausibility.
Abstract: In evaluating the existing theory and evidence on Ricardian equivalence, it is essential to distinguish between the short run effects of government borrowing (primarily the potential for stimulating aggregate demand) and the long run effects (primarily the potential for depressing capital accumulation). I argue that the theoretical case for long run neutrality is extremely weak, in that it depends upon improbable assumptions that are either directly or indirectly falsified through empirical observation. In contrast, the approximate validity of short run neutrality depends primarily upon assumptions that have at least an aura of plausibility. Nevertheless, even in this case behavioral evidence weighs heavily against the Ricardian view. Efforts to measure the economic effects of deficits directly through aggregate data confront a number of problems which, taken together, may well be insuperable. It is therefore not at all surprising that this evidence has, by itself, proven inconclusive. Overall, the existing body of theory and evidence establishes a significant likelihood that deficits have large effects on current consumption, and there is good reason to believe that this would drive up interest rates. In addition, I find a complete lack of either evidence or coherent theoretical argument to dispute the view that sustained deficits significantly depress capital accumulation in the long run.

Book ChapterDOI
TL;DR: The economics of science has remained lamentably underdeveloped by absolute as well as comparative standards as discussed by the authors, which is traceable to Arrow's famous 1962 essay, "Economic Welfare and the Allocation of Resources for Inventions".
Abstract: Economists understand technology less deeply than some might hope. But they understand the world of technology far better than they do the world of science (see, for example, Rosenberg, 1982, especially chapter 7). Kenneth Arrow’s famous 1962 essay, and the literature it inspired, is in good part to blame for this state of affairs. In ‘Economic Welfare and the Allocation of Resources for Inventions’, Arrow laid the foundations for modern economic analysis of research and development (R&D) activities. On that base, a large, and impressive edifice of research devoted to the economics of technological invention and innovation has since been erected. By absolute as well as comparative standards, the economics of science has remained lamentably underdeveloped. That too is traceable to the 1962 essay.

Posted Content
TL;DR: In this article, a comprehensive survey of developments in the theory of measurement of welfare and its application to environmental economics is presented, in the first part derives consumer surplus measures to be held in a timeless world, and the second part looks at intertemporal issues.
Abstract: This book is an advanced text in welfare economics and its application to environmental economics. It provides, in the first chapters, a comprehensive survey of developments in the theory of measurement of welfare, and then applies this theory to environmental economics. The first part derives consumer surplus measures to be held in a timeless world. Throughout the emphasis is on the circumstances in which a money measure correctly reflects the underlying utility change. Four main cases are considered: unrationed private goods, rationed private goods, public goods or externalities, and discrete choices. Reviews of practical methodologies for the calculation of consumers' surplus for these classes of goods are also given. The second part looks at intertemporal issues. In particular, it derives comsumer faces risk and uncertainty. The book is intended for advanced courses in environmental and welfare economics, and as a reference work for those interested in the theory of measurement of welfare and its application to environmental economics.