scispace - formally typeset
Search or ask a question

Showing papers by "London School of Economics and Political Science published in 1986"


Journal ArticleDOI
TL;DR: Hahn as mentioned in this paper provides an informal introduction to some of the main themes of the recent literature on "non-cooperative" or "sequential" bargaining models, focusing in particular on the relationship between the new approach and the traditional axiomatic approach exemplified by "Nash bargaining theory".
Abstract: The paper provides an informal introduction to some of the main themes of the recent literature on "non-cooperative" or "sequential" bargaining models. It focuses in particular on the relationship between the new approach and the traditional axiomatic approach exemplified by "Nash bargaining theory". It illustrates the new insights offered by the non-cooperative approach, by reference to a detailed analysis of the manner in which the presence of an outside option available to one of the parties will affect the negotiated outcome. Finally, the difficulties which arise in extending this analysis to two-person bargaining with incomplete information, and to n-person bargaining, are discussed. This is a revised version of the fourth Review of Economic Studies Lecture presented in April 1985 at the joint meeting of the Association of University Teachers of Economics and the Royal Economic Society held in Oxford. The choice of lecturer is made by a panel whose members are currently Professors Hahn, Mirrlees and Nobay, and the paper is refereed in the usual way. This paper aims to provide an informal and elementary introduction to an approach to bargaining which has received a great deal of attention over the past few years. The approach involves writing down some particular sequence of moves (offers and replies) to be made over time in the course of negotiations, and then looking for a non-cooperative equilibrium in the game thus specified (in practice, a perfect equilibrium, or in games of incomplete information, a sequential equilibrium). While the approach appears at first sight to be very different in spirit from the traditional axiomatic approach-in which a bargaining solution is specified by appealing to a number of general requirements which are deemed appropriate on the basis of some a priori considerations-the two approaches are in fact complementary. While the

549 citations


Journal ArticleDOI
TL;DR: Pissarides as mentioned in this paper used data on unemployment flows and job vacancies to shed light on the phenomenal rise of unemployment in Britain, from under 3% in the 1960s to over 15% (male) unemployment in the 1980s.
Abstract: Unemployment Christopher Pissarides This paper uses data on unemployment flows and job vacancies to shed light on the phenomenal rise of unemployment in Britain, from under 3% in the 1960s to over 15% (male) unemployment in the 1980s. It finds that most of the rise is due to a fall in the demand for labour. A more expansionary fiscal policy and improved international competitiveness would have ameliorated this fall. Some of the fall, however, can also be attributed to supply pressure, which stopped wages from falling fast enough. Social security and a more relaxed attitude by the state in the provision of unemployment and supplementary benefits also contributed to the rise in unemployment, by making workers more choosey. The paper investigates whether it is possible for any given set of underlying factors to give rise to more than one equilibrium unemployment rate. If so, a temporary stimulus might release the economy from a low-level equilibrium. However the conditions necessary for this do not appear to hold in practice. Thus fiscal and monetary policy can permanently affect unemployment only if they can permanently alter aggregate demand. Microeconomic policy, like marginal employment subsidies, can, however, have a permanent effect on unemployment and the paper investigates whether there are any grounds for wishing to use such policy tools to alter the free-market equilibrium unemployment rate. There is evidence that the allocation of workers to jobs is done more efficiently at a fairly high level of overall labour demand. It follows that, unless job vacancies exceed unemployment, job creating policy measures will be beneficial.

360 citations


Journal ArticleDOI
TL;DR: In this paper, structural modelling intervention techniques are used to estimate the changes in casualty rates for various categories of road users following the introduction of the seat belt law, that is car rear seat passengers, pedestrians and cyclists.
Abstract: Monthly data on road casualties in Great Britain are analyzed in order to assess the effect on casualty rates of the seat belt law introduced on January 31, 1983. Such analysis is known technically as intervention analysis. The form of intervention analysis that is used in this paper is based upon structural time series modelling and differs in significant respects from standard intervention analysis based upon ARIMA modelling. The relative merits of the two approaches are compared. Structural modelling intervention techniques are used to estimate the changes in casualty rates for various categories of road users following the introduction of the seat belt law. We first note the high rate of compliance with the seat belt law. By February 1983, the wearing rate had jumped to 90 percent and the rate has remained at approximately 95 percent from March 1983 onwards. There can be no doubt of the success of the law as regards to compliance. In considering the casualty figures we distinguish between those directly affected by the law, namely car drivers and front seat passengers, and those not directly affected by the law, that is car rear seat passengers, pedestrians and cyclists. Taking first numbers killed and seriously injured (KSI), we found a reduction of 23 percent for car drivers and 30 percent for front seat passengers. Thus, for those directly affected by the law, there have been substantial reductions. For rear seat passengers KSI we found a rise of 3 percent, for pedestrians a fall of one-half percent, and for cyclists an increase of 5 percent -- all three of these of these values being statistically insignificant. We conclude that there is no significant evidence of a change in numbers of KSI of those not directly affected. For numbers killed, we found for those directly affected a reduction of 18 percent for car drivers and 25 percent for front seat passengers. However, for those indirectly affected by the law, our model gave an increase of 27 percent for rear seat passengers, 8 percent for pedestrians, and 13 percent for cyclists. The value for rear seat passengers is highly significant and the other two values are on the borderline of significance. We conclude that there was an increase in fatalities of those not directly affected. We are unable to provide a completely satisfactory explanation of the difference between the figures for KSI and killed for rear seat passengers, pedestrians and cyclists. The article contains 13 pages of comments by discussants and 3 pages of authors' response. Language: en

335 citations


Book ChapterDOI
TL;DR: In this paper, the authors present a discussion on the dynamic models of labor demand and examine the theoretical explanations of facts and investigates the extent to which these explanations are consistent with empirical data.
Abstract: Publisher Summary The chapter presents a discussion on the dynamic models of labor demand. A firm does not hire its workforce afresh each day for the reason that it is cheaper not to do so. Hiring and firing generate costs for the firm over and above the weekly wage payment. As it is discussed, these costs ensure that the firm's demand for labor depends not only on current exogenous factors but also on the initial size of the workforce and expectations about the future levels of such factors. The firm's demand for labor cannot be described by a static model. The chapter examines the theoretical explanations of facts and investigates the extent to which these explanations are consistent with empirical data. The chapter discusses the size and structure of the “adjustment” costs imposed on the firm by turnover. This is an important issue because the structure of these costs is crucial in determining the temporal pattern of labor demand in response to exogenous shocks. This is followed by analyses of a number of dynamic models of the demand for labor. The chapter also discusses the formulation of empirical models and presents some of the limited amount of empirical work, which is explicitly based on a formulated dynamic theory. The chapter concludes with some general remarks on the directions in which research in this area might proceed.

312 citations


Journal ArticleDOI
TL;DR: The authors describes the geographical pattern of wages in Britain between 1760 and 1914 and draws out some of the implications of the wages pattern and considers, in particular, the implications for the "growth pole" debate on the likely effect of industrialization upon regional income inequalities.
Abstract: This paper describes the geographical pattern of wages in Britain between 1760 and 1914. It then draws out some of the implications of the wages pattern and considers, in particular, the implications for the “growth pole” debate on the likely effect of industrialization upon regional income inequalities. The market forces responsible for creating and maintaining these differentials are then described, followed by a final section which discsusses the significance of changing regional wage differentials to the standar-of-living debate. It concludes that from a regional perspective the overall effects of industrialization upon living standards are indisputably favorable.

210 citations


Book ChapterDOI
TL;DR: In this article, the authors present the time path of unemployment in the United States, the United Kingdom, and Europe, and construct three main types of model to explain unemployment, which are based on supply and demand and assume that without government intervention these are, in the long run, equal.
Abstract: Publisher Summary This chapter describes unemployment in the long run. It presents the time path of unemployment in the United States, the United Kingdom, and Europe. There is an upward trend for unemployment in the United States; but in Europe, there has been an astonishing growth with unemployment rising in all but one of the last 15 years. This explains why the unemployment rates are so different for different groups of people in society. For example, unemployment rates are typically higher for young people than for older people. They are also higher for the unskilled, for blacks, and for workers in certain industries. These differences are closely related to the different rates of turnover of the different groups. To explain unemployment, three main types of model are constructed. The first of these is based on supply and demand and assumes that without government intervention these are, in the long run, equal. However, even without wage regulation there may be excess supply of labor and involuntary unemployment may persist because of the wage-setting behavior of monopsonistic firms or of monopolistic unions or because of bargaining between the two. Thus, second set of models has wages set by firms, and the third has wages set by unions.

181 citations


Journal ArticleDOI
TL;DR: In this paper, the authors explore some of the possibilities, and argue that in the absence of index-linked loans, higher inflation implies higher liquidation rates and default premia, which is one possible reason why higher inflation would depress share values.
Abstract: In this paper, we explore some of the possibilities, and argue that in the absence of index-linked loans, higher inflation implies higher liquidation rates and default premia. This would also be one possible reason why higher inflation would depress share values. Our evidence also suggests that the Modigliani and Cohn (1979) 'valuation errors hypothesis' is extremely important. In order to see why higher inflation might lead to more bankruptcies, consider the following example. Suppose that the real interest rate, p, is i %, and that a firm has borrowed ?iooo. If there is no inflation, the firm's interest payments are ?io. Now, suppose that inflation (p) rises to IO %, and that the nominal interest rate (r) is given by the formula, (i + r) = (i + p) (i +fi). Then, r rises to I I -I %, and total interest payments to ?i i i. Hence, at a time when revenue has risen by only IO %, interest payments rise elevenfold, and this creates cashflow problems for the firm. Index-linked debt would avoid this problem because the principal could be indexed, and would rise to ?i ioo, leaving the firm to only pay Li i in interest

145 citations


Journal ArticleDOI
TL;DR: In this paper, a radical reconstruction of instrumental models of bureaucracy explains the privatization boom in terms of the primacy of bureau-shaping motivations in the welfare functions of policy-level bureaucrats.
Abstract: Public choice theories of bureaucracy, especially the budget maximization thesis, have been influential in stimulating the drive towards privatization in Britain and the USA. But these accounts are strangely silent about why changes in state agency practices have come about under‘new right’ governments. They apparently attribute the scope of change entirely to‘virtuous’ political direction overcoming previously inherent features of bureaucratic behaviour and democratic politics. By contrast, a radical reconstruction of instrumental models of bureaucracy explains the privatization boom in terms of the primacy of bureau-shaping motivations in the welfare functions of policy-level bureaucrats. Privatization is seen as a development of earlier strategies (such as the separation of control and line agencies, the creation of‘dual state’ structures, and automation) by which the class interests of senior bureaucrats have been advanced at the expense of rank and file state workers and service recipients. An examination of divergences in the internal and social costs of public agency functions explains why legislators and policy-level bureaucrats (especially in control agencies) push ahead with the‘inappropriate’ privatization of public service delivery systems where overall social welfare is reduced.

134 citations


Journal ArticleDOI
TL;DR: In this article, the authors present a method for estimating a class of models in which "news" or "surprises" appear and expectations are formed rationally, which is an extension of the "errors-in-variables" method of McCallum and Wickens.
Abstract: In the first part of the paper we outline a method for estimating a class of models in which "news" or "surprises" appear and expectations are formed rationally. The method is an extension of the "errors-in-variables" method of McCallum and Wickens. As a by-product some of Pagan's results on the circumstances under which the commonly used "two-step" method of estimating "surprise" models is efficient are shown to be a consequence of well-known theorems on the efficiency of sub-system estimation when a subset of equations are exactly identified. In the second part of the paper the method is applied to Hall's random-walk model of consumption, which is extended to allow for stochastic interest rates and for leisure and government spending to be substitutes for private spending. The extended formulation is a great deal more successful at capturing the salient features of the data. We also derive approximate restrictions across the parameters of the model due to the rational expectations hypothesis but find that they are marginally rejected by the data. Finally, we evaluate the ability of the life-cycle with rational expectations model to encompass alternative models.

114 citations


Journal ArticleDOI
TL;DR: The economic conjuncture of the 1970s and 1980s was extremely disturbed, with high and volatile inflation, interest and exchange rates, which led the monetary authorities of most Western coun tries to adopt monetary targets as a financial discipline to constrain inflationary pressures as mentioned in this paper.
Abstract: The economic conjuncture of the 1970s and 1980s was extremely disturbed, with high and volatile inflation, interest and exchange rates. This led the monetary authorities of most Western coun tries to adopt monetary targets as a financial discipline to constrain inflationary pressures. But this same conjuncture also encouraged the process of financial innovation which subsequently eroded the stability of the links between certain monetary aggregates and nominal incomes, which had provided the empirical basis for such targetry. Thus, in his recent book, Financial In novations and the Money Supply, Podolski (1986) argues that

77 citations


Journal ArticleDOI
TL;DR: In this paper, the authors aim to establish intuitively appealing and verifiable conditions for the existence and weak consistency of ML estimators in a multi-parameter framework, assuming neither the independence nor the identical distribution of the observations.


Journal ArticleDOI
TL;DR: The multivariate exponential smoothing model of Enns, Machak, Spivey and Wrobleski is examined and it is found that its structure is such that it can be estimated by using techniques designed for a univariate exponential smoother.
Abstract: The multivariate exponential smoothing model of Enns, Machak, Spivey and Wrobleski is examined and it is found that its structure is such that it can be estimated by using techniques designed for a univariate exponential smoothing model. Similarly forecasts can be made using algorithms for the univariate model. The model can therefore be handled very easily. A more general univariate time series model, which can include polynomial trends and seasonal factors, is then set up and a multivariate generalisation, analogous to the multivariate exponential smoothing model, is introduced. It is shown that this model can also be handled using algorithms designed for the univariate case.


Journal ArticleDOI
TL;DR: In this article, the dual relationship between the prices of private goods and the quantities of public goods is considered and the conditions for optimum public good provision can be expressed as a modification of the Samuelson conditions with extra terms representing (a) the distortionary effect of taxes on the willingness to pay for the public good, and (b) distributional effects.

Journal ArticleDOI
TL;DR: Levels of health expenditure per head have fallen in many countries and the cumulative effects on health of increased poverty, unemployment, underemployment and famine, and the reduced capacity of health services to respond to health problems can be documented with facts for a number of countries in Latin America and Africa.
Abstract: The widesread economic crisis has resulted in a fall in living standards in the western hemisphere of over 9% (1981-83) and in sub-Saharan Africa they have fallen to the levels in 1970. Food production in the African countries most seriously affected by drought dropped by 15% between 1981-83. Living standards also fell in some countries in Europe and in some of the poorest Asian countries. The high cost of fuel the heavy burden of interest payments and unfavorable terms of trade in Africa and Latin America led to serious unemployment devaluation of national currencies and formidable austerity policies. While some countries have succeeded in protecting their health services from cuts in public expenditure in many others cuts in the health budget have been substantial. The effects of the crisis in some countries have amounted to the virtual disintegration of rural health services. There are limited data available to show what has been happening to levels of expenditure on health but those presented have demonstrated that levels of health expenditure/head have fallen in many countries. The cumulative effects on health of increased poverty unemployment underemployment and famine and the reduced capacity of health services to respond to health problems can be documented with facts for a number of countries in Latin America and Africa. Malnutrition has increased and improvements in infant mortality have been checked or reversed. The economic crisis has placed at risk the health of the most vulnerable. (authors)

Journal ArticleDOI
TL;DR: The principal intention is to promote discussion amongst simulation practitioners about their own ‘ideal’ of a computer support environment and the nature of deficiencies in the current systems.
Abstract: The modeller approaching discrete-event simulation has expected and received a high degree of computer support. The processing power simply to run a model and analyse the results would, of course, be taken for granted, but support has gone far beyond this in promoting the easier and speedier construction of models through specialized program structures, languages and lately program generators. Computer graphics capabilities of mini- and microcomputers have been exploited to secure a readier acceptance of simulation models and results. These support facilities constitute the computer environment within which the fortunate modeller works at present. What more could be expected?

Journal ArticleDOI
TL;DR: In this article, the effect of serial dependence on variability of kernel estimators of conditional expectations and joint probability densities is studied in the context of a vector-valued stationary time series.
Abstract: Kernel estimators of conditional expectations and joint probability densities are studied in the context of a vector-valued stationary time series. Weak consistency is established under minimal moment conditions and under a hierarchy of weak dependence and bandwidth conditions. Prompted by these conditions, some finite-sample theory explores the effect of serial dependence on variability of estimators, and its implications for choice of bandwidth.

Journal ArticleDOI
TL;DR: A model of consensus leads to examples in which the ergodic behavior of a nonstationary product of random nonnegative matrices depends discontinuously on a continuous parameter as discussed by the authors.

Journal ArticleDOI
TL;DR: In this article, the authors consider a well-structured model of capital accumulation with a preference structure that reflects altruism in the specific sense that each generation's welfare is defined not only on its own consumption level but also on that of its immediate descendants.
Abstract: We consider a well-structured model of capital accumulation with a preference structure that reflects altruism in the specific sense that each generation's welfare is defined not only on its own consumption level but also on that of its immediate descendants. As altruism is limited the interests of distinct generations, who each live for one period only, come into conflict. The solution concept is that of a perfect Nash equilibrium in which each generation chooses a consumption schedule (possibly non-linear). The savings of each generation form the inherited endowment of the next. Our purpose is to utilize recent results by L e i n i n g e r [6] on the properties of the equilibrium sequence of consumption schedules to provide a simple proof that a// feasible programs that can be generated from this sequence are Pareto-efficient in the modified sense first proposed in L a n e and M i t r a [5]. The restriction is made that the schedules (strategies) be differentiable. By application of L e i n i n g e r ' s "levelling" result [6] and Sard 's theorem on critical values [7] we show, in Section 3 of this paper, that marginal propensities to consume are interior (i. e., belong to the open interval (0, 1)) except on a closed null set i. e.,

Journal ArticleDOI
TL;DR: In recent years there has been a considerable degree of interest in the measurement of the capital stock, with special reference to the degree of premature scrapping of capital equipment that might have occurred.
Abstract: In recent years there has been a considerable degree of interest in the measurement of the capital stock, with special reference to the degree of premature scrapping of capital equipment that might have occurred. It has been argued that this might be one reason for the observed slowdown in productivity growth (see e.g. Muellbauer (1984) or Baily (1981)). Others have alleged that there has been a large decrease in manufacturing capacity in the UK since 1980, and that, therefore, we should not reflate the economy, (see e.g. Ball (1985)), for we run the risk of re-igniting inflation. The latter view is commonly expressed in the press, both by financial commentators and politi cians. It is, therefore, of some importance to assess the extent to which the official stock meas ures are misleading.

Journal ArticleDOI
TL;DR: A three-phase simulation system written in Pascal for use on microcomputers, minis and mainframes is presented and provides a basis for further research in simulation within both institutions.
Abstract: A three-phase simulation system written in Pascal for use on microcomputers, minis and mainframes is presented. The advantages of the three-phase method are discussed, and the basis of the system explained. Although a traditional, textbook example is used to aid explanation and discussion, the system described in this paper has been used in real applications and in the teaching of simulation at Lancaster University and the London School of Economics. It also provides a basis for further research in simulation within both institutions.


Journal ArticleDOI
TL;DR: The project was to construct a planning system for a regional health council in Ontario, Canada, which would take account of the possible alternative future states of the health-care system's environment and would aim to keep options for future development open.
Abstract: Earlier work has criticized the dominant tendencies in operational research contributions to health services planning as characterized by optimization, implausible demands for data, depoliticization, hierarchy and inflexibility. This paper describes an effort which avoids at least some of these pitfalls. The project was to construct a planning system for a regional health council in Ontario, Canada, which would take account of the possible alternative future states of the health-care system's environment and would aim to keep options for future development open. The planning system devised is described in the paper. It is based on robustness analysis, which evaluates alternative initial action sets in terms of the useful flexibility they preserve. Other features include the explicit incorporation of pressures for change generated outside the health-care system, and a satisficing approach to the identification of both initial action sets and alternative future configurations of the health-care system. It was found possible to borrow and radically ‘re-use’ techniques or formulations from the mainstream of O.R. contributions. Thus the ‘reference projection’ method was used to identify inadequacies in performance which future health-care system configurations must repair. And Delphi analysis, normally a method for generating consensus, was used in conjunction with cluster analysis of responses to generate meaningfully different alternative futures.


Journal ArticleDOI
TL;DR: In this paper, the authors present a causal account of how these changes are translated into state policy via business pressure on governments undergoing monetary or fiscal crises, and how electorates are persuaded to convey functionally appropriate signals to governments in making their political choices, nor the apparently central, partially autonomous role played by political institutions in triggering major policy changes.
Abstract: Existing radical explanations of the turndown in public expenditure and welfare state growth across most liberal democracies from the mid-1970s are of two types. In economic explanations the shift is attributed to the changing economic imperatives of capitalist economies in the new world recession, In ideological explanations it is argued that governments have tried to reestablish the predominance of market disciplines and allocation systems in social life, more indirectly responding to business pressure and fiscal strains. The economic approach does have a causal account of how these changes are translated into state policy via business pressure on governments undergoing monetary or fiscal crises. But neither approach can adequately explain how electorates are persuaded to convey functionally appropriate signals to governments in making their political choices, nor the apparently central, partially autonomous role played by political institutions in triggering major policy changes. By contrast, attention...

Journal ArticleDOI
TL;DR: In this article, the authors draw attention to the danger of leakage from omitted frequencies and show that the consequent bias can be reduced by means of tapering, which is a common technique in the errors-in-variables problem.

Journal ArticleDOI
TL;DR: The way in which the natural language understanding system accepts sentences and organizes data structures based on them is outlined, along with the supporting error-handling mechanisms and editing facilities.
Abstract: Research into computer systems to aid simulation model formulation has led to the construction of a natural language understanding system (N.L.U.S.). A description and short history of N.L.U.S.s leads into a discussion of the syntactic and semantic problems which have to be met. A classification scheme for N.L.U.S.s is proposed, and the simulation problem formulator described in this context. The N.L.U.S. simulation model formulator takes as input a sentence in English which conforms to a prescribed set of sentence structures. These restricted sentence structures resolve syntactic and semantic problems. The way in which the system accepts these sentences and organizes data structures based on them is outlined, along with the supporting error-handling mechanisms and editing facilities. A simulation model example is used to illustrate the flavour of the system by reference to complete interactive session listings.

Journal ArticleDOI
TL;DR: In this article, the authors formalize Begg's notion of financial panic and show that this interpretation is incomplete and to formalise the existence of a rational expectations model of equilibria.
Abstract: Begg (1984) has recently presented a rational expectations model of equilibrium bond pricing. One important feature of this model is that lenders making portfolio choices between short and long assets are risk-averse, so that the well-known principle of certainty equivalence no longer applies to the optimal demand for bonds. When the policy for the supply of bonds and a simple policy rule for interest rates are specified, an equilibrium model of equilibrium bond pricing is obtained where expectations are self-fulfilling in both the mean and variance. The key feature of this interesting model of the bonds market is that it is nonlinear. As long as the degree of risk, the degree of risk-aversion or the supply of bonds are not too large, the forward evolution of the model will be unstable and a unique rational expectations solution exists as the backward evolution of equilibrium bond prices will converge. This is very much in the spirit of the conventional saddlepoint approach to solving rational expectations models (Blanchard and Kahn, 1980), where equilibrium asset prices are considered as forward-looking jump variables. However, if the degree of risk, the degree of risk-aversion, the coupon or the supply of bonds become large enough, the forward evolution of the model becomes locally stable. Begg's (1984) interpretation is that this leads to a loss of confidence and financial panic, because there is now an infinite number of convergent rational expectations trajectories. The main objective of this paper is to show that this interpretation is incomplete and to formalise Begg's notion of financial panic. Because the forward evolution of asset prices is stable, the backward evolution is unstable when there is a large degree of risk or riskaversion. It therefore seems as if equilibrium bond prices have become predetermined and are no longer jump variables, so that it seems as if risk-aversion investors have 'given up' their forward-looking behaviour in the face of too many shocks. This is only a locally valid argument and contradicts intuition, because efficient asset prices are generally regarded as jump variables that incorporate 'news' about current and future events. It is possible, however, still to have forward-looking asset prices, because increasing the degree of uncertainty or the degree of risk-aversion eventually leads to unique speculative bubbles and possibly financial chaos for the backward evolution of equilibrium bond prices. There exists therefore, in addition to the infinity of convergent and pre-determined rational expectations paths, a unique non-

Journal ArticleDOI
TL;DR: Bruno [1984] argued that a fall in material input, for given levels of capital and labor input, can explain a measured fall in productivity growth which in most OECD countries first appeared after 1973 as mentioned in this paper.
Abstract: Bruno [1984] argues that a fall in material input, for given levels of capital and labor input, can explain a measured fall in productivity growth which in most OECD countries first appeared after 1973. This comment asks the following: 1. Do the available output statistics show a fall in gross output, as the materials hypothesis explanation of the productivity slowdown requires? Or do they show a fall in real value added, defined as gross output less material input, which indicates a fall in "technical progress?" I conclude that the materials hypothesis cannot explain falls in productivity growth measured using GDP data, but could explain partperhaps 0.5 percent per annumof the fall measured using the index of manufacturing production. 2. What materials price and quantity movements have occurred? To isolate the component of material prices that is exogenous, common to all OECD countries, and excludes energy costs, I look at the U. N. price index for basic commodities imported by developed countries. The rise in real raw material prices was much smaller in terms of its cost to importing countries than the oil price rise, and by the late 1970s real industrial materials prices had fallen back to the levels of the 1960s. While a general index of material input quantities is not available, data for metals, one major component of materials, suggest that this input has fallen by about 2 percent per annum since the early 1970s relative to output and to earlier trends. So statistics which directly evaluate measurement biases, and material input prices and quantities, suggest that substitution away from material inputs cannot explain much of the fall in productivity growth.