scispace - formally typeset
Search or ask a question

Showing papers in "Research Papers in Economics in 1999"


Book ChapterDOI
TL;DR: This article examined the use of autoregressive distributed lag (ARDL) models for the analysis of long-run relations when the underlying variables are I(1) and I(0) regressors.
Abstract: This paper examines the use of autoregressive distributed lag (ARDL) models for the analysis of long-run relations when the underlying variables are I(1). It shows that after appropriate augmentation of the order of the ARDL model, the OLS estimators of the short-run parameters are p T -consistent with the asymptotically singular covariance matrix, and the ARDL-based estimators of the long-run coe¢cients are super-consistent, and valid inferences on the long-run parameters can be made using standard normal asymptotic theory. The paper also examines the relationship between the ARDL procedure and the fully modi…ed OLS approach of Phillips and Hansen to estimation of cointegrating relations, and compares the small sample performance of these two approaches via Monte Carlo experiments. These results provide strong evidence in favour of a rehabilitation of the traditional ARDL approach to time series econometric modelling. The ARDL approach has the additional advantage of yielding consistent estimates of the long-run coe¢cients that are asymptotically normal irrespective of whether the underlying regressors are I(1) or I(0). JEL Classi…cations: C12, C13, C15, C22. Key Words: Autoregressive distributed lag model, Cointegration, I(1) and I(0) regressors, Model selection, Monte Carlo simulation. ¤This is a revised version of a paper presented at the Symposium at the Centennial of Ragnar Frisch, The Norwegian Academy of Science and Letters, Oslo, March 3-5, 1995. We are grateful to Peter Boswijk, Clive Granger, Alberto Holly, Kyung So Im, Brendan McCabe, Steve Satchell, Richard Smith, Ron Smith and an anonymous referee for helpful comments. Partial …nancial support from the ESRC (Grant No. R000233608) and the Isaac Newton Trust of Trinity College, Cambridge is gratefully acknowledged.

4,711 citations


Posted Content
TL;DR: The Mechanisms of Governance as discussed by the authors is an important work in the field of transaction cost economics, a branch of the New Institutional Economics with which Oliver Williamson is especially associated.
Abstract: This book brings together in one place the work of one of our most respected economic theorists, on a field which he has played a large part in originating: the New Institutional Economics. Transaction cost economics, which studies the governance of contractual relations, is the branch of the New Institutional Economics with which Oliver Williamson is especially associated. Transaction cost economics takes issue with one of the fundamental building blocks in microeconomics: the theory of the firm. Whereas orthodox economics describes the firm in technological terms, as a production function, transaction cost economics describes the firm in organizational terms, as a governance structure. Alternative feasible forms of organization--firms, markets, hybrids, bureaus--are examined comparatively. The analytical action resides in the details of transactions and the mechanisms of governance. Transaction cost economics has had a pervasive influence on current economic thought about how and why institutions function as they do, and it has become a practical framework for research in organizations by representatives of a variety of disciplines. Through a transaction cost analysis, The Mechanisms of Governance shows how and why simple contracts give way to complex contracts and internal organization as the hazards of contracting build up. That complicates the study of economic organization, but a richer and more relevant theory of organization is the result. Many testable implications and lessons for public policy accrue to this framework. Applications of both kinds are numerous and growing. Written by one of the leading economic theorists of our time, The Mechanisms of Governance is sure to be an important work for years to come. It will be of interest to scholars and students of economics, organization, management, and law.

4,106 citations


Posted Content
TL;DR: In this paper, the authors developed and estimated a structural model of inflation that allows for a fraction of firms that use a backward looking rule to set prices, and they concluded that the New Keynesian Phillips curve provides a good first approximation to the dynamics of inflation.
Abstract: We develop and estimate a structural model of inflation that allows for a fraction of firms that use a backward looking rule to set prices. The model nests the purely forward looking New Keynesian Phillips curve as a particular case. We use measures of marginal cost as the relevant determinant of inflation, as the theory suggests, instead of an ad-hoc output gap. Real marginal costs are a significant and quantitatively important determinant of inflation. Backward looking price setting, while statistically significant, is not quantitatively important. Thus, we conclude that the New Keynesian Phillips curve provides a good first approximation to the dynamics of inflation.

2,644 citations


Posted Content
TL;DR: The authors empirically analyzes the characteristics of the Internet as a channel for two categories of homogeneous products (books and CDs) using a data set of over 4,500 price observations collected over a period of 9 months.
Abstract: There have been many claims that the Internet represents a new "frictionless market." Our research empirically analyzes the characteristics of the Internet as a channel for two categories of homogeneous products -- books and CDs -- using a data set of over 4,500 price observations collected over a period of 9 months. We compare pricing behavior at 37 Internet and conventional retail outlets. We find support for the hypothesis of increased efficiency in Internet channels on several dimensions, but we also find evidence of considerable price dispersion. We find that prices on the Internet are 8-15% lower than prices in conventional outlets, depending on whether taxes, shipping and shopping costs are included. Additionally, we find that Internet retailers' price adjustments over time are up to 100 times smaller than conventional retailers' price adjustments -- presumably reflecting lower menu costs in Internet channels. However, we also find substantial price dispersion across retailers on the Internet. Prices for books and CDs differ by as much as 47% across Internet retailers at any one time and the dispersion of posted prices on the Internet is equal to or greater than comparable measures of dispersion across conventional retailers. Moreover, the Internet retailers with the lowest prices do not receive the most sales for either books or CDs. The observed dispersion is not easily explained as a response to existing models of search costs. However, price dispersion on the Internet may be explained by retailer heterogeneity with respect to factors such as branding and consumer "trust." We note that branding and trust may play an enhanced role on the Internet because of the spatial and temporal separation between buyer, seller, and product in Internet channels for books and CDs.

1,810 citations


Posted Content
TL;DR: The authors summarizes recent research in economics that investigates differentials by race and gender in the labor market, including recent extensions of taste-based theories, theories of occupational exclusion, and theories of statistical discrimination.
Abstract: This chapter summarizes recent research in economics that investigates differentials by race and gender in the labor market. We start with a statistical overview of the trends in labor market outcomes by race, gender and Hispanic origin, including some simple regressions on the determinants of wages and employment. This is followed in Section 3 by an extended review of current theories about discrimination in the labor market, including recent extensions of taste-based theories, theories of occupational exclusion, and theories of statistical discrimination. Section 4 discusses empirical research that provides direct evidence of discrimination in the labor market, beyond "unexplained gaps" in wage or employment regressions. The remainder of the chapter reviews the evidence on race and gender gaps, particularly wage gaps. Section 5 reviews research on the impact of pre-market human capital differences in education and family background that differ by race and gender. Section 6 reviews the impact of differences in both the levels and the returns to experience and seniority, with discussion of the role of training and labor market search and turnover on race and gender differentials. Section 7 reviews the role of job characteristics (particularly occupational characteristics) in the gender wage gap. Section 8 reviews the smaller literature on differences in fringe benefits by gender. Section 9 is an extensive discussion of the empirical work that accounts for changes in the trends in race and gender differentials over time. Of particular interest is the new research literature that investigates the impact of widening wage inequality on race and gender wage gaps. Section 10 reviews research that relates policy changes to race and gender differentials, including anti-discrimination policy. The chapter concludes with comments about a future research agenda.

1,717 citations



Posted Content
TL;DR: This paper analyzed the relationship between inequality and economic growth from two directions, showing that when capital markets are imperfect, there is not necessarily a trade-off between equity and efficiency, and provided an explanation for two recent empirical findings, namely, the negative impact of inequality and the positive effect of redistribution upon growth.
Abstract: We analyze the relationship between inequality and economic growth from two directions. The first part of the survey examines the effect of inequality on growth, showing that when capital markets are imperfect, there is not necessarily a trade-off between equity and efficiency. It therefore provides an explanation for two recent empirical findings, namely, the negative impact of inequality and the positive effect of redistribution upon growth. The second part analyzes several mechanisms whereby growth may increase wage inequality, both across and within education cohorts. Technical change, and in particular the implementation of "General Purpose Technologies," stands as a crucial factor in explaining the recent upsurge in wage inequality.

1,590 citations


MonographDOI
TL;DR: In this article, the authors present a co-operative research effort that allowed contributors to evaluate different policy rules using their own specific approaches, and present their findings on the potential response of interest rates to an array of variables, including alterations in the rates of inflation, unemployment and exchange.
Abstract: This volume presents late-1990s thinking on monetary policy rules and seeks to determine just what types of rules and policy guidelines function best. A co-operative research effort that allowed contributors to evaluate different policy rules using their own specific approaches, this collection presents their findings on the potential response of interest rates to an array of variables, including alterations in the rates of inflation, unemployment, and exchange. This work illustrates that simple policy rules are more robust and more efficient than complex rules with multiple variables. A state-of-the-art appraisal of the fundamental issues facing the Federal Reserve Board and other central banks, the text should be of interest for economic analysts and policymakers alike.

1,586 citations


Posted Content
TL;DR: In this paper, the authors studied what determines group formation and the degree of participation when the population is heterogeneous, both in terms of income and race or ethnicity, and found that those individuals who express views against racial mixing are less prone to participate in groups the more racially heterogeneous their community is.
Abstract: This paper studies what determines group formation and the degree of participation when the population is heterogeneous, both in terms of income and race or ethnicity We are especially interested in whether and how much the degree of heterogeneity in communities influences the amount of participation in different types of groups Using survey data on group membership and data on U S localities, we find that, after controlling for many individual characteristics, participation in social activities is significantly lower in more unequal and in more racially or ethnically fragmented localities We also find that those individuals who express views against racial mixing are less prone to participate in groups the more racially heterogeneous their community is These results are consistent with our model of group formation

1,560 citations


Posted Content
TL;DR: In this article, the unconditional expectation of average household utility is expressed in terms of the unconditional variances of the output gap, price inflation, and wage inflation, where the model exhibits a tradeoff between stabilizing output gap and price inflation.
Abstract: We formulate an optimizing-agent model in which both labor and product markets exhibit monopolistic competition and staggered nominal contracts. The unconditional expectation of average household utility can be expressed in terms of the unconditional variances of the output gap, price inflation, and wage inflation. Monetary policy cannot replicate the Pareto-optimal equilibrium that would occur under completely flexible wages and prices; that is, the model exhibits a tradeoff between stabilizing the output gap, price inflation, and wage inflation. The Pareto optimum is attainable only if either wages or prices are completely flexible. For reasonable calibrations of the model, we characterize the optimal policy rule. Furthermore, strict price inflation targeting is clearly suboptimal, whereas rules that also respond to either the output gap or wage inflation are nearly optimal.

1,449 citations


Posted Content
TL;DR: In this paper, the authors present several applications of state-space models and Markov switching models, including decomposition of time series into trend and cycle, a new index of coincident economic indicators, approaches to modeling monetary policy uncertainty, Friedman's "plucking" model of recessions, the detection of turning points in the business cycle and the question of whether booms and recessions are duration-dependent.
Abstract: Both state-space models and Markov switching models have been highly productive paths for empirical research in macroeconomics and finance. This book presents recent advances in econometric methods that make feasible the estimation of models that have both features. One approach, in the classical framework, approximates the likelihood function; the other, in the Bayesian framework, uses Gibbs-sampling to simulate posterior distributions from data. The authors present numerous applications of these approaches in detail: decomposition of time series into trend and cycle, a new index of coincident economic indicators, approaches to modeling monetary policy uncertainty, Friedman's "plucking" model of recessions, the detection of turning points in the business cycle and the question of whether booms and recessions are duration-dependent, state-space models with heteroskedastic disturbances, fads and crashes in financial markets, long-run real exchange rates, and mean reversion in asset returns.

Posted Content
TL;DR: In this article, the authors examined stock market co-movements and applied these concepts to test for stock market contagion during the 1997 East Asian crises, the 1994 Mexican peso collapse, and the 1987 U.S. stock market crash.
Abstract: This paper examines stock market co-movements. It begins with a discussion of several conceptual issues involved in measuring these movements and how to test for contagion. Standard tests examine if cross-market correlation in stock market returns increase during a period of crisis. The measure of cross-market correlations central to this standard analysis, however, is biased. The unadjusted correlation coefficient is conditional on market movements over the time period under consideration, so that during a period of turmoil when stock market volatility increases, standard estimates of cross-market correlations will be biased upward. It is straightforward to adjust the correlation coefficient to correct for this bias. The remainder of the paper applies these concepts to test for stock market contagion during the 1997 East Asian crises, the 1994 Mexican peso collapse, and the 1987 U.S. stock market crash. In each of these cases, tests based on the unadjusted correlation coefficients find evidence of contagion in several countries, while tests based on the adjusted coefficients find virtually no contagion. This suggests that high market co-movements during these periods were a continuation of strong cross-market linkages. In other words, during these three crises there was no contagion, only interdependence.

Posted Content
TL;DR: In this article, the authors discuss the central issues of regulation and discuss these from a number of disciplinary perspectives, including business, political science, sociology, social administration, anthropology, and other disciplines.
Abstract: The way in which regulation works is a key concern of industries, consumers, citizens, and governments alike. Understanding Regulation takes the reader through the central issues of regulation and discusses these from a number of disciplinary perspectives. This book is written by a lawyer and an economist, but looks also towards business, political science, sociology, social administration, anthropology, and other disciplines. The fundamental strategies, institutions, and explanations of regulation are reviewed and the means of identifying `good' regulation are outlined. Individual chapters look at such topics as self-regulation, the regulation of risks, the cost-benefit testing of regulation, the importance of enforcement, and the challenge of regulating within Europe. The book's second part considers a series of issues of particular concern in modern utilities regulation, including the use of RPI-X price caps, the control of service quality, franchising techniques and ways of measuring regulatory performance. Questions of accountability and procedure are then examined and recent public debates on regulatory reform are reviewed. A central argument of Understanding Regulation is that regulation inevitably gives rise to political contention but that persons of different political persuasion can nevertheless converse sensibly on the search for better regulation.

Posted Content
TL;DR: In this article, the authors survey recent work in equilibrium models of labor markets characterized by search and recruitment frictions and by the need to reallocate workers across productive activities, and use the framework to study the influence of alternative labor market institutions and policies on wages and unemployment.
Abstract: This paper surveys recent work in equilibrium models of labor markets characterized by search and recruitment frictions and by the need to reallocate workers across productive activities. The duration of unemployment and jobs and wage determination are treated as endogenous outcomes of job creation and job destruction decisions made by workers and firms. The solutions studied are dynamic stochastic equilibria in the sense that time and uncertainty are explicitly modeled, expectations are rational, private gains from trade are exploited and the actions taken by all agents are mutually consistent. A number of alternative wage determination mechanisms are explored, including the frequently studied non-cooperative wage bargaining and wage posting by firms. We use the framework to study the influence of alternative labor market institutions and policies on wages and unemployment.

Posted Content
TL;DR: In this paper, the authors show that large technology shocks are needed to produce realistic business cycles, while Solow residuals are sufficiently volatile, these imply frequent technological regress, suggesting the imminent demise of real business cycles.
Abstract: The Real Business Cycle (RBC) research program has grown spectacularly over the last decade, as its concepts and methods have diffused into mainstream macroeconomics. Yet, there is increasing skepticism that technology shocks are a major source of business fluctuations. This chapter exposits the basic RBC model and shows that it requires large technology shocks to produce realistic business cycles. While Solow residuals are sufficiently volatile, these imply frequent technological regress. Productivity studies permitting unobserved factor variation find much smaller technology shocks, suggesting the imminent demise of real business cycles. However, we show that greater factor variation also dramatically amplifies shocks: a RBC model with varying capital utilization yields realistic business cycles from small, nonnegative changes in technology.

Posted Content
TL;DR: A review of the literature linking health and labor market behavior can be found in this article, with a focus on the U.S. and developing countries, where health is a major determinant of wages, hours and labor force participation.
Abstract: ". . . that the labor force status of an individual will be affected by his health is an unassailable proposition [because] a priori reasoning and casual observation tell us it must be so, not because there is a mass of supporting evidence." (Bowen and Finegan, 1969) "Despite the near universal finding that health is a significant determinant of work effort, the second major inference drawn from [this] review is that the magnitude of measured health effects varies substantially across studies." (Chirikos, 1993) This chapter provides an overview of some of the literature linking health and labor market behavior. The question is important because for groups as diverse as single mothers and older people, health is thought to be a major determinant of wages, hours, and labor force participation. Thus, an understanding of the effects of health on labor market activity is necessary for evaluations of the cost effectiveness of interventions designed to prevent or cure disease. Moreover, since the relationship between health and the labor market is mediated by social programs, an understanding of this relationship is necessary if we are to assess the effectiveness and solvency of these programs. In countries with aging populations, these questions will only become more pressing over time as more individuals reach the age where health has the greatest impact of labor market outcomes. The two quotations above, one from 1969 and one from 1993, illustrate that a good deal of empirical evidence linking health and labor market activity has sprung up over the last 25 years. Indeed, the literature we review suggests that health has a pervasive effect on most outcomes of interest to labor economists including wages, earnings, labor force participation, hours worked, retirement, job turnover, and benefits packages. But unfortunately there is no consensus about the magnitude of the effects or about their size relative to the effects of other variables. We will, however, be able to shed some light on factors that cause the estimates to disagree. Much of the best work linking health and labor market outcomes focuses on developing countries. This may be because the link between health and work is more obvious in societies in which many prime age adults are under-nourished and in poor health, and also because the theory of efficiency wages provides a natural starting point for investigations of this issue. However several excellent recent surveys of health and labor markets in developing countries already exist (dee Behrman and Deolalikar 1988 and Strauss and Thomas 1998). In order to break newer ground, this survey will have as its primary focus papers written since 1980 using U.S. data, although we will refer to the developing country literature where appropriate.

Posted Content
TL;DR: In this article, the authors derive a poverty-efficient allocation of aid and compare it with actual aid allocations, and find that the actual allocation is radically different from the poverty efficient allocation, for a given of poverty, aid tapers in with policy reform.
Abstract: The authors derive a poverty-efficient allocation of aid and compare it with actual aid allocations. They build the poverty-efficient allocation in two stages. First they use new World Bank ratings of 20 different aspects of national policy to establish the current relationship between aid, policies, and growth. Onto that, they add a mapping from growth to poverty reduction, which reflects the level and distribution of income. They compare the effects of using headcount and poverty-gap measures of poverty. They find the actual allocation of aid to be radically different from the poverty-efficient allocation. In the efficient allocation, for a given of poverty, aid tapers in with policy reform. In the actual allocation, aid tapers out with reform. In the efficient allocation, aid is targeted disproportionately to countries with severe poverty and adequate polices-the type of country where 74 percent of the world's poor live. In the actual allocation, such countries receive a much smaller share of aid (56 percent) than their share of the world's poor. With the present allocation, aid is effective in sustainably lifting about 30 million people a year out of absolute poverty. With a poverty-efficient allocation, this would increase to about 80 million people. Even with political constraints introducedto keep allocations for India and China constant, poverty reduction would increase to about 60 million. Reallocating aid is politically difficult, but it may be considerably less difficult than quadrupling aid budgets, which is what the authors estimate would be necessary to achieve the same impact on poverty reduction with existing aid allocations.

Journal Article
TL;DR: In this paper, the authors argue that in actual practice development cooperation has already moved beyond aid and that issues such as the ozone hole, global climate change, HIV, drug trafficking and financial volatility are not really poverty-related.
Abstract: Edited by the United Nations Development Programme, this collection of papers offers a new rationale and framework for international development cooperation. Its main argument is that in actual practice development cooperation has already moved beyond aid. In the name of aid (i.e. assistance to poor countries), we are today dealing with issues such as the ozone hole, global climate change, HIV, drug trafficking and financial volatility. All of these issues are not really poverty-related. Rather, they concern global housekeeping: ensuring an adequate provision of global public goods.

Posted Content
TL;DR: In this paper, the authors present a model of the economy and pose the problem of optimal monetary policy, and characterize the responses of endogenous variables, including nominal interest rates, to shocks under an optimal regime, and highlight the advantages of commitment, by contrasting the optimal responses with those that would result from optimization under discrestion.
Abstract: The first section of this paper presents a model of the economy and poses the problem of optimal monetary policy. The second characterizes the responses of endogenous variables, including nominal interest rates, to shocks under an optimal regime, and highlights the advantages of commitment, by contrasting the optimal responses with those that would result from optimization under discrestion. Then, the next section considers the optimal assignment of an objective to a central bank with instrument but not goal) independence, that is expected to pursue its assigned goal under discretion. The last section considers the form of interest rate feedback rule that can achieve the desired dynamic responses to chocks, if the central bank's commitment to such a rule is credible. to the private sector.

Posted Content
TL;DR: The authors examined segregation in American cities from 1890 to 1990 and found that there is a strong positive relation between urban population or density and segregation, and that the legal barriers enforcing segregation had been replaced by decentralized racism, where whites pay more than blacks to live in predominantly white areas.
Abstract: This paper examines segregation in American cities from 1890 to 1990. From 1890 to 1940, ghettos were born as blacks migrated to urban areas and cities developed vast expanses filled with almost entirely black housing. From 1940 to 1970, black migration continued and the physical areas of the ghettos expanded. Since 1970, there has been a decline in segregation as blacks have moved into previously all†white areas of cities and suburbs. Across all these time periods there is a strong positive relation between urban population or density and segregation. Data on house prices and attitudes toward integration suggest that in the mid†twentieth century, segregation was a product of collective actions taken by whites to exclude blacks from their neighborhoods. By 1990, the legal barriers enforcing segregation had been replaced by decentralized racism, where whites pay more than blacks to live in predominantly white areas.

Posted Content
TL;DR: In this article, a simple variant of an unobserved component model is used to combine the information from different sources of governance data and country-specific aggregate governance indicators, and the authors illustrate the methodology by constructing aggregate indicators of bureaucratic quality, rule of law, and graft.
Abstract: In recent years the growing interest of academics and policymakers in governance has been reflected in the proliferation of cross-country indices measuring various aspects of governance. The authors explain how a simple variant of an unobserved components model can be used to combine the information from these different sources into aggregate governance indicators. The main advantage of this method us that it allows quantification of the precision of both individual sources of governance data and country-specific aggregate governance indicators. The authors illustrate the methodology by constructing aggregate indicators of bureaucratic quality, rule of law, and graft for a sample of 160 countries. Although these aggregate governance indicators are more informative about the level of governance than any single indicator, the standard errors associated with estimates of governance are still large relative to the units in which governance is measured. In light of these margins of error, it is misleading to offer very precise rankings of countries according to their level of governance: small differences in country rankings are unlikely to be statistically - let alone practically - significant. Nevertheless, these aggregate governance indicators are useful because they allow countries to be sorted into broad groupings according to levels of governance, and they can be used to study the causes and consequences of governance in a much larger sample of countries than previously used (see for example the companion paper by the authors,"Governance matters", Policy Research Working Paper no. 2196).

Posted Content
TL;DR: In this article, the authors used a new date set to assess whether capital structure theory is portable across countries with different institutional framework, and they analyzed capital structure choices of firms in 10 developing countries and provided evidence that these decisions are affected by the same variables as in developed countries.
Abstract: This study uses a new date set to assess whether capital structure theory is portable across countries with different institutional framework. We analyze capital structure choices of firms in 10 developing countries, and provide evidence that these decisions are affected by the same variables as in developed countries.

Posted Content
TL;DR: Lawson et al. as discussed by the authors explored the relationship between codifiable and tacit knowledge in the innovation process, and investigated the claim that tacit knowledge, because it is difficult to transfer in the absence of labour mobility, may constitute a basis for sustained regional competitive advantage.
Abstract: LAWSON C. and LORENZ E. (1999) Collective learning, tacit knowledge and regional innovative capacity, Reg. Studies 33 , 305-317 . The paper reviews key ideas in the firm capabilities literature and shows how they can be usefully extended to develop a conception of collective learning among regionally clustered enterprises. The paper also explores the relationship between codifiable and tacit knowledge in the innovation process, and investigates the claim that tacit knowledge, because it is difficult to transfer in the absence of labour mobility, may constitute a basis for sustained regional competitive advantage. The closing section uses case study material based on Minneapolis and Cambridge to illustrate the importance for innovation of a regional capability for combining and integrating diverse knowledge, and of the sources of such capabilities as pre-conditions for successful high technology regions. LAWSON C. et LORENZ E. (1999) L'apprentissage collectif, la connaissance implicite et la capacite regionale a innover, Reg. Studies 33 , 305-317 . Puisant dans la documentation qui traite des competences de l'entreprise, l'article fait la critique des idees cle et montre comment on peut les elargir afin de developper une conception de l'apprentissage collectif parmi des entreprises regroupees sur le plan regional. L'article examine aussi le rapport entre les connaissances explicite et implicite dans le processus d'innovation, et cherche a verifier l'affirmation que la connaissance puisse servir de base a l'avantage competitif regional soutenu, parce qu'il s'avere difficile de la transferer, faute de mobilite de la maind'oeuvre. A partir des etudes de cas de Minneapolis et de Cambridge, la derniere partie montre l'importance pour l'innovation de la capacite regionale a combiner et a integrer des connaissances diverses, et des origines de telles competences comme prealables a la reussite des regions a la pointe de la technologie. LAWSON C. und LORENZ E. (1999) Kollekt (This abstract was borrowed from another version of this item.)

Posted Content
TL;DR: The authors examined several episodes in U.S. monetary history using the framework of an interest rate rule for monetary policy and found that a monetary policy rule in which the interest rate responds to inflation and real output more aggressively than it did in the 1960s and 1970s, or than during the time of the international gold standard, and more like the late 1980s and 1990s, is a good policy rule.
Abstract: This paper examines several episodes in U.S. monetary history using the framework of an interest rate rule for monetary policy. The main finding is that a monetary policy rule in which the interest rate responds to inflation and real output more aggressively than it did in the 1960s and 1970s, or than during the time of the international gold standard, and more like the late 1980s and 1990s, is a good policy rule. Moreover, if one defines rule, then such mistakes have been associated with either high and prolonged inflation or drawn out periods of low capacity utilization.

Posted Content
TL;DR: In this paper, the authors examine whether hostile takeovers can be distinguished from friendly takeovers, empirically, based on accounting and stock performance data, and show that most deals described as hostile in the press are not distinguishable from friendly deals in economic terms, and negotiations are publicized earlier in hostile transactions.
Abstract: This paper examines whether hostile takeovers can be distinguished from friendly takeovers, empirically, based on accounting and stock performance data. Much has been made of this distinction in both the popular and the academic literature, where gains from hostile takeovers are typically attributed to the value of replacing incumbent managers and the gains from friendly takeovers are typically attributed to strategic synergies. Alternatively, hostility could reflect just a perceptual distinction arising from different patterns of public disclosure, where negotiated outcomes are the rule and transactions tend to be characterized as friendly when bargaining remains undisclosed throughout, and hostile when the public becomes aware of the negotiation before its resolution. Empirical tests show that most deals described as hostile in the press are not distinguishable from friendly deals in economic terms, and that negotiations are publicized earlier in hostile transactions.

Posted Content
TL;DR: In this paper, a team of researchers from the International Food Policy Research Institute, the Food and Agricultural Organization of the United Nations (FAO), and the International Livestock Research Institute (ILRI) collaborated to produce a comprehensive and even-handed attempt at defining the nature, extent, scope, and implications of what they termed the "Livestock Revolution" in developing countries.
Abstract: A team of researchers from the International Food Policy Research Institute (IFPRI), the Food and Agricultural Organization of the United Nations (FAO), and the International Livestock Research Institute (ILRI) collaborated to produce this comprehensive and even-handed attempt at defining the nature, extent, scope, and implications of what they term the "Livestock Revolution” in developing countries. Looking forward to 2020, they argue convincingly that the structural shifts in world agriculture being brought about by shifts in developing-country demand for foods of animal origin will continue and that increasingly global markets have the ability to supply both cereal and animal products in desired quantities without undue price rises. They emphasize, however, that policy decisions taken for the livestock sector of developing countries will determine whether the Livestock Revolution helps or harms the world's poor and malnourished. The report emphasizes the importance of continued investment in both research on and development of animal and feed grain production and processing, and the need for policy action to help small, poor livestock producers become better integrated with commercial livestock marketing and processing. It details a host of requirements in the area of technology development for production and processing of livestock products, potential benefits from new technologies, and critical policy issues for environmental conservation and protection of public health.

Posted Content
TL;DR: This paper developed a stochastic new open economy macroeconomic model based on sticky nominal wages to analyze the effects of the monetary regime on welfare, expected output, and the expected terms of trade.
Abstract: The paper develops a simple stochastic new open economy macroeconomic model based on sticky nominal wages. Explicit solution of the wage-setting problem under uncertainty allows one to analyze the effects of the monetary regime on welfare, expected output, and the expected terms of trade. Despite the potential interplay between imperfections due to sticky wages and monopoly, the optimal monetary policy rule has a closed-form solution. To motivate our model, we show that observed correlations between terms of trade and exchange rates are more consistent with our traditional assumptions about nominal rigidities than with a popular alternative based on local-currency pricing.

Posted Content
TL;DR: In this paper, the authors review some recent micro-econometric studies evaluating effects of government-sponsored commercial R&D and pay particular attention to the conceptual problems involved Neither the firms receiving support, nor those not applying, constitute random samples Furthermore, those not receiving support may be affected by the programs due to spillover effects.
Abstract: A number of market failures have been associated with R&D investments and significant amounts of public money have been spent on programs to stimulate innovative activities In this paper, we review some recent microeconometric studies evaluating effects of government-sponsored commercial R&D We pay particular attention to the conceptual problems involved Neither the firms receiving support, nor those not applying, constitute random samples Furthermore, those not receiving support may be affected by the programs due to spillover effects which often are the main justification for R&D subsidies Constructing a valid control group under these circumstances is challenging, and we relate our discussion to recent advances in econometric methods for evaluation studies based on non-experimental data We also discuss some analytical questions, beyond these estimation problems, that need to be addressed in order to assess whether R&D support schemes can be justified For instance, what are the implications of firms' R&D investments being complementary to each other, and to what extent are potential R&D spillovers internalized in the market?

Posted Content
TL;DR: In this paper, the authors show that as the average bank tilts its product mix toward fee-based activities and away from traditional lending activities, the bank's revenue volatility, its degree of total leverage, and the level of its earnings all increase.
Abstract: Commercial banks’ lending and deposit-taking business has declined in recent years. Deregulation and new technology have eroded banks’ comparative advantages and made it easier for nonbank competitors to enter these markets. In response, banks have shifted their sales mix toward noninterest income — by selling ‘nonbank’ fee-based financial services such as mutual funds; by charging explicit fees for services that used to be ‘bundled’ together with deposit or loan products; and by adopting securitized lending practices which generate loan origination and servicing fees and reduce the need for deposit financing by moving loans off the books. The conventional wisdom in the banking industry is that earnings from fee-based products are more stable than loan-based earnings, and that fee-based activities reduce bank risk via diversification. However, there are reasons to doubt this conventional wisdom a priori. Compared to fees from nontraditional banking products (e.g., mutual fund sales, data processing services, mortgage servicing), revenue from traditional relationship lending activities may be relatively stable, because switching costs and information costs reduce the likelihood that either the borrower or the lender will terminate the relationship. Furthermore, traditional lending business may employ relatively low amounts of operating and/or financial leverage, which will dampen the impact of fluctuations in loan-based revenue on bank earnings. We test this conventional wisdom using data from 472 U.S. commercial banks between 1988 and 1995, and a new ‘degree of total leverage’ framework which conceptually links a bank’s earnings volatility to fluctuations in its revenues, to the fixity of its expenses, and to its product mix. Unlike previous studies that compare earnings streams of unrelated financial firms, we observe various mixes of financial services produced and marketed jointly within commercial banks. Thus, the evidence that we present reflects the impact of production synergies (economies of scope) and marketing synergies (cross-selling) not captured in previous studies. To implement this framework, we modify standard degree of leverage estimation methods to conform with the characteristics of commercial banks. Our results do not support the conventional wisdom. As the average bank tilts its product mix toward fee-based activities and away from traditional lending activities, we find that the bank’s revenue volatility; its degree of total leverage, and the level of its earnings all increase. The first two results imply increased earnings volatility (because earnings volatility is the product of revenue volatility and the degree of total leverage) and the third result implies a possible risk premium. These results have implications for bank regulators, who must set capital requirements at levels that balance the volatility of bank earnings against the probability of bank insolvency. These results also suggest another explanation for the shift toward fee-intensive product mixes: a belief by bank managers that increased earnings volatility will enhance shareholder value (or at least will increase the value of the managers’ call options on their banks’ stock). Our results have no direct implications for the expanded bank powers debate we examine only currently permissible fee-based activities, and these activities may have demand and production characteristics different from insurance underwriting, investment banking, or real estate brokerage.

Posted Content
TL;DR: In this paper, the authors study a contest with multiple (not necessarily equal) prizes and show that for any number of contestants having linear, convex or concave cost functions, and for any distribution of abilities, it is optimal for the designer to allocate the entire prize sum to a single ''first'' prize.
Abstract: We study a contest with multiple (not necessarily equal) prizes. Contestants have private information about an ability parameter that affects their costs of bidding. The contestant with the highest bid wins the first prize, the contestant with the second-highest bid wins the second prize, and so on until all the prizes are allocated. All contestants incur their respective costs of bidding. The contest's designer maximizes the expected sum of bids. Our main results are: 1) We display bidding equlibria for any number of contestants having linear, convex or concave cost functions, and for any distribution of abilities. 2) If the cost functions are linear or concave, then, no matter what the distribution of abilities is, it is optimal for the designer to allocate the entire prize sum to a single ''first'' prize. 3) We give a necessary and sufficient conditions ensuring that several prizes are optimal if contestants have a convex cost function.