scispace - formally typeset
Search or ask a question

Showing papers in "Research Papers in Economics in 2001"


Journal Article
TL;DR: In this paper, the authors highlight the role of business in national economies and show that there is more than one path to economic success, and explain national differences in social and economic policy.
Abstract: What are the most important differences among national economies? Is globalization forcing nations to converge on an Anglo-American model? What explains national differences in social and economic policy? This pathbreaking work outlines a new approach to these questions. It highlights the role of business in national economies and shows that there is more than one path to economic success.

5,778 citations


Posted Content
TL;DR: In this article, the authors provide a general framework for integration of high-frequency intraday data into the measurement, modeling and forecasting of daily and lower frequency volatility and return distributions.
Abstract: This paper provides a general framework for integration of high-frequency intraday data into the measurement, modeling and forecasting of daily and lower frequency volatility and return distributions. Most procedures for modeling and forecasting financial asset return volatilities, correlations, and distributions rely on restrictive and complicated parametric multivariate ARCH or stochastic volatility models, which often perform poorly at intraday frequencies. Use of realized volatility constructed from high-frequency intraday returns, in contrast, permits the use of traditional time series procedures for modeling and forecasting. Building on the theory of continuous-time arbitrage-free price processes and the theory of quadratic variation, we formally develop the links between the conditional covariance matrix and the concept of realized volatility. Next, using continuously recorded observations for the Deutschemark/Dollar and Yen /Dollar spot exchange rates covering more than a decade, we find that forecasts from a simple long-memory Gaussian vector autoregression for the logarithmic daily realized volatitilies perform admirably compared to popular daily ARCH and related models. Moreover, the vector autoregressive volatility forecast, coupled with a parametric lognormal-normal mixture distribution implied by the theoretically and empirically grounded assumption of normally distributed standardized returns, gives rise to well-calibrated density forecasts of future returns, and correspondingly accurate quintile estimates. Our results hold promise for practical modeling and forecasting of the large covariance matrices relevant in asset pricing, asset allocation and financial risk management applications.

2,898 citations


Posted Content
TL;DR: This article surveys the economic literature on boards of directors and finds that board composition is not related to corporate performance, while board size has a negative relation with corporate performance and boards appear to evolve over time as a function of the bargaining power of the CEO relative to the existing directors.
Abstract: This paper surveys the economic literature on boards of directors. Although a legal requirement for many organizations, boards are also an endogenously determined governance mechanism for addressing agency problems inherent to many organizations. Formal theory on boards of directors has been quite limited to this point. Most empirical work on boards has been aimed at answering one of three questions: 1) How are board characteristics such as composition or size related to profitability? 2) How do board characteristics affect the observable actions of the board? 3) What factors affect the makeup of boards and how they evolve over time? The primary findings from the empirical literature on boards are: Board composition is not related to corporate performance, while board size has a negative relation to corporate performance. Both board composition and size are correlated with the board's decisions regarding CEO replacement, acquisitions, poison pills, and executive compensation. Finally, boards appear to evolve over time as a function of the bargaining power of the CEO relative to the existing directors. Firm performance, CEO turnover, and changes in ownership structure appear to be important factors affecting changes to boards.

2,804 citations


Posted Content
TL;DR: The authors present a model embodying moderate amounts of nominal rigidities which accounts for the observed inertia in inflation and persistence in output, and the key features of their model are those that prevent a sharp rise in marginal costs after an expansionary shock to monetary policy.
Abstract: We present a model embodying moderate amounts of nominal rigidities which accounts for the observed inertia in inflation and persistence in output. The key features of our model are those that prevent a sharp rise in marginal costs after an expansionary shock to monetary policy. Of these features, the most important are staggered wage contracts of average duration three quarters, and variable capital utilization.

2,580 citations


Book ChapterDOI
TL;DR: In this paper, the authors used fully modified OLS principles to develop new methods for estimating and testing hypotheses for cointegrating vectors in dynamic panels in a manner that is consistent with the degree of cross sectional heterogeneity that has been permitted in recent panel unit root and panel cointegration studies.
Abstract: This chapter uses fully modified OLS principles to develop new methods for estimating and testing hypotheses for cointegrating vectors in dynamic panels in a manner that is consistent with the degree of cross sectional heterogeneity that has been permitted in recent panel unit root and panel cointegration studies. The asymptotic properties of various estimators are compared based on pooling along the ‘within’ and ‘between’ dimensions of the panel. By using Monte Carlo simulations to study the small sample properties, the group mean estimator is shown to behave well even in relatively small samples under a variety of scenarios.

2,234 citations


Book ChapterDOI
TL;DR: In this article, the local power of panel unit root statistics against a sequence of local alternatives is studied and the results of a Monte Carlo experiment suggest that avoiding the bias can improve the power of the test substantially.
Abstract: To test the hypothesis of a difference stationary time series against a trend stationary alternative, Levin & Lin (1993) and Im, Pesaran & Shin (1997) suggest bias adjusted t-statistics. Such corrections are necessary to account for the nonzero mean of the t-statistic in the case of an OLS detrending method. In this chapter the local power of panel unit root statistics against a sequence of local alternatives is studied. It is shown that the local power of the test statistics is affected by two different terms. The first term represents the asymptotic effect on the bias due to the detrending method and the second term is the usual location parameter of the limiting distribution under the sequence of local alternatives. It is argued that both terms can offset each other so that the test has no power against the sequence of local alternatives. These results suggest to construct test statistics based on alternative detrending methods. We consider a class of t-statistics that do not require a bias correction. The results of a Monte Carlo experiment suggest that avoiding the bias can improve the power of the test substantially.

2,038 citations


Posted Content
TL;DR: The authors found that the share of income accruing to the bottom quintile does not vary systematically with the average income, and that when average income rises, the average incomes of the poorest fifth of society rise proportionately.
Abstract: When average income rises, the average incomes of the poorest fifth of society rise proportionately. This is a consequence of the strong empirical regularity that the share of income accruing to the bottom quintile does not vary systematically with average income. The authors document this empirical regularity in a sample of 92 countries spanning the past four decades and show that it holds across regions, periods, income levels, and growth rates. The authors next ask whether the factors that explain cross-country differences in the growth rates of average incomes have differential effects on the poorest fifth of society. They find that several determinants of growth--such as good rule of law, opennness to international trade, and developed financial markets--have little systematic effect on the share of income that accrues to the bottom quintile. Consequently, these factors benefit the poorest fifth of society as much as everyone else. Thee is some weak evidence that stabilization from high inflation and reductions in the overall size of government not only increase growth but also increase the income share of the poorest fifth in society. Finally, the authors examine several factors commonly thought to disproportionately benefit the poorest in society, but find little evidence of their effects. The absence of robust findings emphasizes that relatively little is known about the broad forces that account for the cross-country and intertemporal variation in the share of income accruing to the poorest fifth of society.

1,952 citations


Posted Content
TL;DR: This excellent text provides a comprehensive treatment of the state space approach to time series analysis, where observations are regarded as made up of distinct components such as trend, seasonal, regression elements and disturbence terms, each of which is modelled separately.
Abstract: This excellent text provides a comprehensive treatment of the state space approach to time series analysis. The distinguishing feature of state space time series models is that observations are regarded as made up of distinct components such as trend, seasonal, regression elements and disturbence terms, each of which is modelled separately. The techniques that emerge from this approach are very flexible and are capable of handling a much wider range of problems than the main analytical system currently in use for time series analysis, the Box-Jenkins ARIMA system. The book provides an excellent source for the development of practical courses on time series analysis.

1,931 citations


Posted Content
TL;DR: A theory of reciprocity for extensive games in which the sequential structure of a strategic situation is made explicit is developed and a general equilibrium existence result is proved.
Abstract: Many experimental studies indicate that people are motivated by reciprocity. Rabin (1993) develops techniques for incorporating such concerns into game theory and economics. His model, however, does not fare well when applied to situations with an interesting dynamic structure (like many experimental games), because it is developed for normal form games in which information about the sequential structure of a strategic situation is suppressed. In this paper we develop a theory of reciprocity for extensive games in which the sequential structure of a strategic situation is made explicit. We propose a new solution concept— sequential reciprocity equilibrium—which is applicable to extensive games, and we prove a general equilibrium existence result. The model is applied in several examples, including some well known experimental games like the Ultimatum game and the Sequential Prisoners’ Dilemma. (This abstract was borrowed from another version of this item.) (This abstract was borrowed from another version of this item.) (This abstract was borrowed from another version of this item.) (This abstract was borrowed from another version of this item.) (This abstract was borrowed from another version of this item.) (This abstract was borrowed from another version of this item.) (This abstract was borrowed from another version of this item.) (This abstract was borrowed from another version of this item.) (This abstract was borrowed from another version of this item.) (This abstract was borrowed from another version of this item.) (This abstract was borrowed from another version of this item.) (This abstract was borrowed from another version of this item.) (This abstract was borrowed from another version of this item.) (This abstract wa (This abstract was borrowed from another version of this item.)

1,518 citations


Posted Content
TL;DR: This article developed a simple framework to analyze the forces that shape these biases, and applied this framework to discuss a range of issues including: Why technical change over the past 60 years was skill-biased, and why the skill bias may have accelerated over twenty-five years.
Abstract: For many problems in macroeconomics, development economics, labor economics, and international trade, whether technical change is biased towards particular factors is of central importance. This paper develops a simple framework to analyze the forces that shape these biases. There are two major forces affecting equilibrium bias: the price effect and the market size effect. While the former encourages innovations directed at scarce factors, the latter leads to technical change favoring abundant factors. The elasticity of substitution between different factors regulates how powerful these effects are, and this has implications about how technical change and factor prices respond to changes in relative supplies. If the elasticity of substitution is sufficiently large, the long-run relative demand for a factor can slope up. I apply this framework to discuss a range of issues including: Why technical change over the past 60 years was skill-biased, and why the skill bias may have accelerated over the past twenty-five years. Why new technologies introduced during the late eighteenth and early nineteenth centuries were unskill-biased. Why biased technical change may increase the income gap between rich and poor countries. Why international trade may induce skill-biased technical change. Why a large wage-push, as in continental Europe during the 1970s, may cause capital-biased technical change. Why technical change may be generally labor-augmenting rather than capital-augmenting.

1,317 citations


Posted Content
TL;DR: In this paper, the authors show that loss aversion determines seller behavior in the housing market, and that owners subject to nominal losses set higher asking prices of 25-35 percent of the difference between the property's expected selling price and their original purchase price, and exhibit a much lower sale hazard than other sellers.
Abstract: Data from downtown Boston in the 1990s show that loss aversion determines seller behavior in the housing market. Condominium owners subject to nominal losses 1) set higher asking prices of 25-35 percent of the difference between the property's expected selling price and their original purchase price; 2) attain higher selling prices of 3-18 percent of that difference; and 3) exhibit a much lower sale hazard than other sellers. The list price results are twice as large for owner-occupants as investors, but hold for both. These findings are consistent with prospect theory and help explain the positive price-volume correlation in real estate markets.

Posted Content
TL;DR: The notion of a convex measure of risk is introduced, an extension of the concept of a coherent risk measure defined in Artzner et al. (1999), and a corresponding extensions of the representation theorem in terms of probability measures on the underlying space of scenarios are proved.
Abstract: We introduce the notion of a convex measure of risk, an extension of the concept of a coherent risk measure defined in Artzner et aL (1999), and we prove a corresponding extension of the representation theorem in terms of probability measures on the underlying space of scenarios. As a case study, we consider convex measures of risk defined in terms of a robust not ion of bounded shortfall risk. In the context of a financial market model, it turns out that the representation theorem is closely related to the superhedging duality under convex constraints.

Posted Content
TL;DR: In this article, the authors estimate plant level production functions that include variables that allow for two types of scale externalities which plants experie nce in their local industrial environments: externalities from other plants in the same industry locally, usually called localization economies or, in a dynamic context, Marshall, Arrow, Romer [MAR] economies.
Abstract: In this paper, using panel data, I estimate plant level production functions that include variables that allow for two types of scale externalities which plants experie nce in their local industrial environments. First are externalities from other plants in the same industry locally, usually called localization economies or, in a dynamic context, Marshall, Arrow, Romer [MAR] economies. Second are externalities from the scale or diversity of local economic activity outside the own industry involving some type of cross- fertilization, usually called urbanization economies or, in a dynamic context, Jacobs economies. Estimating production functions for plants in high tech industries and in capital goods, or machinery industries, I find that local own industry scale externalities, as measured specifically by the count of other own industry plants locally, have strong productivity effects in high tech but not machinery industries. I find evidence that single plant firms both benefit more from and generate greater external benefits than corporate plants. On timing, I find evidence that high tech single plant firms benefit from the scale of past own industry activity, as well as current activity. I find no evidence of urbanization economies from the diversity of local economic activity outside the own industry and limited evidence of urbanization economies from the overall scale of local economic activity.

Posted Content
TL;DR: In this article, the authors present evidence that supports the individual-based model of social capital formation, including seven facts: (l) the relationship between social capital and age is first increasing and then decreasing, (i) social capital declines with expected mobility, social capital investment is higher in occupations with greater returns to social skills, (ii) social connections fall sharply with physical distance, (iii) people who invest in human capital also invest in social capital, and (iv), social capital appears to have interpersonal complementarities.
Abstract: To identify the determinants of social capital formation, it is necessary to understand the social capital investment decision of individuals. Individual social capital should then be aggregated to measure the social capital of a community. This paper assembles the evidence that supports the individual-based model of social capital formation, including seven facts: (l) the relationship between social capital and age is first increasing and then decreasing, (2) social capital declines with expected mobility, (3) social capital investment is higher in occupations with greater returns to social skills, (4) social capital is higher among homeowners, (5) social connections fall sharply with physical distance, (6) people who invest in human capital also invest in social capital, and (7) social capital appears to have interpersonal complementarities.

Posted Content
David Popp1
TL;DR: In this paper, the authors used U.S. patent data from 1970 to 1994 to estimate the effect of energy prices on energy-efficient innovations and found that both energy prices and the quality of existing knowledge have strongly significant positive effects on innovation.
Abstract: I use U.S. patent data from 1970 to 1994 to estimate the effect of energy prices on energy-efficient innovations. Using patent citations to construct a measure of the usefulness of the existing base of scientific knowledge, I consider the effect of both demand-side factors, which spur innovative activity by increasing the value of new innovations, and supply-side factors, such as scientific advancements that make new innovations possible. I find that both energy prices and the quality of existing knowledge have strongly significant positive effects on innovation. Furthermore, I show that omitting the quality of knowledge adversely affects the estimation results.

Posted Content
TL;DR: In this article, the first-differenced GMM estimator can be poorly behaved, since lagged levels of the series provide only weak instruments for subsequent firstdifferences.
Abstract: This Paper highlights a problem in using the first-differenced GMM panel data estimator to estimate cross-country growth regressions. When the time series are persistent, the first-differenced GMM estimator can be poorly behaved, since lagged levels of the series provide only weak instruments for subsequent first-differences. Revisiting the work of Caselli, Esquivel and Lefort (1996), we show that this problem may be serious in practice. We suggest using a more efficient GMM estimator that exploits stationarity restrictions and this approach is shown to give more reasonable results than first-differenced GMM in our estimation of an empirical growth model.

Posted Content
TL;DR: O'Rourke and Williamson as discussed by the authors present a coherent picture of trade, migration, and international capital flows in the Atlantic economy in the century prior to 1914, the first great globalization boom.
Abstract: Globalization is not a new phenomenon; nor is it irreversible. In Globalization and History, Kevin O'Rourke and Jeffrey Williamson present a coherent picture of trade, migration, and international capital flows in the Atlantic economy in the century prior to 1914--the first great globalization boom. The book's originality lies in its application of the tools of open-economy economics to this critical historical period--differentiating it from most previous work, which has been based on closed-economy or single-sector models. The authors also keep a close eye on globalization debates of the 1990s, using history to inform the present and vice versa.

Posted Content
TL;DR: In this article, the authors describe the database on U.S. patents that they have developed over the past decade, with the goal of making it widely accessible for research, and present main trends in U. S. patenting over the last 30 years.
Abstract: This paper describes the database on U.S. patents that we have developed over the past decade, with the goal of making it widely accessible for research. We present main trends in U. S. patenting over the last 30 years, including a variety of original measures constructed with citation data, such as backward and forward citation lags, indices of "originality" and "generality", self-citations, etc.

Posted Content
TL;DR: The initial impact of the Asian financial crisis in Malaysia reduced the expected value of government subsidies to politically favored firms as mentioned in this paper, and the evidence suggests Malaysian capital controls provided a screen behind which favored firms could be supported.
Abstract: The initial impact of the Asian financial crisis in Malaysia reduced the expected value of government subsidies to politically favored firms. Of the estimated $60 billion loss in market value for politically connected firms from July 1997 to August 1998, roughly 9% can be attributed to the fall in the value of their connections. Firing the Deputy Prime Minister and imposing capital controls in September 1998 primarily benefited firms with strong ties to Prime Minister Mahathir. Of the estimated $5 billion gain in market value for Mahathir-connected firms during September 1998, approximately 32% was due to the increase in the value of their connections. The evidence suggests Malaysian capital controls provided a screen behind which favored firms could be supported.

Posted Content
TL;DR: In this paper, the authors used a comprehensive data set of Portuguese manufacturing firms and showed that the firm size distribution is significantly right-skewed, evolving over time toward a log-normal distribution.
Abstract: Using a comprehensive data set of Portuguese manufacturing firms, we show that the firm size distribution is significantly right-skewed, evolving over time toward a log-normal distribution. We also show that selection accounts for very little of this evolution. Instead, we propose a simple theory based on financing constraint. A calibrated version of our model does a good job at explaining the evolution of the firm size distribution.

Posted Content
TL;DR: In this article, Laffont and Tirole analyze regulatory reform and the emergence of competition in network industries using the state-of-the-art theoretical tools of industrial organization, political economy, and the economics of incentives.
Abstract: In Competition in Telecommunications, Jean-Jacques Laffont and Jean Tirole analyze regulatory reform and the emergence of competition in network industries using the state-of-the-art theoretical tools of industrial organization, political economy, and the economics of incentives. The book opens with background information for the reader who is unfamiliar with current issues in the telecommunication industry. The following sections focus on four central aspects of the recent deregulatory movement: the introduction of incentive regulation; one-way access; the special nature of competition in a industry requiring two-way access; and universal service, in particular, the use of engineering models to compute subsidies and the design of universal service auctions.

Posted Content
TL;DR: In this paper, the authors assess two broad and competing theories of government regulation: the helping hand approach, according to which governments regulate to correct market failures, and the grabbing-hand approach according to where government regulates to support political constituency.
Abstract: The authors draw on their new database on bank regulation and supervision in 107 countries to assess different governmental approaches to bank regulation and supervision and evaluate the efficacy of different regulatory and supervisory policies. First, the authors assess two broad and competing theories of government regulation: the helping-hand approach, according to which governments regulate to correct market failures, and the grabbing-hand approach, according to which governments regulate to support political constituencies. Second, they assess the effect of an extensive array of regulatory and supervisory policies on the development and fragility of the banking sector. These policies include the following: Regulations on bank activities and the mixing of banking and commerce. Regulations on entry by domestic and foreign banks. Regulations on capital adequacy. Design features of deposit insurance systems. Supervisory power, independence, and resources; stringency of loan classification; provisioning standards; diversification guidelines; and powers to take prompt corrective action. Regulations governing information disclosure and fostering private sector monitoring of banks. Government ownership of banks. The results raise a cautionary flag with regard to reform strategies that place excessive reliance on a country's adherence to an extensive checklist of regulatory and supervisory practices that involve direct government oversight of and restrictions on banks. The findings, which are much more consistent with the grabbing-hand view of regulation than with the helping-hand view, suggest that the regulatory and supervisory practices most effective in promoting good performance and stability in the banking sector are those that force accurate information disclosure, empower private sector monitoring of banks, and foster incentives for private agents to exert corporate control.

Posted Content
TL;DR: In this article, the problem of multivariate conditional variance estimation can be simplified by estimating univariate GARCH models for each asset, and then, using transformed residuals resulting from the first stage, estimating a conditional correlation estimator.
Abstract: In this paper, we develop the theoretical and empirical properties of a new class of multivariate GARCH models capable of estimating large time-varying covariance matrices, Dynamic Conditional Correlation Multivariate GARCH. We show that the problem of multivariate conditional variance estimation can be simplified by estimating univariate GARCH models for each asset, and then, using transformed residuals resulting from the first stage, estimating a conditional correlation estimator. The standard errors for the first stage parameters remain consistent, and only the standard errors for the correlation parameters need be modified. We use the model to estimate the conditional covariance of up to 100 assets using S&P 500 Sector Indices and Dow Jones Industrial Average stocks, and conduct specification tests of the estimator using an industry standard benchmark for volatility models. This new estimator demonstrates very strong performance especially considering ease of implementation of the estimator.

Posted Content
TL;DR: Bennett and Blamey as discussed by the authors used choice experiments to assess the options for the Canberra water supply, an application of choice modelling, Jenny Gordon et al remnant vegetation and wetlands protection -non-market valuation, Jeff Bennett et al green product choice, Russell BlAMEy et al opt-out alternatives and anglers' stated preferences.
Abstract: Introduction, Jeff Bennett and Russell Blamey. Part 1 The technique: choice experiments - an overview of concepts and issues, Jordan J. Louviere some fundamentals of environmental choice modelling, Jeff Bennett and Vic Adamowicz. Part 2 Case studies: assessing the options for the Canberra water supply - an application of choice modelling, Jenny Gordon et al remnant vegetation and wetlands protection - non-market valuation, Jeff Bennett et al green product choice, Russell Blamey et al. Part 3 Exploring some methodological issues: choice set design, Russell Blamey et al opt-out alternatives and anglers' stated preferences, Mellisa Ruby Banzhaf et al yea-saying and validation of a choice model of green product choice, Russell Blamey and Jeff Bennett framing effects, John Rolfe and Jeff Bennett the strengths and weaknesses of environmental choice modelling, Jeff Bennett and Russell Blamey.

Posted Content
TL;DR: It is hypothesized that this area is involved in integrating theory-of-mind processing with cooperative actions in prefrontal cortex, and data is reported from a functional MRI experiment designed to test this hypothesis.
Abstract: Cooperation between individuals requires the ability to infer each other’s mental states to form shared expectations over mutual gains and make cooperative choices that realize these gains. From evidence that the ability for mental state attribution involves the use of prefrontal cortex, we hypothesize that this area is involved in integrating theory-of-mind processing with cooperative actions. We report data from a functional MRI experiment designed to test this hypothesis. Subjects in a scanner played standard two-person ‘‘trust and reciprocity’’ games with both human and computer counterparts for cash rewards. Behavioral data shows that seven subjects consistently attempted cooperation with their human counterpart. Within this group prefrontal regions are more active when subjects are playing a human than when they are playing a computer following a fixed (and known) probabilistic strategy. Within the group of five noncooperators, there are no significant differences in prefrontal activation between computer and human conditions.

Posted Content
TL;DR: In this article, the authors study equilibrium firm-level stock returns in two economies: one in which investors are loss averse over the fluctuations of their stock portfolio and another in which they are loss-averse over fluctuations of individual stocks that they own.
Abstract: We study equilibrium firm-level stock returns in two economies: one in which investors are loss averse over the fluctuations of their stock portfolio and another in which they are loss averse over the fluctuations of individual stocks that they own Both approaches can shed light on empirical phenomena, but we find the second approach to be more successful: in that economy, the typical individual stock return has a high mean and excess volatility, and there is a large value premium in the cross-section which can, to some extent, be captured by a commonly used multifactor model

Journal Article
TL;DR: In this article, the authors discuss the challenges of globalization and city-region development in the context of city-regions in Africa, Asia, and Latin America, and present a collection of cities and regions in the world.
Abstract: PART I: OPENING ARGUMENTS PART II: ON PRACTICAL QUESTIONS OF GLOBALIZATION AND CITY-REGION DEVELOPMENT PART III: THE GLOBAL CITY-REGION: A NEW GEOGRAPHIC PHENOMENON? PART IV: THE COMPETITIVE ADVANTAGES OF GLOBAL CITY-REGIONS PART V: GLOBAL CITY-REGIONS IN AFRICA, ASIA, AND LATIN AMERICA: POLITICAL AND ECONOMIC CHALLENGES PART VI: SOCIAL INEQUALITIES AND IMMIGRANT NICHES IN GLOBAL CITY-REGIONS PART VII: QUESTIONS OF CITIZENSHIP PART VIII: THE NEW COLLECTIVE ORDER OF GLOBAL CITY-REGIONS PART IX: CODA: ENVIRONMENTAL ISSUES

Posted Content
TL;DR: In this article, the authors developed a technique useful for obtaining more precise estimates of demand and supply curves when constrained to market-level data and applied the technique to the automobile market, estimating the economic effects of the minivan introduction.
Abstract: I develop a technique useful for obtaining more precise estimates of demand and supply curves when constrained to market-level data. It augments the estimation routine with data on the average characteristics of consumers that purchase different products. I apply the technique to the automobile market, estimating the economic effects of the minivan introduction. I show that standard approaches yield results that are meaningfully different from those obtained with my extension. I report benefits accruing to both minivan and non-minivan consumers. I complete the welfare picture by measuring the extent of first- mover advantage and of profit cannibalization both initially by the innovator and later by the imitators. My results support a simple economic story where large improvements in consumers' standard of living arise from competition as firms, ignoring the externalities they impose on one another, cannibalize each others profits by continually seeking new goods that give them some temporary market power.

Posted Content
TL;DR: In this paper, the authors argue that as Internet penetration increases, students of inequality of access to the new information technologies should shift their attention from the "digital divide" - inequality between "haves" and "have-nots" differentiated by dichotomous measures of access or use of the new technologies - to digital inequality, by which they refer not just to differences in access, but also to inequality among persons with formal access.
Abstract: The authors of this paper contend that as Internet penetration increases, students of inequality of access to the new information technologies should shift their attention from the "digital divide" - inequality between "haves" and "have-nots" differentiated by dichotomous measures of access to or use of the new technologies - to digital inequality, by which we refer not just to differences in access, but also to inequality among persons with formal access to the Internet. After reviewing data on Internet penetration, the paper describes five dimensions of digital inequality - in equipment, autonomy of use, skill, social support, and the purposes for which the technology is employed - that deserve additional attention. In each case, hypotheses are developed to guide research, with the goal of developing a testable model of the relationship between individual characteristics, dimensions of inequality, and positive outcomes of technology use. Finally, because the rapidity of organizational as well as technical change means that it is difficult to presume that current patterns of inequality will persist into the future, the authors call on students of digital inequality to study institutional issues in order to understand patterns of inequality as evolving consequences of interactions among firms' strategic choices, consumers' responses, and government policies.

Posted Content
TL;DR: In this paper, the authors present an authoritative analysis of modern economic growth from the Industrial Revolution to the 'New Economy' of today, charting the history of five technological revolutions: waterpowered mechanization, steam-powered mechanisation, electrification, motorization, and computerization.
Abstract: How can we best understand the impact of revolutionary technologies on the business cycle, the economy, and society? Why is economics meaningless without history and without an understanding of institutional and technical change? Does the 'new economy' mean the 'end of history'?an we best understand the impact of revolutionary technologies on business organization and the business cycle? These are some of the questions addressed in this authoritative analysis of modern economic growth from the Industrial Revolution to the 'New Economy' of today. Chris Freeman has been one of the foremost researchers on innovation for a long time and his colleague Francisco Louca is an outstanding historian of economic theory and an analyst of econometric models and methods. Together they chart the history of five technological revolutions: water-powered mechanization, steam-powered mechanization, electrification, motorization, and computerization. They demonstrate the necessity to take account of politics, culture, organizational change, and entrepreneurship, as well as science and technology in the analysis of economic growth. This is an well-informed, highly topical, and persuasive study of interest across all the social sciences. Available in OSO: http://www.oxfordscholarship.com/oso/public/content/economicsfinance/0199251053/toc.html