scispace - formally typeset
Search or ask a question

Showing papers in "Econometrica in 1995"


Journal ArticleDOI
TL;DR: In this article, the authors developed techniques for empirically analyzing demand and supply in differentiated products markets and then applied these techniques to analyze equilibrium in the U.S. automobile industry.
Abstract: This paper develops techniques for empirically analyzing demand and supply in differentiated products markets and then applies these techniques to analyze equilibrium in the U.S. automobile industry. Our primary goal is to present a framework which enables one to obtain estimates of demand and cost parameters for a class of oligopolistic differentiated products markets. These estimates can be obtained using only widely available product-level and aggregate consumer-level data, and they are consistent with a structural model of equilibrium in an oligopolistic industry. When we apply the tech- niques developed here to the U.S. automobile market, we obtain cost and demand parameters for (essentially) all models marketed over a twenty year period.

4,803 citations


Book ChapterDOI
TL;DR: In economics, the most commonly used tool has been the strategic equilibrium of Nash (Ann Math 54:286-295, 1951) or one or another of its so-called refinements as discussed by the authors.
Abstract: Game theoretic reasoning has been widely applied in economics in recent years. Undoubtedly, the most commonly used tool has been the strategic equilibrium of Nash (Ann Math 54:286–295, 1951), or one or another of its so-called “refinements.” Though much effort has gone into developing these refinements, relatively little attention has been paid to a more basic question: Why consider Nash equilibrium in the first place?

869 citations


Journal ArticleDOI
TL;DR: In this article, the authors developed and estimated a model of the U.S. Automobile Industry using micro data from the Consumer Expenditure Survey (CES) and used it in conjunction with population weights to derive aggregate demand.
Abstract: This paper develops and estimates a model of the U.S. Automobile Industry. On the demand side, a discrete choice model is adopted, that is estimated using micro data from the Consumer Expenditure Survey. The estimation results are used in conjunction with population weights to derive aggregate demand. On the supply side, the automobile industry is modelled as an oligopoly with product differentiation. Equilibrium is characterized by the first order conditions of the profit maximizing firms. The estimation results are used in counterfactual simulations to investigate two trade policy issues: the effects of the VER, and exchange rate pass-through.

693 citations


Journal ArticleDOI
TL;DR: In this article, the authors introduce and characterize a weighting function according to which an event has greater impact when it turns impossibility into possibility, or possibility into certainty, than when it merely makes a possibility more or less likely.
Abstract: To accommodate the observed pattern of risk-aversion and risk-seeking, as well as common violations of expected utility (e.g., the certainty effect), we introduce and characterize a weighting function according to which an event has greater impact when it turns impossibility into possibility, or possibility into certainty, than when it merely makes a possibility more or less likely. We show how to compare such weighting functions (of different individuals) with respect to the degree of departure from expected utility, and we present a method for comparing an individual's weighting functions for risk and for uncertainty.

591 citations


Report SeriesDOI
TL;DR: In this paper, the authors developed grouping estimators that reveal positive and moderately sized wage elasticities and negative income effects for women with children in the tax system, and found negative income effect for women having children.
Abstract: The 1980's tax reforms and the changing dispersion of wages offer one of the best opportunities yet to estimate labor supply effects. Nevertheless, changing sample composition, aggregate shocks, the changing composition of the tax paying population, and discontinuities in the tax system create serious identification and estimation problems. We develop grouping estimators that address these issues. Our results reveal positive and moderately sized wage elasticities. We: also find negative income effects for women with children.

546 citations


Journal ArticleDOI
TL;DR: In this paper, the authors consider the use of FM regression in the context of vector autoregressions (VAR's) with some unit roots and some cointegrating relations.
Abstract: Fully modified least squares (FM-OLS) regression was originally designed in work by Phillips and Hansen (1990) to provide optimal estimates of cointegrating regressions. The method modifies least squares to account for serial correlation effects and for the endogeneity in the regressors that results from the existence of a cointegrating relationship. This paper provides a general framework which makes it possible to study the asymptotic behavior of FM-OLS in models with full rank I(1) regressors, models with I(1) and I(0) regressors, models with unit roots, and models with only stationary regressors. This framework enables us to consider the use of FM regression in the context of vector autoregressions (VAR's) with some unit roots and some cointegrating relations. The resulting FM-VAR regressions are shown to have some interesting properties. For example, when there is some cointegration in the system, FM-VAR estimation has a limit theory that is normal for all of the stationary coefficients and mixed normal for all of the nonstationary coefficients. Thus, there are no unit root limit distributions even in the case of the unit root coefficient submatrix (i.e., I n-r , for an n-dimensional VAR with r cointegrating vectors). Moreover, optimal estimation of the cointegration space is attained in FM-VAR regression without prior knowledge of the number of unit roots in the system, without pretesting to determine the dimension of the cointegration space and without the use of restricted regression techniques like reduced rank regression. The paper also develops an asymptotic theory for inference based on FM-OLS and FM-VAR regression. The limit theory for Wald tests that rely on the FM estimator is shown to involve a linear combination of independent chi-squared variates. This limit distribution is bounded above by the conventional chi-squared distribution with degrees of freedom equal to the number of restrictions. Thus, conventional critical values can be used to construct valid (but conservative) asymptotic tests in quite general FM time series regressions. This theory applies to causality testing in VAR's and is therefore potentially useful in empirical applications.

532 citations


Journal ArticleDOI
TL;DR: In this article, an estimation method for the empirical study of theoretical auction models is proposed, which relies on a simulated nonlinear least squares objective function appropriately adjusted so as to obtain consistent estimates of the parameters of interest.
Abstract: In this paper we propose an estimation method for the empirical study of theoretical auction models. We focus on first-price sealed bid and descending auctions and we adopt the private value paradigm, where each bidder is assumed to have a different private value, only known to him, for the object that is auctioned. Following McFadden (1989) and Pakes and Pollard (1989), our proposed method is based on simulations. Specifically, the method relies on a simulated nonlinear least squares objective function appropriately adjusted so as to obtain consistent estimates of the parameters of interest. We illustrate the proposed method by studying a market of agricultural products, where descending auctions are used. Our analysis takes into account heterogeneity of the auctioned objects and the fact that only the winning bid is observed. We estimate the parameters that characterize the distribution of the unobserved private values for each auctioned object.

382 citations


Journal ArticleDOI
TL;DR: In this article, it is shown that under error models used in robust estimation, unidentified population parameters can often be bounded, and that when the data may be contaminated or corrupted, estimating the bounds is more natural than attempting point estimation of unidentified parameters.
Abstract: Robust estimation aims at developing point estimators that are not highly sensitive to errors in the data. However, the population parameters of interest are not identified under the assumptions of robust estimation, so the rationale for point estimation is not apparent. This paper shows that under error models used in robust estimation, unidentified population parameters can often be bounded. The bounds provide information that is not available in robust estimation. For example, it is possible to obtain finite bounds on the population mean under contaminated sampling. A method for estimating the bounds is given and illustrated with an application. It is argued that when the data may be contaminated or corrupted, estimating the bounds is more natural than attempting point estimation of unidentified parameters

313 citations


Journal ArticleDOI
TL;DR: In this article, a generalized method of moment estimators and tests for continuous-time Markov processes is presented, which is designed to be applied to discrete-time data obtained by sampling continuous time Markov process.
Abstract: Continuous-time Markov processes can be characterized conveniently by their infinitesimal generators. For such processes there exist forward and reverse-time generators. We show how to use these generators to construct moment conditions implied by stationary Markov processes. Generalized method of moments estimators and tests can be constructed using these moment conditions. The resulting econometric methods are designed to be applied to discrete-time data obtained by sampling continuous-time Markov processes.

299 citations


Journal ArticleDOI
TL;DR: In this paper, stability properties of evolutionary selection dynamics in normal-form games are investigated, focusing on deterministic dynamics in continuous time and on asymptotic stability of sets of population states, more precisely of faces of the mixed strategy space.
Abstract: This paper investigates stability properties of evolutionary selection dynamics in normal-form games. The analysis is focused on deterministic dynamics in continuous time and on asymptotic stability of sets of population states, more precisely of faces of the mixed-strategy space. The main result is a characterization of those faces which are asymptotically stable in all dynamics from a certain class, and we show that every such face contains an essential component of the set of Nash equilibria, and hence a strategically stable set in the sense of Kohlberg and Mertens (1986).

296 citations


Journal ArticleDOI
TL;DR: This paper proposed a model of the process by which players learn to play repeated coordination games, with the goal of understanding the results of some recent experiments, in which the dynamics of subjects' strategy choices and the resulting patterns of discrimination among equilibria varied systematically with the rule for determining payoffs and the size of interacting groups, in ways that are not adequately explained by available methods of analysis.
Abstract: This paper proposes a model of the process by which players learn to play repeated coordination games, with the goal of understanding the results of some recent experiments. In those experiments the dynamics of subjects' strategy choices and the resulting patterns of discrimination among equilibria varied systematically with the rule for determining payoffs and the size of the interacting groups, in ways that are not adequately explained by available methods of analysis. The model suggests a possible explanation by showing how the dispersion of subjects' beliefs interacts with the learning process to determine the probability distribution of its dynamics and limiting outcome.

Journal ArticleDOI
TL;DR: In this article, the authors apply nonparametric regression models to estimation of demand curves of the type most often used in applied research and derive estimates of exact consumers surplus and deadweight loss from the demand curve estimators.
Abstract: We apply nonparametric regression models to estimation of demand curves of the type most often used in applied research. From the demand curve estimators we derive estimates of exact consumers surplus and deadweight loss, which are the most widely used welfare and economic efficiency measures in areas of economics such as public finance. We also develop tests of the symmetry and downward sloping properties of compensated demand. We work out asymptotic normal sampling theory for kernel and series nonparametric estimators, as well as for the parametric case. The paper includes an application to gasoline demand. Empirical questions of interest here are the shape of the demand curve and the average magnitude of welfare loss from a tax on gasoline. In this application we compare parametric and nonparametric estimates of the demand curve, calculate exact and approximate measures of consumers surplus and deadweight loss, and give standard error estimates. We also analyze the sensitivity of the welfare measures to components of nonparametric regression estimators such as the number of terms in a series approximation.

Journal ArticleDOI
TL;DR: In this article, the authors model and estimate congestion prices and capacity for large hub airports, including stochastic queues, time-varying traffic rates, and endogenous, intertemporal adjustment of traffic in response to queuing delay and fees.
Abstract: This paper models and estimates congestion prices and capacity for large hub airports. The model includes stochastic queues, time-varying traffic rates, and endogenous, intertemporal adjustment of traffic in response to queuing delay and fees. Relative costs of queuing and schedule delays are estimated using data from Minneapolis-St. Paul. Simulations calculate equilibrium traffic patterns, queuing delays, schedule delays, congestion fees, airport revenues, airport capacity, and efficiency gains. The paper also investigates whether a dominant airline internalizes delays its aircraft impose. It tests game-theoretic specifications with atomistic, Nash-dominant, Stackelberg-dominant, and collusive-airline traffic

Journal ArticleDOI
TL;DR: A modification of the official Sen poverty index yields an index which is not only continous in individual incomes and consistent with the transfer axiom, but also admits a geometric interpretation in terms of the area beneath a graph called the inverse generalized Lorenz curve for normalized poverty gaps by Jenkins and Lambert as mentioned in this paper.
Abstract: A modification of the official Sen poverty index yields an index which is not only continous in individual incomes and consistent with the transfer axiom, but also admits a geometric interpretation in terms of the area beneath a graph called the inverse generalized Lorenz curve for normalized poverty gaps by Jenkins and Lambert (1993) or the poverty gap profile by Shorrocks (1994).

Journal ArticleDOI
TL;DR: In this article, the authors considered a k-player sequential bargaining game, where the size of the cake and the order in which players move follow a general Markov process, and characterized the sets of subgame perfect and stationary sub-game perfect payoffs.
Abstract: We consider a k-player sequential bargaining model in which the size of the cake and the order in which players move follow a general Markov process. For games in which one agent makes an offer in each period and agreement must be unanimous, we characterize the sets of subgame perfect and stationary subgame perfect payoffs. With these characterizations, we investigate the uniqueness and efficiency of the equilibrium outcomes, the conditions under which agreement is delayed, and the advantage to proposing. Our analysis generalizes many existing results for games of sequential bargaining which build on the work of Stahl (1972), Rubinstein (1982), and Binmore (1987).

Journal ArticleDOI
TL;DR: In this article, the authors propose two consistent one-sided specification tests for parametric regression models, one based on the sample covariance between the residual from the parametric model and the discrepancy between parametric and nonparametric fitted values, and the other based on a difference in sums of squared residuals between the parameterized and non-parametric models, which can be viewed as a test of the joint hypothesis that the true parameters of a series regression model are zero.
Abstract: This paper proposes two consistent one-sided specification tests for parametric regression models, one based on the sample covariance between the residual from the parametric model and the discrepancy between the parametric and nonparametric fitted values ; the other based on the difference in sums of squared residuals between the parametric and nonparametric models. We estimate the nonparametric model by series regression. The new test statistics converge in distribution to a unit normal under correct specification and grow to infinity faster than the parametric rate (n -1/2 ) under misspecification, while avoiding weighting, sample splitting, and non-nested testing procedures used elsewhere in the literature. Asymptotically, our tests can be viewed as a test of the joint hypothesis that the true parameters of a series regression model are zero, where the dependent variable is the residual from the parametric model, and the series terms are functions of the explanatory variables, chosen so as to support nonparametric estimation of a conditional expectation. We specifically consider Fourier series and regression splines, and present a Monte Carlo study of the finite sample performance of the new tests in comparison to consistent tests of Bierens (1990), Eubank and Spiegelman (1990), Jayasuriya (1990), Wooldridge (1992), and Yatchew (1992) ; the results show the new tests have good power, performing quite well in some situations. We suggest a joint Bonferroni procedure that combines a new test with those of Bierens and Wooldridge to capture the best features of the three approaches.

ReportDOI
TL;DR: In this article, the authors examined the effect of prospective payment for hospital care on adverse medical outcomes and found that a greater share of deaths occur in the hospital or shortly after discharge, but by one year post discharge, mortality is no higher.
Abstract: This paper examines the effect of prospective payment for hospital care on adverse medical outcomes. In 1983, the federal government replaced its previous cost-based reimbursement method with a Prospective Payment System, under which reimbursement depends only on the diagnosis of the patient. Hospitals thus lost the marginal reimbursement they formerly received for providing additional treatments. In addition, the average price each hospital received for patients with different diagnoses changed. This paper relates each of these changes to adverse outcomes, with two conclusions. First, there is a change in the timing of deaths associated with changes in average prices. In hospitals with price declines, a greater share of deaths occur in the hospital or shortly after discharge, but by one year post-discharge, mortality is no higher. Second, there is a trend increase in readmission rates caused by the elimination of marginal reimbursement. This appears to be due to accounting changes on the part of hospitals, however, rather than true changes in morbidity

Journal ArticleDOI
TL;DR: In this article, the mean preserving spread relation between the likelihood ratio distributions derived from the original information systems is found to be sufficient to rank information systems under quite general assumptions about the agent's utility function.
Abstract: Different information systems are compared in terms of their relative efficiencies in an agency model. The mean preserving spread relation between the likelihood ratio distributions derived from the original information systems is found to be sufficient to rank information systems under quite general assumptions about the agent's utility function. Furthermore, it is shown that the mean preserving spread criterion can be applied to a broader set of information systems than Holmstrom's informativeness criterion and Blackwell's theorem

Journal ArticleDOI
TL;DR: In this article, the authors consider the allocation of goods in exchange economies with a finite number of agents who may have private information about their preferences and characterize the set of allocation rules which are incentive compatible.
Abstract: We consider the allocation of goods in exchange economies with a finite number of agents who may have private information about their preferences. In such a setting, standard allocation rules such as Walrasian equilibria or rational expectations equilibria are not compatible with individual incentives. We characterize the set of allocation rules which are incentive compatible, or in other words, the set of strategy-proof social choice functions. Social choice functions which are strategy-proof are those which can be obtained from trading according to a finite number of pre-specified proportions. The number of proportions which can be accommodated is proportional to the number of agents. Such rules are necessarily inefficient, even in the limit as the economy grows

Journal ArticleDOI
TL;DR: In this paper, a statistical model of dynamic intra-family investment behavior incorporating endowment heterogeneity is estimated to evaluate alternative estimation procedures that have exploited family and kinship data, which place alternative restrictions on the endowment structure and on behavior, including generalized least squares, instrumental-variables, fixed-effects based on the children of siblings, and sibling fixed effects with instrumental variables.
Abstract: A statistical model of dynamic intrafamily investment behavior incorporating endowment heterogeneity is estimated to evaluate alternative estimation procedures that have exploited family and kinship data. These procedures, which place alternative restrictions on the endowment structure and on behavior, include generalized least squares, instrumental-variables, fixed-effects based on the children of sisters, fixed-effects based on siblings, and sibling fixed-effects with instrumental variables. The framework is applied to data on birth outcomes, with focus on the effects of teen-age childbearing net of other maternal behavior. The empirical results imply that the least restrictive statistical formulation, consistent with dynamic behavior and heterogeneity among siblings, fits the data best. All of the estimation procedures that control for a family-specific endowment indicate, however, that the biological effect of having a birth at younger ages is to marginally increase birthweight and to increase fetal growth.

Journal ArticleDOI
TL;DR: In this article, the authors analyze optimal mechanisms in environments where sellers are privately informed about quality and derive conditions that are necessary and sufficient to determine when two simple trading environments maximize either social or private surplus.
Abstract: We analyze optimal mechanisms in environments where sellers are privately informed about quality A methodology is provided for deriving conditions that are necessary and sufficient to determine when two simple trading environments maximize either social or private surplus The commonly used auction mechanism is frequently inefficient in procurement environments Often, the optimal mechanism is simply to order potential suppliers and to tender take-it-or-leave-it offers to each sequentially We completely characterize the environments in which either mechanism is optimal In doing so, we develop a general methodology that determines when and if a given trading institution is optimal

Journal ArticleDOI
TL;DR: In this article, belief potential of the information system and p-dominance of Nash-equilibria of the game were introduced, and it was shown that a Nash equilibrium is uniquely selected whenever its p-Dominance is below the belief potential.
Abstract: This paper elucidates the logic behind recent papers which show that a unique equilibrium is selected in the presence of higher order uncertainty, i.e,, when players lack common knowledge. We introduce two new concepts: belief potential of the information system and p-dominance of Nash-equilibria of the game, and show that a Nash-equilibrium is uniquely selected whenever its p-dominance is below the belief potential. This criterion applies to many-action games, not merely 2 × 2 games. It also applies to games without dominant strategies, where the set of equilibria is shown to be smaller and simpler than might be initially conjectured. Finally, the new concepts help understand the circumstances under which the set of equilibria varies with the amount of common knowledge among players

Journal ArticleDOI
TL;DR: The authors argue that aggregate information is not very important for individual consumption decisions and study models of life-cycle consumption in which individuals react optimally to their own income process but have incomplete or no information on economywide variables.
Abstract: Individual income is much more variable than aggregate per capita income. I argue that aggregate information is therefore not very important for individual consumption decisions and study models of life-cycle consumption in which individuals react optimally to their own income process but have incomplete or no information on economy-wide variables. Since individual income is less persistent than aggregate income consumers will react too little to aggregate income variation. Aggregate consumption will be excessively smooth. Since aggregate information is slowly incorporated into consumption, aggregate consumption will be autocorrelated and correlated with lagged income. On the other hand, the model has the same prediction for micro data as the standard permanent income model. The second part of the paper provides empirical evidence on individual and aggregate income processes. Different models for individual income are fit to quarterly data from the Survey of Income and Program Participation making various adjustments for measurement error. Calibrating the consumption model using the estimated parameters for the income process yields predictions which qualitatively correspond to the empirical findings for aggregate consumption but do not match them well in magnitude.

Journal ArticleDOI
TL;DR: In this paper, the existence of precommitment effects through public announcements of contracts is analyzed in a model where agency contracts, designed ex-ante, can always be secretly renegotiated, at the exante and interim stages.
Abstract: We consider a model where two agents, privately informed about their own characteristics, play a (normal form) game on behalf of two uninformed principals. We analyze the existence of precommitment effects through public announcements of contracts, in a model where agency contracts, designed ex-ante, can always be secretly renegotiated, at the ex-ante and interim stages. We show that the existence of precommitment effects depends both on the strategic complementarity of the agents' actions and on the direct effect of the opponents' actions on each principal's welfare. In our model, the possibility of renegotiation is crucial for the existence of precommitment effects. The results are introduced through an example of Cournot and Bertrand competition between firms, viewed as vertical structures.

Journal ArticleDOI
TL;DR: In this paper, the authors characterize the set of strategies that are stable with respect to a stochastic dynamic adaptive process in a finite two-player game played by a population of players.
Abstract: We add a round of pre-play communication to a finite two-player game played by a population of players. Pre-play communication is cheap talk in the sense that it does not directly enter the payoffs. The paper characterizes the set of strategies that are stable with respect to a stochastic dynamic adaptive process. Periodically players have an opportunity to change their strategy with a strategy that is more successful against the current population. Any strategy that weakly improves upon the current poorest performer in the population enters with positive probability. When there is no conflict of interest between the players, only the efficient outcome is stable with respect to these dynamics. For general games the set of stable payoffs is typically large. Every efficient payoff recurs infinitely often.

Journal ArticleDOI
TL;DR: In this paper, the second order properties of various quantities of interest in the partially linear regression model were examined and a stochastic expansion with remainder op(n -2/4), where u < 1/2, for the standardized semiparametric least squares estimator, a standard error estimator and a studentized statistic was obtained.
Abstract: We examine the second order properties of various quantities of interest in the partially linear regression model. We obtain a stochastic expansion with remainder op(n -2/4), where ,u < 1/2, for the standardized semiparametric least squares estimator, a standard error estimator, and a studentized statistic. We use the second order expansions to correct the standard error estimates for second order effects, and to define a method of bandwidth choice. A Monte Carlo experiment provides favorable evidence on our method of bandwidth choice.

Journal ArticleDOI
TL;DR: It is shown that social discounting is ruled out in an intertemporal welfarist environment and an axiom called independence of the utilities of the dead is provided, providing a characterization of critical-level generalized utilitarian rules.
Abstract: "This paper considers the problem of social evaluation in a model where population size, individual lifetime utilities, lengths of life, and birth dates vary across states. In an intertemporal framework, we investigate principles for social evaluation that allow history to matter to some extent. Using an axiom called independence of the utilities of the dead, we provide a characterization of critical-level generalized utilitarian rules. As a by-product of our analysis, we show that social discounting is ruled out in an intertemporal welfarist environment. A simple population-planning example is also discussed."

ReportDOI
TL;DR: In this article, the U.S. offshore oil and gas lease sales conducted by the Department of the Interior since 1954 are analyzed and their predictions compared to outcomes in the data.
Abstract: This paper describes the U.S. offshore oil and gas lease sales conducted by the Department of the Interior since 1954. Several decisions are discussed, including bidding for leases, the government's decision whether to accept the highest bid, the incidence and timing of exploratory drilling, and the formation of bidding consortia. Equilibrium models of these decisions that emphasize informational and strategic issues and that account for institutional features of the leasing program are analyzed, and their predictions compared to outcomes in the data. AN IMPORTANT ASPECT of any market is the information available to partici- pants. The social and private costs of information imperfections are com- pounded in strategic settings, where participants may exploit informational asymmetries. Information can play a crucial role in an auction. A seller (or a buyer in a procurement auction) often resorts to an auction market because of uncertainty about the market price for the item in question. That is, the seller is uncertain about others' willingness to pay. At the same time, buyers may be uncertain about their rivals' valuations of the item, and they may be uncertain about the value of the item for themselves, such as when there is an unknown common valuation component. If one buyer has access to information superior to that of its rivals, such as a more precise signal of the item's worth on a future resale market, informational rents may be obtained. Even if buyers have symmetric information, in the sense of equally precise signals, they must account for the winner's curse in uncertain environments because the item will be won by the buyer with the most optimistic assessment of the item's worth. Buyers have incentives to pool information or to gain an advantage by learning of a rival's intentions. If ex post signals of the item's worth are available, the seller can increase profits by making payment contingent on the ex post signal, say via a royalty payment that supplements any fixed payment. However, if the buyer can affect the value by ex post actions, a moral hazard problem arises, and excessive reliance on a royalty rate may distort incentives. The game is not zero-sum, so inefficiencies may result.

Journal ArticleDOI
TL;DR: A definition of extended memory is provided, generalizing the ideas of long memory and persistence, based on the properties of forecasts over long horizons, and it is suggested that many more types of misspecification can occur than with usual situations and could produce important specification errors.
Abstract: Many economic variables have a persistence property which may be called extended memory and the relationship between variables may well be nonlinear. This pair of properties allow for many more types of model misspecification than encountered with stationary or short-memory variables and linear relationships, and misspecifications lead to greater modelling difficulties. Examples are given using the idea of a model being balanced. Alternative definitions of extended memory are considered and a definition based on the properties of optimum forecasts is selected for later use. An important but not necessarily pervasive class of processes are those that are extended-memory but whose changes are short-memory. For this case, called I(1), standard cointegration ideas will apply. Tests of linearity are discussed in both the I(1) case, where a possible group of tests is easily found, and more generally. Similarly, methods of building nonlinear models based on familiar techniques, such as neural networks and projection pursuit, are briefly considered for I(1) and the more general case. A number of areas requiring further work in this new area are emphasized

Journal ArticleDOI
TL;DR: In this article, the authors consider the theory of market versus optimal product diversity in the light of two recent advances in oligopoly theory, namely, the development of discrete choice models to describe heterogeneous consumer tastes, and the application of such models to oligopolistic competition.
Abstract: This paper considers the theory of market versus optimal product diversity in the light of two recent advances in oligopoly theory. The first is the development of discrete choice models to describe heterogeneous consumer tastes, and the application of such models to oligopolistic competition. The second advance is the proof that logconcavity of the consumer taste density guarantees the existence of a price equilibrium. We analyze an oligopoly model with price competition and free entry, taking explicit account of the integer constraint. Under the Chamberlinian symmetry assumption (that tastes are i.i.d.), we first show that logconcavity of the taste density implies there is excessive market provision of variety when each consumer buys one unit of the product from one of the firms. We then show that this result extends to price-sensitive individual demands by proving that the equilibrium number of firms is at least as great as that which would be provided at the second-best social optimum subject to a zero-profit constraint for firms. Our results call into question previous findings for representative consumer models that left open the possibility of insufficient product diversity.