scispace - formally typeset
Search or ask a question

Showing papers in "Econometrica in 1979"


Book ChapterDOI
TL;DR: In this paper, the authors present a critique of expected utility theory as a descriptive model of decision making under risk, and develop an alternative model, called prospect theory, in which value is assigned to gains and losses rather than to final assets and in which probabilities are replaced by decision weights.
Abstract: This paper presents a critique of expected utility theory as a descriptive model of decision making under risk, and develops an alternative model, called prospect theory. Choices among risky prospects exhibit several pervasive effects that are inconsistent with the basic tenets of utility theory. In particular, people underweight outcomes that are merely probable in comparison with outcomes that are obtained with certainty. This tendency, called the certainty effect, contributes to risk aversion in choices involving sure gains and to risk seeking in choices involving sure losses. In addition, people generally discard components that are shared by all prospects under consideration. This tendency, called the isolation effect, leads to inconsistent preferences when the same choice is presented in different forms. An alternative theory of choice is developed, in which value is assigned to gains and losses rather than to final assets and in which probabilities are replaced by decision weights. The value function is normally concave for gains, commonly convex for losses, and is generally steeper for losses than for gains. Decision weights are generally lower than the corresponding probabilities, except in the range of low prob- abilities. Overweighting of low probabilities may contribute to the attractiveness of both insurance and gambling. EXPECTED UTILITY THEORY has dominated the analysis of decision making under risk. It has been generally accepted as a normative model of rational choice (24), and widely applied as a descriptive model of economic behavior, e.g. (15, 4). Thus, it is assumed that all reasonable people would wish to obey the axioms of the theory (47, 36), and that most people actually do, most of the time. The present paper describes several classes of choice problems in which preferences systematically violate the axioms of expected utility theory. In the light of these observations we argue that utility theory, as it is commonly interpreted and applied, is not an adequate descriptive model and we propose an alternative account of choice under risk. 2. CRITIQUE

35,067 citations



Journal ArticleDOI
TL;DR: In this article, the bias that results from using non-randomly selected samples to estimate behavioral relationships as an ordinary specification error or "omitted variables" bias is discussed, and the asymptotic distribution of the estimator is derived.
Abstract: Sample selection bias as a specification error This paper discusses the bias that results from using non-randomly selected samples to estimate behavioral relationships as an ordinary specification error or «omitted variables» bias. A simple consistent two stage estimator is considered that enables analysts to utilize simple regression methods to estimate behavioral functions by least squares methods. The asymptotic distribution of the estimator is derived.

23,995 citations


Journal ArticleDOI
TL;DR: In this paper, a simple test for heteroscedastic disturbances in a linear regression model is developed using the framework of the Lagrangian multiplier test, and the criterion is given as a readily computed function of the OLS residuals.
Abstract: A simple test for heteroscedastic disturbances in a linear regression model is developed using the framework of the Lagrangian multiplier test. For a wide range of heteroscedastic and random coefficient specifications, the criterion is given as a readily computed function of the OLS residuals. Some finite sample evidence is presented to supplement the general asymptotic properties of Lagrangian multiplier tests.

3,629 citations


Journal ArticleDOI
TL;DR: In this article, the generalized Nash solution proposed by Harsanyi and Selten is applied to this set to define a bargaining solution for Bayesian collective choice problems, and it is shown that the set of expected utility allocations which are feasible with incentive-compatible mechanisms is compact and convex.
Abstract: Collective choice problems are studied from the Bayesian viewpoint. It is shown that the set of expected utility allocations which are feasible with incentive-compatible mechanisms is compact and convex, and includes the equilibrium allocations for all other mechanisms. The generalized Nash solution proposed by Harsanyi and Selten is then applied to this set to define a bargaining solution for Bayesian collective choice problems.

2,011 citations


Journal ArticleDOI
TL;DR: In this article, the authors show that the so-called principle of minimum differentiation, as based on Hotelling's 1929 celebrated paper (Hotelling [3]), is invalid and that no equilibrium price solution will exist when both sellers are not far enough from each other.
Abstract: The purpose of this note is to show that the so-called Principle of Minimum Differentiation, as based on Hotelling’s 1929 paper “Stability in Competition” is invalid. The purpose of this note is to show that the so-called Principle of Minimum Differentiation, as based on Hotelling’s 1929 celebrated paper (Hotelling [3]), is invalid. Firstly, we assert that, contrary to the statement formulated by Hotelling in his model, nothing can be said about the tendency of both sellers to agglomerate at the center of the market. The reason is that no equilibrium price solution will exist when both sellers are not far enough from each other. Secondly, we consider a slightly modified version of Hotelling’s example, for which there exists a price equilibrium solution everywhere. We show however that, for this version, there is a tendency for both sellers to maximize their differentiation. This example thus constitutes a counterexample to Hotelling’s conclusions. We shall first recall Hotelling’s model and notations. On a line of length `, two sellers A and B of a homogeneous product, with zero production cost, are located at respective distances a and b from the ends of this line (a+ b ≤ `; a ≥ 0, b ≥ 0). Customers are evenly distributed along the line, and each customer consumes exactly a single unit of this commodity per unit of time, irrespective of its price. Since the product is homogeneous, a customer will buy from the seller Econometrica, 47(5), 1145–1150, September 1979. Center for Operations Research and Econometrics

1,911 citations



Journal ArticleDOI
TL;DR: In this paper, the authors characterize the solution to the problem of searching for the best outcome from alternative sources with different properties, and the optimal strategy is an elementary reservation price rule, where the reservation prices are easy to calculate and have an intuitive economic interpretation.
Abstract: This paper completely characterizes the solution to the problem of searching for the best outcome from alternative sources with different properties The optimal strategy is an elementary reservation price rule, where the reservation prices are easy to calculate and have an intuitive economic interpretation

1,034 citations


Journal ArticleDOI
TL;DR: In this article, it was shown that Theil's coefficient (T) and the logarithm of the arithmetic mean over the geometric mean (L) are the only decomposable inequality measures such that the weight of the "within-components" in the total inequality of a partitioned population sum to a constant.
Abstract: A decomposable inequality measure is defined as a measure such that the total inequality of a population can be broken down into a weighted average of the inequality existing within subgroups of the population and the inequality existing between them. Thus, decomposable measures differ only by the weights given to the inequality within the subgroups of the population. It is proven that the only zero-homogeneous "income-weighted" decomposable measure is Theil's coefficient (T) and that the only zero-homogeneous "population-weighted" decomposable measure is the logarithm of the arithmetic mean over the geometric mean (L). More generally, it is proved that T and L are the only decomposable inequality measures such that the weight of the "within-components" in the total inequality of a partitioned population sum to a constant. More general decomposable measures are also analyzed.

999 citations



Journal ArticleDOI
TL;DR: In this article, a representation theorem is given which rationalizes individual choice behavior as being as if the individual is "uncertain about future tastes" and the choice is made in more than one stage, where these early choices amount to choice of a subset of items from which subsequent choice will be made.
Abstract: This paper concerns individual choice among "opportunity sets," from which the individual will later choose a single object. In particular, it concerns preference relations on opportunity sets which satisfy "preference for flexibility," a set is at least as good as all of its subsets, but which may not satisfy "revealed preference," the union of two sets may be strictly preferred to each one taken separately. A representation theorem is given which "rationalizes" such choice behavior as being as if the individual is "uncertain about future tastes." IN MANY PROBLEMS of individual choice, the choice is made in more than one stage. At early stages, the individual makes decisions which will constrain the choices that are feasible later. In effect, these early choices amount to choice of a subset of items from which subsequent choice will be made. This paper concerns choice among such opportunity sets, where the individual has a "desire for flexibility" which is "irrational" if the individual knows what his subsequent preferences will be.

Journal ArticleDOI
TL;DR: In this article, an econometric method for selecting macroeconomic policy rules when expectations are formed rationally is proposed. But the model does not take into account business and consumer reactions to the policies formulated.
Abstract: The paper investigates an econometric method for selecting macroeconomic policy rules when expectations are formed rationally. A simple econometric model of the U.S. is estimated subject to a set of rational expectations restrictions using a minimum distance estimation technique. The estimated model is then used to calculate optimal monetary policy rules to stabilize fluctuations in output and inflation, and to derive a long run tradeoff between price stability and output stability which incorporates the rationally formed expectations. The optimal tradeoff curve is compared with actual U.S. price and output stability and with the results of a monetary policy rule with a constant growth rate of the money supply. A TROUBLESOME SHORTCOMING with contemporary methods of quantitative macroeconomic policy is the failure to take full account of business and consumer reactions to the policies formulated. This problem is characteristic of both policy simulation and formal optimal control techniques, each of which are based on reduced form econometric models in which output and price expectations are formed by fixed coefficient distributed lag structures. Since these lag structures show no direct relationship to government policy, the mechanisms generating expectations are in general inconsistent with the expectations of firms and consumers who are aware of this policy.2 Finding empirical methods to deal with this problem is potentially important for a number of reasons. The social welfare gains expected from plans which rely on unresponsive expectations are likely to be significantly cut short, and perhaps made perversely negative, as people learn about policy through observation. Proper policy formulation therefore requires either the difficult task of modelling how people learn about unannounced plans,3 or the apparently easier task of publicly announcing plans, assuming that these will be incorporated in peoples'







Journal ArticleDOI
TL;DR: In this article, it was shown that Groves' scheme is unique on restricted domains which are convex, in particular convex domains, and this generalizes earlier uniqueness results by Green and Laffont and Walker.
Abstract: It is proved that Groves’ scheme is unique on restricted domains which are smoothly connected, in particular convex domains. This generalizes earlier uniqueness results by Green and Laffont and Walker. An example shows that uniqueness may be lost if the domain is not smoothly connected.


Journal ArticleDOI
TL;DR: In this paper, a simultaneous equations model of bid and offer functions for housing attributes (dwelling quality, dwelling size, and lot size) is estimated in order to account for the heterogeneity of the housing good.
Abstract: A simultaneous equations model of bid and offer functions for housing attributes (dwelling quality, dwelling size, and lot size) is estimated in order to account for the heterogeneity of the housing good. Estimation of a traditional, nonlinear hedonic price equation in the first stage provides a basis for calculating implicit prices for housing attributes which are used in the second stage simultaneous equations model. Empirical results confirm the theoretically expected negative coefficient for each attribute in its own bid function and the expected positive or zero coefficient for each attribute in its own offer function. Cross price relationships reveal a general pattern of complementarity in consumption of housing attributes.

Journal ArticleDOI
TL;DR: In this article, a class of statistical models which generate simultaneous equation models with both discrete and continuous endogenous variables is introduced, which can also be regarded as a new class of switching simultaneously equation models which are of general interest.
Abstract: In this paper, a class of statistical models which generate simultaneous equation models with both discrete and continuous endogenous variables is introduced. This class of models can also be regarded as a new class of switching simultaneous equation models which are of general interest. Identification and estimation problems are investigated. Several simple consistent two stage methods are proposed. The consistency of those estimators is proved. Two step maximum likelihood procedures are then developed.

Journal ArticleDOI
TL;DR: In this paper, the concept of anonymous dominance solvable voting was introduced and proved to be an anonymous voting procedure which always selects an efficient alternative, which is called voting by elimination.
Abstract: The concept of a dominance solvable voting scheme is presented as a weakening of the strategy-proofness requirement: it relies on successive elimination of dominated strategies and generalizes the well known concept of "sophisticated voting." Dominance solvable decision schemes turn out to contain many usual voting procedures such as voting by veto, kingmaker, and voting by binary choices. The procedure of voting by elimination is proved to be an anonymous dominance solvable voting scheme which always selects an efficient alternative.



Journal ArticleDOI
TL;DR: In this paper, the authors investigated the properties of the winning bid in a sealed bid tender auction where each player has private information and found that it is possible for the winning bidder to converge in probability to the true value of the object at auction, even though no bidder knows the real value.
Abstract: IN THIS PAPER we investigate the properties of the winning bid in a sealed bid tender auction where each player has private information. We find that it is possible for the winning bid to converge in probability to the true value of the object at auction, even though no bidder knows the true value. Necessary and sufficient conditions for this phenomenon are derived, extending and generalizing certain of Wilson's results [3]. We study an auction in which a seller offers to sell at the highest bid an item of unknown value V. The kth bidder receives a private signal Sk (for k = 1, 2,.. .) and submits a bid without knowledge of the other signals. A finitely additive probability measure P reflects the bidders' unanimous beliefs about V and the signals. Conditional on V, the signals are independent and identically distributed. The signals take their values in some space &'. With n bidders, a bidding strategy for k is a function Pnk: 9' -> R. k's strategy specifies that upon receiving the signal Sk, he shall bid Pnk(Sk).2 Thus the winning



Journal ArticleDOI
TL;DR: This article proposed the Gini coefficient of the censored income distribution truncated from above by the poverty line as an index of poverty, which was introduced by Professor Sen. In comparison with Sen's index, their alternative measure is simpler and more concerned with relative deprivation; it can be regarded as a more natural translation of the gini coefficient from the measurement of inequality into that of poverty.
Abstract: This paper proposes the Gini coefficient of the censored income distribution truncated from above by the poverty line as an index of poverty. An ordinalist axiomatic approach, which was introduced by Professor Sen, is used to justify this measure. In comparison with Sen's index, our alternative measure is simpler and more concerned with relative deprivation; it can be regarded as a more natural translation of the Gini coefficient from the measurement of inequality into that of poverty.

Journal ArticleDOI
TL;DR: In this article, the authors investigated the notion of the Nash social welfare function and made a fundamental assumption that there exists a distinguished alternative called an origin, which represents one of the worst states for all individuals in the society.
Abstract: JSTOR is a not-for-profit service that helps scholars, researchers, and students discover, use, and build upon a wide range of content in a trusted digital archive. We use information technology and tools to increase productivity and facilitate new forms of scholarship. For more information about JSTOR, please contact support@jstor.org. The Econometric Society is collaborating with JSTOR to digitize, preserve and extend access to Econometrica. We investigate the notion o0f the Nash social welfare function and make a fundamental assumption that there exists a distinguished alternative called an origin, which represents one of the worst states for all individuals in the society. Under this assumption, in Sections 1 and 2, we formulate several rationality criteria that a reasonable social welfare function should satisfy. Then we introduce the Nash social welfare function and the Nash social welfare indices which are the images of the welfare function. The function is proved to satisfy the criteria. In Section 3 it is shown that the Nash social welfare function is the unique social welfare function that satisfies the criteria. Then, in Section 4, we examine two examples which display plausibility of the welfare function.