scispace - formally typeset
Search or ask a question

Showing papers in "Econometrica in 1990"


Journal Article•DOI•
TL;DR: In this article, the authors consider estimation and hypothesis testing in linear time series models when some or all of the variables have unit roots and show that parameters that can be written as coefficients on mean zero, nonintegrated regressors have jointly normal asymptotic distributions, converging at the rate T'/2.
Abstract: This paper considers estimation and hypothesis testing in linear time series models when some or all of the variables have unit roots. Our motivating example is a vector autoregression with some unit roots in the companion matrix, which might include polynomials in time as regressors. In the general formulation, the variable might be integrated or cointegrated of arbitrary orders, and might have drifts as well. We show that parameters that can be written as coefficients on mean zero, nonintegrated regressors have jointly normal asymptotic distributions, converging at the rate T'/2. In general, the other coefficients (including the coefficients on polynomials in time) will have nonnormal asymptotic distributions. The results provide a formal characterization of which t or F tests-such as Granger causality tests-will be asymptotically valid, and which will have nonstandard limiting distributions.

2,529 citations


Journal Article•DOI•
TL;DR: In this paper, an asymptotic theory for residual based tests for cointegration is developed and the power properties of the test are also studied, and the tests are consistent if suitably constructed, but the ADF and Z(subscript "t") tests have slower rates of divergence under co-integration.
Abstract: This paper develops an asymptotic theory for residual based tests for cointegration. Attention is given to the augmented Dickey-Fuller (ADF) test and the Z(subscript "alpha") and Z(subscript "t") unit root tests. Two new tests are also introduced. The tests are shown to be asymptotically similar, and simple representations of their limiting distributions are given and asymptotic critical values are tabulated. The ADF and Z(subscript "t") tests are asymptotically equivalent. Power properties of the test are also studied. The tests are consistent if suitably constructed, but the ADF and Z(subscript "t") tests have slower rates of divergence under cointegration than the other tests. Copyright 1990 by The Econometric Society.

2,012 citations


Journal Article•DOI•
TL;DR: In this article, a rich class of non-cooperative games, including models of oligopoly competition, macroeconomic coordination failures, arms races, bank runs, technology adoption and diffusion, R&D competition, pretrial bargaining, coordination in teams, and many others, are studied.
Abstract: We study a rich class of noncooperative games that includes models of oligopoly competition, macroeconomic coordination failures, arms races, bank runs, technology adoption and diffusion, R&D competition, pretrial bargaining, coordination in teams, and many others. For all these games, the sets of pure strategy Nash equilibria, correlated equilibria, and rationalizable strategies have identical bounds. Also, for a class of models of dynamic adaptive choice behavior that encompasses both best-response dynamics and Bayesian learning, the players' choices lie eventually within the same bounds. These bounds are shown to vary monotonically with certain exogenous parameters. WE STUDY THE CLASS of (noncooperative) supermodular games introduced by Topkis (1979) and further analyzed by Vives (1985, 1989), who also pointed out the importance of these games in industrial economics. Supermodular games are games in which each player's strategy set is partially ordered, the marginal returns to increasing one's strategy rise with increases in the competitors' strategies (so that the game exhibits "strategic complementarity"2) and, if a player's strategies are multidimensional, marginal returns to any one com- ponent of the player's strategy rise with increases in the other components. This class turns out to encompass many of the most important economic applications of noncooperative game theory. In macroeconomics, Diamond's (1982) search model and Bryant's (1983, 1984) rational expectations models can be represented as supermodular games. In each of these models, more activity by some members of the economy raises the returns to increased levels of activity by others. In oligopoly theory, some models of Bertrand oligopoly with differentiated products qualify as supermodu- lar games. In these games, when a firm's competitors raise their prices, the marginal profitability of the firm's own price increase rises. A similar structure is present in games of new technology adoption such as those of Dybvig and Spatt (1983), Farrell and Saloner (1986), and Katz and Shapiro (1986). When more users hook into a communication system or more manufacturers adopt an interface standard, the marginal return to others of doing the same often rises. Similarly, in some specifications of the bank runs model introduced by Diamond and Dybvig (1983), when more depositors withdraw their funds from a bank, it is more worthwhile for other depositors to do the same. In the warrant exercise

1,795 citations


Report•DOI•
TL;DR: In this paper, the Arrow-Pratt theory of risk aversion is applied to the theory of optimal choice under risk, and a measure of the strength of the precautionary saving motive analogous to the ArrowPratt measure of risk avoidance is used to establish a number of new propositions about the necessity of saving and give a new interpretation of the Dreze-Modigliani substitution effect.
Abstract: The theory of precautionary saving is shown to be isomorphic to the Arrow-Pratt theory of risk aversion, making possible the application of a large body of knowledge about risk aversion to precautionary saving--and more generally, to the theory of optimal choice under risk. In particular, a measure of the strength of the precautionary saving motive analogous to the Arrow-Pratt measure of risk aversion is used to establish a number of new propositions about precautionary saving and to give a new interpretation of the Dreze-Modigliani substitution effect. Copyright 1990 by The Econometric Society.

1,555 citations


Report•DOI•
TL;DR: In this article, the effects of the level and length of unemployment insurance benefits on unemployment durations were investigated and individual behavior during the weeks just prior to when benefits lapse was found to have a strong negative effect on the probability of leaving unemployment.
Abstract: This paper tests the effects of the level and length of unemployment insurance benefits on unemployment durations. The paper particularly studies individual behavior during the weeks just prior to when benefits lapse. Higher unemployment insurance benefits are found to have a strong negative effect on the probability of leaving unemployment. However, the probability of leaving unemployment rises dramatically just prior to when benefits lapse. Individual data are used with accurate information of spell durations, and the level and length of benefits. The semiparametric estimation techniques used in the paper yield more plausible estimates than conventional approaches and provide useful diagnostics. Copyright 1990 by The Econometric Society.

1,367 citations


Journal Article•DOI•
TL;DR: In this paper, the authors investigated pure strategy sequential equilibria of repeated games with imperfect monitoring, and they showed that the latter include solutions having a "bang-bang" property, which affords a significant simplification of the equilibrium that need be considered.
Abstract: This paper investigates pure strategy sequential equilibria of repeated games with imperfect monitoring. The approach emphasizes the equilibrium value set and the static optimization problems embedded in extremal equilibria. A succession of propositions, central among which is "self-generation," allow properties of constrained efficient supergame equilibria to be deduced from the solutions of the static problems. We show that the latter include solutions having a "bang-bang" property; this affords a significant simplification of the equilibria that need be considered. These results apply to a broad class of asymmetric games, thereby generalizing our earlier work on optimal cartel equilibria. The bang-bang theorem is strengthened to a necessity result: under certain conditions, efficient sequential equilibria have the property that after every history, the value to players of the remainder of the equilibrium must be an extreme point of the equilibrium value set. General implications of the self-generation and bang-bang propositions include a proof of the monotonicity of the equilibrium average value set in the discount factor, and an iterative procedure for computing the value set.

1,013 citations


Report•DOI•
TL;DR: In this paper, the authors examine the impact of news in one market on the time path of per-hour volatility in other markets using a volatility type of vector autoregression and find that the empirical evidence is generally against the null hypothesis of the heat wave.
Abstract: This paper seeks to explain the causes of volatility clustering in exchange rates. Careful examination of intra-daily exchange rates provides a test of two hypotheses-heat waves and meteor showers. The heat wave hypothesis is that the volatility has only country-specific autocorrelation. Alternatively, the meteor shower is a phenomenon of intra-daily volatility spillovers from one market to the next. Using the GARCH model to specify the heteroskedasticity across intra-daily market segments, we find that the empirical evidence is generally against the null hypothesis of the heat wave. Using a volatility type of vector autoregression we examine the impact of news in one market on the time path of per-hour volatility in other markets.

825 citations


Journal Article•DOI•
TL;DR: In this article, the authors consider the identifiability of the Roy model from data on earnings distributions and show that the normal theory version is identifiable without regressors or exclusion restrictions and with sufficient price variation, the model can be identified from multimarket data.
Abstract: This paper clarifies and extends the classical Roy model of self selection and earnings inequality. The original Roy model, based on the assumption that log skills are normally distributed, is shown to imply that pursuit of comparative advantage in a free market reduces earnings inequality compared to the earnings distribution that would result if workers were randomly assigned to sectors. Aggregate log earnings are right skewed even if one sectoral distribution is left skewed. Most major implications of the log normal Roy model survive if differences in skills are log concave. However few implications of the model survive if skills are generated from more general distributions. We consider the identifiability of the Roy model from data on earnings distributions. The normal theory version is identifiable without regressors or exclusion restrictions. Sectoral distributions can be identified knowing only the aggregate earnings distribution. For general skill distributions, the model is not identified and has no empirical content. With sufficient price variation, the model can be identified from multimarket data. Cross-sectional variation in regressors can substitute for price variation in restoring empirical content to the Roy model.

765 citations


Report•DOI•
TL;DR: In this article, the authors present a new model of retirement and uses it to estimate the effects of pension plan provisions on the departure rates of older salesmen from a large Fortune 500 firm.
Abstract: The effects of firm pension plan provisions on the retirement decisions of older employees are analyzed. The empirical results are based on data from a large firm, with a typical defined benefit pension plan. The "option value" of continued work is the central feature of the analysis. Estimation relies on a retirement decision rule that is close in spirit to the dynamic programming rule but is considerably less complex than a compre- hensive implementation of that rule, thus greatly facilitating the numerical analysis. THE TYPICAL FIRM PENSION PLAN presents very large incentives to retire from the firm at an early age, often as young as 55. Although the labor supply effects of Social Security provisions have been the subject of a great deal of analysis, much less attention has been directed to the implications of firm pension plans. Yet the retirement inducements in the provisions of firm plans are typically much greater than the incentives inherent in Social Security benefit formulae, as demonstrated by Kotlikoff and Wise (1985, 1987). Indeed, the provisions of most firm plans are at odds with the planned increase in the Social Security retire- ment age; private plans encourage early retirement. This paper presents a new model of retirement and uses it to estimate the effects of pension plan provisions on the departure rates of older salesmen from a large Fortune 500 firm. An important goal is to develop a model that can be used to predict the effects on retirement of potential changes in pension plan provisions. The analysis is based on longitudinal personnel records from the firm. The option value of continued work is the central feature of the model. Pension plan provisions typically provide a large bonus if the employee works until a certain age, often the early retirement age, and then a substantial inducement to leave thereafter. Employees who retire later may do so under less advantageous conditions. If the employee retires before the early retire- ment age, the option of a later bonus is lost. Continuing to work preserves the option of retiring later, hence the terminology: the "option value" of work. The provisions of firm pension plans that have motivated our work are described in the next section. The option value model is described in Section 2. Results are presented and the model fit is discussed in Section 3. Simulations of illustrative potential changes in pension plan provisions are presented in Section 4. A summary and conclusions are in Section 5.

750 citations


Journal Article•DOI•
TL;DR: In this article, the authors analyzed a competitive credit market where banks use imperfect and independent tests to assess the ability of a potential creditor to repay credit, and they showed that in a situation where all banks charge the same interest rate, a bank always has the incentive to undercut in order to improve the average creditworthiness of its own clientele.
Abstract: This paper analyzes a competitive credit market where banks use imperfect and independent tests to assess the ability of a potential creditor to repay credit. The banks compete by announcing interest rates at which they will provide credit to those applicants who pass the banks' tests. The proportion of applicants who pass the test of at least one bank increases with the number of banks providing credit, so the average credit-worthiness decreases. It is then shown that in a situation where all banks charge the same interest rate, a bank always has the incentive to undercut in order to improve the average credit-worthiness of its own clientele. This feature represents the major difference from the situations in standard Bertrand and Bertrand-Edgeworth models. Copyright 1990 by The Econometric Society.

582 citations


Report•DOI•
TL;DR: In this article, it is shown that optimal consumption is not a smooth function of wealth; it is optimal for the consumer to wait until a large change in wealth occurs before adjusting his consumption.
Abstract: We analyze a model of optimal consumption and portfolio selection in which consumption services are generated by holding a durable good. The durable good is illiquid in that a transaction cost must be paid when the good is sold. It is shown that optimal consumption is not a smooth function of wealth; it is optimal for the consumer to wait until a large change in wealth occurs before adjusting his consumption. As a consequence, the consumption based capital asset pricing model fails to hold. Nevertheless, it is shown that the standard, one factor, market portfolio based capital asset pricing model does hold in this environment. It is shown that the optimal durable level is characterized by three numbers (not random variables), say x, y, and z (where $x ). The consumer views the ratio of consumption to wealth (c/W) as his state variable. If this ratio is between x and z, then he does not sell the durable. If c/W is less than x or greater than z, then he sells his durable and buys a new durable of size S so that S/W = y. Thus y is his "target" level of c/W. If the stock market moves up enough so that c/W falls below x, then he sells his small durable to buy a larger durable. However, there will be many changes in the value of his wealth for which c/W stays between x and z, and thus consumption does not change. Numerical simulations show that small transactions costs can make consumption changes occur very infrequently. Further, the effect of consumption transactions costs on the demand for risky assets is substantial.

Journal Article•DOI•
TL;DR: In this paper, the authors modify the standard principal-agent model with oral hazard by allowing the contract to be renegotiated after the agent's choice of action and before the observation of the action's consequences.
Abstract: The authors modify the standard principal-agent model with oral hazard by allowing the contract to be renegotiated after the agent's choice of action and before the observation of the action's consequences. In equilibrium, the agent randomizes over effort levels. The optimal contract gives the agent a menu of compensation schemes: safe ones intended for low-effort workers and risky ones for those whose effort is high. The optimal contract may give the agent a positive rent, in contrast to the case without renegotiation. Copyright 1990 by The Econometric Society.

Journal Article•DOI•
TL;DR: The main objective of this paper is to determine which aspects of life cycle fertility, if any, are sensitive to male income and female wages.
Abstract: "This paper estimates semiparametric reduced-form neoclassical models of life-cycle fertility in Sweden.... The estimated model integrates aspects of life cycle fertility that have previously been studied in isolation of each other: completed fertility, childlessness, interbirth intervals, and the time series of annual birth rates. The main objective of this paper is to determine which aspects of life cycle fertility, if any, are sensitive to male income and female wages."

Journal Article•DOI•
TL;DR: In this paper, the principal-agent relationship when the principal has private information was analyzed as a three-stage non-cooperative game: contract proposal, acceptance/refusal, and contract execution.
Abstract: We analyze the principal-agent relationship when the principal has private information as a three-stage noncooperative game: contract proposal, acceptance/refusal, and contract execution. We assume that the information does not directly affect the agent's payoff (private values). Equilibrium exists and is generically locally unique. Moreover, it is Pareto optimal for the different types of principal. The principal generically does strictly better than when the agent knows her information. Equilibrium allocations are the Walrasian equilibria of an "economy" where the traders are different types of principal and "exchange" the slack on the agent's individual rationality and incentive compatibility constraints.

Journal Article•DOI•
TL;DR: In this paper, it was shown that any conditional moment test of functional form of nonlinear regression models can be converted into a chi-square test that is consistent against all deviations from the null hypothesis that the model represents the conditional expectation of the dependent variable relative to the vector of regressors.
Abstract: In this paper, it will be shown that any conditional moment test of functional form of nonlinear regression models can be converted into a chi-square test that is consistent against all deviations from the null hypothesis that the model represents the conditional expectation of the dependent variable relative to the vector of regressors. Copyright 1990 by The Econometric Society.

Journal Article•DOI•
TL;DR: In this paper, the authors show that, starting from an arbitrary matching, the process of allowing randomly chosen blocking pairs to match will converge to a stable matching with probability one, and that every stable matching can arise.
Abstract: EMPIRICAL STUDIES OF TWO SIDED MATCHING have so far concentrated on markets in which certain kinds of market failures were addressed by resorting to centralized, deterministic matching procedures. Loosely speaking, the results of these studies are that those centralized procedures which achieved stable outcomes resolved the market failures, while those markets organized through procedures that yielded unstable outcomes continued to fail.2 So the market failures seem to be associated with instability of the outcomes. But many entry-level labor markets and other two-sided matching situations don't employ centralized matching procedures, and yet aren't observed to experience such failures. So we can conjecture that at least some of these markets may reach stable outcomes by means of decentralized decision making. And decentralized decision making in complex environments presumably introduces some randomness into what matchings are achieved. However, as far as we are aware, no nondeterministic models leading to stable outcomes have yet been studied. The present paper demonstrates that, starting from an arbitrary matching, the process of allowing randomly chosen blocking pairs to match will converge to a stable matching with probability one. (This resolves an open question raised by Knuth (1976), who showed that such a process may cycle.) Furthermore, every stable matching can arise

Report•DOI•
TL;DR: In this paper, the exact small sample distribution of the instrumental variable estimator is compared to the asymptotic distribution of a Gaussian distribution, and the results show that small sample distributions are more robust to noise.
Abstract: We present new results on the exact small sample distribution of the instrumental variable estimator. In particular, we compare the small sample distribution to the asymptotic distribution

Journal Article•DOI•
TL;DR: In this article, a characterization of optimal strategies for playing repeated coordination games whose players have identical preferences is proposed, where players' optimal coordination strategies reflect their uncertainty about how their partners will respond to multiple-equilibrium problems; this uncertainty constrains the statistical relationships between their strategy choices players can bring about.
Abstract: This paper proposes a characterization of optimal strategies for playing certain repeated coordination games whose players have identical preferences. Players' optimal coordination strategies reflect their uncertainty about how their partners will respond to multiple-equilibrium problems; this uncertainty constrains the statistical relationships between their strategy choices players can bring about. The authors show that optimality is nevertheless consistent with subgame-perfect equilibrium. Examples are analyzed in which players use precedents as focal points to achieve and maintain coordination, and in which they play dominated strategies with positive probability in early stages in the hope of generating a useful precedent. Copyright 1990 by The Econometric Society.

Journal Article•DOI•
TL;DR: A number of authors have shown that competitive markets may possess "sunspot equilibria" as mentioned in this paper, i.e., rational expectations equilibrium in which purely extrinsic uncertainty affects equilibrium prices and allocations.
Abstract: A NUMBER OF AUTHORS have shown that competitive economies may possess "sunspot equilibria", that is, rational expectations equilibria in which purely extrinsic uncertainty affects equilibrium prices and allocations.2 Such results demonstrate that it does not require a lack of faith in the rationality of market participants to believe that competitive markets may be subject to purely speculative fluctuations, driven solely by expectations.3 The mere existence of sunspot equilibria as solutions to a system of marketclearing conditions, however, might not be judged sufficient to indicate that competitive markets with rational participants could ever be subject to speculative instability. The sunspot equilibria represent states of affairs in which agents act differently in the case of different realizations of the "sunspot" variable, and it is rational for each agent to do so. But it might be thought unlikely that the beliefs of all the participants in the market could ever come to be coordinated so as to bring about an equilibrium of that kind. It is rational to believe that sunspots convey information about future states of affairs once the economy is in a sunspot equilibrium, but why would rational agents ever begin to believe in such a thing, so as to create the conditions under which the belief is rational? In order to address such a question, one must go beyond the mere statement of the conditions for equilibrium and discussion of what states of affairs satisfy them; one must specify an explicit dynamic process according to which the beliefs of agents adjust when out of equilibrium. Any exercise of this kind is necessarily unsatisfactory, as there is no univocal meaning for the postulate of "rational" behavior outside of an equilibrium. It may well be the case that

Journal Article•DOI•
TL;DR: In this article, it is shown that nonparametric estimates of the optimal instruments can give asymptotically efficient instrumental variables estimators for nonlinear models in an i.i.d. environment.
Abstract: This paper considers asymptotically efficient instrumental variables estimation of nonlinear models in an i.i.d. environment. The class of models includes nonlinear simultaneous equations models and other models of interest. A problem in constructing efficient instrumental variables estimators for such models is that the optimal instruments involve a conditional expectation, calculation of which can require functional form assumptions for the conditional distribution of endogenous variables, as well as integration. Nonparametric methods provide a way of avoiding this difficulty. Here it is shown that nonparametric estimates of the optimal instruments can give asymptotically efficient instrumental variables estimators. Also, ways of choosing the nonparametric estimate in applications are discussed. Two types of nonparametric estimates of the optimal instruments are considered. Each involves nonparametric regression, one by nearest neighbor and the other by series approximation. The finite sample properties of the estimators are considered in a small sampling experiment involving an endogenous dummy variable model.

Journal Article•DOI•
TL;DR: In this article, the authors derived asymptotic distributions for the ordinary least squares estimate of a first order autoregression model when the series is fractionally integrated and introduced the fractional unit root distribution to describe the limiting distribution.
Abstract: Asymptotic distributions are derived for the ordinary least squares estimate of a first order autoregression model when the series is fractionally integrated. The fractional unit root distribution is introduced to describe the limiting distribution. The unit root distribution is shown to be an atypical member of this family because its density is nonzero over the entire real line. For -1/2 Copyright 1990 by The Econometric Society.

Journal Article•DOI•
TL;DR: In this paper, the authors propose an endogenously determined game with an en-dogenous game, where the game is defined as the union of a set of players in a game.
Abstract: UNIVERSITY OF CALIFORNIA, BERKELEY Department o f Economics Berkeley, California Working Paper 8756 DISCONTINUOUS GAMES ENDOGENOUS SHARING RULES AND Leo K. Simon and. W i l l i a m R. October 1, Key words: Zame D i s c o n t i n u o u s games, e x i s t e n c e , Nash e q u i l i b r i u m , H o t e l l i n g , Bertrand, sharing r u l e s , r a t i o n i n g . Abstract We propose a d i f f e r e n t approach t o the k i n d s of economic problems t h a t l e a d t o d i s c o n t i n u o u s games. We t a k e the view t h a t the under- l y i n g p a y o f f s f o r t h e s e problems are o n l y p a r t i a l l y determined, r a t h e r than discontinuous. At p o i n t s where t i e s occur, we propose t h a t t h e s h a r i n g r u l e s h o u l d be determined endogenously, i . e . , as p a r t o f the s o l u t i o n t o the model r a t h e r t h a n as p a r t of the des- c r i p t i o n o f the model. T h i s l e a d s us t o d e f i n e a game with an en- dogenous s h a r i n g r u l e . I t c o n s i s t s of a s t r a t e g y space f o r each of a f i n i t e number o f p l a y e r s , t o g e t h e r with a p a y o f f correspondence, i n t e r p r e t e d as the union of a l l p o s s i b l e s h a r i n g r u l e s . A s o l u t i o n f o r such a game i s a s e l e c t i o n from the p a y o f f correspondence t o - g e t h e r with a s t r a t e g y p r o f i l e s a t i s f y i n g t h e u s u a l (Nash) b e s t response c r i t e r i o n . Our p r i n c i p a l r e s u l t i s t h a t such a s o l u t i o n always e x i s t s . JEL C l a s s i f i c a t i o n :

Journal Article•DOI•
TL;DR: In this article, the authors extend Maskin's results on Nash implementation to the case of three or more agents and derive simpler sufficiency conditions that are applicable in a wide variety of economic environments.
Abstract: The authors extend E. Maskin's results on Nash implementation. First, they establish a condition that is both necessary and sufficient for Nash implementability if there are three or more agents (the case covered by Maskin's sufficiency result). Second--and more important--they examine the two-agent case (for which there existed no general sufficiency results). The two-agent model is the leading case for applications to contracting and bargaining. For this case, too, they establish a condition that is both necessary and sufficient. The authors use their theorems to derive simpler sufficiency conditions that are applicable in a wide variety of economic environments. Copyright 1990 by The Econometric Society.

Journal Article•DOI•
TL;DR: In this article, the authors report the results from laboratory asset markets designed to test the rational expectations hypothesis that markets aggregate and transmit the information of differentially informed traders, and examine which features of their environment are necessary or sufficient to achieve an rational expectations equilibrium.
Abstract: In this study, the authors report the results from laboratory asset markets designed to test the rational expectations hypothesis that markets aggregate and transmit the information of differentially informed traders. After documenting evidence in favor of the rational expectations model, they examine which features of their environment are necessary or sufficient to achieve an rational expectations equilibrium. The authors find that trading experience and common knowledge of dividends are jointly sufficient to achieve a rational expectations equilibrium, but that neither is a sufficient condition by itself. They also present some stylized facts about the convergence process leading to a rational expectations equilibrium. Copyright 1990 by The Econometric Society.

Journal Article•DOI•
TL;DR: In this paper, the author analyzes three different versions of expected utility for two-stage lotteries and suggests several different compound dominance axioms as possible replacements of the reduction axiom.
Abstract: Preference relations over two-stage lotteries are analyzed. Empirical evidence indicates that decisionmakers do not always behave in accordance with the reduction of compound lotteries axiom, but they seem to satisfy a compound independence axiom. Although the reduction and the compound independence axioms, together with continuity, imply expected utility theory, each of them by itself is compatible with all possible preference relations over simple lotteries. Using these axioms, the author analyzes three different versions of expected utility for two-stage lotteries. The author suggests several different compound dominance axioms as possible replacements of the reduction axiom, which are strictly weaker than the reduction of compound lotteries axiom. Copyright 1990 by The Econometric Society.

Journal Article•DOI•
TL;DR: In this article, the authors compare two methods for a monopolist to sell information to traders in a financial market and show that the optimal selling method depends on how much information is revealed by equilibrium prices.
Abstract: The authors compare two methods for a monopolist to sell information to traders in a financial market. In a direct sale, information buyers observe versions of the seller's signal while in an indirect sale the seller sells shares in a portfolio based on his private information. It is shown that, when traders are identical and pricing is linear, there is a trade-off between optimal surplus extraction that is possible under direct sale and more effective control of the usage of information that is possible under indirect sale. The optimal selling method depends on how much information is revealed by equilibrium prices. Copyright 1990 by The Econometric Society.

Journal Article•DOI•
TL;DR: In this article, the authors compare an unrestricted version of the Albrecht and Axell (1984) model with a mixture of negative binomial distributions and show that the unrestricted model does not support the existence of significant heterogeneity.
Abstract: from Albrecht and Axell (1984) and is based on workers who are homogeneous in terms of market productivity and heterogeneous in terms of nonmarket productivity, and on firms which are heterogeneous in terms of productive efficiency. The equilibrium model is contrasted in terms of its fit to the data with an unrestricted version of the model which is based on a mixture of negative binomial distributions. The equilibrium model fails to conform to the data in exactly the dimension of its major focus, namely it implies that measurement error accounts for almost all of the dispersion in observed wages. The equilibrium model also does not do well in fitting the unemployment duration distribution compared to the unrestricted model. The problem is that the duration distribution itself does not support the existence of significant heterogeneity, as evidenced by the estimates of the unrestricted model. The paper also illustrates the use of such models for policy analysis by simulating the welfare effects of a minimum wage.

Journal Article•DOI•
TL;DR: In this article, the authors re-examine the data from O'Neill's experiment involving a repeated, two-person, constant-sum game and find no evidence that players' behavior approached minimax behavior as players became more experienced.
Abstract: One re-examines the data from O'Neill's experiment involving a repeated, two-person, constant-sum game. One finds that there is less evidence in support of the minimax hypothesis than indicated by O'Neill. There is strong evidence of serial correlation in players' choices, with several players displaying statistically significant dependence on the past moves of their opponents. One interprets this finding as evidence that the players themselves rejected minimax play as the appropriate model for their opponents' behavior. One finds no evidence that players' behavior approached minimax behavior as players became more experienced

Journal Article•DOI•
TL;DR: In this paper, an empirical investigation of equilibrium restrictions on household consumption and male labor supply is presented, which exploits a simple factor structure, rationalized by two assumptions, that household allocations are Pareto optimal and that the labor market is competitive.
Abstract: This paper is an empirical investigation of equilibrium restrictions on household consumption and male labor supply. It exploits a simple factor structure, rationalized by two assumptions, that household allocations are Pareto optimal and that the labor market is competitive. The paper estimates household preferences, and tests how well this parsimonious factor structure represents panel data on married couples and time series data on asset returns. Most of the estimates are roughly comparable to those found in previous work; no evidence against the simple factor representation is found and the intertemporal capital asset pricing model is not rejected. Copyright 1990 by The Econometric Society.

Journal Article•DOI•
TL;DR: In this paper, the authors compare the predictions of a theoretical model of a common knowledge inference process with actual behavior, and find that the theoretical model roughly predicts the observed behavior, but the actual inference process is clearly less efficient than the standard of the theoretical models.
Abstract: This paper reports on an experimental study of-the way in which individuals make inferences from publicly available information. We compare the predictions of a theoretical model of a common knowledge inference process with actual behavior. In the theoretical model, "perfect Bayesians," starting with private information, take actions; an aggregate statistic is made publicly available; the individuals do optimal Bayesian updating and take new actions; and the process continues until there is a common knowledge equilibrium with complete information pooling. We find that the theoretical model roughly predicts the observed behavior, but the actual inference process is clearly less efficient than the standard of the theoretical model, and while there is some pooling, it is incomplete.